This blog post is around seven or eight years overdue. Anyway, I had been musing about energy grids at the time and thinking about things like long-distance alternating current (AC) power transmission, the rise of solar panels, and electronics. I figured in 15-20 years (so about 7-12 years from now), we’d start seeing direct current (DC) power being more available from the wall. Before that can happen, people need a supply of DC power and devices that use DC power.
Thailand was tragically flooded last year. The number of surprise alligators in the region also likely increased. As home to much of the world’s production capacity for hard drives, the shutdown of facilities caused the cost per gigabyte stored on magnetic hard drives to balloon to prices not seen since the middle of the last decade. Demand dropped for hard drives. Value-driven consumers may have opted for alternatives such as more expensive solid state drives (SSDs) or removable media like DVDs. Other purchases may have been deferred, playing catch-as-catch-can with hard drives already owned until prices begin to have some semblance of normalcy. The obvious. But none of this would have merited such a… cache-y blog post title.
Continue reading “Cache-as-cache-can”
Last week, Professor Mor Harchol-Balter visited us at the University of Toronto to deliver a talk as part of the Department of Computer Science’s Distinguished Lecture Series. During her excellent talk, she showed how intuition often fails us when scaling systems to meet a given load, even with perfect information of load patterns, whereby the amount of computers is over-estimated (thereby increasing energy use due to over-provisioning) and presented some results that will be appearing in conferences soon with experimental results related to power saving policies in data centres. One of the examples she gave of intuition breaking down is, given n computers serving r requests per unit time, load l, and a response time of t per request, how many computers are required to maintain response time t and load l if r increases by, say, a factor of one hundred?
During her talk, Professor Harchol-Balter mentioned that banks refuse to share computing power with anyone else. This seemed like a missed opportunity for energy savings in the form of virtualization, although, in the short term economic analysis, millions of dollars of power saved is chump change for these institutions. I later asked her about this and she conceded that they’d be willing to share their servers with other instances of their own software, given certain constraints. Which brings us to a continuation of my previous depth paper excerpt blog post Energy-proportional computing.
The following post is an excerpt from a subsection of my Ph.D. depth paper, less citations, which exist in the original document. This post deals with the concept of energy-proportional computing.
Continue reading “Energy-proportional computing”
I finally registered a domain for myself, but it was somewhat of an impulse buy. The hosting company was having a sale ($2.95/month, domain included), but that wasn’t what sold me. It was the energy source for the servers: wind power. Perhaps they have backup fossil-fuel-burning generators and maybe their offices aren’t nearly as green, but the folks at FatCow are mooving in the right direction. They lacked some features I would have liked to have and their servers sometimes seem slow to respond (shutting down servers due to lack of wind?), but we all have to make sacrifices, right?
Oh, and until August 24th, if you sign up at http://www.fatcow.com/secret with code AUG295, you, too, can get web hosting on a clean-energy-powered server for a year for $2.95 per month. I get $25 credit if you enter heresjono.com as your referrer. Please do!
Isn’t the pace of technological advancement remarkable? For less than the price of a month’s worth of kitty litter, one can buy an 8GB microSD memory card that is smaller and lighter than a penny. That’s enough to store the text of all English Wikipedia articles as of March 2010 with room to squeeze in a minimal install of Apple’s current desktop operating system ((2 GiB is sufficient for a bootable disk image of Mac OS X 10.6)). And that’s just one month’s worth of kitty litter.
If, say, your entire office burns down due to an electrical fire from a photocopier next door, off-site backups are worth their weight in platinum. If you don’t believe me, just look up how much professional data recovery services cost and compare that to the weight of a hard drive or fifty. Some people use e-mail as a back-up system, but it isn’t really appealing as a long-term, scalable solution. Amazon’s S3 provides variable pricing and is scalable, but may be overly complicated for backing up small amounts of data. Some off-site back-up services such as Backblaze are great deals for people with large storage needs ((I am not affiliated with or endorsing Backblaze. I just find their blog posts interesting.)).
The following post is a collection of excerpts from a draft of my Ph.D. depth paper, less citations, which exist in the original document.
Few computer systems spend all of their time at full utilization; even in always-on situations such as servers, a properly provisioned system will be spend almost all of its time at less than 50% utilization. Since most personal computers remain idle for extended periods of time ((In the context of this document, idle systems are ones that are on but performing no useful work.)), consumers should be considering a computer’s idle power draw when purchasing a computer. However, performance per watt is often the number that is compared.
One year ago last week, I put the finishing touches on a new computer. I started my game plan for a low-powered device in January 2009, but I was teaching a course at the time so it wasn’t until reading week that I found three (almost) uninterrupted days to bring my plan to fruition. While I’ve seen a number of articles and blog posts about building a low-powered computer, I’ve seen only a few documenting the end result after real-world use.