The other day Jon was (halfway) joking that modern enthusiast GPU's should be measured in horsepower, as they now consume about 1/3rd HP on the high end. This got me to thinking of a computer system expense evaluation tool I started working on a few years ago named RealCost. This tool could be used for just about anything electronic and is intended to estimate the total cost of ownership of a device based on its initial expense and the power it consumes. Consumer adjustable variables include average hours use per day, load weighting, planned life, and local energy expense. It is still a work in progress and I am considering adding variables like opportunity cost, inflation rate, and liquidation value.
The current formula is definitely a simplification, as different GPUs may produce different load levels with the same usage parameters. It also does not account for time of day or seasonal variations in energy expense, performance capabilities, or whether the heat generated is welcome or not (I can attest that the hot air blowing out of my rig is most welcome this time of year). Retail cost based on release price could be moved to consumer adjustable, as shoppers who are willing to wait a few months are most often rewarded with price drops.
To arrive at the RealCost I add Retail Cost and Power Cost. The Power Cost is calculated with the following formula: (((Average Draw*Average Use Per Day in Hours*Days in Billing Cycle)*Average Cost Per Kilowatt Hour in Local Market/1000)*(Useful Life/Planned Replacement in Years*12). The Average Draw formula is calcuated as follow: (Idle Draw * Idle Weighting)+(Loaded Draw* Loaded Weighting).
Power can be expensive (sometimes almost 50% of the initial AIB expense over a three year period). While many PC gamers would gladly pay for performance, in a dead heat on their favorite benchmark, perhaps understanding the total cost of ownership could be beneficial.