- Joined
- Apr 12, 2021
- Messages
- 904
- Motherboard
- Asus z590 ROG Maximus XIII Hero
- CPU
- i9-11900K
- Graphics
- RX 6600 XT
- Mac
- Classic Mac
- Mobile Phone
At a first level of analysis, there's a cost per unit of compute work, which allows technology and designs to be compared for efficiency.
Building from this baseline there's the cost of work as measured in units of computation, e.g., processing an image in such a way on a given design requires so much energy.
There's rate of work and efficiency margins to reduce time. In other words, increasing work rates may reduce efficiency due to scaling effects.
There's relative efficiency of particular codes and the efficiency of mapping of algorithms to the instruction set architecture.
Then there's implementation waste, such as un-needed or un-checked computation, or overheads, which perform no meaningful work but are tolerated.
If you examine the big picture, there's a reasonable notion of something like vehicle MPG but for a computer. But such notion as MPG is only comparable for a class of device, e.g., passenger car.
Comparison of device classes, such as the relative efficiency of rail to trucks on highways per unit freight is another matter, and similar notion applies to desktop vs mobile personal computers.
My point here is that idea of a PC box being arbitrarily costly or hot without regard to its design -and- workload is too simplistic.
It's even a further stretch to analogize a hot computing device as having any bearing whatsoever on planetary climate. You need to study all the ways people use energy and the entire ecosystem to grok where computing fits in. Nothing about this is obvious and it likely doesn't follow common sense.
As to cost pressures of electricity on PC design, every PC adjusts its energy consumption to step up work rates based on demand.
Design emphasis on efficiency is very important. This became obvious with the adoption of Bitcoin, where the tokens are engineered to become more expensive to compute with exhaustion of a finite pool of tokens, thereby driving up the cost of production and maintenance of the chain of trust and the corresponding value of tokens.
It so happens that Windows PC evolution is driven by marketing excess, similar to muscle cars: racing, leader boards, bragging rights, and male-pattern competition (which is an effect similar to male-pattern baldness and color-blindness—a purely genetic trait so it's not something decidable, it's just how we are).
In general, desktop PC form factor can afford to be wasteful compared to mobile, because mobile has a battery limit. There's a culture of waste in desktop PC design based on cost of energy relative to other household devices such as stoves and AC. But the true force of computing efficiency is simply Moore's Law: the rate of improvement of silicon technology is exceeding demand for computation in personal devices.
In the simplest analysis, it comes down to the workload and rate at which work is performed. The significance of energy costs to consumers of personal computers and the devices utility as space heaters is anecdotal to larger trends, but it does serve to signify Apple's competency relative to PC industry.
Apple computers stand out in efficiency not because of inherently more efficient chips, but because Apple has a product emphasis on mobile, enjoys control of the full technology stack, and the company made it a point of pride to enhance power efficiency. This is about wisdom to know where to put power in the design, and knowing when adding more power doesn't add value to the product.
Windows PC has been shunned for mobile due to the vicissitudes of evolution and the consumer market emphasis on gaming, which is about exploring where excess can take you. So of course we call out it's excesses when cost pressures mount. But the choice to be wasteful is emblematic of the USA and our lifestyle, maybe because freedom grows from a surplus, so it's natural to veer towards excess.
Building from this baseline there's the cost of work as measured in units of computation, e.g., processing an image in such a way on a given design requires so much energy.
There's rate of work and efficiency margins to reduce time. In other words, increasing work rates may reduce efficiency due to scaling effects.
There's relative efficiency of particular codes and the efficiency of mapping of algorithms to the instruction set architecture.
Then there's implementation waste, such as un-needed or un-checked computation, or overheads, which perform no meaningful work but are tolerated.
If you examine the big picture, there's a reasonable notion of something like vehicle MPG but for a computer. But such notion as MPG is only comparable for a class of device, e.g., passenger car.
Comparison of device classes, such as the relative efficiency of rail to trucks on highways per unit freight is another matter, and similar notion applies to desktop vs mobile personal computers.
My point here is that idea of a PC box being arbitrarily costly or hot without regard to its design -and- workload is too simplistic.
It's even a further stretch to analogize a hot computing device as having any bearing whatsoever on planetary climate. You need to study all the ways people use energy and the entire ecosystem to grok where computing fits in. Nothing about this is obvious and it likely doesn't follow common sense.
As to cost pressures of electricity on PC design, every PC adjusts its energy consumption to step up work rates based on demand.
Design emphasis on efficiency is very important. This became obvious with the adoption of Bitcoin, where the tokens are engineered to become more expensive to compute with exhaustion of a finite pool of tokens, thereby driving up the cost of production and maintenance of the chain of trust and the corresponding value of tokens.
It so happens that Windows PC evolution is driven by marketing excess, similar to muscle cars: racing, leader boards, bragging rights, and male-pattern competition (which is an effect similar to male-pattern baldness and color-blindness—a purely genetic trait so it's not something decidable, it's just how we are).
In general, desktop PC form factor can afford to be wasteful compared to mobile, because mobile has a battery limit. There's a culture of waste in desktop PC design based on cost of energy relative to other household devices such as stoves and AC. But the true force of computing efficiency is simply Moore's Law: the rate of improvement of silicon technology is exceeding demand for computation in personal devices.
In the simplest analysis, it comes down to the workload and rate at which work is performed. The significance of energy costs to consumers of personal computers and the devices utility as space heaters is anecdotal to larger trends, but it does serve to signify Apple's competency relative to PC industry.
Apple computers stand out in efficiency not because of inherently more efficient chips, but because Apple has a product emphasis on mobile, enjoys control of the full technology stack, and the company made it a point of pride to enhance power efficiency. This is about wisdom to know where to put power in the design, and knowing when adding more power doesn't add value to the product.
Windows PC has been shunned for mobile due to the vicissitudes of evolution and the consumer market emphasis on gaming, which is about exploring where excess can take you. So of course we call out it's excesses when cost pressures mount. But the choice to be wasteful is emblematic of the USA and our lifestyle, maybe because freedom grows from a surplus, so it's natural to veer towards excess.
Last edited: