Contribute
Register

Apple Silicon Mac Pro Revealed at WWDC 2023

At a first level of analysis, there's a cost per unit of compute work, which allows technology and designs to be compared for efficiency.

Building from this baseline there's the cost of work as measured in units of computation, e.g., processing an image in such a way on a given design requires so much energy.

There's rate of work and efficiency margins to reduce time. In other words, increasing work rates may reduce efficiency due to scaling effects.

There's relative efficiency of particular codes and the efficiency of mapping of algorithms to the instruction set architecture.

Then there's implementation waste, such as un-needed or un-checked computation, or overheads, which perform no meaningful work but are tolerated.

If you examine the big picture, there's a reasonable notion of something like vehicle MPG but for a computer. But such notion as MPG is only comparable for a class of device, e.g., passenger car.

Comparison of device classes, such as the relative efficiency of rail to trucks on highways per unit freight is another matter, and similar notion applies to desktop vs mobile personal computers.

My point here is that idea of a PC box being arbitrarily costly or hot without regard to its design -and- workload is too simplistic.

It's even a further stretch to analogize a hot computing device as having any bearing whatsoever on planetary climate. You need to study all the ways people use energy and the entire ecosystem to grok where computing fits in. Nothing about this is obvious and it likely doesn't follow common sense.

As to cost pressures of electricity on PC design, every PC adjusts its energy consumption to step up work rates based on demand.

Design emphasis on efficiency is very important. This became obvious with the adoption of Bitcoin, where the tokens are engineered to become more expensive to compute with exhaustion of a finite pool of tokens, thereby driving up the cost of production and maintenance of the chain of trust and the corresponding value of tokens.

It so happens that Windows PC evolution is driven by marketing excess, similar to muscle cars: racing, leader boards, bragging rights, and male-pattern competition (which is an effect similar to male-pattern baldness and color-blindness—a purely genetic trait so it's not something decidable, it's just how we are).

In general, desktop PC form factor can afford to be wasteful compared to mobile, because mobile has a battery limit. There's a culture of waste in desktop PC design based on cost of energy relative to other household devices such as stoves and AC. But the true force of computing efficiency is simply Moore's Law: the rate of improvement of silicon technology is exceeding demand for computation in personal devices.

In the simplest analysis, it comes down to the workload and rate at which work is performed. The significance of energy costs to consumers of personal computers and the devices utility as space heaters is anecdotal to larger trends, but it does serve to signify Apple's competency relative to PC industry.

Apple computers stand out in efficiency not because of inherently more efficient chips, but because Apple has a product emphasis on mobile, enjoys control of the full technology stack, and the company made it a point of pride to enhance power efficiency. This is about wisdom to know where to put power in the design, and knowing when adding more power doesn't add value to the product.

Windows PC has been shunned for mobile due to the vicissitudes of evolution and the consumer market emphasis on gaming, which is about exploring where excess can take you. So of course we call out it's excesses when cost pressures mount. But the choice to be wasteful is emblematic of the USA and our lifestyle, maybe because freedom grows from a surplus, so it's natural to veer towards excess.
 
Last edited:
While energy costs here in the US is still relatively inexpensive, it seems that the trend is changing quite rapidly.

View attachment 569354
Source:https://abc7ny.com/con-ed-bill-new-york-fossil-fuel/13529638/


9% + 4.3% + 1.7%! Those increases are compounded!

Add in the fact that the entire world seems to be getting increasingly warmer with each passing day and those fire breathing kilowatt PC rigs make less and less sense. How much harder will your air conditioner need to work to negate the heat your rig is spitting out?
I live in a century-old house, where it is
cold and warm in the winter. I have heating on electricity and even after the increases I do not pay much. I'm using an RX560, so a 500W power supply is completely sufficient. It's only 130W more than the Mac Studio M2 Ultra.
I work on Monterey. And I know Sonoma will also run on it. I have Mojave on my MacBook. I don't feel the need to switch to new systems because they have changed beyond recognition since Ventura. None of the new systems give me anything beyond what I already have. So, quietly counting, the z690 platform will be enough for me even when the i9 14900K comes out. I spent 4x less on this PC than I would have spent on a Mac Studio M2 Ultra. A few months ago I bought a 4TB M.2 SSD, and yesterday I bought 192GB RAM. I value my freedom too much to buy a Mac Studio. I was waiting for a new MacBook Air 15" with M3. But unfortunately Apple gave it a hopeless M2. There is no SD card reader in it and there are only 2 USB. In the old one I have 2 USB, thunderbolt and SD card reader. Last month I changed the SSD to a 2TB for myself. I have stopped counting on Apple anymore. There are no more interesting solutions for me.
 
I live in a century-old house, where it is
cold and warm in the winter. I have heating on electricity and even after the increases I do not pay much. I'm using an RX560, so a 500W power supply is completely sufficient. It's only 130W more than the Mac Studio M2 Ultra.
I work on Monterey. And I know Sonoma will also run on it. I have Mojave on my MacBook. I don't feel the need to switch to new systems because they have changed beyond recognition since Ventura. None of the new systems give me anything beyond what I already have. So, quietly counting, the z690 platform will be enough for me even when the i9 14900K comes out. I spent 4x less on this PC than I would have spent on a Mac Studio M2 Ultra. A few months ago I bought a 4TB M.2 SSD, and yesterday I bought 192GB RAM. I value my freedom too much to buy a Mac Studio. I was waiting for a new MacBook Air 15" with M3. But unfortunately Apple gave it a hopeless M2. There is no SD card reader in it and there are only 2 USB. In the old one I have 2 USB, thunderbolt and SD card reader. Last month I changed the SSD to a 2TB for myself. I have stopped counting on Apple anymore. There are no more interesting solutions for me.

Yes, the Mac Studios have 370W PSUs. No, they do not use 370W of power. I don't know if an M2 Ultra is even capable of pulling 100W. i9 13900Ks have been shown to pull over 250W.

power-multithread.png

Source:https://www.techpowerup.com/review/intel-core-i9-13900k/22.html


I envy the people who live in cooler climates. Here in New York City, the summers are miserable and has only gotten worse with the passage of time. If I can eliminate the equivalent of a few 100W incandescent light bulbs, it helps a lot.

As of this moment, it's possible to install Sonoma on a hackintosh but some functions are broken due to the dropping of IO80211FamilyLegacy support. On Sonoma, some things that work on real Intel Macs will not work on a hackintosh. Regardless of whether you use these features, functionality of hackintoshes running Sonoma will be compromised.

Every year, there's criticism that there are no major changes in macOS that's worth upgrading to... A friend recently found an old white MacBook running El Capitan at work and it's a completely different experience to what Ventura feels like.

If by "freedom" you mean upgradable RAM and SSD, you're not going to find easy ways to do that on current Macs. If you want to stay on macOS, that's the reality you have to live with. Of course, you have the "freedom" of going to Windows or Linux.

If you were contemplating an M3 MacBook Air as a replacement for your rig, then you probably don't need the power of 13900K anyway.
 
Last edited:
I value my freedom too much to buy a Mac Studio. I have stopped counting on Apple anymore. There are no more interesting solutions for me.

I don't know how to read the term "freedom" WRT to hackintosh because the going-in position is captivity to Apple's stack?

Then there's the enormous cost to hack config and maintenance, so it's impossible for me to grok any idea of "economy" of hackintosh, except that in the sense of having time but not $. But either way the price must be paid.

If you rationalize these costs to some idea of "doing what you like to do" then there's no longer an interesting question of freedom or economy as you're merely predisposed to your own way. "My reasons are because I like it this way."

I was hoping you were going to say you've finally seen the future of your freedom in some Linux distro, where you get PC price-points on parts, modern kit, and your time spent hacking is invested towards coming generations of kit.
 
Alas, no Logic Pro or Cubase for Linux. And 90% of my beloved plug-ins would only work via Wine, if at all.
I wouldn't be able to use my current expensive audio interface(mac only) and other specialized equipment.
Moving my photo stuff over to Linux would also be a daunting task.

BTW, installing and tweaking Linux for pro audio is also non-trivial/time consuming.
 
Some nerding out from 2022 on power mgmt in M-series big.LITTLE architecture.


Performance Taken together with the difference in functional units, you’d expect an E core running at maximum frequency and 100% active residency to have a throughput of about a third of a P core at its maximum frequency. In practice, running tight loops of code accessing only registers, E cores can achieve almost twice that expected, giving them nearly two thirds of the throughput of P cores. For example, a task running in two threads allocated to two P cores might complete in 32 seconds, and on two E cores in 52 seconds.

Real-world task performance of E cores isn’t as impressive, though. Compressing an IPSW image using two threads and two P cores takes 32 seconds, but on two E cores takes 134 seconds, for almost a quarter of the performance. Thus, whether code is allocated to the P or E cores can make a substantial difference to the time it takes to complete.

Efficiency If the relationship between performance and power were linear, then there would be no efficiency benefit to running tasks more slowly on cores that used lower power. Because E cores use less power than that, substantial savings can be made by running tasks on the E cores alone, instead of P cores. One example, based again on file compression, required 10.3 J total energy when run on P cores, and only 3.1 J on the E cores, which is 30%.

Thus, for this specific instance of compression, running its threads entirely on E cores takes four times as long as on P cores, but uses a total of less than a third of the energy.

Let's examine this a little more carefully:

In the case of file compression, if the de-compressor running on P-cores just ends up being gated by drive I/O, then a total efficiency paradigm may want the great energy savings from running on the E-cores.

A potential advantage of Apple's vertical integration and systems obsolescence comes into focus: under Windows, the dictates of eternal backwards compatibility eclipse any design intention to conserve power: the systems integrator of a Windows (or Linux) stack faces a hit-or-miss relevancy of such optimizations (e.g., big.LITTLE) in code libraries and apps because the integrator is too late to the design party to realize any systems level advantage. Microsoft's run-anywhere advantage takes precedence of over efficiency.

But it gets weird from a user-perspective:

Ultimately the user of a Windows PC may experience a radically better synthetic (e.g., Geekbench) compression score, while in real life, the WinPC and Mac are working at the same rate, while the total integration of the archiecture taken across many kinds of code on Apple kit leads to a user experience of substantially increased battery life. But, this is difficult to compare or quantify because a benchmark has no view of total system efficiency.

So while instruments at hand can't precisely quantify Apple's advantage, or may even reveal an Apple disadvantage, it's possible for Apple device is giving a more well-rounded performance. For Apple, this leads to increased brand loyalty through an intangible sense of quality. But if you try to look for the advantage in synthetic workloads, you may find nothing, leading to judging Apple's intangibles as snake-oil.

Both points of view are true! But Apple's intentions to vertically integrate and refine their products still makes makes sense: it can lead to quatifiable increases in customer engagement for their products. Meanwhile, legion of PC aficionados cry Apple fan-boy BS.

For such reasons, we can surmise that Apple marketing will play a buzz-word performance claim game, and that it makes sense overall for them to do that even as reviewers cry BS.
Anyone can rightfully argue as to whether Apple's degree of integration matters to them or not, but Apple can also know empirically that it does matter for broad sectors of the market and score big wins (e.g., iPhone).

It is a time of reckoning for the hackintosh user. We have to face that Apple only appeared to be cosmopolitan in its earlier decades because of a cultural combination of personal computing zeitgeist, clever marketing, and a need to compete with PC. But while Macs got built out of the same parts as PCs for 15 years, Macs never were PCs. With AppleSi, this downplayed truth is finally coming home to roost, much to the discomfort and chagrin of loyal old-timey users.

You have to face you were never free. But you are also not free on Windows, it's just a very different nest of constraints.

The freedom is on Linux, but it's a lot more work. And to get there you may have to give something up, which is the price of a previous Devil's bargain with Apple.

Or... Maybe Apple offers true value and you are happy to pay them a modest increment for their good design?

As to upgradability: This is a red-herring today for most customers. It's true that Apple traditionally sells under-provisioned kit at an appearance price-point to unwitting buyers in order to preserve their margins, so let the buyer beware. Apple has also produce many ugly and bad designs. But my feeling is that since about 2015, their product spreads getting less exploitative every year as they have come fully into their own. Part of this is that personal computers are now considered to be truly essential appliances with users having firmer expectations of what they are good for in ordinary life. And Apple has been a trendsetter in these expectations, so credit to them as it is due.




Also see ARM big.LITTLE:
 
Every year, there's criticism that there are no major changes in macOS that's worth upgrading to... A friend recently found an old white MacBook running El Capitan at work and it's a completely different experience to what Ventura feels like.
I totally agree
 
For me, a computer is something that runs the software I prefer to use/have been using for years.
Snow Leopard as an OS was fine. I don't care for most of the inventions and add ons since OS 10.6.8.
Airdrop is nice. I fail to come up with anything else.
Mind you, I am not an iCloud user. Don't use FaceBook/Twitter/Instagram.
Basically I use Mail/Safari & Chrome/Evernote/Cubase/Logic Pro/Adobe Photo Suite/Spotify/Carbon Copy Cloner/Backblaze on my hack. These programs have been working OK for me for the last 10 years or so(except probably Backblaze, been using that for just 2 years).
I detest the mandatory yearly MacOS upgrades with their mostly useless upgrades and changes.
 
Back
Top