Contribute
Register

Apple M2 CPU, MacBook Air, MacBook Pro at WWDC22

Status
Not open for further replies.
Good questions with more implications for hacks.

Its not as tidy as a the concern overcommitting system RAM.

Think how you've identified drive DRAM as a being cache the Mac's CPU to flash, then identify system RAM as a cache for the cloud to the Mac device in the same manner.

Everything Apple wants is for your Mac to be a way-station for rented services it connects via the network.

Every step of recent Apple device design has moved away from a PC-style unreliable bin-of-parts under the tyranny of the ruling OS to a more tightly integrated and reliable device package that provides an Apple customer a hookup to a tyranny of a ruling cloud service. Next up: AR/VR.

Where does the data sent to the drive come to / go from but RAM. That's where any data you immediately need to work on should be. So the question isn't how can the CPU help get data to the drive, but the other way around. To quote Sun-Ra "space is the place," or rather "RAM is the place." By closely integrating flash with RAM you can optimize the location of spatial-temporal layout of data to smooth the access to a dynamic cloud. Put another way, Apple thinks the last place you inter your data is on your device, they want to rent you a better version of yourself through their Colossus.

For this to work, device subsystems have to be very dependable, which is why the insides of Apple devices are soldered and glued together.

Ultimately, you will choose your device config by the scale and cost of your workload as it relates to cloud services, where if you require more data autonomy you will need a more powerful device, the cost of which will be justified by the value your peonage adds to Apple's hive.

As macOS becomes more tuned to Apples HW, it's OS optimizations are going to become more tailored to Apple's specific HW and less adaptable to "PCs". An NVMe drive is built according to a system layout assumption that privileges residency of data in local storage, but data in RAM is what should be privileged in a network node. Flash is just a local warehouse.

(Alan Kay argued that todays CPU design is weak because it over-emphasizes RAM as a warehouse at the expense of computing which is endlessly thwarted by caches, but he was looking — as was Alan Turing in his later work — at the elegance of the computational structure of life.)

Every hackintosher should becomr keenly aware that the Mac is truly not-a-PC, it's something that emerged from a history that had an major PC era which Apple has deftly parlayed as a stepping stone to a more profound system of market control. As to what this means for humanity, well so far these companies are more Dr. Evil than truly evil, but watch out for the banality of evil, it can surprise you!

Jobs will gloat over Gates in the grave. But it's the DoE / ARPA / Xerox PARC / SRI nerds that will rise in the Apocalypse when the accounting of the 9 Billion Names of God is complete.

For those who want to keep with their PC and maintain the freedom to truly invent: Linux, now more than ever.

As usual an intriguing overview. I always respect and appreciate your insights as an industry insider - at least I believe that's what you've hinted at being. :thumbup:

However your use of words like "tyranny" and "Colossus" and "Evil" give me a somewhat impolite vision of yourself as a crusading Knight, armoured and wielding a large blood-stained sword, bellowing at the enemy!

I don't think Apple is in any way evil, nor being tyrannical. We only have to compare them with all the other large tech companies and see that only incredible success at what they do makes Apple stand out more than any other. This in turn leads to "tall poppy" syndrome where everyone feels the need to take a pop at them.

I am a late-comer to the "cloud" methodology, previously being staunch in my belief that local storage is best. Paranoia I guess. But now I pay monthly for extra iCloud storage and services. Not a problem. I believe a lot of Pen-Testing crusaders have tried to poke their swords into online security and, in Apple's case, rarely given me cause to worry. Indeed I've lost more through stupid errors on my part or failing hard-drives. But I have no dark secrets anyway.

As for caches and memory-paging (does anyone even call it that any more?) I would love to learn more about how Apple makes such apparently insignificant machines, when compared with monster PCS, do so much in so little. A friend of mine who is a YouTube Creator swapped from a big PC to a base Mac Mini, and knocks-out high-quality videos with Final Cut Pro, amazed at how the little Apple machine is so much more efficient than his old hardware. He points out real-time editing without much of a lag. Yes, I understand words like "optimisation" and "walled-garden" but whatever the complaints - it works in real-world applications!

:)
 
However your use of words like "tyranny" and "Colossus" and "Evil" give me a somewhat impolite vision of yourself as a crusading Knight, armoured and wielding a large blood-stained sword, bellowing at the enemy!

I am not an industry insider. Very far from it. I'm a city kid with technical curiosity from nowhere that was lifted by Reagon's defense budgets and the the best ways that a democratic state uses its power to make opportunities available with local education and business partnerships. It just so happened chips and PC went when and where I lived and worked in the belly of one of the beasts. Apple products were / are a big part of surroundings. I learned programming on Apple II with 80 column card, dual 5 1/4 inch and USCD Pascal for that machine. That lead to IT to supercomputer engineering and ultimately into freelance community engineering. I am a practicing computer professional, but not in the "industry" in any substantial way anymore. Maybe that will change.

As to words like tyranny this is a direct reference to Microsoft. But underneath is irrational but healthy fear most of us feel at unaccountable bureaucracies with extraordinary power over our lives. I am aware of my ideology and resolutely pro-American in the sense that I would prefer to be murdered by our guys than those from other countries, but I am a more True Marxist than most techno weenies in that I agree with that economic analysis and see that it applies in our democracy.
I love movies and they're always playing in my mind, and many of my utterances are oblique movie dialog or sci-fi story references.

Colossus is a direct reference to the Forbin Project, about NORAD SAGE getting a mind of its own and putting humanity at its foothills, so to speak — the giant computer is embedded in the Rockies. Another version of this appears in the recent TV show Devs. Colossus does a mind-meld with a Russian version of itself called Gigantus or something and a story similar to Dr. Strangelove plays out, but with no irony. The story is stupid, but I like the way it expresses certain strange assumptions about the psychic drives of Western American techno nerds.

I take your point about friend who is YT (not Young Thing per Snowcrash) creator very well! Your anecdote is precisely how I see Apple aiming their products.

The next step are the goggles.
 
Good questions with more implications for hacks.

Its not as tidy as a the concern overcommitting system RAM.

Think how you've identified drive DRAM as a being cache the Mac's CPU to flash, then identify system RAM as a cache for the cloud to the Mac device in the same manner.

Everything Apple wants is for your Mac to be a way-station for rented services it connects via the network.

Every step of recent Apple device design has moved away from a PC-style unreliable bin-of-parts under the tyranny of the ruling OS to a more tightly integrated and reliable device package that provides an Apple customer a hookup to a tyranny of a ruling cloud service. Next up: AR/VR.

Where does the data sent to the drive come to / go from but RAM. That's where any data you immediately need to work on should be. So the question isn't how can the CPU help get data to the drive, but the other way around. To quote Sun-Ra "space is the place," or rather "RAM is the place." By closely integrating flash with RAM you can optimize the location of spatial-temporal layout of data to smooth the access to a dynamic cloud. Put another way, Apple thinks the last place you inter your data is on your device, they want to rent you a better version of yourself through their Colossus.

For this to work, device subsystems have to be very dependable, which is why the insides of Apple devices are soldered and glued together.

Ultimately, you will choose your device config by the scale and cost of your workload as it relates to cloud services, where if you require more data autonomy you will need a more powerful device, the cost of which will be justified by the value your peonage adds to Apple's hive.

As macOS becomes more tuned to Apples HW, it's OS optimizations are going to become more tailored to Apple's specific HW and less adaptable to "PCs". An NVMe drive is built according to a system layout assumption that privileges residency of data in local storage, but data in RAM is what should be privileged in a network node. Flash is just a local warehouse.

(Alan Kay argued that todays CPU design is weak because it over-emphasizes RAM as a warehouse at the expense of computing which is endlessly thwarted by caches, but he was looking — as was Alan Turing in his later work — at the elegance of the computational structure of life.)

Every hackintosher should becomr keenly aware that the Mac is truly not-a-PC, it's something that emerged from a history that had an major PC era which Apple has deftly parlayed as a stepping stone to a more profound system of market control. As to what this means for humanity, well so far these companies are more Dr. Evil than truly evil, but watch out for the banality of evil, it can surprise you!

Jobs will gloat over Gates in the grave. But it's the DoE / ARPA / Xerox PARC / SRI nerds that will rise in the Apocalypse when the accounting of the 9 Billion Names of God is complete.

For those who want to keep with their PC and maintain the freedom to truly invent: Linux, now more than ever.

There's no doubt that all the tech companies want us to subscribe to one service or another but being a very cheap individual, I resist at every turn. The only thing I subscribe to is Apple TV+ because it's cheap and I find the content to be of very high quality. I don't think that macOS nor the Macs push me towards any subscriptions.

I'm old fashioned and choose to store all my data locally. Total capacity of my NAS is fast approaching 100TB with (what I feel to be) good redundancy. I view the SSD storage of my primary system to be "temporary" and archive everything to my NAS.

I started to wonder about the SSD on the Apple Silicon Macs after hearing of the poor SSD performance of the M2 MacBook Pro. I also wondered how RAM size may affect the perceived performance. For example, on an 8GB model, the system can/may be paging a large swap area when these people are conducting their SSD benchmarks.

As I said earlier, I really feel like 8GB is insufficient in 2022 and 16GB should be made the minimum on all Macs now.
 
There's no doubt that all the tech companies want us to subscribe to one service or another but being a very cheap individual, I resist at every turn. The only thing I subscribe to is Apple TV+ because it's cheap and I find the content to be of very high quality. I don't think that macOS nor the Macs push me towards any subscriptions.

I'm old fashioned and choose to store all my data locally. Total capacity of my NAS is fast approaching 100TB with (what I feel to be) good redundancy. I view the SSD storage of my primary system to be "temporary" and archive everything to my NAS.

I started to wonder about the SSD on the Apple Silicon Macs after hearing of the poor SSD performance of the M2 MacBook Pro. I also wondered how RAM size may affect the perceived performance. For example, on an 8GB model, the system can/may be paging a large swap area when these people are conducting their SSD benchmarks.

As I said earlier, I really feel like 8GB is insufficient in 2022 and 16GB should be made the minimum on all Macs now.

It's this mixing-up of technologies which I wonder about when reviews come out with their "damning" results.

Like the screen-grab earlier of the temperatures in an M2 laptop. Are there specific sensors in different parts of the M2 or motherboard, to monitor each part separately? It's all "unified" after all. With a PC there are CPU sensors, SSD sensors, North/South Bridge sensors etc. So when I see iStat Menus, or whatever software was being used, giving readings like that, I would like to know if it's accurate. Purhaps iStat has insider knowledge. Or ... maybe someone has been pointing and IR Thermometer at the motherboard. Hmm, that's accurate - not.

For sure Apple has made mistakes with thermal throttling, history shows that. Form over function perhaps. I'll be realiy surprised though, if those temperature readings are wholly accurate for the M2. That would be a big slip by Apple if so.
 
It's this mixing-up of technologies which I wonder about when reviews come out with their "damning" results.

Like the screen-grab earlier of the temperatures in an M2 laptop. Are there specific sensors in different parts of the M2 or motherboard, to monitor each part separately? It's all "unified" after all. With a PC there are CPU sensors, SSD sensors, North/South Bridge sensors etc. So when I see iStat Menus, or whatever software was being used, giving readings like that, I would like to know if it's accurate. Purhaps iStat has insider knowledge. Or ... maybe someone has been pointing and IR Thermometer at the motherboard. Hmm, that's accurate - not.

For sure Apple has made mistakes with thermal throttling, history shows that. Form over function perhaps. I'll be realiy surprised though, if those temperature readings are wholly accurate for the M2. That would be a big slip by Apple if so.

On my Mac Studio, there are a ton of sensors!!

Screen Shot 2022-07-01 at 5.06.32 PM.png
But I don't think anyone knows what the TJMax for these chips are... Without know that, how do they know it's overheating???

Edit:
Btw, those temps are in Fahrenheit. Lol
 
Last edited:
On my Mac Studio, there are a ton of sensors!!

View attachment 550496
But I don't think anyone knows what the TJMax for these chips are... Without know that, how do they know it's overheating???

Thanks. Someone I can trust giving us something relevant to look at. Excellent. :thumbup:

That's exactly the point, isn't it? Has Apple released this information? But I'm still impressed with all those readings!

:)
 
The only thing I subscribe to is Apple TV+ because it's cheap and I find the content to be of very high quality. I don't think that macOS nor the Macs push me towards any subscriptions.
Go to one mega plex movie theater once a month. You sit through a half hour of horrible movie previews, have annoying people talking right next to you. The cost of the popcorn with fake butter flavor and too much salt is more than what you pay for one month of Apple TV+. Easy to see what the better choice is.
 
Go to one mega plex movie theater once a month. You sit through a half hour of horrible movie previews, have annoying people talking right next to you. The cost of the popcorn with fake butter flavor and too much salt is more than what you pay for one month of Apple TV+. Easy to see what the better choice is.

At $60/year, I think Apple TV+ is a steal especially when compared against what Netflix charges. I get full 4K streams without having to pay extra. Ted Lasso, Severance, Foundation, For All Mankind... Really great shows!

And, best of all, I can pause when I need to get more snacks or pee. Something that's not possible in a theatre. Lol
 
On the topic of the overheating "problem" for the M2 MBP. iCave Dave explains why this problem is not a real world problem for the majority of M2 MBP owners. Watch the video starting at the 15:31 timestamp. He explains it quite well. His making the thumbnail look like click bait is just a joke. He's showing what other Apple tech channels do to get views and clicks.

 
Last edited:
On the topic of the overheating "problem" for the M2 MBP. iCave Dave explains why this problem is not a real world problem for the majority of M2 MBP owners. Watch the video starting at the 15:31 timestamp.


I haven't been tracking this, is there actually news about this matter or just hand wringing?

This YT promoter with apparent eye-rolling ADD doing an irritating Russell Brand impression is seriously pitching beverage mugs based on distinctions between hot liquids, coffee vs tea, then segueing into a cross-promotional riff about how the M2 test was intended to "make the mac hurt, make it suffer" — ???

And by fake Russell Brand's reckoning, what was the test?

He says it was used the device to do video work that it is designed to do. (Prolly DaVinci Resolve based on his lingity-frankity)

And what happened??

From the original source of this hot story (I slay me) a higher than typical max CPU temp was recorded! Zomg

Is this bad, good, terrifying? Idk

Apple's Intel Macbooks have typically allowed 100C peak steady operating temperature. They also throttle, which is a typical design response to hitting limits.

So what's the deal here?

This mac is designed with Apple's own silicon. We already well know that M1, like previous Macbooks is a balance of performance and packaging that requires care in tradeoffs with all subsystem being worked at the limits, esp power and therefore thermal. The M2 stands out from the M1 with a higher max clock which implies higher energy, so other aspects being equal => higher peak chip temps.

OK, so what? As was pointed out in another post: "What are the thermal limits of this model" Did anything actually go wrong?

To me the big news is that Macbook fans can still rev: I hate that!

C3PO to R2D2:
- It's our lot in life, it seems we were made to suffer.
- Blee-bloo-tweezz-bleep!
 
Status
Not open for further replies.
Back
Top