Contribute
Register

October 18th 2021 Apple Event: M1 Pro/Max MacBook Pros

Status
Not open for further replies.
"When we set the upper limit of PC-DOS at 640K, we thought nobody would ever need that much memory."

— William Gates, chairman of Microsoft

“Computers in the future may weigh no more than 1.5 tons.”

— Popular Mechanics, forecasting the relentless march of science, 1949
 
Re customer giddiness: I'm stoked too!

I'm surprised to see my points regarded as an existential threat to consumer happiness...

There is a line somewhere near here where appreciation crosses over to fetish, and I feel it too! IOW I'm in good company.

I am trying to steer the conversation to consider that great innovation must be something more than beating Intel at the process game and repositioning the Start Menu for the 11th time.

This is not innovation, it's refinement.

And these new Macs, while traditionally underprovisioned and milking customers on storage, are gorgeous. Maybe the best testament to these new units is that somehow, for me, a 2012 MBP I bought refurb from Apple is still a fine daily driver. And such is my key point about innovation.

The fact is an iPhone is compute performant WRT to a top of line Mac Pro at web browsing.

So we can to face that the meaning of this new Mac architecture is about tasks at hand, not about any absolute measure of power. And Apple is careful to frame it in just this way (re megahertz myth of dayz of old when knights were bold)

WRT to fretting about replaceable RAM, storage, etc, fixing these units can only make the device much more reliable.

To those who have been on the fence about needed / wanted upgrade...

NOTE TO BUYERS DO NOT HEDGE ON STORAGE. By Apple history, you ought to buy the top storage provision to preserve the long-term value of the device.

IN MY OPINION: 32G of RAM is basically infinite on these designs, except for some very specific use cases pertaining to how Apple is marketing the "pro" angle. So topping out RAM is truly optional. A 16G device will likely not suffer any performance ills for 9/10 users — but what nerd will say no to more.

One last thought:

PRO HAS A MORE SPECIFIC MEANING FOR THIS CYCLE THAN EVER BEFORE. MANY USERS WHO ARE ACTUALLY PROS MAY NOT FIT APPLES DEFINITION BECAUSE APPLE NOW UNQUESTIONABLY DEFINES THE ENTIRE STACK FROM ELECTRONS TO UI CLICKS, AND THE PERFORMANCE CLAIMS ASSUME THIS. AS TO THE SUBJECTIVITY OF BENCHMARKS, THE RULES APPLY TO ALL. THIS NEW DESIGN WITHOUT A DOUBT STEALS BRAGGING RIGHTS FROM INTEL, WHICH IS SUPER COOL

Please enjoy your new devices!
 
...

40 CPU cores and 128 GPU cores = at least a $25,000 Mac Pro. Sorry, in Apple terms $24,999.
There is also the rumor of a more compact Mac Pro. By the way, Snazzy Labs just released a video (yes, another glowing impressions video) that concludes with this:

Screen Shot 2021-10-19 at 2.24.31 PM.png


 
Re customer giddiness: I'm stoked too!

I'm surprised to see my points regarded as an existential threat to consumer happiness...

There is a line somewhere near here where appreciation crosses over to fetish, and I feel it too! IOW I'm in good company.

I am trying to steer the conversation to consider that great innovation must be something more than beating Intel at the process game and repositioning the Start Menu for the 11th time.

This is not innovation, it's refinement.

You bring up an excellent point. Yes indeed it is about innovation.

With the introduction of the M1, we are however now coming to a point of great importance or concern in the world of computing which is: going forward today as users, consumers and developers, what will the computing landscape look like 5 or 10 years from now? Because we are now at a crossroads. And our choices of platform today will no doubt shape and affect the computing platforms and landscape of tomorrow.

A case in point is hardware support and software support and development. If you are a software or hardware developer who have traditionally relied on say an x86 platform for a living, unless you rely on the Apple ecosystem you may be in for a rude awakening in a few years time (if the market has shifted towards using ARM on desktop). The problem isn't as simple as just having Microsoft Windows, Linux or macOS - but that of choice and whether that platform you've chosen will be supported. Firstly as Apple is transitioning towards ASi, there's going to no doubt be less people in the market able to work with x86 (unless they can afford to use/support two or more platforms). Think of games development and the dilemma programmers now face of having to choose between supporting ASi/ARM or x86, AMD and/or Nvidia, and macOS/Windows/Linux/Android/PS5/XSX/Switch. Apple isn't a great games platform (nor are they great PC games development supporters, now that they've ousted Nvidia from macOS and upset partner Epic who owns Unreal Engine). Then there's Intel with their new Alder Lake/Meteor Lake architectures, which actually marks a departure from x86. If the latest reports on their chips is to be seen, Alder Lake could be troublesome for anyone running x86 legacy software or platforms (regardless if they use Windows or not). The x86 DRM software recognition issue was mentioned recently as a problem for Alder Lake chips (particularly if you run older apps or games that aren't patched), but who knows what else it could affect? And with Rocket Lake being the last possible genuine Intel x86 platform produced (something which Intel may phase out at some point), users wanting to stick with x86 for the next few years down the line may have to go with either that and Ice Lake/Comet Lake or AMD Ryzen/Epyc/Threadripper instead. Then theres also word of universities and research labs supposedly moving towards RISC-V platforms because Apple is not interested in supporting x86 and Nvidia's CUDA (which they did more than 5 years ago). It all makes the computing market look rather fragmented, and will no doubt force individuals and companies who invest in software and hardware to scrutinise and evaluate their future purchases more closely.

Personally I would like to stick with using x86 as long as possible because that is what use and I prefer. But the specs and efficiencies gained from M1 are not to be sniffed at. I know a good handful of creative professionals in my industry would no doubt jump over to M1 Pro or M1 Max right now because it totally fits their requirements. As a photographer or director they'd shoot high-end quality HD/4K ads or commercials, do graphics, photo and/or video editing, do move around a lot and are based practically anywhere. The new MBP fits the bill for them, but maybe not so for say PC games/Windows software developers.
 
@c-o-pr,

We are congratulating Apple for what they have achieved. We're not bashing Intel -- at least not me.
What Apple has achieved is fascinating. We have to give them that. I remember using a Macintosh for the first time in primary school, and being fascinated by this thing with a GUI, something I’d never used before.

All these years later, Apple continued to refine its OS, modified it and put in on a phone and tablet, and now it designs its own GPUs with a comparatively tiny 10 core integrated CPU. They implemented pcie4.0, thunderbolt, mini-led, and promotion. It is fascinating to see them continue to innovate.

We all know Intel has stumbled it was sad to see.at least for me personally.

But sometimes in life we have to hit rock bottom and get smacked in the face by life in order to learn the lesson. Intel was used to being on top, to being the default option, an entitlement mentality. But then it had to learn how to be second fiddle, and that had to hurt, especially to have its most prestigious customer leave and develop something comparable in performance that consumes far less power. Kinda ironic also that Apple’s Johnny Srouji used to work at Intel.

Gelsinger is now saying that Intel has to earn its customer’s business by delivering superior solutions on time. I’m sure Intel has learned it’s lesson. Gelsinger at least at seems to say as much in his many interviews. They appear to be coming back with Alder Lake and the future lakes.

Apple went its own way. And did something marvelous, something it couldn’t do with what Intel had on offer. Maybe Intel might catch up and Apple will come back. Maybe not. But the world is big enough for intel amd x64 and Apple Silicon. It’s not one or the other, it’s both.
 
Last edited:
You bring up an excellent point. Yes indeed it is about innovation.

With the introduction of the M1, we are however now coming to a point of great importance or concern in the world of computing which is: going forward today as users, consumers and developers, what will the computing landscape look like 5 or 10 years from now? Because we are now at a crossroads. And our choices of platform today will no doubt shape and affect the computing platforms and landscape of tomorrow.

A case in point is hardware support and software support and development. If you are a software or hardware developer who have traditionally relied on say an x86 platform for a living, unless you rely on the Apple ecosystem you may be in for a rude awakening in a few years time (if the market has shifted towards using ARM on desktop). The problem isn't as simple as just having Microsoft Windows, Linux or macOS - but that of choice and whether that platform you've chosen will be supported. Firstly as Apple is transitioning towards ASi, there's going to no doubt be less people in the market able to work with x86 (unless they can afford to use/support two or more platforms). Think of games development and the dilemma programmers now face of having to choose between supporting ASi/ARM or x86, AMD and/or Nvidia, and macOS/Windows/Linux/Android/PS5/XSX/Switch. Apple isn't a great games platform (nor are they great PC games development supporters, now that they've ousted Nvidia from macOS and upset partner Epic who owns Unreal Engine). Then there's Intel with their new Alder Lake/Meteor Lake architectures, which actually marks a departure from x86. If the latest reports on their chips is to be seen, Alder Lake could be troublesome for anyone running x86 legacy software or platforms (regardless if they use Windows or not). The x86 DRM software recognition issue was mentioned recently as a problem for Alder Lake chips (particularly if you run older apps or games that aren't patched), but who knows what else it could affect? And with Rocket Lake being the last possible genuine Intel x86 platform produced (something which Intel may phase out at some point), users wanting to stick with x86 for the next few years down the line may have to go with either that and Ice Lake/Comet Lake or AMD Ryzen/Epyc/Threadripper instead. Then theres also word of universities and research labs supposedly moving towards RISC-V platforms because Apple is not interested in supporting x86 and Nvidia's CUDA (which they did more than 5 years ago). It all makes the computing market look rather fragmented, and will no doubt force individuals and companies who invest in software and hardware to scrutinise and evaluate their future purchases more closely.

Personally I would like to stick with using x86 as long as possible because that is what use and I prefer. But the specs and efficiencies gained from M1 are not to be sniffed at. I know a good handful of creative professionals in my industry would no doubt jump over to M1 Pro or M1 Max right now because it totally fits their requirements. As a photographer or director they'd shoot high-end quality HD/4K ads or commercials, do graphics, photo and/or video editing, do move around a lot and are based practically anywhere. The new MBP fits the bill for them, but maybe not so for say PC games/Windows software developers.
You raise good points. But such is the nature of the capitalist market. If the market is demanding ARM-based computing, then that is what will become dominant. x86 wasn’t always dominant. But market forces back in the day around Win 3.1 and Win 95 (and IBM’s stumbles) sealed x86’s fate in history.

Maybe the market wants to move on! AMD and Intel certainly will fight like hell to prevent that from happening. But nothing lasts forever in this world. Given the intense pressure from the ARM invasion, I am predicting some wild designs from AMD and Intel over the next few years. Who knows, maybe one or both of the companies might ultimately add an ARM-based cpu to its portfolio as well (or at least include an ARM-based chiplet). Who knows where we’ll be in 5-10 years?
 
You raise good points. But such is the nature of the capitalist market. If the market is demanding ARM-based computing, then that is what will become dominant. x86 wasn’t always dominant. But market forces back in the day around Win 3.1 and Win 95 (and IBM’s stumbles) sealed x86’s fate in history.

Maybe the market wants to move on! AMD and Intel certainly will fight like hell to prevent that from happening. But nothing lasts forever in this world. Given the intense pressure from the ARM invasion, I am predicting some wild designs from AMD and Intel over the next few years. Who knows, maybe one or both of the companies might ultimately add an ARM-based cpu to its portfolio as well (or at least include an ARM-based chiplet). Who knows where we’ll be in 5-10 years?
AMD has publicly stated that it's been experimenting with ARM cores.
 
VGER viewed emotional beings such as humans as an infestation because we harbor an emotional "dimension" to our existence that VGER regarded as illogical and in need of correction. But ultimately it was VGER that needed correction. It needed to “leap beyond logic”.
Thats 18bit programing for you.

When we talk about need versus want, how do we define those terms? What constitutes need? And how can we calculate that need accurately? Who calculates their needs anyway to that degree? And what constitutes want?

Want is something that is not as bad as to need, but not as good as it would be to have. :)
 
Status
Not open for further replies.
Back
Top