Contribute
Register

New Apple Silicon Macs: MacBook Air, MacBook Pro, and Mac Mini

Status
Not open for further replies.
I am of the highest skepticism when I look at all these claims.
Working in the semiconductor industry and knowing quite well what is going on at the technical level, there is no doubt that intel has been underwhelming in the past few years in their R&D efforts and it is to the point where they have fallen behind contract foundry makers. That being said, I am finding Apple's move as bold and extremely risky.

What made me move to a mac was... intel and the x86 move and the hardware intercompatibility/upgradability with a broader cheaper market. Going to arm reminds me of where apple was with its RISC/PPC hardware architecture which made them an isolated, elitist, closed platform. It corresponds to the brand's strategy but one which had limited success in the past. What Apple is launching here is program to do to the PC market exactly what they have done with the mobile phone market. It may or may not work. Even on the intel platforms you could see it happening with the increasing restrictions on upgrading memory and storage over time. These moves motivated more people to create hackintoshes. I have 3 macbooks in my household only because:
1. I can have one working powerful and flexible hack as I couldn't find anything equivalent in the mac lineup and nothing close to the price I would be willing to pay even making some technical compromises.
2. They are intel based and offer a level of compatibility with Windows PCs and linux.
3. They are a great bridge to iOS devices.

This move takes out 1 and 2 and may even make me question my iOS bias (3) and I fear for them that many will come to the same conclusion. The bet for them is that ARM will displace x86 even in the desktop/laptop segments using an SOC, short product lifetime/non upgradeable platform like what the mobile space has become.
 
I would wait until the new 2021 Mac mini models with more cores and I/O come out. There's no guarantee of this but I do think it will happen. Apple has sold a lot of the 6 core Intel models over the last two years. The Mac mini they announced on Tuesday isn't likely going to be the only option, it's more like a developer's kit version for consumers. It's the one desktop they could get ready quickly enough to meet the "end of 2020" deadline.__
Yes I think you're very right about this, but I believe that they moved fast not so much to meet the deadline, but to get quickly something inexpensive (they even reduced the price) to the hands of developers, so they can start porting their apps. Like you said, it's more like a DTK for the general public.

Also keep in mind that ARM's new Armv9 ISA seems to be around the corner, so for anyone not in a hurry, it might be worth waiting for new SOCs that support it.

First benchmarks out and looking good -
I think that Cinebench r23 scores (it now supports the M1) would paint a more real-world picture; Geekbench is probably too bursty and synthetic to draw too many conclusions.

What Apple is launching here is program to do to the PC market exactly what they have done with the mobile phone market. It may or may not work. Even on the intel platforms you could see it happening with the increasing restrictions on upgrading memory and storage over time. These moves motivated more people to create hackintoshes.
I share the same sentiment.

Everything soldered down now, no eGPUs, no modularity, no upgrading, no other operating systems. Everything is being progressively locked down tighter and tighter, with less and less freedom to use your device, differently than Apple intended. Need more RAM? You can't; you gotta get one of those new ones. More GPU power? Nope. Windows/Linux? Nope. A faster CPU/SOC (ha, good one). More storage? Well, you can either have it dangling outside or, you guessed it, get one of those new ones.

I don't really see why they dropped eGPU support. The drivers and pro apps supporting those GPUs are there; why take that performance boost away? The M1's 2.6 TFLOPS are impressive for an iGPU, but it's nowhere near the 10 TFLOPS of a 5700XT or the 20 TFLOPS of a 6800XT or even the 36 TFLOPS of a -heaven forbid- RTX 3090. It might just be that the drivers are not ready yet and that eGPU support will come in the near future, but I'm not so sure. I think that Apple will make and support only their own discrete/proprietary (not to mention overpriced) GPUs for the Mac Pro and for eGPU use.

The Mac is becoming a locked/fixed and expendable/throwaway appliance now, akin to an iPhone, and that's a real shame. This might not be a concern for most users who are content with such an appliance and don't care how/why it works (especially if it performs well), but I think that this is a step -or two- backwards for enthusiasts. Maybe I'm overreacting and a more affordable/upgradeable Mac Pro will come along, although with rumors saying that it'll be half the size, I'm not very optimistic.
 
Yes I think you're very right about this, but I believe that they moved fast not so much to meet the deadline, but to get quickly something inexpensive (they even reduced the price) to the hands of developers, so they can start porting their apps. Like you said, it's more like a DTK for the general public.

Also keep in mind that ARM's new Armv9 ISA seems to be around the corner, so for anyone not in a hurry, it might be worth waiting for new SOCs that support it.


I think that Cinebench r23 scores (it now supports the M1) would paint a more real-world picture; Geekbench is probably too bursty and synthetic to draw too many conclusions.


I share the same sentiment.

Everything soldered down now, no eGPUs, no modularity, no upgrading, no other operating systems. Everything is being progressively locked down tighter and tighter, with less and less freedom to use your device, differently than Apple intended. Need more RAM? You can't; you gotta get one of those new ones. More GPU power? Nope. Windows/Linux? Nope. A faster CPU/SOC (ha, good one). More storage? Well, you can either have it dangling outside or, you guessed it, get one of those new ones.

I don't really see why they dropped eGPU support. The drivers and pro apps supporting those GPUs are there; why take that performance boost away? The M1's 2.6 TFLOPS are impressive for an iGPU, but it's nowhere near the 10 TFLOPS of a 5700XT or the 20 TFLOPS of a 6800XT or even the 36 TFLOPS of a -heaven forbid- RTX 3090. It might just be that the drivers are not ready yet and that eGPU support will come in the near future, but I'm not so sure. I think that Apple will make and support only their own discrete/proprietary (not to mention overpriced) GPUs for the Mac Pro and for eGPU use.

The Mac is becoming a locked/fixed and expendable/throwaway appliance now, akin to an iPhone, and that's a real shame. This might not be a concern for most users who are content with such an appliance and don't care how/why it works (especially if it performs well), but I think that this is a step -or two- backwards for enthusiasts. Maybe I'm overreacting and a more affordable/upgradeable Mac Pro will come along, although with rumors saying that it'll be half the size, I'm not very optimistic.
Use cases are different is why. There will always be some option for people that need to make things., develop things, design things, etc.

End of the day somebody has to write the software for Mac, iOS, etc. Somebody has to design the chips, etc. These require more than low end consumer mostly.

Maybe all those go Linux. Maybe. Long time ago in a galaxy far far away, people used Atari STs to develop Genesis console games. So using another platform is not impossible.
 
I don't really see why they dropped eGPU support.
I agree. eGPU support is one of the halmarks of thunderbolt. It can be that this initial m1 chip has limited pcie lanes... the macbooks only support one external monitor, and the mac mini doesn't have a 10 gbps lan option. Seems like the amount of pcie lanes is very limited in the m1.

I'll be watching very closely to when Apple updates the rest of the line to see if they add more I/O. If not, then I don't know that I'm going to upgrade.
 
I wonder what would happen to Mac applications in two o three years...
Will developers then create two type of Mac apps, one Intel X86 and one arm-based?
I remember the original Rosseta Fix, back in the 2006, only last for three or four years...
So in 2023 we might have to trash our Macs and get those new ones in order to run new applications?
This new Macs generation does not only means the end of the Hackintosh era, does it?
 
I wonder what would happen to Mac applications in two o three years...
Will developers then create two type of Mac apps, one Intel X86 and one arm-based?
I remember the original Rosseta Fix, back in the 2006, only last for three or four years...
So in 2023 we might have to trash our Macs and get those new ones in order to run new applications?
This new Macs generation does not only means the end of the Hackintosh era, does it?

Developers/publishers can keep doing app updates in universal (includes code for both chips) for as long as they want; for relatively simple apps, it's a recompile in xcode with a flag ticked.

Rosetta was for those apps that didn't get updated - it was what allowed old software to run without the universal binaries. Dropping it meant you couldn't use the old apps on newer machines when you updated the system, so you basically lost ability to get the newest whiz-bang system features (and eventually security updates) on new machines. The software we had to stop using was really old software - sometimes disappointing of course.

So it was really about five years (I think) when the old machines stopped getting apple's security updates - but (from personal experience) by then the old machines were REALLY long in the tooth and slow. (I'm sure people who had top of the line PowerPC don't fully agree with this assessment of course).

Of course, some developers want to stop updating for the older systems, not because the old ones are not viable, but because as they add new features in the system and things the newer machines offer that the old ones don't, managing code bases gets more and more complicated. Eventually they stop providing versions that support the older machines - and also some newer software comes out that depends on the newer hardware and you can't get that if you're running older gear.

So my _guess_ is that you can keep running your existing equipment (assuming relatively recent) for about five years without too many significant compromises. To the extent you need the newest whiz-bang features, less; if you are less demanding about new features and willing to do some work to find solutions, maybe longer. (With the x86 platform, extremely high chance you can put an Ubuntu variant on it and use it for a really long time)

Of course during that time you may decide to upgrade to a newer machine and relegate the old gear to some basic use - server or whatever. But that's a decision about what you get from the new gear to make it worthwhile. (I still have a powerbook g4 that runs fine but too slow to be useable - even so, it ran as a simple server for a really long time).

But if you want the latest and greatest photo and video editing software on macos, at some point it's going to make sense to jump. Far enough away there's not much sense stressing about it now.

Keep in mind - compared to the switch to x86 from PowerPC, the software universe has changed a LOT. Far more opensource available, developer tools, etc - a lot of the software that couldn't run without rosetta had been developed on much older development platforms/APIs and couldn't be easily brought up to date. And a very large installed base of gear out there. The PowerPC computers were so different from market-standard (tools available so much more limited) that support of any kind was dropped a lot faster, exacerbated by the fact that they were much less capable (performance wise).

So to boil it down: if you want latest and greatest feature and performance wise, eventually your x86 gear will slowly become less relevant (possibly more quickly). But you should be able to get quite a lot of use out of it in the interim. I could be wrong, in that apple's in a different situation and relatively more aggressive about dropping older stuff and APIs, so maybe this gets accelerated - but probably not that much.

I've plenty of mac/hack gear, too much really. Only the oldest laptop is on the 'stop using soon' list (and it's in its death throes and not used much anyway, way off the supported list). Another may get relegated to simple browsing/office use when I pull the trigger on a new laptop - but it's so underpowered and underspecced any other use has been a pain for some time. Not worried about the hacks for several years. YMMV.
 
Last edited:
My guess is that the decision to stack so many components on to a single SoC is to reduce manufacturing costs and reduce points of failure.

Apple has been trying to eliminate MHz and GHz as a form of comparison for years. I don't find it surprising at all that they omit any mention of it here. However, in looking at the Geekbench results, it appears the M1 is running at 3.2GHz. Assuming this is true, they are achieving really great IPC from these CPUs!!

I've been doing some more reading on this and I think my speculation yesterday was pretty off-base. I reckon they will be able to scale it up by putting a lot more cores and processing in. Complications on adding memory and some other bits - they can deal with that by e.g. moving memory off-package.

Some of the limitations in this first gen SoC (like memory) likely more driven by adapting the iphone/ipad A14 and going for scale/speed to market and not fundamental technical issues.

The next iteration of this - highest probability is for the higher-end macbook pros and some of the imacs - could be absolute stonkers.

I'm personally a bit more interested in the additional features this may bring esp to laptops, but understand they decided to keep the chip transition to proven designs/formats.

Only outright disappointment for me is that they haven't brought cellular internet direct to the laptop line. (Would also feel comfortable if more memory but balanced against how cheap I am)
 
This new Macs generation does not only means the end of the Hackintosh era, does it?
Good question ! It might not not only be the end of Hackintosh as many claim, but aklso be the end of all Intel Macs including all high priced Macs. So would anyone who need to care about money invest in an Intel Mac from today ?
 
So would anyone who need to care about money invest in an Intel Mac from today ?
I think so. It's going to depend on the individual use case. People that dual boot with Windows and use that OS often will benefit from building an Intel based hack or buying and Intel Mac. They can still keep using macOS in a dual boot as long as their machine keeps working.

Not everyone upgrades to the latest macOS release as soon as it comes out. Audio Professionals are one example of this. Many are still using Mojave to this day and plan to stay there for some time. Many of their plugins just won't work right in Catalina or Big Sur. 32 bit apps won't work either. I've got various computers running many different legacy versions of OS X going back as far as Mavericks from 2013. Those don't cease to function just because Apple stops supporting Intel in a new macOS release 2-3 years from now.

For people such as developers, that must always have the latest Xcode, it would probably be better to just buy a new M1 Mac for their work. Better yet, wait for the refreshed and redesigned MBP 16" in 2021. They should max the ram and graphics out at time of purchase to get the longest useful life out of it. Better resale value too.
 
Last edited:
Tim Millet from Apple talks about Unified LPDDR4 Memory and the M1 Chip

Screen Shot 14.jpg


 
Last edited:
Status
Not open for further replies.
Back
Top