Contribute
Register

October 18th 2021 Apple Event: M1 Pro/Max MacBook Pros

Status
Not open for further replies.
Delivery expected on Tuesday the 2nd. Nice to see many glowing reviews by those who purchased these items to review on their channels. They're less likely to suffer from confirmation bias than people like me who bought one for personal use.

View attachment 532235
Update: Mine has been shipped via FedEx. UPS is apparently encountering flight delays to some destinations.
I think I've read somewhere that the max chip performs slightly worse in a 14” chassis than in a 16”. 16” being bigger and having somewhat better cooling. Did anybody do a comparison test between them?
 
Reasonably, Intel, AMD and Nvidia are all safe because Apple will not take up 100% of the CPU and GPU market. Apple has neither the manufacturing capacity, nor even the will to compete at the lower end of the market. The high-end GPU market is also safe from AppleSilicon, for now.

But AppleSilicon will take some wind out of the sails of x86. And if that example inspires further architectures, possibly with their own OS, computing may get exciting again. Let's roll back to the 90s, when there was Microsft on x86, Apple on 68k and then PowerPC, NeXT on 68k and then x86, Sun on SPARC, SGI on MIPS, BeOS on PowerPC and then x86…
It depends more on the software library than on the speed of those new Apple chips. Right now they are strongly focusing on YouTubers and other "Hollywood” types. All I keep hearing is Final Cut, ProRes, and on and on. Their software is heavily optimized to fly on those chips and that works great I admit it. I am also a content creator but my field is packaging production which involves the design, prepress, packshot illustrations, and 3D models. Ludicrous speed isn't exactly something that is that important to my work or shaving render times by a couple of minutes. Most of my work starts on an empty illustrator or photoshop artboard and speed is measured by how fast I can get the inspiration to create something. I just hope they will focus more on creating a better all-around machine than this narrow view they seem to be having right now.
 
Reasonably, Intel, AMD and Nvidia are all safe because Apple will not take up 100% of the CPU and GPU market. Apple has neither the manufacturing capacity, nor even the will to compete at the lower end of the market.
if apple told TSMC we need 3 times the production they would get it, if apple told fox con we need 3 times the production they would get it. So yea they have the manufacturing capacity.

The high-end GPU market is also safe from AppleSilicon, for now.
You must be assuming they blew their wad, but in their second offering they are putting on chip a GPU that is as powerful as a Vega 56 and threw optimization likely are leveraging every OZ of that 10 Tflops where in the Vega probably not so much. I am pretty sure that apple has designs on a 64 core GPU that would kill the 6900xt and likely give Nvida 3090 a run for its money and maybe beat it because of optimization.

But AppleSilicon will take some wind out of the sails of x86. And if that example inspires further architectures, possibly with their own OS, computing may get exciting again. Let's roll back to the 90s, when there was Microsft on x86, Apple on 68k and then PowerPC, NeXT on 68k and then x86, Sun on SPARC, SGI on MIPS, BeOS on PowerPC and then x86…
With the exception of Sun/Solaris the rest of them are dead and were very nich in the first place and did not encourage competition.
 
You must be assuming they blew their wad, but in their second offering they are putting on chip a GPU that is as powerful as a Vega 56 and threw optimization likely are leveraging every OZ of that 10 Tflops where in the Vega probably not so much. I am pretty sure that apple has designs on a 64 core GPU that would kill the 6900xt and likely give Nvida 3090 a run for its money and maybe beat it because of optimization.


 

Not really impressed that it beat a 5700xt since the 5700xt has less threwput. The fact it was on battery power is cool but it does not really influence me.
 
According to ArsTechnica, ProMotion is essentially Apple's implementation of FreeSync.
Apple doesn't call Monterey's variable refresh rate implementation "FreeSync," but that's essentially what it is, and monitors marketed as having FreeSync support will support variable refresh rates when used with compatible Macs.

[…]

Apple doesn't document anywhere which Macs support variable refresh rates, so I've tried to do it myself. Based on my own testing plus what we know about which GPUs support FreeSync and variable refresh rates in Windows, here are all the Macs that should be able to support variable refresh rates in Monterey:
  • All Apple Silicon Macs.
  • Nearly all iMacs, 15- and 16-inch MacBook Pros, and iMac Pros with AMD Radeon GPUs, but not the 2013 Mac Pro, which use AMD GPUs that predate FreeSync support. This is based on AMD documentation about which of its GPUs support FreeSync.
  • The 2020 Intel MacBook Air and 13-inch MacBook Pro with 10th-generation Intel Core CPUs and Intel Iris Plus GPUs. Formerly codenamed Ice Lake, these were the first Intel GPUs to support adaptive sync (the GPUs in Intel's 11th-generation chips support it across the board).
 
I think I've read somewhere that the max chip performs slightly worse in a 14” chassis than in a 16”. 16” being bigger and having somewhat better cooling. Did anybody do a comparison test between them?
Cooling system on the 16" is more effective, which makes sense because there's more room for air intake and exhaust. The cooler itself might be enhanced in some way for the M1 Max models.

But the performance of 14" is still beyond my expectations. This is a system I can delightfully hold onto for a long time. It doesn't matter if the 14" is a few percentage points slower than the 16"! This is also why I have never overclocked my processors, although the reasoning for this is more involved.

Overclocking gives you maybe 2% faster benchmarks for 50% more energy consumption. If I exaggerate, so be it. :) In a similar vein, the smaller dimensions of the 14" -- which makes the item more portable -- easily outweigh the minor and imperceptible differences in speed.

I can perceive the smaller size, but I cannot perceive the performance difference...
 
I can perceive the smaller size, but I cannot perceive the performance difference...
From what I've seen in videos and the specs, the 16" seems much too big and bulky to comfortably carry with you to say a coffee shop or to take on a flight. It reminds me more of the 17" Intel MBPs of the latter 2000s. Not very mobile friendly. It's really meant to be on a video editor's desk full time or to be part of a recording studio etc.
 
Status
Not open for further replies.
Back
Top