Contribute
Register

October 18th 2021 Apple Event: M1 Pro/Max MacBook Pros

Status
Not open for further replies.
Height of the menu bar is larger on notched MacBook Pros. Photo below shows 2020 13" Intel MacBook Pro on left and 2021 14" MacBook Pro on right.
Could this mean that they might eventually be enabling touch screen ability on MBP and MBA laptops ? The extra space makes it easier to select menu bar items with a finger. Never say never. Everyone thought that MagSafe, HDMI ports and the SD card reader would never appear on an Apple laptop again. Well, we have them now.

Screen Shot 13.jpg


S. Jobs "Nobody wants a big phone." Now we've got the 13 Pro Max
"The stylus means we've failed." Now we've got Apple Pencil on iPads etc.
 
Could this mean that they might eventually be enabling touch screen ability on MBP and MBA laptops ? The extra space makes it easier to select menu bar items with a finger. Never say never. Everyone thought that MagSafe, HDMI ports and the SD card reader would never appear on an Apple laptop again. Well, we have them now.

View attachment 533449

My views regarding touchscreens on laptops can be summarized like this:
  • If the laptop screen flips 180 degrees such that the keyboard becomes hidden, then touch makes sense.
  • If the laptop detaches from the keyboard, then touch makes sense.
I have had a conventional (non-flip, non-detachable) Windows laptop for the past 6 years with a touchscreen. I have used that touchscreen only once in 6 years. To me it just makes no sense.

It's also a reason why the TouchBar was a failure. When we're using the keyboard, we are inclined to keep our wrists planted on the table or on a wrist-rest. We try to minimize the movement of our wrists.

The keyboard + trackpad + mouse is a brilliantly effective human interface. Which is why even the iPad got a Magic Keyboard with a trackpad!

If the keyboard / trackpad / mouse are not present or readily available, then touch makes sense.

On a somewhat related topic, one reason Apple chose not to include FaceID on the new 14" and 16" MacBook Pros is because the TouchID sensor on the keyboard has some advantages:
  • When we use FaceID on iPhone or iPad Pro to authenticate a purchase or authenticate a login, we first have to double-click a physical button.
  • Because FaceID is built into the notch (iPhone) or into the bezel (iPad Pro), it meant the old round home button could be eliminated to provide more screen real estate.
But on the MacBook Pro:
  • The TouchID sensor is on a dedicated keyboard and does not interfere with screen real estate.
  • Our hands are always on the keyboard.
  • Instead of double-clicking a button like we do on iPhone and iPad to invoke FaceID, we can move one finger to the TouchID button on the keyboard. We can even double-click the TouchID button.
With FaceID on MacBook Pro, the process would be like this:
  • To authenticate a purchase or login, first press some key on keyboard.
  • Then look up at FaceID sensor.
But...
  • The dedicated TouchID key on keyboard can serve both purposes and not reduce screen real estate.
 
Last edited:
My views regarding touchscreens on laptops can be summarized like this:
  • If the laptop flips 180 degrees such that the keyboard becomes hidden, then touch makes sense.
  • If the laptop detaches from the keyboard, then touch makes sense.
I have had a conventional (non-flip, non-detachable) Windows laptop for the past 6 years with a touchscreen. I have used that touchscreen only once in 6 years. To me it just makes no sense.

It's also a reason why the TouchBar was a failure. When we're using the keyboard, we are inclined to keep our wrists planted on the table or on a wrist-rest. We try to minimize the movement of our wrists.

The keyboard + trackpad + mouse is a brilliantly effective human interface. Which is why even the iPad got a Magic Keyboard with a trackpad!

If the keyboard / trackpad / mouse are not present or readily available, then touch makes sense.

On a somewhat related topic, one reason Apple chose not to include FaceID on the new 14" and 16" MacBook Pros is because the TouchID sensor on the keyboard has some advantages:
  • When we use FaceID on iPhone or iPad Pro to authenticate a purchase or authenticate a login, we first have to double-click a physical button.
  • Because FaceID is built into the notch (iPhone) or into the bezel (iPad Pro), it meant the old round home button could be eliminated to provide more screen real estate.
But on the MacBook Pro:
  • The TouchID sensor is on a dedicated keyboard and does not interfere with screen real estate.
  • Our hands are always on the keyboard.
  • Instead of double-clicking a button like we do on iPhone and iPad to invoke FaceID, we can move one finger to the TouchID button on the keyboard. We can even double-click the TouchID button.
With FaceID on MacBook Pro, the process would be like this:
  • To authenticate a purchase or login, first press some key on keyboard.
  • Then look up at FaceID sensor.
But...
  • The dedicated TouchID key on keyboard can serve both purposes and not reduce screen real estate.

Yes, I remember Intel took a dig at Apple for not including Face ID on laptops. If I understand correctly FaceID uses Lidar for depth analysis when building the facial topography, so that set of electronics would need to be built in to Apple's laptops too. Maybe Apple have decided it's not as useful in a desk-based system as a mobile one or that they simply need more time to implement it. Mobile-phone reviews often point out that Apple's FaceID method is a lot more secure than their competitors who only use the camera and available light.

Touch ID on an Apple keyboard makes so much more sense all round.:thumbup:
 
S. Jobs "Nobody wants a big phone." Now we've got the 13 Pro Max
"The stylus means we've failed." Now we've got Apple Pencil on iPads etc.

Unfortunately Steve is not there to guide his company any more. Which is beginning to tell.

It's salutary to read/watch many iPhone reviewers saying how much they like the iPhone 12 Mini or iPhone SE over their bigger siblings. And a $99 pencil is something easy to loose so I see his viewpoint. A pencil or a stylus also turns the iPad into a "crossover" device impinging on the territory of Wacom and their lik. However, I admit, I can see the attraction. iPads are much nicer.

I suspect the one change we've seen recently that he may like is the new ARM chipsets. He did like ruffling feathers!
 
Unfortunately Steve is not there to guide his company any more. Which is beginning to tell.

It's salutary to read/watch many iPhone reviewers saying how much they like the iPhone 12 Mini or iPhone SE over their bigger siblings. And a $99 pencil is something easy to loose so I see his viewpoint. A pencil or a stylus also turns the iPad into a "crossover" device impinging on the territory of Wacom and their lik. However, I admit, I can see the attraction. iPads are much nicer.

I suspect the one change we've seen recently that he may like is the new ARM chipsets. He did like ruffling feathers!
On the subject of stylus, Steve Jobs was referring to its use as the primary method of interaction. Apple Pencil is not a primary method of interaction. Instead, it serves use cases that require more precision such as (a) drawing, (b) signing, (c) handwritten notes, etc.

On the subject of bigger screens, Jobs and Apple resisted the pressure to follow Samsung's lead with the Galaxy Note, but eventually changed their minds.

On the subject of smaller screens, Jobs and Apple resisted the pressure to follow Android's lead with a smaller iPad, but eventually changed their minds and produced the iPad mini.

Can touchscreen come to Mac? It can, but there has to be a viable need or market. It has to make 'enough' sense first. Apple Pencil made sense; bigger iPhone made sense, smaller iPad made sense. Does touchscreen on a conventional laptop make sense?

Failure of the TouchBar suggests that at this time a touchscreen on conventional laptops does not make sense.
 
Last edited:
Does touchscreen on a conventional laptop make sense?

I've only tried one on a Chromebook and didn't like it. As you pointed out earlier, it's fine if the device can be folded into tablet form but even then the quality of capacitive touch is unlikely to be as good as an iPad alone. The Chromebook I tried was positively "squishy"!

As for day-to-day use, I'd prefer mouse and keyboard every time. Drag/drop, Copy/paste is so much easier and quicker, not forgetting muscle-memory keyboard short-cuts allowing you to do complex editing in an instant with traditional methods.

With the new ARM silicon and associated processing power, I doubt any of that would change.
 
I've only tried one on a Chromebook and didn't like it. As you pointed out earlier, it's fine if the device can be folded into tablet form but even then the quality of capacitive touch is unlikely to be as good as an iPad alone. The Chromebook I tried was positively "squishy"!

As for day-to-day use, I'd prefer mouse and keyboard every time. Drag/drop, Copy/paste is so much easier and quicker, not forgetting muscle-memory keyboard short-cuts allowing you to do complex editing in an instant with traditional methods.

With the new ARM silicon and associated processing power, I doubt any of that would change.
Someone once said, "Just because we can do a thing, it does not follow that we must do that thing." :)
 
ArsTechnica reports on the latest leaks on AppleSilicon:
  • Jade for new MacPro = 2 * M1 Max
  • M2 finalised and about to enter trial production
  • M3 is already in the works
At that pace, Apple should perhaps beware of the Osborne effect.
 
Monterey and Windows 11 on ARM. Ya ya, couldn't help myself.
  • Took all of 15 minutes to download and install Windows 11 ARM version.
  • It runs surprisingly well (14" MacBook Pro with M1 Max and 32GB LPDDR5).
  • Currently updating itself to latest insider preview so I'm watching Max Tech's latest YouTube video via Firefox's picture-in-picture mode -- which is much better than Safari's.
Monterey and Win 11.jpg
 
Last edited:
Status
Not open for further replies.
Back
Top