Contribute
Register

Gigabyte Z490 Vision D (Thunderbolt 3) + i5-10400 + AMD RX 580

Well, I too, like many others have a new 14" M1Max Macbook Pro ordered and it is scheduled to be delivered in very early November.

I like the hacks I've built but they have been pretty high maintenance, and while part of me enjoys the challenge of making these work, another part of me feels that it distracts me from really focusing on my work.

However missed blessing that sounds, the $$$ for "real" macs keeps going up, and that too is a decidedly mixed blessing.

So, for me, I will be especially curious about two issues; the first is just how good is my new laptop, and next year when the "possible" new Mac Pro with Apple Silicon comes out, just how much $$$ will Apple charge us for the new Apple Silicon Mac Pro. If it is affordable, or relatively so, that might just say to me that "ready-to-use" trumps the potential savings that might come from a hackintosh.

And, this is strictly my opinion, but after all, time spent doing and re-doing EFIs is also a cost, is it not? I think my time is worth something after all.

EDIT: @CaseySJ: FYI: I have re-installed my Fenvi HB1200 in my Z490, and it's perfect (so far). It responds natively (best I can tell), and best of all, it only need two antennas, which I also like very very much. I just needed the BT for a short project, and BT works great in this hack. I will wait for the Intel BT drivers to mature for now.

To Each His Own.

Thanks, @CaseySJ!

To follow on these thoughts, I’ve always liked having a real Mac as well as a hack; it satisfies having the ‘it just works’ aspect of the real Mac, as well as scratching the geek itch of making a Hackintosh. You get two macs for the price of 1.5, and the real one is great backup if something goes wrong on the hack… just my $0.016 (exchange rate)
 
To follow on these thoughts, I’ve always liked having a real Mac as well as a hack; it satisfies having the ‘it just works’ aspect of the real Mac, as well as scratching the geek itch of making a Hackintosh. You get two macs for the price of 1.5, and the real one is great backup if something goes wrong on the hack… just my $0.016 (exchange rate)

Interesting....right now, my "real" macs are primary, and the hacks are backups (betas, mostly). My thinking is that I need to wait for the Apple Silicon releases next year to see how much they will cost. If they are reasonably affordable (I'm hoping the "small" Mac Pro will come in at about $8-12K or thereabouts), I would keep one of the hacks as a dual boot (Windows and Mac, maybe even Linux) backup, but I just can't see using a hack as a production machine. That's a risk I just don't want to take. Particularly for Video and Mograph (After Effects, etc), that's just a no-no, at least for me. Audio only, maybe, but Mograph Video and most Video applications, no way.

Anther thing: I don't know if this affects you or not, but for me and many other video and audio professionals, you have to think about license costs. Each "mac" real or not, typically needs licensing for most software. That's also a cost to consider as well. Yes, you can ignore on some cases (like Apple's software for the most part), but most pro software limits installs to 2-3, and then you must pay more if you want to install that software on another machine. So, there's that.
 
Last edited:
Well, I too, like many others have a new 14" M1Max Macbook Pro ordered and it is scheduled to be delivered in very early November.

I like the hacks I've built but they have been pretty high maintenance, and while part of me enjoys the challenge of making these work, another part of me feels that it distracts me from really focusing on my work.

However missed blessing that sounds, the $$$ for "real" macs keeps going up, and that too is a decidedly mixed blessing.

So, for me, I will be especially curious about two issues; the first is just how good is my new laptop, and next year when the "possible" new Mac Pro with Apple Silicon comes out, just how much $$$ will Apple charge us for the new Apple Silicon Mac Pro. If it is affordable, or relatively so, that might just say to me that "ready-to-use" trumps the potential savings that might come from a hackintosh.

And, this is strictly my opinion, but after all, time spent doing and re-doing EFIs is also a cost, is it not? I think my time is worth something after all.

EDIT: @CaseySJ: FYI: I have re-installed my Fenvi HB1200 in my Z490, and it's perfect (so far). It responds natively (best I can tell), and best of all, it only need two antennas, which I also like very very much. I just needed the BT for a short project, and BT works great in this hack. I will wait for the Intel BT drivers to mature for now.

To Each His Own.

Thanks, @CaseySJ!
@CaseySJ @mm2margaret Regarding the future of Hackintosh, what we know is Intel appears to be coming back. Alder Lake appears to be a beast. But what we don’t know is if whether Alder Lake and Raptor Lake and the future lakes will continue to work with macOS. Theoretically the future lakes should be able to boot macOS just fine, but we don’t know whether the performance increases that come from mixing and matching multiple x86 architectures in the same chip will be there for macOS. For example, windows “allegedly” needed a new version with a new thread scheduler to take advantage of the hybrid architecture from Alder Lake… so even if Alder Lake is a performance beast on Windows, it could be gimped on macOS. Secondly, as Intel adds new compute tiles to its future architectures, there’s no guarantee that they’ll work in macOS. As an example, Intel Xe graphics in Tiger Lake and Rocket Lake are much better than their Gen9.x counterparts, but macOS has no drivers for them. This will likely only get worse with time.

And if Apple never uses newer x86 processors in new Macs again, then sooner or later the inevitable will happen when it stops supporting x86 builds altogether on new versions of the OS. Monterey was the beginning of a ruthless front to boot x86 from the Mac. Many Intel models that worked fine with Big Sur got cut off for Monterey. I only foresee that trend accelerating for all but the latest Intel macs. What could happen however is Sapphire Rapids Xeon or a Meteor Lake Xeon could convince Apple to use x86 in the next Mac Pro. But that’s speculative, given that Apple could also scale up the M1 Max design to add many more CPU and GPU cores to fit in the power envelope of a desktop chassis with a large power supply and appropriate cooling.

Our hobby may be on the way out. Very sad to see, I’ve been hacking for years. But such is life, the only thing promised is change.
 
@dehjomz - Can't disagree much with that analysis. Seems to be dead on as far as I'm concerned......the only thing I would say is that I sincerely doubt Apple would ever use Intel CPUs at all in future. Remember, the cut off date for Apple/Intel CPUs for new Macs will start at the two year time frame that they quoted last year. Perhaps one or two years more after that for just Intel based Macs software support (only) and that would be it. This would be very similar to what they did with the Power mac back when they switched to Intel, and I think they will follow that same model with the switch away from Intel.
 
@dehjomz - Can't disagree much with that analysis. Seems to be dead on as far as I'm concerned......the only thing I would say is that I sincerely doubt Apple would ever use Intel CPUs at all in future. Remember, the cut off date for Apple/Intel CPUs for new Macs will start at the two year time frame that they quoted last year. Perhaps one or two years more after that for just Intel based Macs software support (only) and that would be it. This would be very similar to what they did with the Power mac back when they switched to Intel, and I think they will follow that same model with the switch away from Intel.
Yeah. I agree. But the difference between the last time Apple jumped ship is that last time, IBM itself also was on the decline. But this time on the other hand, Intel - if you believe Gelsinger - seems to be on the cusp of a Renaissance in terms of features offered and performance metrics. Intel is first to PCIe5 and DDR5. The first x86 vendor to offer a hybrid architecture. If intel can ramp its more advanced nodes (Intel 4, Intel 3, 20A and so on) to volume production on time, then its chip architects can go wild with their designs.

Case in point, Gelsinger announced on the 3Q 2021 earnings call that Intel has powered on its Meteor Lake compute tile built on Intel 4. He also said “I am happy to share that Intel 7, Intel 4, Intel 3, Intel 20A, and Intel 18A are all on or ahead of the timelines we set out in July.” This is the first time in years Intel is not slipping further behind. If Intel really can design more performant chips than Apple could do on its own, then that may influence Apple’s calculus. IBM was never really relevant again, whereas Intel appears to be coming back.

But Intel has a far way to go and it may never match the performance per watt that Apple can deliver at least using x86. Therefore, I tend to agree that the end of x86 support on future versions macOS may be near.

Honestly, if Apple were to commercialize Apple Silicon for the broader ecosystem, that would force the hand of Intel and AMD. This has to make both companies nervous. I know Apple won’t give up its competitive advantage of being the only entity that can use Apple Silicon, but I’d imagine that Asus, Dell, Lenovo, HP and others would love to be able to use Apple Silicon in their products. I think that there is tremendous opportunity for an ARM CPU vendor to emerge and proliferate. Especially for laptop chips.
 
Last edited:
Interesting....right now, my "real" macs are primary, and the hacks are backups (betas, mostly). My thinking is that I need to wait for the Apple Silicon releases next year to see how much they will cost. If they are reasonably affordable (I'm hoping the "small" Mac Pro will come in at about $8-12K or thereabouts), I would keep one of the hacks as a dual boot (Windows and Mac, maybe even Linux) backup, but I just can't see using a hack as a production machine. That's a risk I just don't want to take. Particularly for Video and Mograph (After Effects, etc), that's just a no-no, at least for me. Audio only, maybe, but Mograph Video and most Video applications, no way.

Anther thing: I don't know if this affects you or not, but for me and many other video and audio professionals, you have to think about license costs. Each "mac" real or not, typically needs licensing for most software. That's also a cost to consider as well. Yes, you can ignore on some cases (like Apple's software for the most part), but most pro software limits installs to 2-3, and then you must pay more if you want to install that software on another machine. So, there's that.
I’m an audio for film/tv professional and have been working on hacks in my edit suite since 2016, with my real Mac being a laptop for when I’m on the mix stage. It took a while to ‘trust’ the hack system but after a year or so of increasingly longer edit stints on it, I had no issues and made it my daily driver. Re: licensing, 99% of my software has the option of iLok authorization, so I just swap the iLok key to the system I need to use. I don’t do anything graphics related really, and for me, Avid Video Engine is always a weaker link in my chain than any sort of hackintosh related graphics issues. It’s shocking how bad it is.
 
Last edited:
Audio, sure. Graphics, Editing and Mograph? For production? No.....

Almost all of my software has a limit on installs, but it is graphics oriented. I have Pro Tools and use it from time to time, but all of the other stuff, Adobe, DaVinci, Nuke, Red Giant (Trapcode), Cinema 4D, etc and yes, even Media Composer all have various methods to limit installs.

Despite all the advances Adobe, DaVinci and Apple has made, over 90% of all of Oscar Nominees are still cut on Media Composer (Avid), and most network TV is also still MC.
 
Yeah. I agree. But the difference between the last time Apple jumped ship is that last time, IBM itself also was on the decline. But this time on the other hand, Intel - if you believe Gelsinger - seems to be on the cusp of a Renaissance in terms of features offered and performance metrics. Intel is first to PCIe5 and DDR5. The first x86 vendor to offer a hybrid architecture. If intel can ramp its more advanced nodes (Intel 4, Intel 3, 20A and so on) to volume production on time, then its chip architects can go wild with their designs.
Intel has had -- and continues to have -- a profound impact on the semiconductor industry. They have a lot to be proud of. They stumbled on the march forward from 14nm, but now they have fully adopted EUV lithography and are engineering a comeback. I want Intel to regain its footing as the world's preeminent semiconductor manufacturer. I also applaud its decision to enter the foundry business and provide manufacturing capacity that Apple, AMD, Nvidia, Qualcomm, and everyone else can tap in to. TSMC and Samsung should not be the sole gatekeepers of the world's chip production.

Case in point, Gelsinger announced on the 3Q 2021 earnings call that Intel has powered on its Meteor Lake compute tile built on Intel 4. He also said “I am happy to share that Intel 7, Intel 4, Intel 3, Intel 20A, and Intel 18A are all on or ahead of the timelines we set out in July.” This is the first time in years Intel is not slipping further behind. If Intel really can design more performant chips than Apple could do on its own, then that may influence Apple’s calculus. IBM was never really relevant again, whereas Intel appears to be coming back.
Developing and debugging process technology nodes is essential. If Intel can demonstrate a reliable ability to ramp up successive technology nodes (Intel 7, 4, 3, etc.), then they can earn the foundry business of all the major fabless companies.

Whereas TSMC is purely a foundry (they have no chip designs of their own), Samsung has both a foundry business and an IDM business (Integrated Design and Manufacturing) for their own chips. By entering the foundry business, Intel would be following Samsung's example.

But Intel has a far way to go and it may never match the performance per watt that Apple can deliver at least using x86. Therefore, I tend to agree that the end of x86 support on future versions macOS may be near.
x86 has dominated the PC industry for decades. I'm not sure that x86 can continue to evolve and compete on performance per watt. Chip architectures should evolve in fundamental ways. But the sheer mass adoption of x86 over the years seems to have created an equally massive inertia. Should Intel continue to push x86...?

Honestly, if Apple were to commercialize Apple Silicon for the broader ecosystem, that would force the hand of Intel and AMD. This has to make both companies nervous. I know Apple won’t give up its competitive advantage of being the only entity that can use Apple Silicon, but I’d imagine that Asus, Dell, Lenovo, HP and others would love to be able to use Apple Silicon in their products. I think that there is tremendous opportunity for an ARM CPU vendor to emerge and proliferate. Especially for laptop chips.
This is why Nvidia's planned acquisition of ARM makes me nervous. If the new PC computing standard becomes ARM, should it be controlled by a company that actually competes with those who would license ARM?
 
Last edited:
@CaseySJ et al:

I totally agree with your sentiments. I am not anti-Intel at all, except that for right now they are stumbling. Its to everyone's benefit (except China) to have Intel successful, and I really hope they get this straightened around. I also agree with your statement "....TSMC and Samsung should not be the sole gatekeepers of the world's chip production....."

Given China's recent belligerence, TSMC just might not be around much longer.....or might not be as "cooperative" and "open" as they once were.......
 
Intel has had -- and continues to have -- a profound impact on the semiconductor industry. They have a lot to be proud of. They stumbled on the march forward from 14nm, but now they have fully adopted EUV lithography and are engineering a comeback. I want Intel to regain its footing as the world's preeminent semiconductor manufacturer. I also applaud its decision to enter the foundry business and provide manufacturing capacity that Apple, AMD, Nvidia, Qualcomm, and everyone else can tap in to. TSMC and Samsung should not be the sole gatekeepers of the world's chip production.


Developing and debugging process technology nodes is essential. If Intel can demonstrate a reliable ability to ramp up successive technology nodes (Intel 7, 4, 3, etc.), then they can earn the foundry business of all the major fabless companies.

Whereas TSMC is purely a foundry (they have no chip designs of their own), Samsung has both a foundry business and an IDM business (Integrated Design and Manufacturing) for their own chips. By entering the foundry business, Intel would be following Samsung's example.


x86 has dominated the PC industry for decades. I'm not sure that x86 can continue to evolve and compete on performance per watt. Chip architectures should evolve in fundamental ways. But the sheer mass adoption of x86 over the years seems to have created an equally massive inertia. Should Intel continue to push x86...?


This is why Nvidia's planned acquisition of ARM makes me nervous. If the new PC computing standard becomes ARM, should it be controlled by a company that actually competes with those who would license ARM?
Excellent points Casey! I completely agree.
 
Back
Top