Contribute
Register

Hackintosh 2022 and beyond - Is there any point now?

Aside from the cost savings compared to a real Mac, the most interesting post I read recently regarding hackintoshes is the fact that there are some who are buying it in the current energy crisis to use it also as their central heater at home!

In the building where I reside, the cost of heat is included in the monthly maintenance fee. However, we pay for our own electricity and air conditioning, so having cooler running hardware is far more beneficial to me.

Also, with all the heat generated from those Intel system comes fans and coolers to prevent them from combusting. The peace and serenity that comes with a silent Apple Silicon Mac is priceless.
 
This doesn't mean that YOUR risk is high! Your exposure to ferocious hackers depends on a lot of factors, starting with what you do with your computer: if you don't connect to the Internet (say an audio/video production machine), the risk is nil. :mrgreen:
The security risk is mainly for corporate computers, but even there, most of the hacking goes through errors or ingenuity of some user of the network.
I've been running obsolete Macs for thirty years now, nothing ever happened. ;)
As long as you can update the browsers, most of the risks for a basic user are covered.
False. If your computer is exposed to the internet, you’re at risk. We saw this with windows xp for example. Many non corporate machines were affected by worms and other malware and caught into botnets and other bad situations. But it’s your life if you want to use and put your credit card and private information on a machine with an old unsupported OS chock full of vulnerabilities with no support from the vendor, go right ahead. Good luck to you.
 
What I find mystifying is that everyone internalizes these epic design failures as something they're doing wrong.

Bringing up XP is a pointless anecdote because that SW didn't even have a firewall when it was introduced, was being attached cable modems which aggregate on the last mile, like a LAN, providing an agar for growth and propagation of worms. But that was 25 years ago. I think the lore at the time was that an original XP box was compromised within 15 minutes after being plugged into cable modem, but that story also became urban legend.

I suggest looking at it like that the vulnerability scope of a pc versus common exploits creates a window of opportunity that slides along with releases.

We know certain simple truths like a pc had to exist before it can be exploited. Similarly a pc that's retired is no longer vulnerable. Why do I make this stupid point? To show we can guess something like a bathtub curve of safety: Most up to date systems, have latest patches to mitigate against known exploits. Very out of date systems are too obsolete to be targeted.

This is too simplistic; I am just waving my hands about a way to think... Not claiming that anyone's risk actually follows such a paradigm.

The XP origin story can't figure in this because it was fundamentally and intentionally misdesigned in the interests of greed, at the expense of safety, and its early release vulnerabilities are just as awful today as back in the day. Windows Firewall was added and this mitigated that terrible risk.

The industry doesn't really know or care about individual risk, it deals in the aggregate. An individual pc may catch on fire, who knows. Nvidia approaches power connector design through engineering, not a case-by-case individual experience. Sometimes they get it wrong which we learn through the individual cases.

What I notice about SW is its never done, and no one wants it to be done.

The industry mantra is "no warranty express or implied, no fitness".

The individual mantra is "this insanely complex chi-chi just through all my data away and emptied my bank account, what did I do wrong?!"

As to walking on the bad side of town, etc, there are enormous pitfalls on the web, and even a very secure system can be exploited by talking its user into letting something bad in.

Very hard to generalize about this.

Over history we know a couple big things, in general a Mac is 10-100 times less likely to be exploited than Windows PC, and Apple has been a leader in making sane security enhancements over the last 10 years. Their vertical integration helps them batten down the hatches in a way no PC maker can match.

The exploits follow the money.

Microsoft has considered its users to be disposable for as long as they've been around. A user is someone who serves Microsoft, not the other way around.

As to how rent is extracted by various companies, etc, so complex.

For example, I'm mystified that no one saw the obvious hazard of github coming: "What can go wrong with giving all our code to Microsoft?" Or in case of Google "what can go wrong with giving away all our business details?"

As with everything these days it always turns out to be more complex than common sense can manage.
 
Aside from the cost savings compared to a real Mac, the most interesting post I read recently regarding hackintoshes is the fact that there are some who are buying it in the current energy crisis to use it also as their central heater at home!
Yes one of the usages of Hackintosh is a weak heater for my room. It’s subtropical near tropical here that a separate heater is not economical, and a Hackintosh is enough. I also tend to let it run more in winter.
 
Just throwing this out there. Is there any point in making a hackintosh now that Apple are using the M1 & 2 chips? Or is it still a viable option?

There's no value proposition to hackintosh whatsoever. And maybe there never was. That's debatable at best.

If your time has no money value, and you enjoy tinkering, and presumably you like something about macOS, you can put together a spec-sheet for about 20% less parts price than a Mac, mostly because of Apple's premiums on storage.

Buuutt... Due to recent deep architecture changes, it's becoming difficult to compare suitability for a purpose based on parts lists, because key Mac features and performance depend on relationships to specific Mac HW. They always have actually but now more than ever.

The HW gap was narrow a few years ago and you could hop over it. The gap is much widening since AppleSi and this looks like a trend.

Understanding both features and performance across the 3 platforms (if you include Linux) now requires true expertise. Complex.

I happen to like using Mac best for my routine work: it feels familiar and steady. I always felt it's more sane, but I can't justify this belief. I meet people all the time who think this is crazy.

I also love goofing around with the kit, especially the niggling little idiosyncrasies that drive me crazy. I cross boot Windows and Ubuntu just to keep in touch with what's going on in those realms. But any time I spend in those environments dissuades me from switching my daily drive over.

OTOH I think there's very little I care about that can't be done on any of the 3. Each is it's own culture. Windows always feels the stupidest.

One HW-related feature of Mac I find completely delightful is Airdrop. Simple feature with seeming value far beyond its minor nature. It's gorgeous when you can stop thinking through networking and put data between devices with a tap.

Over a long career in computing, I'm very wary of any convenience that leads to lock in or dependence on a corp service. If I do it, like Gmail, it's because I completely clear about my need for the service and its limits. So for example, Airdrop works whether or not I sign in to icloud. I don't use any icloud services except authentication. When devices are signed in, Airdrop works more smoothly because I don't have to confirm the xfer. But it still is great without it. These little Apple creeping conveniences are sane and tempting, but I see them as a slippery slope. But Apple is careful to not alienate me for not being a joiner. I like that.

I think another genius invention is imessage. Great stuff.

My number 1 concern is mastery over my own data and I use care to protect it and avoid dependencies on 3rd parties that could jeopardize my data.

Yet I still live in fear on one hand because there are so many ways to go wrong, and bewilderment on the other, because in the scheme of life data becomes a burden that often is not worth the effort to conserve. I'm getting off course here, but I think we haven't yet confronted a profound aspect of digital life, which is data baggage.

Back on track: for me in 2021, an i9 fell into my lap while a trusty 2008 Mac Pro is very long in the tooth. So I built a hack, and it was a huge pain, but it worked out. It's prolly cost $20,000 in time. Completely ridiculous. I would not recommend to anyone. But it's been fun (how perverse is this my god).

If you're sane you'll get a properly fitted Studio and get to work.
 
Last edited:
I’m on the fence myself, I have now an 11th gen and a 12th gen, now possibly a 13th gen but not a single actual Mac…since the 2011 17 inch that burned out
 
Last night I was thinking about multi-core benchmarks and the scores from the i9-13900K. It reminded me of when I still had my 12 core/24 thread MacPro5,1. While the benchmark scores very very pretty to look at, the fact was that I didn't have any apps that took advantage of all those cores and threads (the sole exception being Handbrake).

When I decided to put together my first hackintosh, my i7-6700K with just 4 cores/8 threads felt far snappier than the 12 core/24 thread, dual Xeon behemoth that I came from. This was also why I chose to go with an M1 Max instead of the M1 Ultra. I don't run any apps that would take advantage of all the CPU cores (again, with the sole exception of Handbrake).

Even when multitasking, unless the apps are actively doing a lot of stuff in the background, it's difficult to make use of a lot of cores. This is why CPUs like the 64 core 3990X Threadripper is not a very practical choice for most users.

I run a Raspberry Pi 4 8GB as a home server. It runs 25 Docker containers 24/7 and I have never seen system load exceed 200% (400% would be max because of the 4 cores).

Moral of the story... Consider what apps you will be running and whether or not those apps can fully take advantage of the hardware. Benchmarks are useful, but you also have to factor in your own use cases.
 
I like to have choices. if I get fed up with Apple I can just boot into Windows on the same hardware and most of my stuff is there. I would prefer Linux, but there's still too much missing to make this a convenient choice.

I would miss Logic Pro, though. Tons of stuff done in LP over the years. And my mac-only Metric Halo ULN-8.
 
This is like an AA meeting for Hackintoshers. So let me vent...

I hate anything to do with "Proprietary" or giving companies a monopoly, but I do prefer OSX over Windows. So how far into the future will software companies still support Intel mac's? Don't say it!...

"It's in the Cloud"... another word I hate. But I guess I still love and live in analog. Now where did I put that Zip drive? :think:
 
This is like an AA meeting for Hackintoshers. So let me vent...

I hate anything to do with "Proprietary" or giving companies a monopoly, but I do prefer OSX over Windows. So how far into the future will software companies still support Intel mac's? Don't say it!...

"It's in the Cloud"... another word I hate. But I guess I still love and live in analog. Now where did I put that Zip drive? :think:

Following "standards" have left PCs in a state of antiquated design with no signs of change. Take for example PCI-e based graphics cards... We now have quadruple wide video cards. What good is an ATX standard based motherboard if the video card blocks all the PCI-e slots?

Also, the positioning of PCI-e slots also makes the fans on those video cards blow heat towards the CPU. The direction those fans blow also usually don't blow in the same direction of the PC case fans causing inefficiencies in cooling.

Following ATX power supply standards have made PCs power cable management a mess. The motherboard connect went from 20 pins to 24 pins and it still isn't enough to power the CPU which needs addition power from 4, 6, or 8 pin connectors. This "standard" also does not provide enough power to the PCI-e slots to power video cards, so more cables are needed...

"Standards" aren't always better. Often, it's flawed standards that give rise to proprietary.
 
Back
Top