Contribute
Register

Apple Previews macOS 11.0 Big Sur - Available Fall 2020

Status
Not open for further replies.
I'm switching to PC, I'm really sorry for the Apple OS, but all apps work better on PC, Cinema 4D and ADOBE to take advantage of NVIDIA acceleration, not to mention Solidworks and more professional apps that only work on Windows. With the end of Hackintosh, is it better to have a Mac just because it is beautiful and very expensive?
Sounds reasonable. Thanks for your participation here over the past 6.5 years. Best of luck with Windows.
 
there are also some new vulnerabilities also found on AMD CPU
so is not just intel

just give it some time and that new apple silicon will also have some vulnerabilities
that is just the way it is

but of course the less number of vulnerabilities , the best
but then again, only one vulnerability can be much worse that all the other ones put together

when using computers or devices there is always a risk
the companies try their best to protect their users
but it doesn't matter if you CPU is impregnatable
most users will find a way to put the rope around their necks by doing other stuff

back in the day apple was a bit more secure than windows
but as apple grew in popularity then it also became a target
apple can try to hide what is in their updates
but we all know

then apple quality control went down the drain, I'm not being negative or attacking apple
I'm simply explaining the way the things are
everybody knows that Catalina is a dumpster fire aka train wrecked
just like iOS 13

apple releases an update, then the next day a supplemental update lol
a fix that fixes the fix but it requires another fix , lol

anyway, let's see how thing play out
we should not worry now because we still have a few more years
and maybe something might change that will extend the life of hackintosh, who knows

but most hardware are vulnerable to attacks , via software one way or another
regards

edit
this is for the user that didn't get the point
tim cook = master windu

master windu was responsible for Anakin turning into the dark side
tim cook is responsible for many mac users turning to windows

:lol:
 
Last edited:
First Apple dropped Mac OS9, then Classic compatibility, then PPC, then nVidia drivers, then 32-bit apps, next Intel. The oldest changes look good in retrospect, but never at the time. Love 'em or leave 'em, I guess.
 
Many people have forgot a significant issue from Intel CPUs that there is a minimal operating system MINIX inside, and it is possibly a backdoor.
 
I was not aware of that. interesting. So glad we are going with RISC long run.

Many people have forgot a significant issue from Intel CPUs that there is a minimal operating system MINIX inside, and it is possibly a backdoor.
 
Maybe, I know that they seem to be cutting off igfx support at HD5000. HD4600 came out slightly (3 months) before HD5000 in 2013. Ironically, Notebookcheck website says this: "Although the HD 5000 features 40 Execution Units, the graphics performance is still somewhat below the HD Graphics 4600 with only 20 EUs."

What may work for HD4600 is: setting the PlatformID to 0x0D220003 (Intel Iris Pro Graphics 5200) and device-id to 0x0412 (Intel HD 4600) in your config.plist. I'm not one hundred % sure of that. It has worked in previous macOS versions.

There are 2014 iMacs and Mac minis that have HD5000 as primary GPU that still have support. The official word from Apple is that 2013 iMacs and older 2012 Mac minis have been dropped from support. What doesn't make sense is that the Late 2013 21.5" iMac with Iris Pro 5200 graphics will not work with Big Sur. That is a Big Surprise. Send Tim Cook an email and ask him why this is. [email protected]


View attachment 478363
Dear Moderator,
Thanks for the reply and the info.. Gotta have to wait and see if it's still supported..
 
I was not aware of that. interesting. So glad we are going with RISC long run.
If I recall correctly even Andrew Tanenbaun, the father of minix, was surprised to hear to that minix is used inside the Intel chips.
 
What if Apple leaves Intel... but not the x86 world completely?What if the Apple Silicons are meant for low-middle level machines? What if the Universal binary is here for staying for long enough because they want to use AMD Ryzen CPU on the high-end machines?
Just watch the keynote. Tim talked in absolute. "The transition will be over in 2 years". They are absolutely leaving Intel and x86. If Apple has not switched to AMD at this point why would they? Apple has been carefully watching, planning and executing behind the scenes to prep their Apple Silicon for this transition. They feel now it is time for that change. We've been here before. We know what is going to happen. There isn't going to be "one more thing" in a few years.
 
I was a latecomer to the iPhone, not b/c of antipathy toward Apple (I've bought thousands of dollars worth of Apple products since my first Classic II), but because iOS seemed to be basically a kiosk. You can go only where they want you to go. I understand, I guess, the need for a streamlined and limited interface on a mobile device, but the lack of choice has always been a turn off to me, whether it's iOS or anything else in life. That's why I stayed on Lion 10.7 so long after I built my first hackinbox. As soon as 10.8 came out and it started integrating with iOS, I could see the writing on the wall. As in, MacOS was on the way to become less of standalone operating system and more and more like iOS, i.e., with a continually diminishing universe of options and customization possibilities.

The ARM chips seem to portend a decisive, final move in that direction, one leading to an eventual elimination of skanky hacks, good hacks and even great hacks. The sublimation of man to machine will soon be complete as far as Apple is concerned. Maybe not in the next two years but we're heading past the final turn to the home stretch.
 
there are also some new vulnerabilities also found on AMD CPU
so is not just intel

just give it some time and that new apple silicon will also have some vulnerabilities
that is just the way it is

That's true that all cpus are not perfect but Intel is far the worst I've experienced. I had three servers with Intel Atom CPU and it failed in one year. It was a flaw design which Intel didn't admit for long time. Now I'm using servers with AMD cpus and it's been running strong for 6 years. I would rather Apple use AMD cpu than Intel. Intel used to be great company but not anymore with very high employee turnovers and burnout rate. Nvidia is even worse.
 
Status
Not open for further replies.
Back
Top