Contribute
Register

Apple's WWDC Announced for June 3-7 2019

Status
Not open for further replies.
I don't know if you were supposed to post this on another thread coz' I don't see any previous comments claiming that AMD is better than nVidia. Your quoted post from your reply is just saying nVidia is possibly out from Apple and never mentioned that AMD is better than nVidia. At least its what I understood anyway.

Just saying. ☺
yes I know the person didn't said that AMD is better
I was generally speaking about everything
but let me ty to explain a bit better to avoid any misunderstanding

I'm sure you heard many times that some people claims that AMD is better than Nvidia at least when it comes to Mac OS and maybe they are right but is not that the card are better is just that AMD cards have better driver compatibility than Nvidia because is apple the ones that write the drivers for AMD or at least apple optimize them for AMD

that is if AMD are the ones that write their own Mac OS drivers

but if you install the same cards on windows then Nvidia will beat the AMD card by a mile
sometimes is just not the hardware but is also the software
that's why Nvidia needs to provide better web drivers

let's say hypothetically speaking the Nvidia will pay apple for apple to support their latest Nvidia cards
and apple is the one writing the Nvidia drivers
do you really think that AMD cards will stand a chance
they are just lucky that apple support them and apple is the one writing the driver
or optimizing them for performance while Nvidia is all alone trying to figure out what to do without any help from apple

that's why the web drivers are taking bit too long
trust me is all about money

even if Nvidia will try to make some kind of deal with apple is not going to work
because some one is going to have to agree to a huge pay cut
while AMD cards are a bit more affordable so apple can still make money

so is not good for apple to make a deal with Nvidia because even if someone have to take a pay cut
then price of the unit will be higher having a Nvidia card instead of a AMD CARD

when a person makes a comment and another replies, yes most of the time the next person will follow up the first person post
but also the person that replies will also make his own comments and will also add some things that the other person didn't mentioned or bring up

that is totally normal

so my comment wasn't just about Nvidia being better than AMD
I was trying to explained why I don't think that Nvidia is dead yet
I also express my opinion that I think that Nvidia cards are better and I hate what apple is doing
why we are being forced to AMD cards when we have nvidia
people should have the right to choose but when you choose apple then you give up that right
because they will choose for you

AMD cards have good performance on Mac OS because apple is helping them with their software

peace
 
Last edited:
fair enough, me personally I never had any problem with any of the Nvidia cards that I bought during the years, I totally respect your comment and opinion, but there are many things that can cause that kind of problem, like a bad overclocking or a bad power supply, the funny thing is that I order a AMD card for my cheap neighbored and the card came bad from factory, had to return it. is ok I completely understand that there are people that like AMD cards the same way I like Nvidia cards, that is understandable, people have different tastes, so I'm cool with that, peace
I forgot mentioning that my second Nvidia card exploded on the eve of my birthday, so I could not give myself a Nvidia card again as a birthday gift.
 
sorry but I completely disagree, just because apple chooses, AMD cheaper cards just to make more money it doesn't mean that AMD cards are better than Nvidia, everybody knows is the other way around, the only reason why AMD have good performance on Mac OS is because apple write the drivers, while Nvidia newer cards doesn't have apple blessing, apple only supports up to Kepler on Mojave because there are some old Macs that are supported on Mojave that comes with some of those Nvidia Kepler cards, so they have to support it, as for the newer card, Maxwell, pascal, Turing is up to Nvidia to provide web drivers for those cards, the story is very long and I don't want to write a 20 page book, Nvidia will come out with web drivers, I heard one line that I will never forget and is "how quickly people forget" pascal support took almost a year, Nvidia started working on Mojave web drivers a few months ago, in case no one else knows , there is a new developer tab on Nvidia web drivers panel, don't think that I'm defending Nvidia I will be fair and say that both companies are guilty, to avoid any misunderstanding, Nvidia never followed up the beta testing in Mojave and they thought that apple was going the same route with metal but apple upgraded metal to metal 2 and never told Nvidia nothing about it, not that they need to, also apple doesn't make any money from Nvidia sales, Nvidia cards generally cost more than AMD cards so apple will make less money if they put Nvidia chips on their computers, another side of the story is that Nvidia doesn't tell apple nothing about their hardware and apple doesn't tell Nvidia nothing about the software, so as you see they both are fighting for control and supremacy while we , well at least the Nvidia costumers are suffering the consequences, one thing is for sure we will have web drivers, and we are not talking about crappy drivers I'm talking about high performance drivers that will smoke those AMD cards, something similar to windows where Nvidia crushes AMD.

apple is pushing for AMD EGPU so they can make money and trying the best they can to try to cut Nvidia's throat
because they know who the #1 graphic company in the world is and is not AMD

anyway later, regards
Did not say better!!!, just said Apple is NOT going to release approval for the finished web drivers for Nvidia cards. There are pictures of a machine with a fully working Nvida card, (not with artifacts and the other problems & Metal 2 capable) out the on the web, with loaded Mojave. The final approval is Apple's fault in this case. End of discussion.
Radeon VII is very close if not better than RTX2080.
 
Did not say better!!!, just said Apple is NOT going to release approval for the finished web drivers for Nvidia cards. There are pictures of a machine with a fully working Nvida card, (not with artifacts and the other problems & Metal 2 capable) out the on the web, with loaded Mojave. The final approval is Apple's fault in this case. End of discussion.
Radeon VII is very close if not better than RTX2080.
the radeon V11 is the new 7 nm card right, also is a 16 gb card right
but they re pretty much even
they beat each other on benchmarks by a few frames depending on the game
but it doesn't stand a chance against the 2080 TI
also just wait until Nvidia releases their 7nm cards


most of the time the company that launches their last product
it should have the upper hand on the company that had the previous product
but like I said, AMD even that they launched their 7NM after the 2080
it couldn't surpass the 2080 not even after adding double the memory capacity

so the best AMD could do was match the RTX 2080
but it doesn't really beat it
beating a card has to be by a significant margin
constantly on every game

AMD need it a 16 gigs card just to match a 8 gb card
lol
they way I see it
I don't trust AMD bump up specs, they are just inflated numbers
but that is just me, I respect if you like AMD, no problem

the best way I describe AMD is with the car AMP terminology
many people buy a 4000 watts mono block amp for their speakers
but those cheap AMPS really don't even reach 2000 WATTS
that's why is always better to buy an AMP that is certified to give you exactly what they re offering
or at least the closet to it

certified AMPS cost more just like Nvidia cost more than AMD
but why is that? because you are buying quality

AMD is just like those cheap AMPS with bumps up specs
a 16 gigs card that can't beat a 8 gb card

flip the coin and a 16 gigs Nvidia card will smoke a 8 gb AMD card, easily by a mile

also Nvidia has some features that AMD doesn't have and if you disable those features then that will increase the frame rate

regards
 
Last edited:
I forgot mentioning that my second Nvidia card exploded on the eve of my birthday, so I could not give myself a Nvidia card again as a birthday gift.
yes you had a bad experience with Nvidia cards, also what a bad day to decide to blow up
because of what happened to you , no one can't blame you for not wanting to buy a Nvidia card ever again
no one wants to have a card exploding on their face
so you have all the right to feel the way you feel and to think the way you think

regards
 
the radeon V11 is the new 7 nm card right, also is a 16 gb card right
but they re pretty much even
they beat each other on benchmarks by a few frames depending on the game
but it doesn't stand a chance against the 2080 TI
also just wait until Nvidia releases their 7nm cards


most of the time the company that launches their last product
it should have the upper hand on the company that had the previous product
but like I said, AMD even that they launched their 7NM after the 2080
it couldn't surpass the 2080 not even after adding double the memory capacity

so the best AMD could do was match the RTX 2080
but it doesn't really beat it
beating a card has to be by a significant margin
constantly on every game

AMD need it a 16 gigs card just to match a 8 gb card
lol
they way I see it
I don't trust AMD bump up specs, they are just inflated numbers
but that is just me, I respect if you like AMD, no problem

the best way I describe AMD is with the car AMP terminology
many people buy a 4000 watts mono block amp for their speakers
but those cheap AMPS really don't even reach 2000 WATTS
that's why is always better to buy an AMP that is certified to give you exactly what they re offering
or at least the closet to it

certified AMPS cost more just like Nvidia cost more than AMD
but why is that? because you are buying quality

AMD is just like those cheap AMPS with bumps up specs
a 16 gigs card that can't beat a 8 gb card

flip the coin and a 16 gigs Nvidia card will smoke a 8 gb AMD card, easily by a mile

also Nvidia has some features that AMD doesn't have and if you disable those features then that will increase the frame rate

regards

A couple of things to note here. The Radeon vii is $699 US, the RTX 2080ti is $1200-&1500, almost double the cost yet not close to double the performance. I have a Luxmark Ball bench of 59,000+ it’s 3rd place in single card scores. Go to Luxmark web site and check. It was done with a reference 3 fan stock card with the latest wattman at 2150MHz and 1200mhz memory. It is beating the crap out of RTX Titans! Frame rates are good for gamers but compute rates are important for developers. The thing is most people don’t game on Apple computers so the frame rate argument doesn’t make a whole lot of sense in the Apple work load environment. Nvidia Cuda cores were great for Apple work loads but AMD cards like the RX 500 series and Vega’s are taking over the crown in this field as far as Apple software Apps. It is a trend and you can fight it all you want but it won’t make it any better. It’s unfortunate that Nvidia can’t get into the Apple ecosphere with their latest products, but also ask yourself. Why hasn’t Nvidia made High Sierra drivers for the RTX series cards? Mojave lockout should not be an excuse for not making them for High Sierra, Sierra etc...
 
Ok in tht usecase it makes sense, but how many MacPro / iMac Pro users use their Mac for such kind of things?
I think most will do Video and Audio things no ?
Yes. I was speaking in generalities. A large part of the use in some shops is Xeon's reliability. Though I know plenty who just stick with consumer hardware (Core i7/9-products). However the things that Xeon/ECC's prevent could be detrimental to some workflows in the rendering space.
I haven't done extensive testing, but at 4K, I don't have any problems with FCP* with my Stork Clone. At 8k however, it's basically unusable unless I make proxies, even then not great. Playing with an iMac Pro at the Apple store, and it didn't even blink with 8k, but those aren't apples-to-apples comparisons, just a quick comparison, and I was not using the same footage for 8k. May also be dependent on however Apple optimizes FCP with hardware (HBM in Vega vs GDDR5 in my Radeon 580, the better compute on the graphics, etc).

*Davinci Resolve also performs great, but depends on how much footage you have - it eats GPU memory. And Premiere just performs horribly regardless of OS(though obviously better on Windows).
 
At 8k however, it's basically unusable unless I make proxies, even then not great.
Interesting, so it is most usefull for Graphic stuff ? Do Xeons even have any advantage for music production ?
 
Graphics/Video - some benefit. Just depends on software and workflow. Even then, it's difficult to say for certain, unless you can try to apples-to-apples comparison between similar configurations - example: newly announced iMac i9 8-core with Vega 48, 32GB so-dimm vs Base iMac Pro 8-core Xeon, Vega 56, 32GB ECC (I wouldn't be surprised the newer iMac might outperform in some situations because of the higher clock speed, and iGPU in the core i9). All I can say any studio I worked for has been largely Xeon powered, but it used to because of the massive amount of cores and cache, that wasn't available in the consumer space (and the reliability stuff, workstations tend to be tanks).

Your audio questions would be better answered in the forum, I don't work in audio/music, and it's unlikely any audio engineers are paying that much attention to this news post, let alone this far down the thread.
 
Status
Not open for further replies.
Back
Top