Contribute
Register

[Success] GIGABYTE Z370 Gaming 7 + Intel Core i7-8700K + RX 580 + (2x) Dell P2715Q 4k @ 60Hz

Maybe I will bite the bullet and just get the same memory and video card. Is there any 64GB memory solutions you would suggest? Also, are there any other bigger Radeon cards that will work? I do lots of video editing and want something powerful.
There is the Vega 56 & 64. I have read somewhere that Final Cut Pro works better with AMD RX & Vega cards compared to NVIDA cards, maybe the guys that do video editing can elaborate on this.
I used to use an NVIDIA card but switched over to two RX 580's but have since sold one. Has been smooth sailing ever since.
 
Before doing any costly change, I would simply confirm/research possible tweaks needed to get your Nvidia GTX 1080 Ti and your memory working with your build. But it's a vastly different build.

Also, I was very specific with the memory I chose for this build to avoid problems. I looked at the chart of compatible memory sticks for the Gaming 7 motherboard, as tested by the manufacturer. My specific memory selection was not an accident to have everything go smoothly since I was doing a ground up build.

...and of course, like a total bonehead, I installed my memory sticks in the wrong slots at first. :D

Just wanted to point this out because I place ease of build and compatibility as paramount.


OK. I REALLY appreciate your response and information. Other than these weird glitches (which aren't REALLY that bad, just a little annoying) the system works great! I will look into getting the 1080 Ti to work, but honestly don't even know where to start... if anyone on this thread knows anything about this id greatly appreciate it!!

Also, is there any updating to Mojave with this build (even though mine isn't EXACTLY like yours)?
 
Is there any updating to Mojave with this build/
For the time being you'll be staying with High Sierra. At least until when/if Nvidia decides to release macOS Mojave drivers. Most that are choosing AMD graphics do so because you can't rely on Nvidia anymore to do the work on new drivers for macOS when there is a completely new version of the OS released to the public. There isn't enough financial incentive in doing that to get them on the ball so to speak.

Mac Pro 5,1 (MP 2010/2012) users are a slowly dying breed and hackintosh users are really the only ones that need Mojave drivers for their Nvidia card. Those users are a drop in the ocean in relation to all the Windows users that need working graphics drivers. That is why there seems to be no reason for Nvidia to have or meet any deadlines to get the Mojave drivers out. AMD RX series drivers were already in macOS ever since the first beta of Mojave. They are optimized for Macs and Pro programs like FCP X and Logic Pro.
 
Last edited:
OK. I REALLY appreciate your response and information. Other than these weird glitches (which aren't REALLY that bad, just a little annoying) the system works great! I will look into getting the 1080 Ti to work, but honestly don't even know where to start... if anyone on this thread knows anything about this id greatly appreciate it!!

Also, is there any updating to Mojave with this build (even though mine isn't EXACTLY like yours)?
This issue thread like it might assist you:
https://www.tonymacx86.com/threads/solved-ui-stuttering-freezing-nvidia-issues.250891/

Might be as simple as ensuring you have the correct Nvidia driver version installed:
https://github.com/Benjamin-Dobell/nvidia-update

And once again I don't have an experience with this since I went with the Radeon for the Mac's built-in support, but hopefully those links above can assist you.
 
I have a MSI GTX1080 (non TI) and haven’t experienced any of your issues.

I used the the script of Benjamin Dobell that @HackaShaq linked above to install the Nvidia WEB Drivers.

A few things I noticed that we have different.

I have 18.3 Smbios (yours seems to be on 14.2.). I guess you are on High Sierra 10.13.6?
Did you activate the xmp profile in the BIos for the RAM?
I would try installing the web drivers with the script and see if maybe a higher smbios will help you too.

Don't update to Mojave yet as there are no Nvidia WEB drivers out yet. stick with HS 10.13.6 which runs like a charm.

I did various benchmark test this weekend.

Here are a few results for comparison:

- Geekbench:
single core: 6609
multi core: 31302
openCL: 181444

- Cinebench:
1645 | 150.2 fps

- Blender:
BMW: CPU 03:52.75min | GPU 04:25.54min
Classroom: CPU 16:23.14min | GPU 16:46.38min

- Heaven:
Basic: 229.7fps | 5787 points
Extreme: 118.8 fps | 2994 points

- Luxmark:
CPU: 3.477
GPU: 15.537
CPU+GPU: 18.987

The Bruce X 5K test in FCPX I can do in ~ 25sec (ProRes422).

I have the 8700k now running at 5Ghz and 32GB RAM at 3200Mhz (XMP profile activated in Bios).


For the time being you'll be staying with High Sierra. At least until when/if Nvidia decides to release macOS Mojave drivers. Most that are choosing AMD graphics do so because you can't rely on Nvidia anymore to do the work on new drivers for macOS when there is a completely new version of the OS released to the public. There isn't enough financial incentive in doing that to get them on the ball so to speak.

If Apple would get their thing together and offer better support for eGPU then there would be quite a big market of people out there waiting to use the powerful Nvidia GPUs for rendering alongside their MACs. The Blackmagic eGPU is a joke for its price and non upgradable design. With the right support from Apple the incentive for Nvidia to release web drivers is there.
 
Last edited:
Just to add my 2c worth, just be careful of the F10 BIOS upgrade. Gigabyte has mucked around with the CPU voltage (it has been increased) and maybe other things that I don't understand, and the end result is lots of heat even at the stock CPU frequency.

Here is a Geekbench I took yesterday on F10 BIOS with 5.0GHz OC (I'm running a 280mm Corsair AIO water cooling). I also ran Blender doing the Classroom test using the CPU and the temp maxed out to 100C and then started to throttle the CPU frequency, I shut it down!

View attachment 361238
This is today, back on the F7 BIOS and OC at 4.9GHZ.
View attachment 361236
This is today running the Blender benchmark and monitoring the CPU temp. It never went over 83C, which is good! This is with my Fans set up in quiet mode.
View attachment 361239

I totally agree OC gives a little gain but is it worth it to have crashes instability and possibly shorten the life of your CPU.
FWIW I have 3600MHz RAM and that has made my GBench results a little faster.

Maybe later this week I will be delidding my CPU and using liquid metal on it, plus I have a 15% larger copper IHS that I'll fit.
My aim is to keep it at the 4.9GHz OC, but to reduce the temp, so I can then run my Corsair ML fans slower for a even quieter build. Plus I might try reducing the CPU voltage for even cooler running.

Jim:geek:
@jb007 , My MB is F10. I have never gone backwards, any tips on this?

BTW, I did the delid thing and did not see temp reductions many claimed.
 
Hi @jiffyslot.
I'll try and answer the Q's to the best of my knowledge/understanding of OC'ing etc.
Others more knowledgeable in these matters please feel free to chime in!
  1. I would not set the CPU to anything other than your CPU ie i7-8700K. Where are you setting your system to an i3-83xx?
  2. I've never had a Xeon CPU, and I'm a little unsure as to what you mean by "evenness of performance" are you meaning the CPU frequency spikes that you see on your CPU when it is using Intel's SpeedStepping technology i.e. it changes the CPU frequency up and down based upon 'load'?
  3. Yes, the BIOS does know what CPU you have and sets up various hardware and software 'things' appropriately. All Intel CPU's have information (ID's) embedded in them so the motherboard can read them.
  4. The more cores that you can give the audio/video tools etc, the better. You say that you have it running fine temperature wise, maybe you can do a back to back test on how much you really gain with overclocking ie run an encoding session. I think you will be surprised at the small difference in real-world audio/video user-case. Things like RAM speed and hard disk speed will play a large part in the speed and feel of these systems.
  5. I don't game either (unless you call a few games of Pinball gaming!), but I do a lot of compiling and I've started to play with Blender and might be doing some Youtube videos so I like to have a fast system. I've OC'ed to 4.9GHz and it's very stable on the F7 BIOS. I do have 3600MHz RAM. Your personal build profile is not visible to me when I click on your name under your Avatar then click on Profile, it says "This member limits who may view their full profile."
    So I'm not sure as to what you have memory wise.
  6. I'm guessing you have air-cooling for your CPU, if it does get hot in your "P. Piggly McHogswine's Porkinarium" (LOL) then maybe consider upgrading to an all in one water cooling like what I have and the Corsair ML (Magnetic Levitation) fans that replaced the standard fans that came with my Fractal Design case. Made a huge difference to noise. I'd do a decibel check for you, but my brother hasn't returned my DB meter... it's only been a year or two!
Jim:geek:

Ha ha, see, I think that's what got me in trouble before; profiles signatures, personal signatures; who can see what and when.
I thought my speed was listed. My RAM is 64GB @ 2400. AFAICT, there is no "opt-out" of overclocking on my particular mobo (see build signature) IDK how to get around this, so I was thinking of using the lowest i7-8700 OC setting, then changing the "Easy" setting to "Saving" to relax the overclock. I'll not be not overclocking my RAM, just using the XMP1 profile 1, as I'd sooner fry the CPU than the RAM. It was speedy RAM. My CPU air-cooler is the size of my head, and has a huge tan fan between the two heat sinks/radiators. It itself doesn't get hot. My thermal paste application is either above-average, or I won the silicon lottery.

ProTools CPU Hoggery.png
 
Last edited by a moderator:
Just to add my 2c worth, just be careful of the F10 BIOS upgrade. Gigabyte has mucked around with the CPU voltage (it has been increased) and maybe other things that I don't understand, and the end result is lots of heat even at the stock CPU frequency.

Here is a Geekbench I took yesterday on F10 BIOS with 5.0GHz OC (I'm running a 280mm Corsair AIO water cooling). I also ran Blender doing the Classroom test using the CPU and the temp maxed out to 100C and then started to throttle the CPU frequency, I shut it down!

View attachment 361238
This is today, back on the F7 BIOS and OC at 4.9GHZ.
View attachment 361236
This is today running the Blender benchmark and monitoring the CPU temp. It never went over 83C, which is good! This is with my Fans set up in quiet mode.
View attachment 361239

I totally agree OC gives a little gain but is it worth it to have crashes instability and possibly shorten the life of your CPU.
FWIW I have 3600MHz RAM and that has made my GBench results a little faster.

Maybe later this week I will be delidding my CPU and using liquid metal on it, plus I have a 15% larger copper IHS that I'll fit.
My aim is to keep it at the 4.9GHz OC, but to reduce the temp, so I can then run my Corsair ML fans slower for a even quieter build. Plus I might try reducing the CPU voltage for even cooler running.

Jim:geek:
Another set of scores:
Screen Shot 2018-10-29 at 10.59.10 PM.png
Screen Shot 2018-10-29 at 11.06.26 PM.png


"RAM Speed" is a puzzle. I am not understanding what that number is. I thought that was measured in Megahertz?
And the Disk Score is also a real puzzle.
 
@jb007 , My MB is F10. I have never gone backwards, any tips on this?

BTW, I did the delid thing and did not see temp reductions many claimed.

F10 firmware is now out, with added support for Intel 9th Gen CPUs

I updated BIOS, redid settings (because like any BIOS update they reset to defaults). No issues with macOS

EDIT:

I noticed the changed with the auto over clock options. It's using higher voltage than it was before, which means higher temps. They might be trying to resolve the auto overclocks being unstable on some of the less golden egg CPUs that got unstable with F6 and F7, but the ones that were stable on F7, are now just running hot needlessly. Might have to consider custom OC settings now. I literally gained 10C on max load CPU temps with new vCore it's giving on auto 5GHz.

EDIT2: Apparently this board doesn't support custom OC what so ever, everything is auto and only auto. locked from making any changes despite being a 8700k. So I'm just going to be rolling back to F7 until hackintosh has support for 9900k and THEN i'll use F10.

a BIG no thanks to majorly overvolting OC and running at 80-85C instead of the usual 70-73 it was before (same test)

@elfcake have a look at the post started by @MysticalOS that I have quoted. Both of us had a discussion about F10 and how Gigabyte had upped the voltage to the CPU's, which MysticalOS finally fixed by manually bring the voltage down. My tests with F10 have shown around 8C higher idling temps due to this voltage increase. I'm going to de-lid mine on Thursday/Friday mainly to get my idle temps down further than the ~6-8C above ambient. Don't forget to report on what the ambient air temp is, otherwise temps are meaningless. E.g if the room was 35C here mine would be idling around 41C, which then doesn't give much headroom, hence the reason I want it to idle cooler, plus I can also lower the fans for less noise.

I'm not sure why you didn't see much change, but I'll publish the results of my de-lid experiment for all to see.

Jim:geek:
 
@elfcake have a look at the post started by @MysticalOS that I have quoted. Both of us had a discussion about F10 and how Gigabyte had upped the voltage to the CPU's, which MysticalOS finally fixed by manually bring the voltage down. My tests with F10 have shown around 8C higher idling temps due to this voltage increase. I'm going to de-lid mine on Thursday/Friday mainly to get my idle temps down further than the ~6-8C above ambient. Don't forget to report on what the ambient air temp is, otherwise temps are meaningless. E.g if the room was 35C here mine would be idling around 41C, which then doesn't give much headroom, hence the reason I want it to idle cooler, plus I can also lower the fans for less noise.

I'm not sure why you didn't see much change, but I'll publish the results of my de-lid experiment for all to see.

Jim:geek:

What Idle looks like as of now (ambient is 19.5c)
Screen Shot 2018-10-29 at 11.15.10 PM.png
 
Last edited:
Back
Top