Graphic card Modded, working on Sierra

Discussion in 'Overclocking' started by The Veterant, Feb 18, 2017.

  1. shasen1235

    shasen1235

    Joined:
    Jun 15, 2015
    Messages:
    58
    Mobo:
    MSI Z170A Krait
    CPU:
    i7-6700k 4.5GHz
    Graphics:
    Gigabyte 980ti WaterForce
    Mobile Phone:
    Android, iOS, Windows Phone
    Jul 21, 2017 at 9:04 PM #21
    shasen1235

    shasen1235

    Joined:
    Jun 15, 2015
    Messages:
    58
    Mobo:
    MSI Z170A Krait
    CPU:
    i7-6700k 4.5GHz
    Graphics:
    Gigabyte 980ti WaterForce
    Mobile Phone:
    Android, iOS, Windows Phone
    A bit late for the party, but here's something I want to point out.

    Even though you edited the bios and CUDA-Z reads the matched frequency, it is likely you still run on the same speed as before.

    I've done many tests and here are some results:
    980ti WaterForce stock 1318Mhz+7000Mhz(Actually runs on 1060Mhz+7000Mhz)
    螢幕快照 2017-07-20 下午11.38.05.png
    (And sorry no CUDA-Z screenshot but rated 1318Mhz, which matched the spec of this card)

    However, the result of custom vbios with 1506Mhz+7200Mhz:
    螢幕快照 2017-07-22 上午3.26.07.png 螢幕快照 2017-07-22 上午3.27.23.png
    I know there could be some margin of error but it doesn't make sense an overclocked vbios without overheating has no improvement and even worse score compared to a stock one.

    And if you read the clock speed from HWMonitor, they are both running on 1060Mhz
    螢幕快照 2017-07-22 上午3.40.02.png

    So here's my conclusion, overclocking GPU by editing vbios for macOS is possible but you'll have to increase boost table a lot higher than the clock you want in order to achieve any clock gain over a "reference card"

    What and why? From what I've experienced the way WebDriver works is it sees all vender cards as a reference card and with no GPU boost. Like our 980ti, the reference card has 1000Mhz base clock, boost up to 1075Mhz and +100Mhz max when GPU boost is available. So I think the number 1060Mhz from HWMonitor is correct, and the one from CUDA-Z is somehow...correct as well since it directly reads the data from vbios.

    So what if we just increase the boost table as I just mentioned? Well, due to the base line difference between two systems(Win:Correct for card, Mac:Ref number), you will end up with either Win or Mac has unmatched voltage table so you can't get both of them running on the the clock you set at the same time. And that's the reason I ended up give up doing this unless some day NV or Apple fixed thsi issue.
     
  2. GDS

    GDS

    Joined:
    May 23, 2010
    Messages:
    154
    Mobo:
    X99 deluxe II
    CPU:
    2696 v4
    Graphics:
    2 TITAN X Maxwell
    Mac:
    MacBook Pro
    Mobile Phone:
    iOS
    Aug 4, 2017 at 2:50 PM #22
    GDS

    GDS

    Joined:
    May 23, 2010
    Messages:
    154
    Mobo:
    X99 deluxe II
    CPU:
    2696 v4
    Graphics:
    2 TITAN X Maxwell
    Mac:
    MacBook Pro
    Mobile Phone:
    iOS
    I do not agree at all

    Open CL with GB 4 (one after the other so no reboot or else)
    Cuda (by all programs using this technology)
    and all the rest
    my one titan X OC runs much better than the stock one


    i will upgrade the second one and post how cuda works

    by the way the mod for Mac is different
    i dont use "boost"
    Capture d’écran 2017-08-04 à 22.48.08.png


     
  3. GDS

    GDS

    Joined:
    May 23, 2010
    Messages:
    154
    Mobo:
    X99 deluxe II
    CPU:
    2696 v4
    Graphics:
    2 TITAN X Maxwell
    Mac:
    MacBook Pro
    Mobile Phone:
    iOS
    Aug 8, 2017 at 6:55 PM #23
    GDS

    GDS

    Joined:
    May 23, 2010
    Messages:
    154
    Mobo:
    X99 deluxe II
    CPU:
    2696 v4
    Graphics:
    2 TITAN X Maxwell
    Mac:
    MacBook Pro
    Mobile Phone:
    iOS
    after tiny optimization and modification

    here the results
    it could be worst ;)
    Capture d’écran 2017-08-09 à 02.30.57.png

    by the way @The Veterant:
    will you say that titan Xp is a dual titan X OC killer?
     

Share This Page