Recent content by creepyTowel

  1. creepyTowel

    Propertree in Big Sur crashes Python

    Try this: Open ProperTree.command in a text editor Change the first line from #!/usr/bin/env python to #!/usr/bin/env python3 Worked for me, anyway!
  2. creepyTowel

    Migration Assistant renders build unusable

    Hi everyone I'm upgrading from High Sierra to Catalina 10.15.2 via a clean install on to a new SSD, taking the opportunity to swap out the Nvidia 1080ti for an AMD 5700XT. I've put the appropriate startup arguments in Clover and full graphics acceleration is working. The problem comes when I...
  3. creepyTowel

    Need to transition from Nvidia

    The Radeon RX 5700 range gets good reviews and is supported by the latest Catalina version. I'm in the same boat as you and am about to replace my GTX 1080ti with a Gigabyte RX 5700 XT Gaming OC 8GB, which many web reviews rate as the best overall 5700. My setup is essentially the same as...
  4. creepyTowel

    Strange problem with gsync

    The same as you -- 387. macOS 10.13.6 17G5019 Clover Bootloader (4895) Lilu (1.3.4) WhateverGreen (1.2.6)
  5. creepyTowel

    Strange problem with gsync

    Hi there I haven't set any particular options for Lilu/whatevergreen. Have you definitely removed nvidiagraphicsfixup.kext and shiki.kext from both Library/Extensions and EFI/Clover/kexts/Other (following the correct procedure to remove from L/E of course)? Could they be lurking anywhere else...
  6. creepyTowel

    Strange problem with gsync

    This thread hasn't been added to in a while, but I thought I'd mention that for me at least the problem has disappeared. I've moved to whatevergreen.kext and Lilu to replace nvidiagraphicsfixup.kext (and Shiki) and moved all kexts to Library/Extensions from EFI/Clover/kexts/Other (keeping a...
  7. creepyTowel

    Strange problem with gsync

    Interesting -- that doesn't work for me. I have to actually power down and then power up the monitor for this to work.
  8. creepyTowel

    Laughable FCPX performance despite decent specs

    I was thinking of trying something and would be interested to know what others think... If I were to (re-)install my old Radeon HD7950 next to my existing Nvidia GTX 1080ti, but without connecting a monitor to the Radeon card, would FCPX notice it and use it for rendering, bypassing the Nvidia...
  9. creepyTowel

    Strange problem with gsync

    Follow up: the issue does not occur (i.e. the monitor G-sync capability survives booting in and out of macOS) if Nvidia web drivers are not enabled in macOS. Obviously this isn't a satisfactory solution, but at least it narrows the problem down to something to do with the macOS Nvidia web driver.
  10. creepyTowel

    Strange problem with gsync

    FWIW my experience is the same as yours, with an Acer XB321HK G-Sync display. I was blaming the display till I read your post!
  11. creepyTowel

    Acer XB321HK 4k display, HiDPI and G-Sync

    Hi there Does anyone have any experience with the above monitor in Sierra? I expected to see the HiDPI option available in the Displays preference pane, but it's not. Might it be something to do with my machine profile (iMac 17,1)? Also, as this monitor has G-Sync built in, that option is now...
  12. creepyTowel

    Hackintosh doesn't shutdown

    What worked for me was changing the BIOS setting "ErP" to enabled. That way, whatever is forcing the restart cannot source enough power to make it happen, so the system stays shut down. Peter.
  13. creepyTowel

    What's the mini-DisplayPort "in" for on the z170x Designare?

    Thanks Wildwillow -- but given that mini-DisplayPort 1.2's maximum supported resolution (according to Wikipedia) is 4096x2160, I suppose that means we're restricted to that...
  14. creepyTowel

    What's the mini-DisplayPort "in" for on the z170x Designare?

    How embarrassing. I missed that in the manual. Thanks Fl0r!an!
  15. creepyTowel

    What's the mini-DisplayPort "in" for on the z170x Designare?

    Gigabyte's z170x Designare board features a mini-DiplayPort "in" socket, about which there seem to be sketchy details at best. Gigabyte describes it as "for future expansion", and review sites say it's "uncertain" what it's for. Could it, tantalisingly, be for accepting the output from a PCIe...