AMD + nVidia in a Mac Pro 5,1?

Forum rules
Before you post read how to get help. Topics in this forum are automatically closed 6 months after creation.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

Right now I have an Apple edition AMD 5770 video card in my Mac Pro running Mint 20.3. I'm interested in the nVidia cuda libraries for doing machine learning, so am thinking about a installing an nVidia card.

I intend to keep the 5770 for video, I need a Mac card so that I can dual boot between macOS High Sierra and Linux. I'm not really interested in nor care about driving a monitor with the nVidia card. I don't play games on it, so the 5770 is good enough for video.

The power leads for the video card are on the motherboard, not directly from the PSU. I think there is a power limit of 150 W, so I don't need the latest and greatest monster card. Just something that is good enough to let me experiment.

I don't care that there aren't any macOS drivers for it, as long as it doesn't interfere with the ability to boot.

Near as I can tell, an AMD card can coexist with nVidia when running Linux, but not sure. I had a failed attempt to get an RX580 working. With both the 5770 and RX580 installed, the RX580 never worked. macOS kernel panicked with both installed.

Any idea if this is possible? I guess I am looking for a lower end power efficient card that can do cuda. Suggestions?
Last edited by LockBot on Fri Sep 29, 2023 10:00 pm, edited 1 time in total.
Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Wed Mar 29, 2023 9:13 amI don't care that there aren't any macOS drivers for it, as long as it doesn't interfere with the ability to boot.
What do macOS drivers have to do with Linux Mint? You will need to be able to install Nvidia and CUDA drivers in order to be able to do machine learning. Certain older Nvidia GPUs no longer have supported Nvidia drivers for LM20 or LM21 versions so that is something to keep in mind. The Nvidia-340 can be used on LM20 versions, but not LM21. That is the oldest driver version which still has support.
lazarus_long wrote: Wed Mar 29, 2023 9:13 amNear as I can tell, an AMD card can coexist with nVidia when running Linux, but not sure.
Yes, they can co-exist on Linux.
lazarus_long wrote: Wed Mar 29, 2023 9:13 am I had a failed attempt to get an RX580 working. With both the 5770 and RX580 installed, the RX580 never worked. macOS kernel panicked with both installed.
Both those GPU use different drivers. Not sure why the kernel panicked.

Maybe you will find this Reddit topic helpful What GPU can i use in my 5.1 Mac Pro without flashing it but with boot screen?

This one may also be helpful I just got a Mac Pro 5,1 as a gift. It needs a GPU and some ram, but it is the dual CPU model. Still Worth it in 2022? because I see someone mentioning The Definitive Classic Mac Pro (2006-2012) Upgrade Guide as a helpful resource.
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

SMG wrote: Thu Mar 30, 2023 10:36 pm
lazarus_long wrote: Wed Mar 29, 2023 9:13 amI don't care that there aren't any macOS drivers for it, as long as it doesn't interfere with the ability to boot.
What do macOS drivers have to do with Linux Mint? You will need to be able to install Nvidia and CUDA drivers in order to be able to do machine learning. Certain older Nvidia GPUs no longer have supported Nvidia drivers for LM20 or LM21 versions so that is something to keep in mind. The Nvidia-340 can be used on LM20 versions, but not LM21. That is the oldest driver version which still has support.
lazarus_long wrote: Wed Mar 29, 2023 9:13 amNear as I can tell, an AMD card can coexist with nVidia when running Linux, but not sure.
Yes, they can co-exist on Linux.
lazarus_long wrote: Wed Mar 29, 2023 9:13 am I had a failed attempt to get an RX580 working. With both the 5770 and RX580 installed, the RX580 never worked. macOS kernel panicked with both installed.
Both those GPU use different drivers. Not sure why the kernel panicked.

Maybe you will find this Reddit topic helpful What GPU can i use in my 5.1 Mac Pro without flashing it but with boot screen?

This one may also be helpful I just got a Mac Pro 5,1 as a gift. It needs a GPU and some ram, but it is the dual CPU model. Still Worth it in 2022? because I see someone mentioning The Definitive Classic Mac Pro (2006-2012) Upgrade Guide as a helpful resource.
mac drivers have nothing to do with Mint, but my mac is set up to dual boot between High Sierra and Mint. In order to dual boot, I need a boot screen. Which is why I have a Mac HD 5770. The HD5770 is for video. If I can find a suitable nVidia card, it will have no display attached. I just want it for training models (Tensorflow, Pytorch, etc.). I don't really care how fast it is, I just want to learn how to do it.

I am concerned about a 2nd card coexisting with the HD5770 because of failed attempts to get a RX580 to work. It absolutely would not work at all in Linux. and would only work under macOS if I removed the HD5770 (with no boot screen). Allegedly refind+ or OpenCore will give you a boot screen with it, but not with mine. After lots of back and forth with the refind+ developer he decided that for whatever reason my RX580 does not support GOP, so will not work at all.

Almost no nVidia card can be flashed to support the mac, so again, an nVidia card is useless for driving a monitor.
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Tue Apr 04, 2023 9:03 pm If I can find a suitable nVidia card, it will have no display attached. I just want it for training models (Tensorflow, Pytorch, etc.). I don't really care how fast it is, I just want to learn how to do it.
You seem to be missing the point that you can not use an Nvidia GPU for training models if there are no Nvidia drivers loaded. It does not matter if you have a display attached to the Nvidia GPU or if you do not. You need to have the Nvidia CUDA drivers installed and loaded for the training models to work.

For the drivers to be able to load, the system must recognize the Nvidia GPU. You will need to figure out how to get your Apple hardware to recognize the Nvidia GPU if you want to do training models with it.

All the information from my previous post still applies.
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

SMG wrote: Tue Apr 04, 2023 9:42 pm
lazarus_long wrote: Tue Apr 04, 2023 9:03 pm If I can find a suitable nVidia card, it will have no display attached. I just want it for training models (Tensorflow, Pytorch, etc.). I don't really care how fast it is, I just want to learn how to do it.
You seem to be missing the point that you can not use an Nvidia GPU for training models if there are no Nvidia drivers loaded. It does not matter if you have a display attached to the Nvidia GPU or if you do not. You need to have the Nvidia CUDA drivers installed and loaded for the training models to work.

For the drivers to be able to load, the system must recognize the Nvidia GPU. You will need to figure out how to get your Apple hardware to recognize the Nvidia GPU if you want to do training models with it.

All the information from my previous post still applies.
I understand that.

My desires for operating under macOS: I don't care if it isn't recognized and has no drivers. All I want is for it to not interfere with the HD5770 or cause macOS to kernel panic.

My desires under Linux: to be recognized, the kernel load the appropriate driver, and not interfere with the HD5770. yes, I understand I need to install drivers to get the CUDA libraries. That's all I want out of it. I've given up on even Linux being able to drive a monitor attached to a non-Apple native card since it couldn't do it with my allegedly supported RX580. The RX580 driver loaded, but refused to output any video.

It also looks like I am limited to a 1000 series card to keep it below the 150W threshold. Any higher requires modifying the power supply to plug directly into it. Video cards draw their power from the motherboard in the MacPro. Is this too old to be of any use?
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Wed Apr 05, 2023 9:02 amIt also looks like I am limited to a 1000 series card to keep it below the 150W threshold. Any higher requires modifying the power supply to plug directly into it. Video cards draw their power from the motherboard in the MacPro. Is this too old to be of any use?
Checking the Nvidia drivers page, the following GPUs are supported with the latest Nvidia-525 drivers:
GeForce 10 Series:
GeForce GTX 1080 Ti, GeForce GTX 1080, GeForce GTX 1070 Ti, GeForce GTX 1070, GeForce GTX 1060, GeForce GTX 1050 Ti, GeForce GTX 1050, GeForce GT 1030, GeForce GT 1010

GeForce 10 Series (Notebooks):
GeForce GTX 1080, GeForce GTX 1070, GeForce GTX 1060, GeForce GTX 1050 Ti, GeForce GTX 1050
I do not know anything about what your hardware can or can not use to know if any of those apply, but hopefully that info helps you.

Also, thanks for the clarification that you were having separate expectations for what would work on MacOS versus what would work on Linux Mint. That clears my confusion on what you were saying. :)
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

I found a 1060, still waiting for it. Hopefully it works!
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

An update: Along with the 1060, I also bought an ancient ATI 2600 XT. Why? it is a Mac supported single slot card and requires no external power. I plugged it into slot 4 (x4). The 1060 hasn't arrived yet, so I still have the HD 5770 in slot 1 (x16).

The good news: it works in both macOS and Linux. I have a KVM that I am using to share 1 monitor with both cards. The machine thinks I have 2 monitors since I have 2 video cards. I set it to mirror displays, I can switch back and forth between cards using the KVM, I see the same desktop on both.

TODO:

I am using refind+ for my bootloader. I was hoping the boot screen would output from both cards, but it doesn't, only the 5770 in slot 1. Is there a way to make it display the boot screen on both cards? If not, make it display on the 2600 XT instead of the 5770?

According to the refind+ developer, if the video card supports GOP, the refind+ boot screen should appear even on a non-mac card. I'm not counting on that working with the 1060 though. Hence the 2600XT.
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Sat Apr 15, 2023 11:09 am The machine thinks I have 2 monitors since I have 2 video cards.
I suspect that might relate to how the hardware inside the computer is wired.
lazarus_long wrote: Sat Apr 15, 2023 11:09 amAccording to the refind+ developer, if the video card supports GOP, the refind+ boot screen should appear even on a non-mac card. I'm not counting on that working with the 1060 though. Hence the 2600XT.
It's my understanding on a PC that the firmware (BIOS/UEFI) determines on which connection (which type of GPU port) the boot menu shows. I am not familiar with rEFInd or how it uses the firmware to decide where to display the boot screen to be able to respond to your question.
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

Progress!!

The 1060 arrived and is installed. I installed the macOS web drivers.

on macOS:
No boot screen (no surprise there)
refind DOES display the boot screen on the 2600XT. Once booted up, both the 1060 and 2600XT work. Awesome!

on Mint:
The nouveau driver bundled with the distribution works, and Linux output to both the 1060 and the 2600 XT. More Awesome!

Now the confusion:

I went to nVidia's web page and downloaded the most recent linux driver: NVIDIA-Linux-x86_64-525.105.17.run
When I ran it, I got the message "You appear to be running an X server; please exit X before installing."

Googling gave various service names to stop using systemctl stop.

More googling brought me to an nVidia forum that essentially said "never install the drivers using that". They said to use apt-get instead.

So I guess the next questions are:
1) is it recommended to install the web drivers?
2) is it required to get CUDA?
3) what is the correct procedure?

I'm running Mint 20.3, Cinnamon desktop.

inxi -G reports the following:

Graphics:
Device-1: AMD RV630 XT [Radeon HD 2600 XT] driver: radeon v: kernel
Device-2: NVIDIA GP106 [GeForce GTX 1060 6GB] driver: nouveau v: kernel
Display: server: X.Org 1.20.13 driver: ati,modesetting,radeon
unloaded: fbdev,vesa resolution: 1920x1080~60Hz, 1920x1080~60Hz
OpenGL: renderer: AMD RV630 (DRM 2.50.0 / 5.4.0-97-generic LLVM 12.0.0)
v: 3.3 Mesa 21.2.6

Thanks!
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Sat Apr 15, 2023 4:08 pmNow the confusion:

I went to nVidia's web page and downloaded the most recent linux driver: NVIDIA-Linux-x86_64-525.105.17.run
When I ran it, I got the message "You appear to be running an X server; please exit X before installing."
You have Linux Mint installed so you should be using the Linux Mint tools. Nvidia says right on their website that one should be using the driver from their distribution rather than the one from their website. You can open Driver Manager and click one button to install a proprietary Nvidia driver.

Yes, you need to have an Nvidia driver installed for CUDA to work. However, the driver is included with the CUDA packages. Did you try to use the CUDA packages from the Nvidia website? I don't know what you want to run to know what versions of CUDA you might need. That would probably determine whether you can use CUDA from the LM20 repos or if you need to install it from Nvidia's website.
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

I guess I have to concede failure. The 1060 doesn't work right under linux. The default nouveau driver loaded and I got video, but there was no acceleration. Everything felt non-responsive, choppy, and slow. it couldn't even play back video (mkv format).

I switched to the 525 driver, that didn't work at all - no video. I tried running an nVidia cuda environment, the docker image wouldn't start, it could not find an environment.

I was able to install the web drivers under macOS, and it displayed video, but I didn't play with it long enough to tell if it was working like it should, as I am primarily interested in Linux.

Kind of a bummer, but so it goes.
User avatar
roblm
Level 15
Level 15
Posts: 5939
Joined: Sun Feb 24, 2013 2:41 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by roblm »

Usually if you want to use the Nvidia driver, then the Nvidia card must be in PCIe slot 1, or a special configuration file must be used, but I don't know if this is true for Apple motherboards.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

roblm wrote: Tue Apr 25, 2023 8:21 am Usually if you want to use the Nvidia driver, then the Nvidia card must be in PCIe slot 1, or a special configuration file must be used, but I don't know if this is true for Apple motherboards.
The Mac Pro has 4 slots:
1: 16x, double wide
2: 16x
3: 4x
4: 4x

It was plugged into slot 1, with the ATI 2600XT in slot 4. I've since gone back to the 5770 in slot 1.
User avatar
roblm
Level 15
Level 15
Posts: 5939
Joined: Sun Feb 24, 2013 2:41 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by roblm »

To test using a configuration file, I need the bus-ID for the AMD GPU. You previously posted the output of inxi -G but the inxi -Gx output is needed.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

Another stupid question:

I am sharing one monitor with 2 video cards and using a KVM to switch back and forth. Unfortunately, Mint thinks there is a monitor connected to both (including the card that is switched out). I have been mirroring so that stuff doesn't get 'lost' on the missing monitor.

In this situation, is performance limited to the weaker of the 2 cards, or will the card that is switched in be used to its full potential?

Alternatively, is there a setting where I can tell Mint "card X has no monitor" or "disable output to card X"

I'm wondering if this might be contributing to the driver problems.
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Wed Jun 14, 2023 8:17 amI am sharing one monitor with 2 video cards and using a KVM to switch back and forth. Unfortunately, Mint thinks there is a monitor connected to both (including the card that is switched out). I have been mirroring so that stuff doesn't get 'lost' on the missing monitor.
I do not understand what you mean by "switching".

Is this your setup?

Code: Select all

           GPU --- cable --
        /                  \
Computer                  KVM -- Monitor
       \                   /
         GPU --- cable --
lazarus_long wrote: Wed Jun 14, 2023 8:17 amIn this situation, is performance limited to the weaker of the 2 cards, or will the card that is switched in be used to its full potential?
What does the output of

Code: Select all

inxi -Gxxx
look like?
lazarus_long wrote: Wed Jun 14, 2023 8:17 amAlternatively, is there a setting where I can tell Mint "card X has no monitor" or "disable output to card X"
If the setup I posted above is correct, then just don't attach the GPU to the KVM.
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

SMG wrote: Wed Jun 14, 2023 6:41 pm
lazarus_long wrote: Wed Jun 14, 2023 8:17 amI am sharing one monitor with 2 video cards and using a KVM to switch back and forth. Unfortunately, Mint thinks there is a monitor connected to both (including the card that is switched out). I have been mirroring so that stuff doesn't get 'lost' on the missing monitor.
I do not understand what you mean by "switching".
lazarus_long wrote: Wed Jun 14, 2023 8:17 amIn this situation, is performance limited to the weaker of the 2 cards, or will the card that is switched in be used to its full potential?
What does the output of

Code: Select all

inxi -Gxxx
look like?
lazarus_long wrote: Wed Jun 14, 2023 8:17 amAlternatively, is there a setting where I can tell Mint "card X has no monitor" or "disable output to card X"
If the setup I posted above is correct, then just don't attach the GPU to the KVM.
yes, your diagram is what I have. The reason for the KVM is so I don't have to crawl around behind the mac every time I need the boot screen. Plus I am sharing the monitor with a couple other Macs. The KVM has 4 outputs, the Mac Pro is using 2 of them.

Plus I've had issues with Mint where if a GPU does not have a monitor plugged in when Mint boots, it will never display output if I later plug one in. It needs to always be plugged in. So switching cables from one GPU to the other after booting won't work.

Does mirroring reduce performance to that of the weakest GPU?

Right now both GPUs are Mac native (but weak) cards, but if I can get the driver issue sorted out I would still prefer to have a better GPU for normal use and keep the 2600xt for the boot screen.

Code: Select all

Graphics:  Device-1: Advanced Micro Devices [AMD/ATI] RV630 XT [Radeon HD 2600 XT] vendor: Apple driver: radeon v: kernel 
           bus ID: 07:00.0 chip ID: 1002:9588 
           Device-2: Advanced Micro Devices [AMD/ATI] Juniper XT [Radeon HD 5770] vendor: Apple MacPro5 1 driver: radeon 
           v: kernel bus ID: 08:00.0 chip ID: 1002:68b8 
           Display: server: X.Org 1.20.13 driver: ati,radeon unloaded: fbdev,modesetting,vesa 
           resolution: 1920x1080~60Hz, 1920x1080~60Hz 
           OpenGL: renderer: AMD JUNIPER (DRM 2.50.0 / 5.4.0-97-generic LLVM 12.0.0) v: 3.3 Mesa 21.2.6 compat-v: 3.1 
           direct render: Yes 
Last edited by SMG on Thu Jun 15, 2023 10:03 am, edited 1 time in total.
Reason: Changed c tags to code tags. Code tags retain the formatting of multi-line terminal output.
User avatar
SMG
Level 25
Level 25
Posts: 31941
Joined: Sun Jul 26, 2020 6:15 pm
Location: USA

Re: AMD + nVidia in a Mac Pro 5,1?

Post by SMG »

lazarus_long wrote: Wed Jun 14, 2023 11:25 pmyes, your diagram is what I have. The reason for the KVM is so I don't have to crawl around behind the mac every time I need the boot screen.
The boot screen is a function of the hardware/UEFI and not the operating system.
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmPlus I've had issues with Mint where if a GPU does not have a monitor plugged in when Mint boots, it will never display output if I later plug one in. It needs to always be plugged in.
Linux Mint has hot-plug capability. Normally, one does not have to have the monitor plugged in to use it. However, that is with a direct connection and not one to a KVM. (Not all KVMs even work on Linux-based distros.)

I do not know the requirements of Apple hardware so there may be something special about it that it will not register properly to be able to do hot-plugs.
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmDoes mirroring reduce performance to that of the weakest GPU?
What monitor connection do you have set up as the primary? The operating system will pick a primary if you do not select one, but you have the option to select it. The operating system does not select based on GPU qualities. (You are vastly overrating the capabilities of the operating system if you think it determines which one is weaker. :P ) Also check to see which GPU is considered Device-1 in the output. That is usually set as the primary renderer, but often that can be changed.
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmRight now both GPUs are Mac native (but weak) cards, but if I can get the driver issue sorted out I would still prefer to have a better GPU for normal use and keep the 2600xt for the boot screen.
You have a very specific, non-standard use case so experimentation will be needed to determine what works.
Image
A woman typing on a laptop with LM20.3 Cinnamon.
lazarus_long
Level 3
Level 3
Posts: 107
Joined: Sat Feb 08, 2020 10:13 pm

Re: AMD + nVidia in a Mac Pro 5,1?

Post by lazarus_long »

SMG wrote: Thu Jun 15, 2023 3:14 pm
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmyes, your diagram is what I have. The reason for the KVM is so I don't have to crawl around behind the mac every time I need the boot screen.
The boot screen is a function of the hardware/UEFI and not the operating system.
Obviously. The reason that a special "mac edition" video card is required for a boot screen is the Mac Pro is not a complete UEFI implementation, the Mac Pro predates UEFI. As I've said several times already, the reason for the 2600xt is so that I can switch it in when I need a boot screen to choose which OS to boot. The 2600xt and HD 5770 WILL display a boot screen. NEITHER the 580 nor 1060 will. The goal is to switch the KVM to the 2600xt. Reboot. Choose the OS in refind+ and boot. Switch the KVM to the fast GPU.
SMG wrote: Thu Jun 15, 2023 3:14 pm
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmPlus I've had issues with Mint where if a GPU does not have a monitor plugged in when Mint boots, it will never display output if I later plug one in. It needs to always be plugged in.
Linux Mint has hot-plug capability. Normally, one does not have to have the monitor plugged in to use it. However, that is with a direct connection and not one to a KVM. (Not all KVMs even work on Linux-based distros.)

I do not know the requirements of Apple hardware so there may be something special about it that it will not register properly to be able to do hot-plugs.
Possibly related to the Mac not being 100% UEFI compatible. Whatever the reason, monitors are NOT hot pluggable on my Mac. Manually moving cables after a reboot isn't an option. I did try directly connecting the monitor to the GPU to eliminate the possibility that the problem is the KVM. It made no difference.

What KVMs only work on some operating systems? All the ones I've seen are just hardware devices that switch ports in and out - there is no software involved and requires nothing to be installed on the machine. Mine is IOGear. It will switch any device that has DVI and USB-A ports.

Possibly the KVM adds an electrical load to the GPU DVI port so that it thinks a monitor is connected, but the KVM only passes through the monitor signal. If the is no monitor switched in, there is no signal to pass through. Linux should not be detecting a monitor when its switched out.

Therefor my question: Is there a way to tell Linux that a specific GPU does not have a monitor attached?
SMG wrote: Thu Jun 15, 2023 3:14 pm
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmDoes mirroring reduce performance to that of the weakest GPU?
What monitor connection do you have set up as the primary? The operating system will pick a primary if you do not select one, but you have the option to select it. The operating system does not select based on GPU qualities. (You are vastly overrating the capabilities of the operating system if you think it determines which one is weaker. :P ) Also check to see which GPU is considered Device-1 in the output. That is usually set as the primary renderer, but often that can be changed.
1) when mirroring there is no 'primary'
2) I know that linux does not intentional detect which GPU is weakest and throttle the rest.
3) I used to be a Windows kernel driver developer (but not video drivers). Windows drivers are interrupt driven. They receive an a interrupt request packet defining the task to do. The driver does it, then sends back an interrupt completion packet. Since the kernel is interrupt driven, it doesn't hold up user mode code - it runs asynchronously. I am assuming linux video works similarly, so that with mirrored displays the following happens when the GPUs needs to render an image:

Some sort of packet is created containing the data needed to render the image. The same packet is sent to both GPUs. Both work on the packet and render the same image. When finished, both inform the kernel that the task is complete.

I see 2 possible scenarios in which 2 mirrored GPUs could operate:
a) The kernel is satisfied the task is complete when either GPU acknowledges completion. The kernel will immediately send the next packet to both. The fast GPU will never have to wait on the slow GPU, its completion acknowledgement is ignored.

b) The kernel can't send the next packet until both GPUs acknowledge completion. Video would therefor be limited to the slowest GPU. Not because the kernel is deliberately looking for it, but because it can't send another packet until it completes.

I suspect b) given the abysmal performance when I had the 1060 with the (crap) nouveau driver + 2600xt. The system was so slow it was nearly unusable, regardless of which GPU was switched in. Right now, the HD5770 + 2600xt is far faster.

So again, the question: does mirroring limit performance to the weakest card?
SMG wrote: Thu Jun 15, 2023 3:14 pm
lazarus_long wrote: Wed Jun 14, 2023 11:25 pmRight now both GPUs are Mac native (but weak) cards, but if I can get the driver issue sorted out I would still prefer to have a better GPU for normal use and keep the 2600xt for the boot screen.
You have a very specific, non-standard use case so experimentation will be needed to determine what works.
Which is why I've been asking specific questions.
Locked

Return to “Graphics Cards & Monitors”