View Full Version : Guide to Switch between VGA cards on hybrid system

9th November 2017, 02:43 PM
Acknowledgment & thanks: 1st of all I would like to thank "Alex Deucher" on Freedesktop.org bug tracker system who assist me to solve this issue & I would like to give attention of readers that many of contents in this guide are from his speech ! Also, I the link for page in ArchWiki in post 2 in this thread is given by him to me .....


The Guide:

Aim of this guide: this guide deal with scenario of hybrid systems with secondary (dedicated) VGA.

General Idea: In this scenario, the driver does do dynamic switching. The hardware is the same. There are no displays connected to the dGPU, so it can only be used for offscreen rendering (e.g, render offload) where the frame is rendered by the dGPU & then copied to the integrated chip for display.

The integrated GPU always handles the displays. To issuer this run the following command:

$ glxinfo | grep "OpenGL renderer"
output should show you that integrated GPU is always used by default, like this:

OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 5500 (Broadwell GT2)
To o check your dedicated VGA card, run the following command:

$ DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
it’s output – if dedicated VGA card is supported on your system – should be something like this:

OpenGL renderer string: AMD OLAND (DRM 2.50.0 / 4.13.11-200.fc26.x86_64, LLVM 4.0.1)
N.B: to list all VGA cards on system, run:

$ xrandr --listproviders
On Linux, user can choose which GPU to use per app.
User need to select the dGPU for render offload, otherwise it's kept powered down to save power.
To actively use the AMD GPU, you need to select it for render offload using xrandr.

PRIME GPU offloading:
GPU-intensive applications should be rendered on the more powerful discrete card. The command:

xrandr --setprovideroffloadsink
provider sink can be used to make a render offload provider send its output to the sink provider (the provider which has a display connected). The provider and sink identifiers can be numeric (0x7d, 0x56) or a case-sensitive name (Intel, radeon).

N.B: This setting is no longer necessary when using the default intel/modesetting driver from the official repos, as they have DRI3 enabled by default and will therefore automatically make these assignments. Explicitly setting them again does no harm, though.

N.B: GPU offloading is not supported by the closed-source drivers. To get PRIME to work you have to use the discrete card as the primary GPU. For GPU switching …..


$ xrandr --setprovideroffloadsink radeon Intel
You may also use provider index instead of provider name:

$ xrandr --setprovideroffloadsink 1 0
Now, you can use your discrete card for the applications who need it the most (for example games, 3D modellers...) by prepending the DRI_PRIME=1 environment variable, like this:

$ env DRI_PRIME=1 foo
where foo is name of targeted application.

Other applications will still use the less power-hungry integrated card.

These settings, which created by using "$ env DRI_PRIME=1 foo", are lost once the X server restarts. To restore original setting of application foo before restarting X server, user need to run:

$ env DRI_PRIME=0 foo
For convenience, user may:
1) set DRI_PRIME=1 in the script that starts the game, or
2) set DRI_PRIME=1 in the launchers (just at most of the beginning of command). For example:
gscan2pdf should changed to env DRI_PRIME=1 gscan2pdf
firefox %u should changed to env DRI_PRIME=1 firefox %u
gimp-2.8 %U should changed to env DRI_PRIME=1 gimp-2.8 %U
3) alternatively, user can put it in /etc/X11/xinit/xinitrc.d/). This may reduce your battery life and increase heat though.
After applying any of above 3 approaches, user still can run modified application by:

$ env DRI_PRIME=0 foo

Final Notes:
1) just to test an application, like foo, with primary VGA card, without modifying it’s setting, run:

$ DRI_PRIME=0 foo -info | grep "GL_VENDOR"
2) just to test an application, like foo, with secondary VGA card, without modifying it’s setting, run:

$ DRI_PRIME=1 foo -info | grep "GL_VENDOR"
3) to test an application, like foo, with it’s default setting, run:

$ DRI_PRIME=0 foo -info | grep "GL_VENDOR"

9th November 2017, 03:04 PM
For more possible scenarios, see this link on ArchWiki:


I converted this page into PDF & attached it here.