by freem » 08 Jun 2020, 18:31
The only thing I know on the topic is, the OpenGL implementation which comes from intel integrated GPUs does a lot of noise in valgrind (a runtime tool that allows to analyze a binary for various problems, including memory leaks). That noise makes it hard to identify real problems, and "magically" disappear when using even old NVidia hardware, with both non-free official and nouveau implementation. It also disappears when using software rendering. Performances are higher running under valgrind with the NV GPU than with integrated, but without valgrind, the integrated GPU is faster than the added GPU.
I would need to check it anew, but I think using valgrind, I had similar or close performances using software implementation and integrated GPU. The game I did those tests on was RedEclipse 1.6.
All those were done on a Debian 10 "buster", the dedicated GPU is a GeForce 8400GS, CPU and thus integrated GPU is "Intel(R) Pentium(R) Gold G5500 CPU @ 3.80GHz".
The testing with NVidia GPU was done with bumblebee, which added some felt latency in the game even when not using bumblebee.
Valgrind is a free software that runs a program under an emulated RISC-CPU, if I am not wrong, which explains the really low performances, but it is reputed for it's accuracy.
I hope those information helps.