Ranking of Linux Distrubitions from Best to Evil

I think samba would be available to anyone on your network that can supply the correct username and password. You could limit samba to certain IPs if you wanted. The shares aren’t Inherently open to the internet, you’d have to open a connection via some other option like vpn tunnel.
 
Greenwithenvy will allow you to adjust the overall power limit, I don't know of anything that will allow you to change the actual power curve.

Greenwithenvy will allow you to overclock and create custom fan profiles.
 
I got Samba set up via just following a link I googled. Thank you for that link. With minimal tinkering, it was available from both my phone and Windows computer. So that worked.

Green With Envy didn't work. Talked about some obscure Nvidia-X drivers or something, I don't even remember. Point is, it wouldn't start. I ended up simply finding a command line solution.

Bash:
sudo nvidia-smi -pl 320

nvidia-smi -pm "ENABLED"

That set the power limit to 320.

Afterwards I wanted a way to just watch the temperature. Again, none of the easy to find GUIs for this were simple or just worked, so I found that the command line had the tools necessary. I mocked up a bash one liner that would provide me with a 0.5 second poll interval status of the GPU (cobbled together via some SED googling and my usual "throw regex at everything" approach):

Bash:
watch -n 0.5 'nvidia-smi -q -d power,temperature,clock,memory,utilization |sed "/BAR1/,/Utilization/{d}" | sed "/Max Clocks/,/Max Customer/{d}" | grep -E "((Graphics|SM|Memory)\s+:\s+[0-9]+\s+\w*Hz$)|(Power Draw\s+:\s+[0-9.]+\s+W)|(GPU Current Temp)|((Total|Used|Free)\s+:\s+[0-9]+\s+\w+B)|((Gpu|Memory)\s+:\s+[0-9]+\s*%)"'
Works great, my power limit also worked just fine:
Screenshot from 2023-08-17 21-28-08.png


Nice and peachy right? So I think to myself, "Maybe Linux gaming has come a long ways, let me try playing something." I look up how, get Lutris going (Lutris comes pre-installed on this distro so that's nice), log into GOG, and download Cyberpunk. I come back to the computer after a lengthy download process to find that... It needed my permission for a .NET thing. So I click Okay.

Then I'm just at some screen with an exit code of 256, and basically no option but to "abort". So I click Abort and tell it to keep the files. So I think to myself, "Well maybe I can just start the install process again and it'll detect the pre-existing files and fill in the blank". Lol nope. Not only did it fail to do that, it outright deleted what that was there. It was like what, 80 freaking gigs of stuff to download, and it just went and deleted all of it? And starting the installer again just started all of the downloads again. And plocate after a sudo updatedb couldn't find the executable...

Try googling around, no one freaking knows why this 256 happens, and some dude just goes "lol yeah just redownload the whole thing to a separate directory". Yeah I'm not sitting around redownloading like 80 or whatever some odd gigs. This is why Linux gaming isn't ready outside of like the Steam Deck. I haven't tried Steam on here, so Steam games might work fine since I know my Steam Deck plays stuff without any issues. Maybe I just had bad luck. At least my mission of running Stable Diffusion with a power limit worked out. Great distro for SD.
 
Last edited:
I got Samba set up via just following a link I googled. Thank you for that link. With minimal tinkering, it was available from both my phone and Windows computer. So that worked.

Green With Envy didn't work. Talked about some obscure Nvidia-X drivers or something, I don't even remember. Point is, it wouldn't start. I ended up simply finding a command line solution.

Bash:
sudo nvidia-smi -pl 320

nvidia-smi -pm "ENABLED"

That set the power limit to 320.

Afterwards I wanted a way to just watch the temperature. Again, none of the easy to find GUIs for this were simple or just worked, so I found that the command line had the tools necessary. I mocked up a bash one liner that would provide me with a 0.5 second poll interval status of the GPU (cobbled together via some SED googling and my usual "throw regex at everything" approach):

Bash:
watch -n 0.5 'nvidia-smi -q -d power,temperature,clock,memory,utilization |sed "/BAR1/,/Utilization/{d}" | sed "/Max Clocks/,/Max Customer/{d}" | grep -E "((Graphics|SM|Memory)\s+:\s+[0-9]+\s+\w*Hz$)|(Power Draw\s+:\s+[0-9.]+\s+W)|(GPU Current Temp)|((Total|Used|Free)\s+:\s+[0-9]+\s+\w+B)|((Gpu|Memory)\s+:\s+[0-9]+\s*%)"'
Works great, my power limit also worked just fine:
View attachment 591226

Nice and peachy right? So I think to myself, "Maybe Linux gaming has come a long ways, let me try playing something." I look up how, get Lutris going (Lutris comes pre-installed on this distro so that's nice), log into GOG, and download Cyberpunk. I come back to the computer after a lengthy download process to find that... It needed my permission for a .NET thing. So I click Okay.

Then I'm just at some screen with an exit code of 256, and basically no option but to "abort". So I click Abort and tell it to keep the files. So I think to myself, "Well maybe I can just start the install process again and it'll detect the pre-existing files and fill in the blank". Lol nope. Not only did it fail to do that, it outright deleted what that was there. It was like what, 80 freaking gigs of stuff to download, and it just went and deleted all of it? And starting the installer again just started all of the downloads again. And plocate after a sudo updatedb couldn't find the executable...

Try googling around, no one freaking knows why this 256 happens, and some dude just goes "lol yeah just redownload the whole thing to a separate directory". Yeah I'm not sitting around redownloading like 80 or whatever some odd gigs. This is why Linux gaming isn't ready outside of like the Steam Deck. I haven't tried Steam on here, so Steam games might work fine since I know my Steam Deck plays stuff without any issues. Maybe I just had bad luck. At least my mission of running Stable Diffusion with a power limit worked out. Great distro for SD.
If you're running the GWE Flatpak, you need to make sure you update your Flatpaks so you're running the same Nvidia libraries within the Flatpak itself. You will need to do this whenever there is a driver update.

I recommend Bottles over Lutris. I've got GOG installed and games running just fine. I recommend installing and using wine-ge-proton8-13 under Bottles - Although I am using soda-7.0-9 in relation to GOG.
 
I don't have Cyberpunk to install, but I mostly use Botles now. I always had some issues with Lutris, probably my own stupidity. I also never had any problems with Heroic Games Launcher for GOG and EA games. I'm on Fedora and use Bottles via Flatpak. Found the following video with Google. YMMV

Cyberpunk 2077 on Linux via Bottles (No GOG Client)​


View: https://www.youtube.com/watch?v=UWHv_Cu2Gg8
 
If you're running the GWE Flatpak, you need to make sure you update your Flatpaks so you're running the same Nvidia libraries within the Flatpak itself. You will need to do this whenever there is a driver update.

I recommend Bottles over Lutris. I've got GOG installed and games running just fine. I recommend installing and using wine-ge-proton8-13 under Bottles - Although I am using soda-7.0-9 in relation to GOG.

I don't remember 100%, but I think the instructions for GWE had an update in there for Flatpak. Personally I think if multiple things need an update, they should just detect that they need an update and do it. If the Nobara is aiming to be user friendly, they really need to get stuff like that fixed. It's also kind of a deal breaker to a user just coming from Windows when your "user friendly" installer decides that the right course of action is deleting 80 gigs worth of data that it just finished downloading, after it failed to actually use it.

Also I tried Bottles just now. It was a very brief try:
1692387187097.png


It is indeed taking a while. Actually more like it's making no progress because it's hung. There's no GUI popup. Also, there's no way to kill it in the middle of installation easily. I had to do a "ps -ef | grep setup" and kill all of its instances individually.


1692387344896.png

And this is what GWE returned, by the way. From my googling, I think GWE is almost considered abandonware because its maintainer is going with AMD next gen.

Code:
flatpak update
Looking for updates…

Nothing to do.

I remember now, this is what flatpak also returned yesterday when I tried to update it. Before launching GWE. No it didn't help.


Also apparently killing the installer counts as a "success":
1692387900269.png


1692387939102.png


Needless to say. No. It does not run. By the way according to them this program only has "Minor glitches".

So, I create another bottle, and try download the GoG exe manually. Oh wait, I can't download it. GoG detects if you're using Linux to access their web page and won't show the download link for GoG Galaxy. So I have to go into the user agent override setting on firefox and set it to Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/116.0 to make it pretend it's running on Windows. So I got the exe. Great. Now I go back to bottles, try to run it and... it tells me I need .net installed to run it. Which .net? Who the heck knows? I head over into the dependencies settings of Bottles and try to install.. well, every single .net on the list, except most of them hang, freeze, or tell me that they won't work on a 64 bit machine.

Okay. I guess GoG just... isn't going to work

My last resort is Steam. Steam is downloading Red Dead 2 at the moment. It's kind of old at this point. Let's see how that goes...
 
I have very little patience with installer glitches. i figure if it's a PITA to get the OS installed it's gonna be a PITA to get it configured. Then using it will surely be the biggest PITA of all.
 
I have very little patience with installer glitches. i figure if it's a PITA to get the OS installed it's gonna be a PITA to get it configured. Then using it will surely be the biggest PITA of all.

The OS itself was not hard to install at all. Nobara installed wonderfully easily. The GPU drivers and everything just installed right away, and I had absolutely no issues getting it running.

For my intended purpose, which was running Stable Diffusion, it worked wonderfully. It was arguably easier to set up than Windows, and the performance gains were massive in Stable Diffusion, probably because of much less OS overhead.

The problem is that I just couldn't get any of these Lutris/Bottles/etc to actually install anything worth a darn on this system. To be fair, I also only tried Cyberpunk. As far as Steam, I'm not sure yet...

1692394735966.png


This is taking a while.
 
I don't remember 100%, but I think the instructions for GWE had an update in there for Flatpak. Personally I think if multiple things need an update, they should just detect that they need an update and do it. If the Nobara is aiming to be user friendly, they really need to get stuff like that fixed. It's also kind of a deal breaker to a user just coming from Windows when your "user friendly" installer decides that the right course of action is deleting 80 gigs worth of data that it just finished downloading, after it failed to actually use it.
Flatpak's are sandboxed applications, they run the way you ideally want software to run on a system. For that reason there will be separate flatpak updates that are not a part of system updates - However most package managers handle such updates globally behind the scenes, possibly your package manager doesn't, which is more of a distro issue than a flatpak issue. Furthermore, the issue is not with GWE, it expects to see libraries that haven't been installed yet. The other possibility is you're running an Nvidia beta driver that isn't supported by Flatpak as yet, the dev of GWE recommends you don't run beta Nvidia drivers for this reason.

EDIT: Just saw that you attempted a flatpak update and none were found, therefore you must be running an Nvidia beta channel driver and it's libraries aren't supported by Flatpak (due to the fact it's a beta driver).

EDIT 2: You're running Wayland, NV Control-X doesn't work under Wayland at this point in time. Switch to X11, that'll resolve your issues. Wayland under Nvidia is still in development at this point in time. In fact Wayland (period) is still in a state of perpetual beta,

The developer of GWE is looking for developers to take over the project, but at this point in time it is not dead. I'm no rocket scientist and it runs fine here:

GWE.png


Likewise, GOG Galaxy installed and ran just fine on default settings. I didn't see any .Net installer and I didn't have to manually have to download the installer off the GoG site. The whole user agent string is more of a GoG site issue as opposed to a Linux specific issue:

GOG Galaxy.png


Processing Vulkan shaders can take some time depending on the title, it's a very multithreaded CPU intensive process, and some titles aren't coded very well in relation to multi threaded processes. In such a case I skip the post processing and just play the game, sometimes there's some judder at first, but it disappears as you play the game. Such judder when first launching certain games until all shaders are cached is something I've noticed under Windows also, especially when using a mechanical HDD to store your games. You can set Steam to process shaders in the background when you aren't playing games. Processing Vulkan shaders doesn't happen with every launch of the game, however it will be triggered in the case you update drivers or certain libraries - Rolling release distro's will trigger shader caching more often by virtue of the fact they're updating certain libraries more often.

As with everything, YMMV. I don't recommend rolling release distro's when running Nvidia hardware/drivers.
 
Last edited:
This is why Linux gaming isn't ready outside of like the Steam Deck.
No No No, this is why linux anything isnt ready. There is an entirely insufficient effort to unfuck the software. I've run, and still do, tons of linux installs. I've yet to install a linux distro, on any system (and i've done many) that didnt have some kind of problem. Sometimes I could fix it with hours on the CL. Sometimes not. And then sometimes updates break shit that used to work. Its crazy.
Currently using it on my chromebook (Xubuntu today) and the bloody brightness keys dont work. Its a kernel issue, wtf.

And I'll add that on the Chromebook, if I don't use a specific kernel the sound either doesn't work, or volume is way too low. For the way to low volume kernels I can run some cl code to fix it, but I have to do it every boot. So I can't exactly just use a different kernel that doesn't have whatever security mitigation or bug, or hell feature that broke my brightness keys.
 
Last edited:
No No No, this is why linux anything isnt ready. There is an entirely insufficient effort to unfuck the software. I've run, and still do, tons of linux installs. I've yet to install a linux distro, on any system (and i've done many) that didnt have some kind of problem. Sometimes I could fix it with hours on the CL. Sometimes not. And then sometimes updates break shit that used to work. Its crazy.
Currently using it on my chromebook (Xubuntu today) and the bloody brightness keys dont work. Its a kernel issue, wtf.
For the record, my Linux systems run as well as any Windows system - So mostly faultless, no OS is without issues. I can't update the graphics drivers on my Windows 10 based laptop as if I do the switchable AMD graphics bug out and it takes a full 5 mins before I see the login screen on boot. Furthermore, I select 'do not update drivers' in the AMD application, and it still asks if I want to update drivers...
 
Just to add, there are distro's that have Samba installed by default, and their implementations are actually very user oriented. Shameless KDE Neon plug:

Samba 2_mod.png
 
So I initially got some error code FFFFFFFF when trying to launch RDR2. Googling it, there weren't many solutions. Top post said to unplug things because they might be causing a launch issue, but that sounded stupid to me so I didn't try it initially. All this computer has plugged into it is an extremely barebones laser mouse (literally if you looked up "barebones" for "laser mouse" this picture would probably come up, it's tiny) and then a Windows curved keyboard; the ergonomic kind:
1692401025963.png


Well turns out the keyboard was the issue. I unplugged it and then the game started up. I jacked up all of the settings to max that I could find. I then unfortunately made the mistake of switching the Graphics API to DirectX12 from Vulkan, because that's what some random online post said to do. That caused the game to not boot. I just did a plocate on its xml file and fixed that back to Vulkan.

Then it actually ran pretty smoothly.


1692400917282.jpeg

1692400953918.png

This is just a 60Hz 1440p screen I had lying around in the room where this computer is at, but qualitatively it ran pretty well. I only had small stutters but they were extremely infrequent and the game felt pretty good. Only thing is (assuming this was a Gsync/Freesync monitor, which it isn't) I have literally no idea if GSync is on, nor how to even enable it if it isn't. But it's a start. Looks like as long as your game is running through Steam, it might be okay. Everything else is apparently sketchy AF lol.

As far as Bottles, apparently the keyboard wasn't the issue, it still can't install anything worth a crap.
 
Last edited:
Assuming you're running Wayland, run an X11 session as opposed to a Wayland session. Doing so will likely solve your GWE issue, it may also solve other issues you're experiencing. At the OS login screen there should be an option to launch using X11.

The fact that an Nvidia based ISO possibly enables Wayland by default is quite ridiculous.
 
EDIT 2: You're running Wayland, NV Control-X doesn't work under Wayland at this point in time. Switch to X11, that'll resolve your issues. Wayland under Nvidia is still in development at this point in time. In fact Wayland (period) is still in a state of perpetual beta,

I'm just going to point out that this is what Nobara, which is supposed to be about as "easy" an approach into Linux as possible, as far as gaming... installed for me by default. I didn't choose this, nor do I even know how to. For all I know, switching to an older version may negatively impact my Stable Diffusion times, which is what this computer was mainly made for. Me trying out Linux gaming was kind of just an on the side hobby project. And it probably will impact it, considering AI tech is what Nvidia is focusing on right now. I have a feeling that lowering the GPU version would do me more harm than good.

Unless the setting can be easily switched back and forth, but I'm kind of skeptical about anything "easy" at this point, except installing a game through Steam.
 
I'm just going to point out that this is what Nobara, which is supposed to be about as "easy" an approach into Linux as possible, as far as gaming... installed for me by default. I didn't choose this, nor do I even know how to. For all I know, switching to an older version may negatively impact my Stable Diffusion times, which is what this computer was mainly made for. Me trying out Linux gaming was kind of just an on the side hobby project. And it probably will impact it, considering AI tech is what Nvidia is focusing on right now. I have a feeling that lowering the GPU version would do me more harm than good.

Unless the setting can be easily switched back and forth, but I'm kind of skeptical about anything "easy" at this point, except installing a game through Steam.
As stated. When running Nvidia I recommend LTS release distro's, but as with everything YMMV. The problem with gaming release distro's is the shameless AMD bias on behalf of developers. As stated, my guess is that you're running a Wayland session, which won't work well with Nvidia. At the OS login screen there should be an option to use X11, it should have been enabled by default had a dev not been pushing their own agenda.
 
Last edited:
As stated. When running Nvidia I recommend LTS release distro's, but as with everything YMMV. The problem with gaming release distro's is the shameless AMD bias on behalf of developers. As stated, my guess is that you're running a Wayland session, which won't work well with Nvidia. At the OS login screen there should be an option to use X11, it should have been enabled by default had a dev not been pushing their own agenda.

I attached an image of my login screen. I don't see anything like that at all. I tried clicking on the Settings icon in the lower right, and this is all it brought up...

Is Xorg supposed to be X11?
 

Attachments

  • 1692402819345.png
    1692402819345.png
    1.3 MB · Views: 0
I attached an image of my login screen. I don't see anything like that at all. I tried clicking on the Settings icon in the lower right, and this is all it brought up...

Is Xorg supposed to be X11?
Yes, you want to select Gnome on Xorg. It kinda pisses me off that X11 wasn't the default considering this distro is supposed to be a transitioning user friendly install using an Nvidia based ISO.

Once you have booted into an X11 session, Nvidia X Server Settings should actually show options that you can change (as opposed to under a Wayland session) and GWE should work. You will have to enable coolbits for fan control, I think the GWE page outlines how to do this.
 
Yes, you want to select Gnome on Xorg. It kinda pisses me off that X11 wasn't the default considering this distro is supposed to be a transitioning user friendly install using an Nvidia based ISO.

Once you have booted into an X11 session, Nvidia X Server Settings should actually show options that you can change (as opposed to under a Wayland session) and GWE should work. You will have to enable coolbits for fan control, I think the GWE page outlines how to do this.

Yep, that fixed it. GoG is running now.

It seems my Nvidia power limit got reset when I switched to this one, according to GWE (which is also now running). And it looks like GWE is also just using nvidia-smi in the background for changing it lol.
 
Yep, that fixed it. GoG is running now.

It seems my Nvidia power limit got reset when I switched to this one, according to GWE (which is also now running). And it looks like GWE is also just using nvidia-smi in the background for changing it lol
Your power limit changes won't persist between boots as changing power limits requires elevated privileges. The GWE GitHub page outlines how to make the change persistent, but I just open GWE on boot and manually change the option - TBH I mostly put my PC to sleep when it's not in use, so it's really a non issue for me. The biggest advantage of GWE is the ability to make custom fan profiles - IMO this is a must have feature under any OS.

I'm glad switching to X11 has helped, I'm thinking of messaging the developer as it should be the default if that's an Nvidia based ISO for transitioning users.
 
Yeah Cyberpunk ran with no issues now.


1692404825614.png



On the other hand, something kind of weird:
1692404922043.png


1692404973446.png


AMD's FidelityX was on by default and it doesn't let me turn on DLSS or Ray Tracing at all, despite this being an Nvidia card... what? lol I guess at least it runs, but Cyberpunk without DLSS is very rough on older GPUs especially.
 
Your power limit changes won't persist between boots as changing power limits requires elevated privileges. The GWE GitHub page outlines how to make the change persistent, but I just open GWE on boot and manually change the option - TBH I mostly put my PC to sleep when it's not in use, so it's really a non issue for me. The biggest advantage of GWE is the ability to make custom fan profiles - IMO this is a must have feature under any OS.

I'm glad switching to X11 has helped, I'm thinking of messaging the developer as it should be the default if that's an Nvidia based ISO for transitioning users.

I believe that my toggling the persistence setting with nvidia-smi should have made it persistent. But I wonder if it only made it persistent for Wayland, so when I switched to X11 it got reset.

Either way I like how I wasted all of that time bitching and trying stuff out and the entire time all of it was caused by one tiny little cog in the lower right hand of the login screen.
 
Yeah Cyberpunk ran with no issues now.


View attachment 591495


On the other hand, something kind of weird:
View attachment 591497

View attachment 591498

AMD's FidelityX was on by default and it doesn't let me turn on DLSS or Ray Tracing at all, despite this being an Nvidia card... what? lol I guess at least it runs, but Cyberpunk without DLSS is very rough on older GPUs especially.
I use DLSS under Death Stranding and Quake II RTX/Portal RTX and it works fine. Unfortunately I don't play Cyperpunk.

Is Cyberpunk installed under Steam or Bottles? You may need to manually enable DLSS under Bottles. I recommend saving your Bottles states as snapshots before making any changes, be aware that saving states can take some time as there's a lot to save:

DLSS.png
 
I use DLSS under Death Stranding and Quake II RTX/Portal RTX and it works fine. Unfortunately I don't play Cyperpunk.

Is Cyberpunk installed under Steam or Bottles? You may need to manually enable DLSS under Bottles. I recommend saving your Bottles states as snapshots before making any changes, be aware that saving states can take some time as there's a lot to save:

View attachment 591500
If "saving state" involves basically copying the entire thing, that's gonna be a bit tough. This only has a very minimalistic 500GB NVME 970 EVO Plus that I got from Microcenter for like 30 bucks installed (it's going to just copy files over to the main system whenever that fills up). I guess it's fine if I only have one game installed on there, but if it's more than one, it's going to start getting pretty expensive to even try that.

Either way, I think mission accomplished for the most part. I don't think I'd use the OS as a daily driver, but I do think that surprisingly games seem to run a little smoother on it, despite being basically emulated? It might be fun to at least have as a dual boot on my main system, but I think that got a bit dicey with the Windows boot manager installed. For now I'm going back to messing around with my 4090 on the main machine while it chugs away at SD, but thanks for all of the help.
 
If "saving state" involves basically copying the entire thing, that's gonna be a bit tough. This only has a very minimalistic 500GB NVME 970 EVO Plus that I got from Microcenter for like 30 bucks installed (it's going to just copy files over to the main system whenever that fills up). I guess it's fine if I only have one game installed on there, but if it's more than one, it's going to start getting pretty expensive to even try that.

Either way, I think mission accomplished for the most part. I don't think I'd use the OS as a daily driver, but I do think that surprisingly games seem to run a little smoother on it, despite being basically emulated? It might be fun to at least have as a dual boot on my main system, but I think that got a bit dicey with the Windows boot manager installed. For now I'm going back to messing around with my 4090 on the main machine while it chugs away at SD, but thanks for all of the help.
I believe the saved state is compressed, but yes, storage is challenging when gaming under Linux, as even cached shaders take up more space. Right now I run all my games off 6TB of spinning rust, and TBH due to the fact that Linux file system performance is so good, I hardly notice any disadvantage to doing so. But as you can see, once tiny teething issues are sorted out, gaming under Linux is relatively straightforward and performance is surprisingly good.

As I stated earlier, these days I'm actually surprised when a game doesn't work under Linux.
 
What's amazing is how much better Stable Diffusion runs on Linux. My 3080 Ti is achieving like 7-9 seconds (with 7 being the average) for each image in its workload. Every single time, while power limited down 320W. By comparison, this 4090 achieves... basically the same generation times on this Windows system. Granted that's probably with windows open, taking up graphics card processing cycles and causing various overhead (without that, I've seen it do images in 5ish seconds on average), but the 4090 is supposed to deliver basically twice the performance for AI work. I'm going to guess this has something to do with the file system and/or security system on Windows causing a delay after each image generation. I don't see that delay nearly as much on Linux. It has extremely consistent generation times and basically no overhead or delay. Kind of tempted to try this 4090 under it, if I could get a dual boot going. Nobara was absolutely excellent for Stable Diffusion, but it's kind of confusing why they went with Wayland (whatever it is) as the default, causing all kinds of troubles with anything that isn't running under Steam (or running Stable Diffusion). At least after you change that, everything runs without issue since it installs the graphics drivers and everything by default.

I guess there's basically no point in using my gaming machine with its 4090 for SD work anymore at all. The Linux box does it just as fast anyway with a 3080 Ti that has a lower power limit. That was kind of the intent of buying this 4090, but I didn't expect it to work out this well.

Linux's version of Samba also works really well, and is extremely fast; generally much faster than when I transfer files between my Windows' shared folders. I think I might move some amount of my storage into a Linux-based NAS at some point.

I guess that's the summary of it.
 
Last edited:
What's amazing is how much better Stable Diffusion runs on Linux. My 3080 Ti is achieving like 7-9 seconds (with 7 being the average) for each image in its workload. Every single time, while power limited down 320W. By comparison, this 4090 achieves... basically the same generation times on this Windows system. Granted that's probably with windows open, taking up graphics card processing cycles and causing various overhead (without that, I've seen it do images in 5ish seconds on average), but the 4090 is supposed to deliver basically twice the performance for AI work. I'm going to guess this has something to do with the file system and/or security system on Windows causing a delay after each image generation. I don't see that delay nearly as much on Linux. It has extremely consistent generation times and basically no overhead or delay. Kind of tempted to try this 4090 under it, if I could get a dual boot going. Nobara was absolutely excellent for Stable Diffusion, but it's kind of confusing why they went with Wayland (whatever it is) as the default, causing all kinds of troubles with anything that isn't running under Steam (or running Stable Diffusion). At least after you change that, everything runs without issue since it installs the graphics drivers and everything by default.

I guess there's basically no point in using my gaming machine with its 4090 for SD work anymore at all. The Linux box does it just as fast anyway with a 3080 Ti that has a lower power limit. That was kind of the intent of buying this 4090, but I didn't expect it to work out this well.

Linux's version of Samba also works really well, and is extremely fast; generally much faster than when I transfer files between my Windows' shared folders. I think I might move some amount of my storage into a Linux-based NAS at some point.

I guess that's the summary of it.
It's so refreshing to read some positive insight from a (possibly) transitioning user, I'm a little chuffed I could help. Bear in mind, that if you keep your OS installs isolated to separate SSD's, you can just choose your boot device via your UEFI boot menu and each OS will no nothing about the other - So no mucking around with Grub.

Glad Mazz was here to help on the Nvidia front. I run AMD on both my PC and laptops.
I can say that in this instance it was a pleasure. :)
 
What's amazing is how much better Stable Diffusion runs on Linux. My 3080 Ti is achieving like 7-9 seconds (with 7 being the average) for each image in its workload. Every single time, while power limited down 320W. By comparison, this 4090 achieves... basically the same generation times on this Windows system. Granted that's probably with windows open, taking up graphics card processing cycles and causing various overhead (without that, I've seen it do images in 5ish seconds on average), but the 4090 is supposed to deliver basically twice the performance for AI work. I'm going to guess this has something to do with the file system and/or security system on Windows causing a delay after each image generation. I don't see that delay nearly as much on Linux. It has extremely consistent generation times and basically no overhead or delay. Kind of tempted to try this 4090 under it, if I could get a dual boot going. Nobara was absolutely excellent for Stable Diffusion, but it's kind of confusing why they went with Wayland (whatever it is) as the default, causing all kinds of troubles with anything that isn't running under Steam (or running Stable Diffusion). At least after you change that, everything runs without issue since it installs the graphics drivers and everything by default.

I guess there's basically no point in using my gaming machine with its 4090 for SD work anymore at all. The Linux box does it just as fast anyway with a 3080 Ti that has a lower power limit. That was kind of the intent of buying this 4090, but I didn't expect it to work out this well.

Linux's version of Samba also works really well, and is extremely fast; generally much faster than when I transfer files between my Windows' shared folders. I think I might move some amount of my storage into a Linux-based NAS at some point.

I guess that's the summary of it.
Those of us who at least dabble in the distributed computing realm have known for a while that most dc projects run faster/more efficiently under Linux than Windows. In some cases the difference can be 20%+. I have no doubt that OS overhead is a big part of it and anything using storage much is definitely going to be faster on just about anything but NTFS.

In at least some cases gaming performance can be better under Linux. I play an old MMO, Lord of the Rings Online, and when I swapped full time from Windows to Manjaro there were a lot of performance improvements for that game. Framerates were steadier with much less stuttering. More noticeable was character login and logout times. For high level characters which have done a lot of quests/deeds/etc logging in and out of those characters was hell. At times it would take so long that the client would drop connection and require a restart of the client. As soon as I swapped to Linux (running the game through Steam/Proton) those issues completely disappeared and as long as the servers feed my system the data (which is considerable) fast, high level character logins are faster than brand new character logins under Windows. NTFS is such a shitshow compared to almost any other filesystem out there.
 
For the record, my Linux systems run as well as any Windows system - So mostly faultless, no OS is without issues. I can't update the graphics drivers on my Windows 10 based laptop as if I do the switchable AMD graphics bug out and it takes a full 5 mins before I see the login screen on boot. Furthermore, I select 'do not update drivers' in the AMD application, and it still asks if I want to update drivers...
I'll admit it depends on what you need to do. I expect there is plenty of scientific and or computer science related software that works best on Linux.

Laptops are a whole other issue. Mobile hardware has always been funky. What you're describing sounds more like Linux drivers not getting the same updates as Windows drivers, and thus not breaking.

I have experienced this exact issue under Windows.
My opinion has always been laptops are ok for light work. Desktops for CAD and games. But don't make any effort to keep up with the times either
 
Those of us who at least dabble in the distributed computing realm have known for a while that most dc projects run faster/more efficiently under Linux than Windows. In some cases the difference can be 20%+. I have no doubt that OS overhead is a big part of it and anything using storage much is definitely going to be faster on just about anything but NTFS.

In at least some cases gaming performance can be better under Linux. I play an old MMO, Lord of the Rings Online, and when I swapped full time from Windows to Manjaro there were a lot of performance improvements for that game. Framerates were steadier with much less stuttering. More noticeable was character login and logout times. For high level characters which have done a lot of quests/deeds/etc logging in and out of those characters was hell. At times it would take so long that the client would drop connection and require a restart of the client. As soon as I swapped to Linux (running the game through Steam/Proton) those issues completely disappeared and as long as the servers feed my system the data (which is considerable) fast, high level character logins are faster than brand new character logins under Windows. NTFS is such a shitshow compared to almost any other filesystem out there.

Well this isn't really distributed computing. It's all done on one GPU and machine, on a per image basis. (At least with the webui, which is what I'm using to run batch jobs).

But I guess I didn't realize that it would run that much better on Linux, regardless. I was looking at a rate/s table on a Reddit:
https://old.reddit.com/r/StableDiff...d_gpus_and_iterationssecond_based_on/jk7n210/
And was looking at the Linux 3080 Ti reported rate with some skepticism (not least of which because only 3 people reported it). But I guess if they're running on an even more stripped down version of Linux, like Arch, I could see it. That being said, I don't think mine is ever going to get that high, nor do I care because I don't want anything to do with Arch.

Now I did encounter one issue. When waking the monitor up and logging into the computer after letting it log out, I tried starting up Firefox and it was extremely slow and started stuttering; I could barely move the mouse; I had to just kill Stable Diffusion. That's something I encountered on an Ubuntu VM as well. I think I need to just turn off the automatic lock screen. It doesn't seem to do much good, mostly harm. This is kind of relevant because I'm also aiming to move some Selenium (well already have actually) cronjobs into this machine that were doing some logins and automated web site interaction for me.

Installing Node and NPM, followed by Selenium, was actually much easier than it was on Ubuntu. The Fedora installer is much more up to date, and my web scripts just worked practically out of the box. Zero issues. But if the machine starts freezing when the cronjobs pull up their headless browsers, that's going to be a very bad thing. I might need to just look into getting more RAM for this box, since it only has 16GB right now. Maybe also a better cooler, since this cooler is kind of garbage. The case is also garbage, but for the money it's really performing well. In case anyone's curious lol though it's a bit off topic:

20230819_173112.jpg


(There's practically no cable management and I had to bend the cable management guide to get the GPU in)
 
Also I made a slightly more complete version of my system temperature monitor thingy... because I'm just too lazy to download some GUI and try to get it working:

Bash:
watch -n 0.5 'echo "GPU Stuff:"; nvidia-smi -q -d power,temperature,clock,memory,utilization |sed "/BAR1/,/Utilization/{d}" | sed "/Max Clocks/,/Max Customer/{d}" | grep -E "((Graphics|SM|Memory)\s+:\s+[0-9]+\s+\w*Hz$)|(Power Draw\s+:\s+[0-9.]+\s+W)|(GPU Current Temp)|((Total|Used|Free)\s+:\s+[0-9]+\s+\w+B)|((Gpu|Memory)\s+:\s+[0-9]+\s*%)"; echo "CPU Stuff:"; echo -n "        CPU TEMP                          : " ; cat /sys/class/thermal/thermal_zone0/temp | sed -re "s/([0-9]+)[0-9]{3}/\1 C/"; echo "Memory:"; cat /proc/meminfo | grep -E "Mem(Total|Free|Available):"'
 
I'll admit it depends on what you need to do. I expect there is plenty of scientific and or computer science related software that works best on Linux.
I don't do any scientific work, or use any science related software besides the time I was using FAH during COVID. I use this PC for the daily running of my business, checking emails via Thunderbird, browsing the web via Firefox, serving media via SMB to my Google TV, and gaming - And considering my usage case it's great, I encounter little in the way of issues.

Well this isn't really distributed computing. It's all done on one GPU and machine, on a per image basis. (At least with the webui, which is what I'm using to run batch jobs).

But I guess I didn't realize that it would run that much better on Linux, regardless. I was looking at a rate/s table on a Reddit:
https://old.reddit.com/r/StableDiff...d_gpus_and_iterationssecond_based_on/jk7n210/
And was looking at the Linux 3080 Ti reported rate with some skepticism (not least of which because only 3 people reported it). But I guess if they're running on an even more stripped down version of Linux, like Arch, I could see it. That being said, I don't think mine is ever going to get that high, nor do I care because I don't want anything to do with Arch.

Now I did encounter one issue. When waking the monitor up and logging into the computer after letting it log out, I tried starting up Firefox and it was extremely slow and started stuttering; I could barely move the mouse; I had to just kill Stable Diffusion. That's something I encountered on an Ubuntu VM as well. I think I need to just turn off the automatic lock screen. It doesn't seem to do much good, mostly harm. This is kind of relevant because I'm also aiming to move some Selenium (well already have actually) cronjobs into this machine that were doing some logins and automated web site interaction for me.

Installing Node and NPM, followed by Selenium, was actually much easier than it was on Ubuntu. The Fedora installer is much more up to date, and my web scripts just worked practically out of the box. Zero issues. But if the machine starts freezing when the cronjobs pull up their headless browsers, that's going to be a very bad thing. I might need to just look into getting more RAM for this box, since it only has 16GB right now. Maybe also a better cooler, since this cooler is kind of garbage. The case is also garbage, but for the money it's really performing well. In case anyone's curious lol though it's a bit off topic:

View attachment 591701

(There's practically no cable management and I had to bend the cable management guide to get the GPU in)
Here's some shots of my rig, difficult to get decent photo's due to the lighting in this room and my potato phone, but you get the idea. Specs in sig.

Case top.jpg

Case side 1.jpg

Case side 3.jpg

Desktop new dock_mod.png
 
  • Like
Reactions: travm
like this
Well this isn't really distributed computing. It's all done on one GPU and machine, on a per image basis. (At least with the webui, which is what I'm using to run batch jobs).

But I guess I didn't realize that it would run that much better on Linux, regardless. I was looking at a rate/s table on a Reddit:
https://old.reddit.com/r/StableDiff...d_gpus_and_iterationssecond_based_on/jk7n210/
And was looking at the Linux 3080 Ti reported rate with some skepticism (not least of which because only 3 people reported it). But I guess if they're running on an even more stripped down version of Linux, like Arch, I could see it. That being said, I don't think mine is ever going to get that high, nor do I care because I don't want anything to do with Arch.

Now I did encounter one issue. When waking the monitor up and logging into the computer after letting it log out, I tried starting up Firefox and it was extremely slow and started stuttering; I could barely move the mouse; I had to just kill Stable Diffusion. That's something I encountered on an Ubuntu VM as well. I think I need to just turn off the automatic lock screen. It doesn't seem to do much good, mostly harm. This is kind of relevant because I'm also aiming to move some Selenium (well already have actually) cronjobs into this machine that were doing some logins and automated web site interaction for me.

Installing Node and NPM, followed by Selenium, was actually much easier than it was on Ubuntu. The Fedora installer is much more up to date, and my web scripts just worked practically out of the box. Zero issues. But if the machine starts freezing when the cronjobs pull up their headless browsers, that's going to be a very bad thing. I might need to just look into getting more RAM for this box, since it only has 16GB right now. Maybe also a better cooler, since this cooler is kind of garbage. The case is also garbage, but for the money it's really performing well. In case anyone's curious lol though it's a bit off topic:

View attachment 591701

(There's practically no cable management and I had to bend the cable management guide to get the GPU in)
Most DC projects are just one computer and one GPU although in the case of CPUs some projects can use more than one CPU if available. The work unit is simply a slice of a much bigger project which is crunched and then sent back at which point another is started. It's much the same thing. The efficiency of Linux over Windows is quite considerable at times. Not everything is faster and some things can be even slower but for the most part in my experience Linux performance is better than Windows for the same work load.
 
Also I made a slightly more complete version of my system temperature monitor thingy... because I'm just too lazy to download some GUI and try to get it working:

Bash:
watch -n 0.5 'echo "GPU Stuff:"; nvidia-smi -q -d power,temperature,clock,memory,utilization |sed "/BAR1/,/Utilization/{d}" | sed "/Max Clocks/,/Max Customer/{d}" | grep -E "((Graphics|SM|Memory)\s+:\s+[0-9]+\s+\w*Hz$)|(Power Draw\s+:\s+[0-9.]+\s+W)|(GPU Current Temp)|((Total|Used|Free)\s+:\s+[0-9]+\s+\w+B)|((Gpu|Memory)\s+:\s+[0-9]+\s*%)"; echo "CPU Stuff:"; echo -n "        CPU TEMP                          : " ; cat /sys/class/thermal/thermal_zone0/temp | sed -re "s/([0-9]+)[0-9]{3}/\1 C/"; echo "Memory:"; cat /proc/meminfo | grep -E "Mem(Total|Free|Available):"'
That GPU bash script is actually pretty good, outstanding job.
GPU Bash_Mod.png
 
Question: I have a spare Dell E6540 that has the Intel HD Graphics 4600 and AMD Radeon 8790 with an i7 4800MQ @2.7GHz with 16 GB of ram. How would Nobara run on this laptop on older games before 2015? The CPU is a 4 core, but during some of my testing with Windows, it runs HOT even using a laptop cooler. I was able to play FarCry 4, but after about 10 minutes, it began to slow down to a crawl because of getting hot.

I also have a spare Asus ROG G751JT gaming laptop that has a NVIDIA GeForce GTX 970M. Has 3GB of video ram, running on a i7 4710HQ @2.5GHz. The CPU has 4 cores and 16GB ram.

Both of these laptops have SSD.
 
Question: I have a spare Dell E6540 that has the Intel HD Graphics 4600 and AMD Radeon 8790 with an i7 4800MQ @2.7GHz with 16 GB of ram. How would Nobara run on this laptop on older games before 2015? The CPU is a 4 core, but during some of my testing with Windows, it runs HOT even using a laptop cooler. I was able to play FarCry 4, but after about 10 minutes, it began to slow down to a crawl because of getting hot.

I also have a spare Asus ROG G751JT gaming laptop that has a NVIDIA GeForce GTX 970M. Has 3GB of video ram, running on a i7 4710HQ @2.5GHz. The CPU has 4 cores and 16GB ram.

Both of these laptops have SSD.
I don't really know a lot about the AMD side of the fence, but I can say that my 980Ti is still supported by Nvidia under Linux, so I assume your 970M would be also.
 
I just tried to install Nobara on a Dell 5590 and could not get it to boot after the installation. I downloaded the latest Nobara-38-Official-2023-07-27.iso. The installation when fine with no issues. I went into the BIOS and disabled secure boot and TPM 2.0, but it sill won't boot. Not sure what else to try at this point.
 
Back
Top