

I just finished Witch Spring R the other day. I think I managed to see almost everything the game has to offer. A very simple and mostly light hearted game.
Now I just started a playthrough of Cassette Beasts. I am intrigued by the story.
I just finished Witch Spring R the other day. I think I managed to see almost everything the game has to offer. A very simple and mostly light hearted game.
Now I just started a playthrough of Cassette Beasts. I am intrigued by the story.
Isn’t it also super common in Mexican cuisine?
I love cumin and it is probably in my top 5 of most used spices in the kitchen. You would hate me!
sudo is not simply a tool to give admin privileges, but a tool to manage elevated permissions or run commands in a different users context.
These things become a lot more relevant once you use the tools professionally. In a well configured system you are only allowed to run the things you are explicitly allowed.
To be completely honest sudo is basically pointless in a single user context. There is almost no reason to even have it installed. It makes dealing with different environments easier though.
Anyway as I said it does not matter in many cases if you are the systems administrator. On the other hand there is also no benefit in getting used to bad practices in case you have to unlearn them later.
One more thing: what you suggest with chroot is one of the very reasons why you should not do that. You might have handed over the keys to break out of chroot. It is a well known vector which boils down to never run anything as root in a chroot environment.
sudoedit opens the editor as your user and just writes the file as root. For a single user who is also admin on the system this does not matter in many cases.
In a multi user context you can easily escape your editor and run a shell which allows a non admin user to escalate their privileges. So from a security implementation standpoint this must exist and it does for this reason.
Of course this also prevents some mistakes from happening and a bad plugin cannot destroy your whole system easily and so on. It boils down to good practice.
Ben Duerr is one of my favorite metal vocalists! I didn’t know the song, but of course I had to check it out and I don’t regret it.
According to a ProtonDB user the specific crashes I am referring to have been finally fixed with 545.29.02. So two weeks ago for a 5 years old card. Good job Nvidia!
I would have loved having that earlier, because I threw mine out after all the frustration with Nvidia and I still doubt that it is fully working now.
Don’t get me wrong it’s great for others stuck with Nvidia hardware though. I would never ever recommend buying any Nvidia hardware for Linux though. The experience is miserable compared to AMD.
Try playing games like Cyberpunk. I dare you :)
You are lucky if you can play without a crash for even one minute with that card. I am not exaggerating. Something is seriously messed up with the 20XX series.
Also Wayland is still a mess for Nvidia cards overall which is becoming more and more important.
You could try disabling VRR in your display settings. I believe it is set to auto by default if supported, but it does not work properly for some monitors causing flickering.
It kinda is though. Iirc it received an interrupt it shouldn’t have received and doesn’t know how to resolve. It is not supposed to ignore it, but then the only other option is crashing at this point. Basically it continues in a dazed and confused state.
Of course the message could be clearer, but at least it also makes the message easily searchable.
Same. A 7800 XT is on its way as we speak replacing my 2080 Super. I am just sick of Nvidia even though performance wise it wouldn’t be necessary.
I am aware, but check the referenced issues. Support has been merged like a year ago and at least gnome on Wayland should work out of the box. It’s incomplete, but it should be working
Also barrier is considered abandoned at this point the previous maintainers forked it which actually is leap input.
Check the input leap project. While I haven’t tested it myself, Wayland support got added like a year ago. You still needed to rebuild some packages, but reading the issue tracker now it seems to have gone a long way.
Unfortunately it is still not considered production ready. At this point I assume they will have it implemented and ready way before synergy though.
Sounds like you don’t clean your package cache. You can enable the paccache.timer to handle it for you on a weekly basis.
https://wiki.archlinux.org/title/pacman#Cleaning_the_package_cache
There was or is a bug with WebKit when using Nvidia. If that’s the case remove the Nvidia driver and use nouveau instead. After logging in you can reinstall the Nvidia driver again.
https://gitlab.gnome.org/GNOME/gnome-control-center/-/issues/2498
DS5 is probably the best you can choose. The build quality is good and they work fine with Linux via Bluetooth. Also it is not some exotic choice and is widely supported.
The Xbox controllers are also working fine, but they are lacking the gyro and touchpad. Also the build quality wasn’t great when I used one the last time (A button not registering every press and wonky d-pad). They are a bit cheaper, but also much worse imo.
I updated my post with my own results. tldr: it doesn’t work at all
Does this fix the RTX 20XX (Super) crashes in cyberpunk 2077? Did anyone test it?
Edit: Now that I had some time I tested the beta driver on Arch Linux using an RTX 2080 Super and I am only getting a black screen or it crashes immediately depending on the flags I use. So the beta driver made it worse than before and it is still unplayable.
I am starting to believe it just affects the 2000 series of cards then although some of the driver bugs causing crashes should affect all modern Nvidia cards equally.
I am confused why that’s the case though.
I looked through protondb again and it looks like all people using 20XX cards cannot play the game. While it looks fine for 30XX with some minor tweaks. For older cards it is a mixed bag, but there are just very few reports overall.
Looks like it really is just turing cards affected then. Bummer!
Not the person above, but if it is an issue you ever run into you are doing it “wrong”. Not really, but let me explain.
Having it on a separate partition has a few advantages like different mount flags (e.g. noexec), easier backup management (especially snapshots) and some other benefits like using your home for a new installation (like OP wants to) or it prevents some critical failures in case you accidentally fill it up (e.g. partial writes or services cannot start).
I often cannot decide on specific mount sizes either, because requirements may change depending on what you do. Hence I would just stick with some reasonable defaults for the installation and use some form of volume manager instead. If you want to use ext4, xfs etc I would recommend using LVM as it gives you a lot of freedom (resizing of volumes, snapshots and adding additional drives, mixed RAID modes etc) or there are btrfs, zfs or bcachefs to name the most common file systems which implement their own idea of storage pools and volumes.
Never should you need to resize a partition, there are more modern approaches. Create a single partition (+ a small EFI partition somewhere) and never bother with partitions ever again. The (performance) overhead is negligible and it gives so many additional benefits I didn’t even mention. Your complaint is a solved problem.