430
submitted 5 months ago by Cossty@lemmy.world to c/linuxmemes@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] Shady_Shiroe@lemmy.world 66 points 5 months ago

I started using AMD cuz it was the "more bang for your buck" option and because of my cheapness I have always had a great experience with Linux, excluding wifi breaking every few months.

[-] Rustmilian@lemmy.world 23 points 5 months ago
[-] Virgo@lemmy.world 63 points 5 months ago

Leave my wife’s icard out of your goddamn mouth

[-] MeanEYE@lemmy.world 4 points 5 months ago

I went with AMD because I got fed up with nVidia, similarly like OP did or at least guy in the screen shot. Never looked back. Sure, AMD requires binary blob to initialize card, but it just works and zero issues since then. Upgrade hardware, just transfer drive to a new machine and voila you are ready to go.

load more comments (2 replies)
[-] otacon239@feddit.de 45 points 5 months ago

I feel like I’m from an alien planet. I’ve been using nVidia cards exclusively since around 2014 and while I’ve certainly not had a perfect track record, 90% of the time, I’ve been pretty plug-and-play. Maybe I’ve been lucky or maybe it’s because I stick to the popular distros.

In either case, from the perspective of openness, I do agree with the community that drivers shouldn’t be shrouded in mystery.

[-] AProfessional@lemmy.world 36 points 5 months ago* (last edited 5 months ago)

You just don’t notice what doesn’t work, like video decoding in your browser. You probably didn’t use a laptop with hybrid graphics. And you might not use GNOME, which has defaulted to wayland which was broken for many years. And you might use an outdated kernel so it never broke. And you don’t use software that used modern linux features like dmabuf.

Its fair to not have this situation but its an easy one to happen.

[-] someacnt_@lemmy.world 19 points 5 months ago

Welp. My gnome defaults to X11, and I am using laptop. That said, it does not use hybrid graphics, but honestly only using dedicated card works well enough. That said, fk nvidia. Their greed is overwhelming..

[-] AProfessional@lemmy.world 5 points 5 months ago

Yes it did fallback to x11 but it was truly a fallback, no developer used x11, features were ignored there, and it was just a worse experience.

[-] nekusoul@lemmy.nekusoul.de 4 points 5 months ago

Or you might want to use G-Sync or other forms of VRR on a multimonitor setup, which you can't do under X11 and is broken on Wayland.

load more comments (6 replies)
[-] Cossty@lemmy.world 6 points 5 months ago

I have 1060, I bought it when it came out. Three years after that I completeny switched to Linux. There were some problems with it on rolling distributions. And I still cant figure out hardware acceleration in Firefox. It either doesn't work or it is baraly noticable from software acceleration. I still have a lot of skipped frames.

load more comments (1 replies)
[-] kbal@fedia.io 6 points 5 months ago

90% of the time, I’ve been pretty plug-and-play.

If it only works 90% of the time that's not so good really.

[-] MeanEYE@lemmy.world 2 points 5 months ago

My thinking exactly. That's your card not working one day out of every 10. Imagine having issues once a week. I'd burn that card with termite and never look back.

[-] dinckelman@lemmy.world 5 points 5 months ago

With a 1080ti, i've had my fair share of issues, but compared to how it used to be, it's a night and day difference. If you're still an X11 purist, everything works perfectly, and on Wayland, everything works even better than that, assuming you can launch your software in native Wayland mode

[-] neidu2@feddit.nl 3 points 5 months ago* (last edited 5 months ago)

Exclusively nvidia card (or at least nvidia based card) since I got my GeForce 256 in 2001 after ditching my Voodoo2. No major issues beyond the ones I've caused myself.

load more comments (6 replies)
[-] pH3ra@lemmy.ml 44 points 5 months ago
[-] elxeno@lemm.ee 41 points 5 months ago
[-] possiblylinux127@lemmy.zip 3 points 5 months ago

For how long? When you fly near the sun you will be burned

load more comments (1 replies)
[-] lemmeee@sh.itjust.works 34 points 5 months ago

It's important to point out that AMD isn't perfect either (I don't know about Intel), since it requires you to install proprietary firmware. But it's obviously a huge improvement, since it doesn't require proprietary drivers. If we forced Nvidia to do what AMD does, we would be in a much better position. So if you care about freedom, Nvidia is the last company you should choose.

[-] possiblylinux127@lemmy.zip 21 points 5 months ago

If you are that concerned about firmware modern hardware is not your friend. Everything from your CPU to WiFi to integrated graphics requires proprietary software

load more comments (2 replies)
[-] nexussapphire@lemm.ee 4 points 5 months ago

Things are changing fast. Nvidia has their own "open source drivers" that are almost identical to the proprietary ones and the NVK project has open source drivers that might outperform the proprietary drivers in most games.

Now the only reason you might install the property drivers by the end of this year is cuda and potentially open CL.I think they're protective of their drivers because about the only thing separating their rt cards from their quatro cards are their drivers and software locked features. Quartos probably get put in more Linux systems than any other type of system.

[-] baseless_discourse@mander.xyz 13 points 5 months ago

Open source driver is maintained by the community and spend a long time just to get over the man-made barrier posted by nvidia.

If you think locking down hardware is a practice against the spirit of open source, don't throw money at them.

load more comments (8 replies)
load more comments (1 replies)
[-] HappyFrog@lemmy.blahaj.zone 25 points 5 months ago

I'd react the same if I was forced to install arch

[-] nexussapphire@lemm.ee 8 points 5 months ago

You say that like it's a bad thing.😄 Whatever he learns on arch he can bring with him to any other distro. Heak he could have tried it on the other distros to get his system working.

I'm not trying to be mean but this sounds like someone who didn't understand his system at all and he's about to learn a lot.

[-] queue@lemmy.blahaj.zone 25 points 5 months ago

I never understand why in 2024 you'd buy nvidia, unless you like paying more for less, or buying from scalpers for even more money. I guess some people really just go "More money spent on it, more better" no matter what.

[-] finkrat@lemmy.world 28 points 5 months ago

People just think "gaming?? OH NO I NEED MY NVIDIA!!!!" while AMD is sitting there like "hey. Hey I have a card that'll work. Hey. Card. Right here. Works better in Linux. Less headaches. Hello. Hey person. Card. Hi."

[-] mr_right@lemmy.dbzer0.com 6 points 5 months ago

Only if you consider ray tracing to be a gimmik (which it is) then AMD is the obvious way to go.

In reality It's because people bought their laptops and their Desktops before switching and want to Use their already existing graphic cards.

[-] finkrat@lemmy.world 3 points 5 months ago

This is a very good point, I forgot gaming laptops are almost exclusively nvidia

load more comments (2 replies)
[-] Bye@lemmy.world 23 points 5 months ago
[-] angel@iusearchlinux.fyi 3 points 5 months ago

Haven’t tried it, but might be worth looking into: https://github.com/vosen/ZLUDA

load more comments (3 replies)
[-] bjoern_tantau@swg-empire.de 13 points 5 months ago

A 1070 is hardly a card anyone would buy in 2024. Maybe they were running Windows before that and didn't care that much.

Also, hard to believe, but for a long while nVidia actually gave you the better experience on Linux. Before AMD had bought ATI. And probably a good while after the sale. The ATI drivers sucked ass.

[-] eager_eagle@lemmy.world 5 points 5 months ago* (last edited 5 months ago)

I'd like to buy AMD, but I have all these use cases

  • HDMI 2.1 (4K @ 120Hz) - relevant after recent news, if planning to use open source drivers
  • CUDA + Machine Learning applications
  • DLSS still visually better than FSR
  • Ray Tracing still better on GeForce cards
load more comments (3 replies)
[-] twinnie@feddit.uk 5 points 5 months ago

Because the features are better. That’s why most FPS comparisons of AMD and Nvidia always turn off the ray tracing.

[-] Aurenkin@sh.itjust.works 4 points 5 months ago

NVIDIA still have the best performing cards if you care about ray tracing. I honestly think that's the only reason to consider buying NVIDIA but you pay a heck of a premium for that.

load more comments (3 replies)
[-] Aurenkin@sh.itjust.works 23 points 5 months ago

I haven't had any serious problems on PopOS but I've still experienced the shittiness that is NVIDIA on Linux. Starfield was broken for months due to a graphics driver bug, then when it was finally fixed, that driver version broke Cyberpunk... Fucking hell NVIDIA.

[-] octoblade@lemmynsfw.com 9 points 5 months ago

I switched from a GTX 1080 to an Arc A770 for this exact reason. I was sick of putting up with the bullshit NVIDIA drivers. I am much happier with the Intel card, with the exception of it not having VR support.

[-] Cringe2793@lemmy.world 8 points 5 months ago

That cursor in the center of the screenshot is really annoying

[-] dinckelman@lemmy.world 8 points 5 months ago

There is 0 doubt that Nvidia has staggered a lot of progress in the Linux on desktop scene, however half of what this guy is describing is pure misunderstanding and lack of knowledge

[-] Cossty@lemmy.world 3 points 5 months ago

I mean... If after whole day of troubleshooting and googling it didnt help to fix the problem on 2 distributions, then I don't think the problem is with them. I had my fair share of nvidia shenanigans with nvidia over the years. When I couldn't fix it or didnt didn't want to deal with it I just switched distro until it worked.

load more comments (1 replies)
[-] apt_install_coffee@lemmy.ml 8 points 5 months ago* (last edited 5 months ago)

I recently bought a 7800 XT for the same reason, NVIDIA drivers giving me trouble in games and generally making it harder to maintain my system. Unfortunately I ran headfirst into the 6.6 reset bug that made general usage an absolute nightmare.

Open source drivers are still miles ahead of NVIDIA's binary blob if only because I could shift to 6.7 when it released to fix it, but I guess GPU drivers are always going to be GPU drivers.

load more comments (1 replies)
[-] Pantherina@feddit.de 7 points 5 months ago

This dude should totally use a ublue-nvidia image before the next Arch update kills their system again.

[-] palordrolap@kbin.social 7 points 5 months ago

There was a period, however brief, about, oh, 13 or so years ago where the recommendation was to avoid AMD entirely and go Intel and NVIDIA. Guess when I bought the parts for my PC?

My system before that was entirely AMD / ATI, but then, that was never a Linux machine. Nonetheless, the fashion when I built that was to avoid Intel and NVIDIA.

Literally the only real problem I've had on Linux with my ancient setup is the fact that one time two or three years back, a kernel and the legacy NVIDIA driver didn't play nice and I had to stick with an older kernel for a while.

Now my problem is that my NVIDIA card is so old that Debian stable doesn't support it any more and so neither do any distros descended from it. The OEM driver from NVIDIA themselves is a pain to install by comparison to the old .deb method, but compared to what I hear about other NVIDIA users, I'm a living miracle.

It might also help that I haven't played anything more modern than Minecraft, but I have no trouble with YouTube and streaming sites that I've noticed, nor with any of the old games.

You can guarantee that by the time I get it together enough to buy a new system with AMD processor and graphics, that will mark the turning point when something happens to cause everyone to swing back the other way again, at least for graphics.

[-] possiblylinux127@lemmy.zip 2 points 5 months ago

Honestly you should be able to pickup a old GPU on eBay for not that much.

Or for that matter, you could pickup a entire workstation that will have better performance.

[-] Alfons@feddit.de 4 points 5 months ago* (last edited 5 months ago)

Needed to switch from Debian to Manjaro because of some gcc version conflicts regarding the linux Kernel and the nvidia driver kernel module. The only fix was to install a newer or older linux kernel. Which is a pain in the ass with Debian but is easy with Manjaro :)

Also switching between newest „gaming“ drivers and cuda always broke my system and drove me crazy. So many hours lost because of nvidia.

I also have to work with some nvidia edge devices. No fresh install without new issues, i can assure you.

Edit: Fyi although I am somewhat teck-savvy, I just recently switched completely to linux. Hence, there might be a good way to handle cuda drivers and „gaming“ drivers

[-] hoanbridgetroll@midwest.social 4 points 5 months ago

I tried doing a TimeShift restore last week for another software issue, and nvidia drivers crapped the bed (again). Decided it was time to bite the bullet and swapped out my 1080 Ti with a 7700 XT. Did a clean install of Manjaro, and it was eerie how simple it was to get everything including Wayland to work. Should’ve done it two years ago.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 10 Mar 2024
430 points (96.7% liked)

linuxmemes

20351 readers
970 users here now

I use Arch btw


Sister communities:

Community rules

  1. Follow the site-wide rules and code of conduct
  2. Be civil
  3. Post Linux-related content
  4. No recent reposts

Please report posts and comments that break these rules!

founded 1 year ago
MODERATORS