Can we stop using the Steven Crowder meme already. The guy is a total chode.
Lol. He gives chodes a bad rep. Call him what he is. A christofascist misogynist grifter.
I don't really disagree, but I think that was the original intent of the meme; to show Crowder as a complete chode by having him assert really stupid, deeply unpopular ideas.
The meme's use has become too soft on Crowder lately, though, I think.
I notice lately that many memes origins are worse than I thought from the context they are used in. Racist, homophobic, and lying people are not something I usually accept as entertainment, but they sneak their way unnoticed into my (non-news) feed through memes. I guess most people won't know the origins of the meme and use it according to the meaning they formed on their own. Other memes like the distracted boyfriend meme are meaningless stock photos, so I understand why many people use memes without thinking about the origins.
Anyway, thanks for pointing out who the person in the picture actually is.
I must admit when I learned this was Crowder I had a sad
Just change and reupload :D
Oh please. There are better templates than this stupid Nazi cunt. I really don't want to see this fuckface.
Yes! This is a nice alternative template for example.
For the longest time I just thought he was that one guy from modern family.
I thought NixOS was the new Arch btw
From: https://knowyourmeme.com/memes/btw-i-use-arch
BTW I Use Arch is a catchphrase used to make fun of the type of person who feels superior because they use a more difficult Linux distribution.
I think that's fair to apply to some NixOS users. Source: BTW I use NixOS.
I mean the barrier of entry is kind of high if you're used to more traditional package managers.
Source: I tried using nix on my Debian machine
Damn you're kinda right
At least the Arch people are not shilling for some corp.
I'm tired of people taking sides like companies give a shit about us. I wouldn't be surprised to see five comments saying something like "you shouldn't buy Nvidia AMD is open source" or "you should sell your card and get an amd card."
I'd say whatever you have is fine, it's better for the environment if you keep it for longer anyway. There are soo many people who parrot things without giving much though to an individuals situation or the complexity of a company's behavior. Every companies job is to maximize profit while minimizing loss.
Basically if everyone blindly chose AMD over Nvidia the roles would flip and AMD would start doing the things Nvidia is doing to maintain dominance, increase profit, reduce cost and Nvidia would start trying to gain more market share from AMD by opening up, becoming more consumer friendly, competitively priced
For individuals, selling your old card and buying a new AMD card for the same price will net you with a slower card in general or if you go used there is a good chance it doesn't work properly and the buyer ghosts you. I should know, I tried to get a used AMD card and it died every time I ran a GPU intensive game.
I also went the other way upgrading my mother's Nvidia card with a new AMD card that was three times as expensive as her Nvidia card ($50) would be on eBay and it runs a bit slower than her Nvidia card did. She was happy about the upgrade though because I used that Nvidia card in her movie server resulting in better live video transcoding than a cheap AMD card would.
Steven Crowder is a despicable human and does not deserve a meme template.
I run Stable Diffusion with ROCm. Who needs CUDA?
What distro are you using? Been looking for an excuse to strain my 6900XT.
I started looking at getting it running on Void and it seemed like (at the time) there were a lot of specific version dependencies that made it awkward.
I suspect the right answer is to spin up a container, but I resent Docker's licensing BS too much for that. Surely by now there'd be a purpose built live image- write it to a flash drive, reboot, and boom, anime ~~vampire princes~~ hot girls
If you don’t like docker take a look at containerd and podman. I haven’t done any cuda with podman but it is supposed to work
I can confirm that it works just fine for me. In my case I'm on Arch Linux btw and a 7900XTX, but it needed a few tweaks:
- Having xformers installed at all would sometimes break startup of stable-diffusion depending on the fork
- I had an internal and an external GPU, I want to set HIP_VISIBLE_DEVICE so that it only sees the correct one
- I had to update torch/torchvision and set HSA_OVERRIDE_GFX_VERSION
I threw what I did into https://github.com/icedream/sd-multiverse/blob/main/scripts/setup-venv.sh#L381-L386 to test several forks.
CUDA?! I barely even know'a!
Then show us your anime titty pics!
Earlier in my career, I compiled tensorflow with CUDA/cuDNN (NVIDIA) in one container and then in another machine and container compiled with ROCm (AMD) for cancerous tissue detection in computer vision tasks. GPU acceleration in training the model was significantly more performant with NVIDIA libraries.
It's not like you can't train deep neural networks without NVIDIA, but their deep learning libraries combined with tensor cores in Turing-era GPUs and later make things much faster.
Brother of "I need nVidia for raytracing" while only playing last decade games.
I completely unironically know people who bought a 4090 exclusively to play League
Not gonna lie, raytracing is cooler on older games than it is newer ones. Newer games use a lot of smoke and mirrors to simulate raytracing, which means raytracing isn't as obvious of an upgrade, or can even be a downgrade depending on the scene. Older games, however, don't have as much smoke and mirrors so raytracing can offer more of an improvement.
Also, stylized games with raytracing are 10/10. Idk why, but applying rtx to highly stylized games always looks way cooler than on games with realistic graphics.
Quake 2 does looks pretty rad in RTX mode
I'm holding out building a new gaming rig until AMD sorts out better ray-tracing and cuda support. I'm playing on a Deck now so I have plenty of time to work through my old backlog.
I was straight up thinking of going to AMD just to have fewer GPU problems on Linux myself
In my experience,
AMD is a bliss on Linux,
while Nvidia is a headache.
Also, AMD has ROCM,
it's their equivalent of Nvidia's CUDA.
Yeah but is it actually equivalent?
If so I'm 100% in but it needs to actually be. a drop in replacement for "it just works" like cuda is.
Once I've actually got drivers all set cuda "just works". Is it equivalent in that way? Or am I going to get into a library compatibility issue in R or Python?
Not all software that uses CUDA has support for ROCM.
But as far as setup goes, I just installed the correct drivers and ROCM compatible software just worked.
So - it can be a an equivalent alternative, but that depends on the software you want to run.
CUDA isn't the responsibility of AMD to chase; it's the responsibility of Nvidia to quit being anticompetitive.
Rocm is the AMD version
last I heard AMD is working on CUDA working on their GPUs and I saw a post saying it was pretty complete by now (although I myself don't keep up with that sort of stuff)
Well, right after that Nvidia amended their license agreements stating that you cannot use CUDA with any translation layers.
The project you're thinking of is ZLUDA.
NVIDIA finally being the whole bitch it seems, not unexpected when it comes to tech monopolies.
In the words of our lord and savior Linus Torvalds "NVIDIA, fuck you! 🖕", amen.
In all reality, a lot of individuals aren't gonna care when it comes to EULA B's unless they absolutely depend on it and this whole move has me want an AMD gpu even more.
I need NVDA for the gainz
Edit: btw Raspberry PI is doing an IPO later this year, bullish on AMD
My only regret for picking team red is that DaVinci Resolve doesn’t support hardware encoding.
Man I just built a new rig last November and went with nvidia specifically to run some niche scientific computing software that only targets CUDA. It took a bit of effort to get it to play nice, but it at least runs pretty well. Unfortunately, now I'm trying to update to KDE6 and play games and boy howdy are there graphics glitches. I really wish HPC academics would ditch CUDA for GPU acceleration, and maybe ifort + mkl while they're at it.
linuxmemes
Hint: :q!
Sister communities:
Community rules (click to expand)
1. Follow the site-wide rules
- Instance-wide TOS: https://legal.lemmy.world/tos/
- Lemmy code of conduct: https://join-lemmy.org/docs/code_of_conduct.html
2. Be civil
- Understand the difference between a joke and an insult.
- Do not harrass or attack members of the community for any reason.
- Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
- Bigotry will not be tolerated.
- These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
3. Post Linux-related content
- Including Unix and BSD.
- Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of
sudo
in Windows. - No porn. Even if you watch it on a Linux machine.
4. No recent reposts
- Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.
Please report posts and comments that break these rules!
Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't fork-bomb your computer.