DaPorkchop_

joined 2 years ago
[–] DaPorkchop_@lemmy.ml 2 points 12 hours ago (1 children)

I don't think this is good advice tbh. I tried this myself, after 5 weeks I felt no mental changes (good or bad) and DID get seemingly irreversible boob growth.

[–] DaPorkchop_@lemmy.ml 10 points 13 hours ago
[–] DaPorkchop_@lemmy.ml 6 points 3 days ago (1 children)

There are a number of enterprise storage systems optimized specifically for SMR drives. This is targeting actual data centers, not us humble homelabbers masquerading as enterprises.

[–] DaPorkchop_@lemmy.ml 7 points 3 days ago (3 children)

Framerate above 20 in what with what settings? That's kinda key information :P

[–] DaPorkchop_@lemmy.ml 32 points 3 days ago (5 children)

You shouldn't need to download any graphics drivers, Ubuntu (and pretty much every other distribution) ships with the open-source AMD driver stack by default, which are significantly better than and less hassle than the proprietary drivers for pretty much all purposes. If you're getting video out it's almost certainly already using the internal GPU, but if you're unsure you can open a terminal and run sudo apt install mesa-utils and then glxinfo -B to double-check what is being used for rendering.

[–] DaPorkchop_@lemmy.ml 11 points 3 days ago

While I am no fan of NVIDIA, this headline seems somewhat disingenuous in making it sound like NVIDIA's fault. They aren't the ones making the memory chips.

[–] DaPorkchop_@lemmy.ml 1 points 3 days ago (2 children)

Where are these negative prices? I'm in Switzerland and my electricity price just keeps going up.

[–] DaPorkchop_@lemmy.ml 3 points 3 days ago (2 children)

Isn't that just passing the PFAS on to whoever ends up getting injected with your donation?

[–] DaPorkchop_@lemmy.ml 1 points 6 days ago

Thinking of a modern GPU as a "graphics processor" is a bit misleading. GPUs haven't been purely graphics processors for 15 years or so, they've morphed into general-purpose parallel compute processors with a few graphics-specific things implemented in hardware as separate components (e.g. rasterization, fragment blending).

Those hardware stages generally take so little time compared to the rest of the graphics pipeline that it normally makes the most sense to have far more silicon dedicated to general-purpose shader cores than the fixed-function graphics hardware. A single rasterizer unit might be able to produce up to 16 shader threads worth of fragments per cycle, so even if your fragment shader is very simple and only takes 8 cycles per pixel, you can keep 8x16 cores busy with only one rasterizer in this example.

The result is that GPUs are basically just a chip packed full of a staggering number of fully programmable floating-point and integer ALUs, with only a little bit of fixed hardware dedicated to graphics squeezed in between. Any application which doesn't need the graphics stuff and just wants to run a program on thousands of threads in parallel can simply ignore the graphics hardware and stick to the programmable shader cores, and still be able to leverage nearly all of the chip's computational power. Heck, a growing number of games are bypassing the fixed-function hardware for some parts of rendering (e.g. compositing with compute shaders instead of drawing screen-sized rectangles, etc.) because it's faster to simply start a bunch of threads and read+write a bunch of pixels in software.

[–] DaPorkchop_@lemmy.ml 1 points 1 week ago (1 children)

The "B" in "Boot" looks really off, the inside of the big "O" is lighter than the rest of the sign, and the kerning on the bottom text is all over the place.

[–] DaPorkchop_@lemmy.ml -3 points 1 week ago (6 children)
[–] DaPorkchop_@lemmy.ml 14 points 1 week ago (1 children)

I don't know who this guy is, but this is clearly a joke, no? It's just a funny spurious correlation.

view more: next ›