[-] ashe@lemmy.starless.one 45 points 1 year ago* (last edited 1 year ago)

You can run an LLM on a phone (tried it myself once, with llama.cpp), but even on the simplest model I could find it was doing maybe one word every few seconds while using up 100% of the CPU. The quality is terrible, and your battery wouldn't last an hour.

[-] ashe@lemmy.starless.one 34 points 1 year ago

"How to get a job: have work experience."

"How to get work experience: get a job."

[-] ashe@lemmy.starless.one 16 points 1 year ago

Restricting the internet based on where you happen to live can only end badly.

[-] ashe@lemmy.starless.one 19 points 1 year ago* (last edited 1 year ago)

...for a few hours, until you need water. And food. And shelter.

Please don't tell me that you think people living in the woods by themselves because our extremely advanced modern society with practically limitless resources compared to nearly all of history can't provide basic needs like that for everyone participating in it is a-okay.

[-] ashe@lemmy.starless.one 25 points 1 year ago

Exactly, automation shouldn't kick some people out of jobs and leave others just as overworked as before, it should automate things that don't absolutely need humans and just decrease the workload of (currently) irreplaceable people so that more people can work as much as one did before and still get the same salary.

Hell, unemployment as a whole should not exist in the modern era. If there's "too few jobs", decrease working hours and increase wages accordingly so the total monthly/yearly/whatever pay is the same. And if there just physically aren't enough resources to accomodate so many people having decent salaries (which is absolutely not the case right now), then we should start talking about overpopulation.

[-] ashe@lemmy.starless.one 35 points 1 year ago

I could be wrong but I don't think there even is a way to fully prevent adblocking without something like the proposed web integrity API, since it's all clientside and the browser can easily just choose not to render any ads.

Overall I do agree that less people using adblocks means less attention from corps and less adblock-blocks like youtube's, but I'm conflicted on whether that's a good enough reason to have most people suffer through so many ads.

[-] ashe@lemmy.starless.one 14 points 1 year ago

Yeah, it's insane we still have to deal with this in 2023.. and it's even worse for trans people, "transgenderism must be eradicated from public life entirely" and all that.

There are people who aren't financially independent yet that are facing the very real possibility of getting disowned by their family and thrown out on the street if they come out as anything but cishet. It sucks, but keeping this kind of information private can be lifesaving.

[-] ashe@lemmy.starless.one 20 points 1 year ago* (last edited 1 year ago)

Its enough to stream 4k compressed

no it isn't.

[-] ashe@lemmy.starless.one 15 points 1 year ago

https://doi.org/10.1093/mnras/stad2032

I wasn't able to read the actual paper since it's behind a paywall, but it's not exclusively a TL model. They say this in the abstract:

Deep space observations of the James Webb Space Telescope (JWST) have revealed that the structure and masses of very early Universe galaxies at high redshifts (⁠z∼15), existing at ∼0.3 Gyr after the BigBang, may be as evolved as the galaxies in existence for ∼10 Gyr. The JWST findings are thus in strong tension with the ΛCDM cosmological model.

While tired light (TL) models have been shown to comply with the JWST angular galaxy size data, they cannot satisfactorily explain isotropy of the cosmic microwave background (CMB) observations or fit the supernovae distance modulus vs. redshift data well.

We present a model with covarying coupling constants (CCC), starting from the modified FLRW metric and resulting Einstein and Friedmann equations, and a CCC + TL hybrid model. They fit the Pantheon + data admirably, and the CCC + TL model is compliant with the JWST observations. [..] One could infer the CCC model as an extension of the ΛCDM model with a dynamic cosmological constant.

516
rule?!? (lemmy.starless.one)
91
rule :c (lemmy.starless.one)
[-] ashe@lemmy.starless.one 19 points 1 year ago

Where did you get either of those statistics?

[-] ashe@lemmy.starless.one 35 points 1 year ago

I don't want them to know anything that isn't completely necessary, and even that should be wiped as soon as it's no longer relevant. Why should I be okay with corps recording all of my online behavior and preferences just so they can sell that info for a bit of extra profit?

179
capitaruleism (lemmy.starless.one)
81
rule (lemmy.starless.one)
25
AI rule (lemmy.starless.one)
submitted 1 year ago by ashe@lemmy.starless.one to c/196@lemmy.world
102
cat rule (lemmy.starless.one)
83
maia rule (lemmy.starless.one)
3
submitted 1 year ago* (last edited 1 year ago) by ashe@lemmy.starless.one to c/selfhosted@lemmy.world

I currently have a 24/7 linux old-office-PC-turned-server for self-hosting, and a desktop for mostly programming and playing games (linux as a host + a windows VM with a passed-through GPU). The server's i5-3330 is usually at ~10-15% usage.

Here's the actual idea: what if, instead of having a separate server and desktop, I had one beefy computer that'd run 24/7 acting as a server and just spun up a linux or windows VM when I needed a desktop? GPUs and USB stuff would be passed through, and I could buy a PCIe SATA or NVMe controller I could also passthrough to not have to worry about virtualized disk overhead.

I'm almost certain I could make this work, but I wonder if it's even worth it - would it consume less power? What about damage to the components from staying powered 24/7? It'd certainly be faster accessing a NAS without the whole "Network-Attached" part, and powering on the desktop for remote access could just be a command over SSH instead of some convoluted remote WoL that I haven't bothered setting up yet.

I'd love to hear your thoughts on this.

Edit 2 months later: Just bought a 7950X3D and use the 3D V-cache half of it as a virtualized desktop with the other cores used for running the host and other VMs. Works perfectly when passing through a dedicated GPU, but iGPU passthrough is very difficult if not impossible since I couldn't manage it.

Edit even later-er: iGPU passthrough is possible on ryzen 7000 after all, everything works great now.

103
π rule (lemmy.starless.one)
272
rule (lemmy.starless.one)
view more: next ›

ashe

joined 1 year ago