[-] thundermoose@lemmy.world 2 points 3 weeks ago

It's really more of a proxy setup that I'm looking for. With thunderbird, you can get what I'm describing for a single client. But if I want to have access to those emails from several clients, there needs to be a shared server to access.

docker-mbsync might be a component I could use, but doesn't sound like there's a ready-made solution for this today.

[-] thundermoose@lemmy.world 1 points 3 weeks ago

Yeah, they are ideally the same mailbox. I'd like a similar experience to Gmail, but with all the emails rehomed to my server.

[-] thundermoose@lemmy.world 3 points 2 months ago

Ralph Nader saying that he thinks the death toll is over 200k is not a reasonable source to cite. The 30-50k estimates from most sources are already appallingly high. There's an active contingent of Ben Shapiro types trying to convince everyone what Israel is doing is fine, don't give them ammo to cast doubt on the official death count.

[-] thundermoose@lemmy.world 4 points 3 months ago

I've been using Mint for about 6 months now and it works with Nvidia just fine BUT the new user experience isn't great. You have to use the nomodeset kernel option and install Nvidia drivers, otherwise you'll boot to a black screen.

Helpful guide: https://forums.linuxmint.com/viewtopic.php?t=421550

[-] thundermoose@lemmy.world 4 points 4 months ago

You're using "machine learning" interchangeably with "AI." We've been doing ML for decades, but it's not what most people would consider AI and it's definitely not what I'm referring to when I say "AI winter."

"Generative AI" is the more precise term for what most people are thinking of when they say "AI" today and it's what is driving investments right now. It's still very unclear what the actual value of this bubble is. There are tons of promises and a few clear use-cases, but not much proof on the ground of it being as wildly profitable as the industry is saying yet.

[-] thundermoose@lemmy.world 2 points 4 months ago

AI is not self-sustaining yet. Nvidia is doing well selling shovels, but most AI companies are not profitable. Stock prices and investor valuations are effectively bets on the future, not measurements of current success.

From this Forbes list of top AI companies, all but one make their money from something besides AI directly. Several of them rode the Web3 hype wave too, that didn't make them Web3 companies.

We're still in the early days of AI adoption and most reports of AI-driven profit increases should be taken with a large grain of salt. Some parts of AI are going to be useful, but that doesn't mean another winter won't come when the bubble bursts.

[-] thundermoose@lemmy.world 2 points 5 months ago

I do quite like the stability of Cinnamon/Debian, and I think this problem is solvable (even if I have to solve it myself). I generally do not want to spend a lot of time futzing around with my desktop environment, but this is one thing I need to have.

[-] thundermoose@lemmy.world 3 points 6 months ago

Hyperfixating on producing performant code by using Rust (when you code in a very particular way) makes applications worse. Good API and system design are a lot easier when you aren't constantly having to think about memory allocations and reference counting. Rust puts that dead-center of the developer experience with pointers/ownership/Arcs/Mutexes/etc and for most webapps it just doesn't matter how memory is allocated. It's cognitive load for no reason.

The actual code running for the majority of webapps (including Lemmy) is not that complicated, you're just applying some business logic and doing CRUD operations with datastores. It's a lot more important to consider how your app interacts with your dependencies than how to get your business logic to be hyper-efficient. Your code is going to be waiting on network I/O and DB operations most of the time anyway.

Hindsight is 20/20 and I'm not faulting anyone for not thinking through a personal project, but I don't think Rust did Lemmy any favors. At the end of the day, it doesn't matter how performant your code is if you make bad design and dependency choices. Rust makes it harder to see these bad choices because you have to spend so much time in the weeds.

To be clear, I'm not shitting on Rust. I've used it for a few projects and great for apps where processing performance is important. It's just not a good choice for most webapps, you'd be far better off in a higher-level language.

[-] thundermoose@lemmy.world 4 points 6 months ago

I wouldn't shortchange how much making the barrier to entry lower can help. You have to fight Rust a lot to build anything complex, and that can have a chilling effect on contributions. This is not a dig at Rust; it has to force you to build things in a particular way because it has to guarantee memory safety at compile time. That isn't to say that Rust's approach is the only way to be sure your code is safe, mind you, just that Rust's insistence on memory safety at compile time is constraining.

To be frank, this isn't necessary most of the time, and Rust will force you to spend ages worrying about problems that may not apply to your project. Java gets a bad rap but it's second only to Python in ease-of-use. When you're working on an API-driven webapp, you really don't need Rust's efficiency as much as you need a well-defined architecture that people can easily contribute to.

I doubt it'll magically fix everything on its own, but a combo of good contribution policies and a more approachable codebase might.

[-] thundermoose@lemmy.world 2 points 6 months ago

i ain't won jack alot from the squattery

[-] thundermoose@lemmy.world 3 points 8 months ago

I think operating at 5V input might be a technical constraint for them. Compatibility revisions for existing hardware are a lot more difficult if the input voltage is 9x higher. Addressing that isn't as easy as slapping a buck converter on the board.

Not saying requiring 5A was the right call, just that I can see reasons for not using USB-PD.

[-] thundermoose@lemmy.world 4 points 9 months ago

In reading this thread, I get the sense that some people don't (or can't) separate gameplay and story. Saying, "this is a great game" to me has nothing to do with the story; the way a game plays can exist entirely outside a story. The two can work together well and create a fantastic experience, but "game" seems like it ought to refer to the thing you do since, you know, you're playing it.

My personal favorite example of this is Outer Wilds. The thing you played was a platformer puzzle game and it was executed very well. The story drove the gameplay perfectly and was a fantastic mystery you solved as you played. As an experience, it was about perfect to me; the gameplay was fun and the story made everything you did meaningful.

I loved the story of TLoU and was thrilled when HBO adapted it. Honestly, it's hard to imagine anyone enjoying the thing TLoU had you do separately from the story it was telling. It was basically "walk here, press X" most of the time with some brief interludes of clunky shooting and quicktime events.

I get the gameplay making the story more immersive, but there's no reason the gameplay shouldn't be judged on its own merit separately from the story.

view more: ‹ prev next ›

thundermoose

joined 1 year ago