Contramuffin

joined 2 years ago
[–] Contramuffin@lemmy.world 2 points 3 days ago (1 children)

That would only encourage billionaires to nope off into space fantasy land (and of course they would still maintain power over governments). As it is right now, billionaires being on the same planet as normal people is at least incentivizing them to keep the planet in somewhat hospitable conditions.

As it is now, the best course of action would be to depose billionaires more quickly than they can escape off-planet. And to do it in a public and spectacular way to put fear into the billionaires

[–] Contramuffin@lemmy.world 2 points 4 days ago* (last edited 4 days ago) (1 children)

Add the backports ppa, that's what I do. Currently on Plasma 6.5.4 on Kubuntu

[–] Contramuffin@lemmy.world 2 points 5 days ago

Clearly you've never actually used petri dishes. Relevant meme:

[–] Contramuffin@lemmy.world 6 points 6 days ago

Hot cheetos tastes like cardboard to me. I've started preferring regular cheetos over hot cheetos because at least it has flavor. That should probably answer your question.

I don't dump hot sauce on everything, that's like asking if someone with a sweet tooth dumps sugar onto everything. Also, there are different types of hot sauce that have different levels of spice. You don't need to drown food in hot sauce, you just need to buy a spicier hot sauce and continue adding the same amount of hot sauce.

Interesting thing to note, your sense of taste and your physiological response to spice are two different things. There are (many) times where I eat something "spicy," and I'll sweat. But I don't taste the spice.

[–] Contramuffin@lemmy.world 4 points 1 week ago* (last edited 1 week ago)

Realistically, most of the antibiotic resistance issues actually arise from antibiotic usage in farm animals. Turns out you get better and fatter growth if you microdose animals with antibiotics. Plus there's the added benefit that you don't need to care as much about animal hygiene or illnesses if they're just always on antibiotics. Of course, that's the perfect circumstance for promoting antibiotic resistance. And at the current massive scale of animal farming, antibiotic resistance spreads quickly.

But, you know, that's an acceptable cost when you consider all the shareholder value that you create by having slightly fatter animals.

Funny thing is, antibiotic resistance is an energetically costly adaptation, and studies show that as long as you drop antibiotic usage below a certain amount, evolution would actually favor deleting antibiotic resistance genes. In other words, if we stopped using antibiotics on farm animals, a large amount of antibiotic resistance would just evaporate basically overnight. Then realize that that would never happen with our current governments

[–] Contramuffin@lemmy.world 11 points 1 week ago (1 children)

My understanding is that rice doesn't need to be soaking in water, either, but it helps with the weeds, since rice can survive the water but not other plants

[–] Contramuffin@lemmy.world 2 points 1 week ago* (last edited 1 week ago)

I've encountered that issue. Check your GPU settings. It seems on some GPU's, the default voltage is too low. So if you run any game that taxes the GPU too much, it'll just crash and require a hard reboot. The solution is to either lower your GPU clocks or raise your GPU voltage. There was a post on Lemmy a while back that I've saved. Let me try to dig it up...

Edit: https://lemmy.dbzer0.com/post/58404212

[–] Contramuffin@lemmy.world 1 points 1 week ago

If there's something you can buy, it's both less hassle and higher quality to just buy it rather than to try to print it yourself. A 3d printer really shines when there's something that you want that you can't buy. Custom parts, repaor parts, things of that nature. Unfortunately, that also means that the things that you print aren't likely to be something that you can plan for ahead of time

[–] Contramuffin@lemmy.world 2 points 1 week ago* (last edited 1 week ago)

I get what you mean. I've suspected it's a combination of factors:

  1. People have a name for it now. You can't announce or be prideful about something that you don't have a name for

  2. People are more accepting of autism now. You'd be more incentivized to hide autism if people thought it was a bad thing

  3. Autistic people tend to attract other autistic people. If you know one autistic person, you probably know a whole bunch of other autistic people too

But also, I just think that a lot of people underestimated how many people were autistic back then. A lot of high-functioning autistic people can pass for normal until you really get to know them. For instance, I'm like 99% sure that both of my parents are high-functioning autistic, and nobody ever suspected they might be. I brought up the possibility to them and their response was just, "yeah, I figured."

[–] Contramuffin@lemmy.world 4 points 1 week ago (2 children)

You know how there's the old schoolhouse stereotype that there's always a "weird kid" in every class? There's a good chance that kid was an undiagnosed autist.

The current estimates for autism rates is around 1 in 30. Which means every classroom is expected to have 1 autistic kid. Matches perfectly with the "weird kid in class" stereotype. People recognized autism since forever. That's why the stereotype exists. It's just that they didn't have an actual word for it yet.

[–] Contramuffin@lemmy.world 1 points 1 week ago

"explore and recombine" isn't really the words I would use to describe generative AI. Remember that it is a deterministic algorithm, so it can't really "explore." I think it would be more accurate to say that it interpolates patterns from its training data.

As for comparison to humans, you bring up an interesting point, but one that I think is somewhat oversimplified. It is true that human brains are physical systems, but just because it is physical does not mean that it is deterministic. No computer is able to come even close to modeling a mouse brain, let alone a human brain.

And sure, you could make the argument that you could strip out all extraneous neurons from a human brain to make it deterministic. Remove all the unpredictable elements: memory neurons, mirror neurons, emotional neurons. In that case, sure - you'd probably get something similar to AI. But I think the vast majority of people would then agree that this clump of neurons is no longer a human.

A human uses their entire lived experience to weigh a response. A human pulls from their childhood experience of being scared of monsters in order to make horror. An AI does not do this. It creates horror by interpolating between existing horror art to estimate what horror could be. You are not seeing an AI's fear - you are seeing other people's fears, reflected and filtered through the algorithm.

More importantly, a human brain is plastic, meaning that it can learn and change. If a human is told that they are wrong, they will correct themselves next time. This is not what happens with an AI. The only way that an AI can "learn" is by adding on to its training data and then retraining the algorithm. It's not really "learning," it's more accurate to say that you're deleting the old model and creating a new one that holds more training data. If this were applied to humans, it would be as if you grew an entirely new brain every single time you learned something new. Sounds inefficient? That's because it is. Why do you think AI is using up so much electricity and resources? Prompting and generating an AI doesn't use up much resources; it's actually the training and retraining that uses so much resources.

To summarize: AI is a tool. It's a pretty smart tool, but it's a tool. It has some properties that are analogous to human brains, but lacks some properties that make it truly similar. It is in techbros' best interests to hype up the similarities and hide the dissimilarities, because hype drives up the stock prices. That's not to say that AI is completely useless. Just as you have said in your comment, I think it can be used to help make art, in a similar way that cameras have been used to help make art.

But in the end, when you cede the decision-making to the AI (that is, when you rely on AI for too much of your workflow), my belief is that the product is no longer yours. How can you claim that a generated artpiece is yours if you didn't choose to paint a little easter egg in the background? If you didn't decide to use the color purple for this object? If you didn't accidentally paint the lips slightly skewed? Even supposing that an AI is completely human-like, the art is still not yours, because at that point, you're basically just commissioning an artist, and you definitely don't own art that you've commissioned.

To be clear, this is my stance on other tools as well, not just AI

[–] Contramuffin@lemmy.world 5 points 1 week ago* (last edited 1 week ago) (2 children)

I think there's a bit of a misconception about what exactly AI is. Despite what techbros try to make it seem, AI is not thinking in any way. It doesn't make decisions because it does not exist. It is not an entity. It is an algorithm.

Specifically, it is a statistical algorithm. It is designed to associate an input to an output. When you do it to billions of input-output pairs, you can then use the power of statistics to interpolate and extrapolate, so that you can guess what the output might be, given a new input that you haven't seen before. In other words, you can perfectly replicate any AI with a big enough sheet of paper and enough time and patience.

That is why AI outputs can't be considered novel. Inherently, it is just a tool that processes data. As an analogy, you haven't generated any new data by taking the average of 5 numbers in excel - you have merely processed the existing data

Even if a human learns from AI-generated art, their art is still art, because a human is not a deterministic algorithm.

The problem arises when someone uses generative AI for a significant and notable portion of their workflow. At this point, this is essentially equivalent to applying a filter to someone else's artwork and calling it new. The debate lies in that there is no clear point for when AI takes up an appropriate vs. inappropriately large portion of a person's workflow...

 

To be pedantic, some MAPK's are activated by mitogens. We're conveniently ignoring that fact because what's more funny is that some MAPK's are not

4
r/notkenm (lemmy.world)
submitted 2 years ago* (last edited 2 years ago) by Contramuffin@lemmy.world to c/requests@lemmit.online
2
r/inceltear (lemmy.world)
submitted 2 years ago* (last edited 2 years ago) by Contramuffin@lemmy.world to c/requests@lemmit.online
3
r/inceltears (lemmy.world)
submitted 2 years ago* (last edited 2 years ago) by Contramuffin@lemmy.world to c/requests@lemmit.online
5
r/engineeringporn (lemmy.world)
submitted 2 years ago* (last edited 2 years ago) by Contramuffin@lemmy.world to c/requests@lemmit.online
view more: next ›