Saledovil

joined 2 years ago
[–] Saledovil@sh.itjust.works 1 points 8 hours ago

Maybe there were cookies from the original account on both devices?

[–] Saledovil@sh.itjust.works 1 points 2 days ago

To clarify something, I don't believe that current AI chatbots are sentient in any shape or form, and as they are now, they'll never be. There's at least one piece missing before we have sentient AI, and until we have that, making the models larger won't make the sentient. The LLM chat bots take the text, and calculate which words are how likely to follow onto that. Then based on these probabilities, a result is picked at random. Which is the reason for the hallucinations that can be observed. It's also the reason why the hallucinations will never go away.

The AI industry lives on speculative hype, all the big players are loosing money on it. Hence the existence of people saying that AI can become god and kill us all helps further that hype. After all, if it can become a god, then all we need to do is tame said god. Of course, the truth is that it currently can't become a god, and maybe the singularity is impossible. As long as no government takes the AI doomers seriously, they provide free advertisement.

Hence AI should be opposed on the basis that its unreliable and wasteful, not that it's an existential threat. Claiming that current AI is an existential threat fosters hype which increases investment, which in turn results in more environmental damage from wasteful energy usage.

[–] Saledovil@sh.itjust.works 18 points 3 days ago

Judge finds that anthropic has to pay restitution to the reddit users. Affirms that posts belong to users.

Well, I can dream.

[–] Saledovil@sh.itjust.works 4 points 3 days ago (2 children)

Hey, just wanted to plug an grassroots advocacy nonprofit, PauseAI, that’s lobbying to pause AI development and/or increase regulations on AI due to concerns around the environment, jobs, and safety. [emphasis added]

No, they're concerned about AI becoming sentient, taking over the world, and killing us all. This in turn, makes them little different from the people pushing for unlimited AI development, as the only difference between those two groups is that the latter believes they'll be able to control the super intelligence.

If you look at their sources, they most prominently feature surveys of people who overestimate what we currently call AI. Other surveys are flat out misrepresented. The survey for a 25% chance that we'll reach AGI in 2025 State of AI engineering admits that for P(doom), they didn't define 'doom', nor the time frame of said doom. So, basically, if we die out because we all fap to AI images of titties instead of getting laid, that counts as AI induced doom. Also, on said survey, 10% answered 0% chance, with 0% being the one of the only precise option offered on the survey, most other options covering ranges of 25 percentage points each. The other precise option was 100%.

Basically, those guys are useful idiots for the AI industry, pushing a narrative not to dissimilar from the one pushed by the AI boosters. Don't support them.

[–] Saledovil@sh.itjust.works 2 points 3 days ago

No, it is not security through obscurity. It’s a message signature algorithm, which are used in cryptography all the time.

Yes it is. The scheme is that when you take a picture, the camera signs said picture. The key is stored somewhere in the camera. Hence the secrecy of the key hinges on the the attacker not knowing how the camera accesses the key. Once the attacker knows that, they can get the key from the camera. Therefore, security hinges on the secrecy of the camera design/protocol used by the camera to access the key, in addition to the secrecy of the key. Therefore, it is security by obscurity.

[–] Saledovil@sh.itjust.works 2 points 4 days ago (2 children)

That's security by obscurity. Given time, an attacker with physical access to the device will get every bit data from it. And yes, you could mark it as compromised, but then there's nothing stopping the attacker from just buying another camera and stripping the key from that, too. Since they already know how. And yes, you could revoke all the keys from the entire model range, and come up with a different puzzle for the next camera, but the attacker will just crack that one too.

Hiding the key on the camera in such a way that the camera can access it, but nobody else can is impossible. We simply need to accept that a photograph or a video is no longer evidence.

The idea in your second paragraph is good though, and much easier to implement than your first one.

[–] Saledovil@sh.itjust.works 1 points 1 week ago

Well, the entity in the comic is like an actual, sentient AI rather than what we have now, given that it expresses and acts on a desire (in this case, wanting to be human). We're probably several breakthroughs away from actually building a sentient AI.

[–] Saledovil@sh.itjust.works 19 points 1 week ago (1 children)

It's like, most trans people are working class. In fact, unless membership in a group is defined by wealth, most members of said group are working class.

[–] Saledovil@sh.itjust.works 2 points 1 week ago (1 children)

What I mean is, there's nothing we could do if the aliens did that to us.

[–] Saledovil@sh.itjust.works 7 points 1 week ago (4 children)

Probably not. Just being capable of interstellar spaceflight opens up some really nice ways to kill a lot of people, such as redirecting asteroids. Or just drop nuclear bombs from orbit. Nothing we could do. Also, what if the aliens show up with more soldiers than we have people?

[–] Saledovil@sh.itjust.works 2 points 1 week ago

A transhuman. Funnily enough, the term is normally used to describe humans who want to be computers. I suppose we could call those transCPUs.

[–] Saledovil@sh.itjust.works 2 points 1 week ago

Gnosia

Single player social deduction game/ visual novel.

 

Hello, as stated in the title, I used to be able to generate a batch of 4 images, but when I try to do this now, I get the following error: CUDA out of memory. Tried to allocate 4.50 GiB. GPU 0 has a total capacity of 7.78 GiB of which 780.00 MiB is free. Including non-PyTorch memory, this process has 6.57 GiB memory in use. Of the allocated memory 6.37 GiB is allocated by PyTorch, and 56.09 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management

This started happening right after I updated SD.Next to the most recent version. I don't know which version I was using beforehand, since I don't update it frequently. I assume it installed it sometime around April this year.

I'm using a NVIDIA Geforce RTX 2070.

Does anybody have any idea what I could try?

 

In the mod "Save our Ship 2" I managed to capture a pirate ship. The large, red ship is the pirate ship, and the small asymmetrical ship is mine. First, the pirates send a boarding party using small personal shuttles. These landed spread out around my ship, allowing my colonists to gang up on the individual pirates ad take them out.

Then I send my guys over to the pirate ship, in an effort to take them out. I had them use the airlock as a chokepoint. The pirates threw themselves at my colonists until they routed. The pirates tried running to the edge of the map in order to escape. A lot of the remaining pirates didn't have space suits at this point, and I had lined up my colonists to shoot the fleeing pirates, so none of them actually managed to reach the edge of the map. Which wouldn't have helped them eitherway, because their in geostationary orbit.

This would be a lot more difficult if the enemy AI wasn't brain dead.

 

Marked as a spoiler because its a monster from Anomaly. The thing is, these things are not scary, because they don't have the AI necessary to capitalize on their invisibility. They act like typical raiders, meaning you can place your tough melee guys in a chokepoint, and they'll come to get their skulls bashed in.

It would probably be better if they instead acted like predatory animals, milling around on the map, and occasionally hunting one of your colonists. If they'd then avoid groups of colonists, while also always attacking in a group themselves, they'd be a truly terrifying monster. Basically, you'd have to hide out in your base, or go out to hunt them. And if you do choose to wait them out, there would be no indication that they've left.

 

Follow up to my last post , the problem has been resolved using a killbox. Admittedly, I had to reload several times before I got it right. So in about 5 out of 6 universes, the colony died.

 

The ongoing toxic fallout means that the sunblocker the mechanoids brought along won't cause any damage for the time being.

 

Game is "Vintage Story". It's similar to Minecraft, but slower paced.

 

Using the Create mod as part of the 'All the Mods 8' Modpack, I build a bread factory. The contraption on the right automatically harvests the wheat. The wheat is separarated from the wheat seeds using a brass tunnel. At this point, half the wheat is stored in a chest to be used as animal feed. The wheat is fed into a millstone, which turns it into flour. Flour is a feature unique to the create mod, and it allows for more efficient bread baking. By mixing the flour with water in a mixer, dough is created, which is then baked in the automatic oven, which utilizes the feauture bulk blasting to turn dough into bread. This allows creating one bread from one unit of wheat. The entire machine is powered using a large water wheel. One way to improve the machine would be to make the farming area larger, currently, 60 plants are growing at the same time, making the area 11 * 11 instead of 9 * 9 would increase the number of crops being fed into the machine. Also, I should probably decorate the bakery as well.

 

A small workshop with 2 machines, build using the Create Mod as part of the 'AllTheMods8' modpack. I like how the energy logistic looks naturally more interesting compared to other tech mods.

view more: next ›