180
submitted 1 month ago by boem@lemmy.world to c/technology@lemmy.world
top 30 comments
sorted by: hot top controversial new old
[-] MajorHavoc@programming.dev 132 points 1 month ago* (last edited 1 month ago)

"asked if Neuralink would perform another surgery to fix or replace the implant, but the company declined"

Evidence whether the company saw them as a person, or felt any ethical obligation...

It's an interesting era when an organization can have a single user, and choose to leave that single user with 85% of the promised functionality no longer functional. But is happily pursuing it's second user.

[-] deegeese@sopuli.xyz 83 points 1 month ago

It’s not a patient to help, it’s an early prototype to be abandoned.

[-] Sludgehammer@lemmy.world 35 points 1 month ago

Move fast and break people.

[-] Kraiden@kbin.run 52 points 1 month ago

I prefer flipping that number on it's head. 15%. They delivered 15% of what they promised and are now saying "fuck it."

It's the equivalent of writing your name on the exam, and then sitting there doodling for the rest of the time.

[-] golli@lemm.ee 41 points 1 month ago

with 85% of the promised functionality no longer functional

To be fair 85% of threads retracting doesn't seem to translate to an equal amount of functional loss. The article mentions

Neuralink was quick to note that it was able to adjust the algorithm used for decoding those neuronal signals to compensate for the lost electrode data. The adjustments were effective enough to regain and then exceed performance on at least one metric—the bits-per-second (BPS) rate used to measure how quickly and accurately a patient with an implant can control a computer cursor.

I think it will be impossible for us to asses how much it actually impacts function in real world use case.

It seems clear that this is a case of learning by trial and error, which considering the stakes doesn't seem like the right approach.

The question that this article doesn't answer is, whether they have learned anything at all or if they are just proceeding to do the same thing again. And if they have learned something, is there something preventing it to be applied to the first patient.

[-] MajorHavoc@programming.dev 20 points 1 month ago

if they have learned something, is there something preventing it to be applied to the first patient.

That's part of what makes me see this as a really bad look.

"Install it deeper" isn't rocket science, and it sounds like their first volunteer is willing.

They just want the extra data from leaving their first volunteer where they landed.

Human subject experiments are supposed to carry more long term obligation than this.

[-] HeyThisIsntTheYMCA@lemmy.world 6 points 1 month ago* (last edited 1 month ago)

Seriously. My father was part of a Deep Brain Stimulation trial. Their follow up was for ten years, just for the trial. The implant itself lasted his entire life, which I'm not feeling like doing the math. Five, six years after that?

[-] CosmicCleric@lemmy.world 11 points 1 month ago* (last edited 1 month ago)

I think it will be impossible for us to asses how much it actually impacts function in real world use case.

Does seem fair though to say that if you have 85% less data input/probes, that you're losing some to a large amount of fidelity, than an algorithm can only make up so much for.

A potentionally bad analogy, but think of it as a high bitrate versus a low bitrate, for listening to music. The quality of the music will be notably different, but you would still be able to hear both of the songs in their entirety.

At the end of the day, it's a lack of data that was originally expected for the algorithm to work with, that is now missing.

~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~

[-] NotMyOldRedditName@lemmy.world 7 points 1 month ago

Currently they've been having him control a cursor. He can left and right click it.

He can perform as good as he could before the problem now it seems, so if that's the case that extra bit rate wasn't needed for that task.

What this probably means is he won't be able to do as much as they learn more about it.

Maybe 1 year in the 2nd patient with full fidelity is able attach it to a robotic arm and fetch themselves a drink, but Nolan while he can click as good, won't be able to do that.

Also if they fix it eventually, as they didn't say never, just not yet, they'll never know if that discrepancy occurs.

[-] lorkano@lemmy.world 6 points 1 month ago

For sure they learned something, they must have some ideas why those retracted. Also they confirmed viability of technology by doing tests before those retracted

[-] Hacksaw@lemmy.ca 1 points 1 month ago

This was a known problem that they didn't fix on the animal models before moving to human trials. They learned nothing. All they did was scrap someone's brain. But I'm sure it's no big deal, he was a cripple right, he should be happy to be part of this /s

[-] angrymouse@lemmy.world 11 points 1 month ago

Actually, 85% retracted but the remaining 15% seems barelly receive signals. It is even worse

[-] lorkano@lemmy.world 6 points 1 month ago

While it sounds a like a dick move, there probably was a reason they would prefer other patients. Maybe it's more risky to do surgery second time? I don't really blame them for this one, their goal is to take best steps to develop technology before they make it widespread and really functional. I blame them for all of those animals death though.

[-] mojofrododojo@lemmy.world 22 points 1 month ago

there probably was a reason they would prefer other patients.

yeah, they fucked up on this one and want a new test subject.

[-] Madison420@lemmy.world 0 points 1 month ago

I mean yes. They wouldn't be part of a study of two different tests were tried or even the same test install twice.

[-] MajorHavoc@programming.dev 11 points 1 month ago

Yeah. I also can think of lots of reasonable reasons, but if those were the real reasons, the company should still be making commitments and plans with their first user...

The healthy stuff sounds like: "We intend X follow up procedure, but it needs to follow Y precaution."

Hell, even companies that have no intention to help usually take the time to lie and claim that they do.

[-] xxd@discuss.tchncs.de 42 points 1 month ago

The algorithm team must have been working overtime to get passable results with 85% of the data missing!

Also, it must feel absolutely horrifying to hear Neuralink decline a surgery to fix your implant. I guess they're still used to the "try, fail, abandon" strategy from their animal tests?

[-] Etterra@lemmy.world 26 points 1 month ago

Funny, I think that's how Elon tests everything. Teslas, especially that cybertruck, Twitter, rockets, his children...

[-] kokesh@lemmy.world 27 points 1 month ago

Implant it in musk. He is a superhuman genius, he can handle it.

[-] Asudox@lemmy.world 1 points 1 month ago
[-] ConstantPain@lemmy.world 9 points 1 month ago

Presumably, the person who volunteered knew all the risks and implications, so you can shit all you want on their decisions but that's how trials work. There's no promises of you coming out with a functional product.

[-] Jimmyeatsausage@lemmy.world 9 points 1 month ago

It's a pretty big presumption that Elon Musk is providing transparent and accurate information to consumers about a technology he's hoping to sell. While I'd agree with the premise normally, he's kind of a known bad actor at this point. I'm a pretty firm believer in informed consent for this kinda stuff, I just don't see much reason to trust Musk is willing to fully inform someone of the limitations, constraints or risks involved in anything he has a personal stake in. If you aren't informed, you can't provide consent.

[-] ConstantPain@lemmy.world 5 points 1 month ago* (last edited 1 month ago)

Dude was not a consumer, it was a volunteer, a big difference. And, as far as I know, they follow the required protocols for these types of trials. Everything else is speculation.

[-] Hacksaw@lemmy.ca 2 points 1 month ago

In an interview with the Journal, Neuralink's first patient, 29-year-old Noland Arbaugh, opened up about the roller-coaster experience. "I was on such a high and then to be brought down that low. It was very, very hard," Arbaugh said. "I cried." He initially asked if Neuralink would perform another surgery to fix or replace the implant, but the company declined, telling him it wanted to wait for more information..

[-] Kolanaki@yiffit.net 8 points 1 month ago

Can these implants even do anything more or do it better than a simple external EEG cap? I haven't seen them showing any benefit of it being implanted directly in your brain over simply using external devices that have existed commercially since the 80's/90's.

[-] deegeese@sopuli.xyz 5 points 1 month ago

These are far more sensitive, allowing the user better speed/precision.

But once they lose 85% of the sensors, all that goes out the window.

[-] webghost0101@sopuli.xyz 4 points 1 month ago* (last edited 1 month ago)

Makes sense but imagine 10, 20 years in the future from now? I doubt there be enough difference to ofset the risks by then.

Should we really rush out an invasive implant that barely works rather then perfect what we will naturally want to use in the future anyway?

[-] Bimfred@lemmy.world 3 points 1 month ago* (last edited 1 month ago)

We (as in humanity) can continue to develop both EEG caps and direct implants. The technology is young and there's no telling what side benefits and additional functionality either one can have.

And the implant, much like early EEG devices, barely works for now. Imagine what they'll be capable of 10-20 years down the line.

[-] wax@feddit.nu 7 points 1 month ago

Overpromise & underdeliver

[-] cy_narrator@discuss.tchncs.de 1 points 1 month ago
this post was submitted on 20 May 2024
180 points (95.0% liked)

Technology

55919 readers
3908 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS