this post was submitted on 26 Jan 2026
23 points (100.0% liked)

TechTakes

2383 readers
48 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
top 4 comments
sorted by: hot top controversial new old
[–] nfultz@awful.systems 4 points 5 hours ago

Cybersecurity insurance was a topic last term at the tech/law group on campus, see also Josephine Wolff https://direct.mit.edu/books/oa-monograph/5373/Cyberinsurance-PolicyRethinking-Risk-in-an-Age-of

This month, I found out my business insurance split out cybersec from the general policy a couple years ago and never told me, so I had to pay a $300 upcharge for it for a new contract that needed it specifically. Also a new $7 terrorism fee.

Probably you can s/cyber/AI/g and guess where things are heading.

[–] cornflake@awful.systems 4 points 5 hours ago

Ever since the great recession I've felt that the executive class are the most parasitic, regularly working against even the shareholders' interests, let alone anyone else. Risk management is a huge part of this disconnect; these execs do not care about these downsides.

I have an anecdote not directly related to insurance, but liability.

I was involved in re-negotiating a Master Services Agreement with a tech consulting firm. The sticking point were terms where they essentially said “we might use AI, we won’t tell you if we do, and if we do and it goes wrong, we accept no liability”. They would not budge on that.

I quit before it got hashed out, but I bet it got signed anyhow. People are so blasé about anything AI

[–] Tar_alcaran@sh.itjust.works 10 points 1 day ago

I recently heard a director at a major contractor say that they'll start using AI to design things as soon as the AI company accepts the same liability as traditional (read: actual) design companies.

Which is, of course, never.