216
top 50 comments
sorted by: hot top controversial new old
[-] FiskFisk33@startrek.website 43 points 6 months ago* (last edited 6 months ago)

Body camera video equivalent to 25 million copies of “Barbie”

Literally anything but the metric system

[-] rickyrigatoni@lemm.ee 13 points 6 months ago

it's called SEO and it's an art

[-] FiskFisk33@startrek.website 5 points 6 months ago

SEO os the bane of the internet. SEO is why i have to scroll through a novella every time i want to check out a recipe

load more comments (2 replies)
[-] a1studmuffin@aussie.zone 5 points 6 months ago

We don't even need to choose! Just use hours, months, years, decades! But no, Barbie movies.

[-] Everythingispenguins@lemmy.world 4 points 6 months ago* (last edited 6 months ago)

There is no common metric measure of time.

Edit -common

[-] SkybreakerEngineer@lemmy.world 5 points 6 months ago

I'm sorry, can you restate that in terms of the number of ground state transitions of a Cs-136 atom?

[-] TonyTonyChopper@mander.xyz 3 points 6 months ago
[-] Everythingispenguins@lemmy.world 2 points 6 months ago

Yes it is, but SI is not all metric. Metric is fundamentally a base 10 system. Time is base 60 you can probably thank the ancient Sumerians for that but there's some debate.

At one point the French tried to make metric time a thing but it didn't stick.

https://en.m.wikipedia.org/wiki/Decimal_time

[-] TonyTonyChopper@mander.xyz 1 points 6 months ago

Short times are always given in scales of 10 for seconds (ms, μs, ns). And long ones can be too.

[-] Everythingispenguins@lemmy.world 2 points 6 months ago* (last edited 6 months ago)

And machinists in America use decimal inches, but I don't think anyone would say that inches is metric.

Spelling

[-] Evkob@lemmy.ca 34 points 6 months ago

Body camera video equivalent of 25 million copies of "Barbie"

Is this a typical unit of measurement in journalism? Like what even is this? Crappy in-article advertising? Some weird SEO shit? An odd attempt to be cool and hip?

[-] Darkassassin07@lemmy.ca 20 points 6 months ago

It's America; anything but metric.

[-] AtmaJnana@lemmy.world 2 points 6 months ago

And which "metric" unit of time measurement do you prefer?

[-] Darkassassin07@lemmy.ca 3 points 6 months ago
[-] AtmaJnana@lemmy.world 3 points 6 months ago* (last edited 6 months ago)

I prefer seconds since 00:00:00 on Jan 1st, 1970

[-] drbluefall@toast.ooo 5 points 6 months ago
load more comments (1 replies)
[-] octopus_ink@lemmy.ml 31 points 6 months ago

That sounds like a big investment to find no wrongdoing by officers.

[-] OmnislashIsACloudApp@lemmy.world 24 points 6 months ago

oh great I'm sure the training for this will not result in a bunch of things getting "reviewed" and no one being responsible for mistakes at all...

[-] 1984@lemmy.today 1 points 6 months ago

Sounds like humans, so I guess it's AI progress? :p

[-] DontMakeMoreBabies@kbin.social 21 points 6 months ago

Would you rather these things never be reviewed? Isn't something better than nothing?

You'll literally never be able to afford (or hire) enough people to review the data they are taking in...

I mean unless we start killing billionaires and taking their shit.

[-] otter@lemmy.ca 9 points 6 months ago

Yea I share the same concerns about the "AI", but this sounds like a good thing. It's going through footage that wasn't going to be looked at (because there wasn't a complaint / investigation), and it's flagging things that should be reviewed. It's a positive step

What we should look into for this program is

  • how the flags are being set, and what kind of interaction will warrant a flag
  • what changes are made to training as a result of this data
  • how the privacy is being handled, and where the data is going (ex. Don't use this footage to train some model, especially because not every interaction is out in the public)
[-] Darkassassin07@lemmy.ca 6 points 6 months ago

Make it publicly accessible. It'll most certainly get watched and problems will be reported to be investigated further.

[-] Hawk@lemmy.dbzer0.com 3 points 6 months ago

Corporations would be delighted to analyze all this footage.

[-] Rivalarrival@lemmy.today 2 points 6 months ago

File a complaint, and you get to view the video. If nobody files a complaint, there is no need to view the video.

Indeed, nobody should be looking at the video unless a complaint is filed.

[-] MaxPow3r11@lemmy.world 1 points 6 months ago

WE should be able to review it/see it ALL.

We pay these fucks to torture and kill with our tax $.

They should have nothing to hide from us.

[-] PhlubbaDubba@lemm.ee 1 points 6 months ago

Well I mean you could rig the cameras to turn on when the cop gets out of their car to break the footage into specific encounters where the cop had to interact with someone. Identify the files by the date, time, and badge number of the cop the camera is assigned to, and now you've got an easy to search database of footage whenever an incident is reported either by the cop because they had to issue paperwork for it or by whoever they were interacting with because they want to lodge a complaint.

While randomly selecting files not involved in ongoing investigation as potential training material could be helpful, we don't actually HAVE to have an assigned review resource to scan for bad behaviour or relevant material to investigations since in both cases someone is incentivized to start the process that will pull the relevant footage anyways.

[-] terminhell@lemmy.world 20 points 6 months ago

What if all the cam footage was just uploaded to something like YouTube. Publicly visible by ya know, the very citizens that pay for it and work for...

[-] Hawk@lemmy.dbzer0.com 4 points 6 months ago

Wouldn't that be a huge privacy issue?

[-] lir@lemmy.world 1 points 6 months ago

The police are already a huge rights issue when they're acting without oversight

load more comments (2 replies)
[-] ech@lemm.ee 12 points 6 months ago

Ah, good. I had "racist profiling ~~AI~~LLM" on my 2024 bingo card

[-] Darkassassin07@lemmy.ca 10 points 6 months ago

Yes, because AI has a firm grasp on nuanced topics like law enforcement and civilian/human rights...

You may as well play the video to an empty room.

[-] themurphy@lemmy.world 7 points 6 months ago* (last edited 6 months ago)

ITT: People who are scared of things they don't understand, which in this case is AI.

In this case, the "AI" program is nothing more than pattern recognition software setting a timestamp where it believes there's something to be looked at. Then an officer can take a look.

It saves so much time, and it filters out anything irrelevant. But be careful because it's labelled "AI". Scarry.

EDIT: Comments to this comment confirms that you don't understand AI, because if you did, you'd know that this system who scans video is not a LLM (large language model). It's not even the same system in its core.

[-] Voroxpete@sh.itjust.works 11 points 6 months ago* (last edited 6 months ago)

This is an astonishingly bad take.

Almost every AI system is a black box. Even if you open source the code and the training data, it's almost impossible to know anything about the current state of a machine learning model.

So the entire premise here is that a completely unaccountable system - whose decisions are basically impossible to understand or scrutinize - gets to decide what data is or isn't relevant.

When an AI says "No crime spotted here", who gets to even know that it did that? If a human is reviewing all of the footage, then why have the AI? You're doing the same amount of human work anyway. So as soon as you introduce this system, you remove a huge amount of human oversight, and replace it with decisions that dramatically affect human lives - that could potentially be life or death if it's the difference between a bad cop being taken off the street or not - being made by a completely unaccountable system.

Whose to say if the training data fed into this system results in it, say, becoming effectively blind to police violence against black people?

And if that doesn't scare you, it absolutely should.

load more comments (1 replies)
[-] Killing_Spark@feddit.de 10 points 6 months ago

It's also potentially skipping some of the parts that should be looked at. It depends on the training set.

[-] fluxion@lemmy.world 4 points 6 months ago

It's not that AI is scary, it's that AI is dumb as fuck.

[-] badbytes@lemmy.world 6 points 6 months ago

And I'm sure the criminal acts by police will get filtered out.

[-] HawlSera@lemm.ee 3 points 6 months ago

I wonder if it's one of those AI that can't see darker skin colors...

load more comments
view more: next ›
this post was submitted on 02 Feb 2024
216 points (98.2% liked)

Technology

57421 readers
3595 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS