GamingChairModel

joined 2 years ago
[–] GamingChairModel@lemmy.world 53 points 3 weeks ago

Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale

Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus

[–] GamingChairModel@lemmy.world 3 points 3 weeks ago (1 children)

It wasn't the buffer itself that drew power. It was the need to physically spin the disc faster in order to read the data to build up a buffer. So it would draw more power even if you left it physically stable. And then, if it would actually skip in reading, it would need to seek back to where it was to build up the buffer again.

[–] GamingChairModel@lemmy.world 4 points 4 weeks ago (2 children)

You don't remember NetZero, do you? A free dial up ISP that gave free Internet connections under the condition that you give up like 25% of your screen to animated banner ads while you're online.

Or BonziBuddy? Literal spyware.

What about all the MSIE toolbars, some of which had spyware, and many of which had ads?

Or just plain old email spam in the days before more sophisticated filters came out?

C'mon, you're looking at the 1990s through rose tinted glasses. I'd argue that the typical web user saw more ads in 1998 than in 2008.

[–] GamingChairModel@lemmy.world 37 points 4 weeks ago (6 children)

No, 1990s internet just hadn't actually fulfilled the full potential of the web.

Video and audio required plugins, most of which were proprietary. Kids today don't realize that before YouTube, the best place to watch trailers for upcoming movies was on Apple's website, as they tried to increase adoption for QuickTime.

Speaking of plugins, much of the web was hidden behind embedded flash elements, and linking to resources was limited. I could view something in my browser, but if I sent the URL to a friend they might still need to navigate within that embedded element to get to whatever it was I was talking about.

And good luck getting plugins if you didn't use the right operating system expected by the site. Microsoft and Windows were so busy fracturing the web standards that most site publishers simply ignored Mac or Linux users (and even ignored any browser other than MSIE).

Search engines were garbage. Yahoo actually provided a decent competition to search engines by paying humans to manually maintain an index, and review user submissions on whether to add a new site to the index.

People's identities were largely tied to their internet service provider, which might have been a phone company, university, or employer. The publicly available email address services, not tied to ISP or employer or university, were unreliable and inconvenient. We had to literally disconnect from the internet in order to dial into Eudora or whatever to fetch mail.

Email servers only held mail for just long enough for you to download your copy, and then would delete from the server. If you wanted to read an archived email, you had to go back to the specific computer you downloaded it to, because you couldn't just log into the email service from somewhere else. This was a pain when you used computer labs in your university (because very few of us had laptops).

User interactions with websites were clunky. Almost everything that a user submitted to a site required an actual HTTP POST transaction, and a reloading of the entire page. AJAX changed the web significantly in the mid 2000's. The simple act of dragging a map around, and zooming in and out, for Google Maps, was revolutionary.

Everything was insecure. Encryption was rare, and even if present was usually quite weak. Security was an afterthought, and lots of people broke their computers downloading or running the wrong thing.

Nope, I think 2005-2015 was the golden age of the internet. Late enough to where the tech started to support easy, democratized use, but early enough that the corporations didn't ruin everything.

[–] GamingChairModel@lemmy.world 3 points 4 weeks ago (2 children)

I'm not sure that would work. Admins need to manage their instance users, yes, but they also need to look out for the posts and comments in the communities hosted on their instance, and be one level of appeal above the mods of those communities. Including the ability to actually delete content hosted in those communities, or cached media on their own servers, in response to legal obligations.

[–] GamingChairModel@lemmy.world 15 points 4 weeks ago (1 children)

Yes, it's the exact same practice.

The main difference, though, is that Amazon as a company doesn't rely on this "just walk out" business in a capacity that is relevant to the overall financial situation of the company. So Amazon churns along, while that one insignificant business unit gets quietly shut down.

For this company in this post, though, they don't have a trillion dollar business subsidizing the losses from this AI scheme.

[–] GamingChairModel@lemmy.world 4 points 1 month ago

They're actually only about 48% accurate, meaning that they're more often wrong than right and you are 2% more likely to guess the right answer.

Wait what are the Bayesian priors? Are we assuming that the baseline is 50% true and 50% false? And what is its error rate in false positives versus false negatives? Because all these matter for determining after the fact how much probability to assign the test being right or wrong.

Put another way, imagine a stupid device that just says "true" literally every time. If I hook that device up to a person who never lies, then that machine is 100% accurate! If I hook that same device to a person who only lies 5% of the time, it's still 95% accurate.

So what do you mean by 48% accurate? That's not enough information to do anything with.

[–] GamingChairModel@lemmy.world 12 points 1 month ago

Yeah, from what I remember of what Web 2.0 was, it was services that could be interactive in the browser window, without loading a whole new page each time the user submitted information through HTTP POST. "Ajax" was a hot buzzword among web/tech companies.

Flickr was mind blowing in that you could edit photo captions and titles without navigating away from the page. Gmail could refresh the inbox without reloading the sidebar. Google maps was impressive in that you could drag the map around and zoom within the window, while it fetched the graphical elements necessary on demand.

Or maybe web 2.0 included the ability to implement states in the stateless HTTP protocol. You could log into a page and it would only show you the new/unread items for you personally, rather than showing literally every visitor the exact same thing for the exact same URL.

Social networking became possible with Web 2.0 technologies, but I wouldn't define Web 2.0 as inherently social. User interactions with a service was the core, and whether the service connected user to user through that service's design was kinda beside the point.

[–] GamingChairModel@lemmy.world 2 points 1 month ago

Teslas will (allegedly) start on a small, low-complexity street grid in Austin. exact size TBA. Presumably, they're mapping the shit out of it and throwing compute power at analyzing their existing data for that postage stamp.

Lol where are the Tesla fanboys insisting that geofencing isn't useful for developing self driving tech?

[–] GamingChairModel@lemmy.world 3 points 1 month ago (1 children)

Wouldn't a louder room raise the noise floor, too, so that any quieter signal couldn't be extracted from the noisy background?

If we were to put a microphone and recording device in that room, could any amount of audio processing be able to extract the sound of the small server out from the background noise of all the bigger servers? Because if not, then that's not just a auditory processing problem, but a genuine example of destruction of information.

[–] GamingChairModel@lemmy.world 1 points 1 month ago (1 children)

taking a shot at installing a new OS

To be clear, I had been on Ubuntu for about 4 years by then, having switched when 6.06 LTS had come out. And several years before that, I had previously installed Windows Me, XP beta, and the first official XP release on a home-built, my first computer that was actually mine, using student loan money paid out because my degree program required all students have their own computer.

But freedom to tinker on software was by no means the flexibility to acquire spare hardware. Computers were really expensive in the 90's and still pretty expensive in the 2000's. Especially laptops, in a time when color LCD technology was still pretty new.

That's why I assumed you were a different age from me, either old enough to have been tinkering with computers long enough to have spare parts, or young enough to still live with middle class parents who had computers and Internet at home.

[–] GamingChairModel@lemmy.world 4 points 1 month ago (2 children)

That's never really been true. It's a cat and mouse game.

If Google actually used its 2015 or 2005 algorithms as written, but on a 2025 index of webpages, that ranking system would be dogshit because the spammers have already figured out how to crowd out the actual quality pages with their own manipulated results.

Tricking the 2015 engine using 2025 SEO techniques is easy. The problem is that Google hasn't actually been on the winning side of properly ranking quality for maybe 5-10 years, and quietly outsourced the search ranking systems to the ranking systems of the big user sites: Pinterest, Quora, Stack Overflow, Reddit, even Twitter to some degree. If there's a responsive result and it ranks highly on those user voted sites, then it's probably a good result. And they got away with switching to that methodology just long enough for each of those services to drown in their own SEO spam techniques, so that those services are all much worse than they were in 2015. And now indexing search based on those sites is no longer a good search result.

There's no turning backwards. We need to adopt new rankings for the new reality, not try to turn back to when we were able to get good results.

view more: ‹ prev next ›