277

cross-posted from: https://lemmy.ml/post/14100831

"No, seriously. All those things Google couldn't find anymore? Top of the search pile. Queries that generated pages of spam in Google results? Fucking pristine on Kagi – the right answers, over and ov

you are viewing a single comment's thread
view the rest of the comments
[-] homesweethomeMrL@lemmy.world 79 points 4 months ago

Even after all that payola, Google is still absurdly profitable. They have so much money, they were able to do a $80 billion stock buyback. Just a few months later, Google fired 12,000 skilled technical workers. Essentially, Google is saying that they don't need to spend money on quality, because we're all locked into using Google search. It's cheaper to buy the default search box everywhere in the world than it is to make a product that is so good that even if we tried another search engine, we'd still prefer Google.

It’s been easily 15 years since I thought Google search was good.

[-] foggy@lemmy.world 20 points 4 months ago* (last edited 4 months ago)

It was not long after the SSL thing that it became actively garbage. that was what, 2018?

But yeah, it's been bad since at least 2012.

[-] AnActOfCreation@programming.dev 11 points 4 months ago
[-] foggy@lemmy.world 27 points 4 months ago

Google stopped indexing all websites without SSL certificates in July 2018.

For example, darklyrics.com is a website I and many others grew up using as a resource to understanding lyrics. They've stubbornly not gotten an SSL because they transact 0 data beyond band name searches. However, without an SSL, they do not show up in Google search results.

This is one of literally millions of examples. Some more reasonable than others, but it still was a massive blow to the efficacy of their search.

[-] SkyNTP@lemmy.ml 36 points 4 months ago

They've stubbornly not gotten an SSL because they transact 0 data beyond band name searches.

Even if sites do not store user account data, such as passwords, ALL websites, and I mean ALL, handle user data, because merely accessing pages (urls) is user data.

Stubbornness is not a good reason not to setup SSL. Encryption should always be on, all the time, for everything.

[-] Bogasse@lemmy.ml 14 points 4 months ago* (last edited 4 months ago)

And it's not only about user data, it would also expose the website to content spoofing in public wifi, which would for example allow the attacker to inject fishing content in the website.

SSL encrypts the data you're sending but it also ensures that you're communicating only with who you think you are. Without SSL you can't be confident about any of that.

[-] pixxelkick@lemmy.world 1 points 4 months ago

If a website has literally no login system, there's nothing to phish.

There is honestly no reason to use SSL on a static website that has no login system and just displays some content.

IE a static blog or etc, where the only content on the website is just "look at this stuff, okay thank you!"

[-] Bogasse@lemmy.ml 1 points 4 months ago

That's still my point, for example you could inject your own login system "create an account to keep track of your favorite artists, or some new shiny feature". For there you can get people's personal information, potentially a password they use on other services.

An URL is something the general public will trust, if the content can be messed with you repurpose the website's reputation. I took phishing as an example but even my not-so-creative and non-expert brain can think of other things : asking for donations, propaganda, advertising, censorship, ...

[-] db0@lemmy.dbzer0.com -4 points 4 months ago

Ssl doest hide the url you're visiting

[-] rikudou@lemmings.world 17 points 4 months ago

It does. Anyone sniffing the traffic can only see the domain.

[-] hansl@lemmy.world 0 points 4 months ago

Not if you use DNSSEC.

[-] stsquad@lemmy.ml 6 points 4 months ago

Yes it does. You can derive the domain from snooping DNS lookups but the URL is part of the encrypted get header.

[-] AProfessional@lemmy.world 7 points 4 months ago

The domain is a public part of TLS itself, SNI, for now.

[-] tgxn@lemmy.tgxn.net 1 points 4 months ago

Yeah we're need encrypted SNI. I hear it's coming soon.

[-] AnActOfCreation@programming.dev 16 points 4 months ago

Hmm I hate Google as much as the next guy and am actively trying to de-Google myself, but I'm not sure I can get behind the outrage here. Certificates are free and easy to obtain with LetsEncrypt, so there's really no excuse for sites not to accept unencrypted traffic these days. I'm sure Google does lots of things to delist the small guys and promote their big payers, but I don't think this is one of them.

this post was submitted on 05 Apr 2024
277 points (86.4% liked)

Technology

57226 readers
4090 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS