[-] algernon@lemmy.ml 186 points 3 months ago

Sadly, that's not code Linus wrote. Nor one he merged. (It's from git, copied from rsync, committed by Junio)

[-] algernon@lemmy.ml 12 points 3 months ago

That would result in those fediverse servers theoretically requesting 333333 * 114MB = ~38Gigabyte/s.

On the other hand, if the site linked would not serve garbage, and would fit like 1Mb like a normal site, then this would be only ~325mb/s, and while that's still high, it's not the end of the world. If it's a site that actually puts effort into being optimized, and a request fits in ~300kb (still a lot, in my book, for what is essentially a preview, with only tiny parts of the actual content loaded), then we're looking at 95mb/s.

If said site puts effort into making their previews reasonable, and serve ~30kb, then that's 9mb/s. It's 3190 in the Year of Our Lady Discord. A potato can serve that.

[-] algernon@lemmy.ml 23 points 3 months ago

I only serve bloat to AI crawlers.

map $http_user_agent $badagent {
  default     0;
  # list of AI crawler user agents in "~crawler 1" format
}

if ($badagent) {
   rewrite ^ /gpt;
}

location /gpt {
  proxy_pass https://courses.cs.washington.edu/courses/cse163/20wi/files/lectures/L04/bee-movie.txt;
}

...is a wonderful thing to put in my nginx config. (you can try curl -Is -H "User-Agent: GPTBot" https://chronicles.mad-scientist.club/robots.txt | grep content-length: to see it in action ;))

[-] algernon@lemmy.ml 60 points 3 months ago

...and here I am, running a blog that if it gets 15k hits a second, it won't even bat an eye, and I could run it on a potato. Probably because I don't serve hundreds of megabytes of garbage to visitors. (The preview image is also controllable iirc, so just, like, set it to something reasonably sized.)

[-] algernon@lemmy.ml 33 points 3 months ago

There are no bugs. Just happy little accidental features.

[-] algernon@lemmy.ml 24 points 5 months ago

Fair bias notice: I am a Forgejo contributor.

I switched from Gitea to Forgejo when Forgejo was announced, and it was as simple as changing the binary/docker image. It remains that simple today, and will remain that simple for the foreseeable future, because Forgejo cherry picks most of the changes in Gitea on a weekly basis. Until the codebases diverge, that will remain the case, and Forgejo will remain a drop-in replacement until such time comes that we decide not to pick a feature or change. If you're not reliant on said feature, it's still a drop-in replacement. (So far, we have a few things that are implemented differently in Forgejo, but still in a compatible way).

Let me offer a few reasons to switch:

  • Forgejo - as of today, and for the foreseeable future - includes everything in Gitea, but with more tests, and more features on top. A few features Forgejo has that Gitea does not:
    • Forgejo makes it possible to have any signed in user edit Wikis (like GitHub), Gitea restricts it to collaborators only. (Forgejo defaults to that too, but the default can be changed). Mind you, this is not in a Forgejo release yet, it will be coming in the next release probably in April.
    • Gitea has support for showing an Action status badge. Forgejo has badges for action statuses, stars, forks, issues, pull requests.
    • ...there are numerous other features being developed for Forgejo that will not make it into Gitea unless they cherry pick it (they don't do that), or reimplement it (wasting a lot of time, and potentially introducing bugs).
  • Forgejo puts a lot of effort into testing. Every feature developed for Forgejo needs to have a reasonable amount of tests. Most of the things we cherry pick for Gitea, we write tests for if they don't have any (we write plenty of tests for stuff originating from Gitea).
  • Forgejo is developed in the open, using free tools: we use Forgejo to host the code, issues and releases, Forgejo Actions for CI, and Weblate for translations. Gitea uses GitHub to host the code, issues and releases, uses GitHub CI, and CrowdIn for translations (all of them proprietary platforms).
  • Forgejo accepts contributions without requiring copyright assignment, Gitea does not.
  • Forgejo routinely cherry picks from Gitea, Gitea does not cherry pick from Forgejo (they do tend to reimplement things we've done, though, a huge waste of time if you ask me).
  • Forgejo isn't going anywhere anytime soon, see the sustainability repo. There are people committed to working on it, there are people paid to work on it, and there's a fairly healthy community around it already.
[-] algernon@lemmy.ml 26 points 5 months ago

Aren't all consoles like that, though? They all run mainstream operating systems, and are basically locked down PCs in a fancy box. If anything, the Steam Deck is further from a PC than an XBox/PS, due to being handheld, with an embedded screen and controller, while XBox and its friends require a display and an external controller (like a PC).

[-] algernon@lemmy.ml 36 points 5 months ago

Steam Deck, because it is handheld, and can run a lot of my Steam games. I can also dock it to a big screen and attach a controller.

[-] algernon@lemmy.ml 110 points 5 months ago

The single best thing I like about Zed is how they unironically put up a video on their homepage where they take a perfectly fine function, and butcher it with irrelevant features using CoPilot, and in the process:

  • Make the function's name not match what it is actually doing.
  • Hardcode three special cases for no good reason.
  • Write no tests at all.
  • Update the documentation, but make the short version of it misleading, suggesting it accepts all named colors, rather than just three. (The long description clarifies that, so it's not completely bad.)
  • Show how engineering the prompt to do what they want takes more time than just writing the code in the first place.

And that's supposed to be a feature. I wonder how they'd feel if someone sent them a pull request done in a similar manner, resulting in similarly bad code.

I think I'll remain firmly in the "if FPS is an important metric in your editor, you're doing something wrong" camp, and will also steer clear of anything that hypes up the plagiarism parrots as something that'd be a net win.

[-] algernon@lemmy.ml 17 points 6 months ago

Nevertheless, as Bluesky grows, there are likely to be multiple professionally-run indexers for various purposes. For example, a company that performs sentiment analysis on social media activity about brands could easily create a whole-network index that provides insights to their clients.

(source)

Is that supposed to be a selling point? Because I'd like to stay far, far away from that, thank you very much.

[-] algernon@lemmy.ml 43 points 6 months ago

Very bad, because the usability of such a scheme would be a nightmare. If you have to unzip the files every time you need a password, that'd be a huge burden. Not to mention that unzipping it all would leave the files there, unprotected, until you delete them again (if you remember deleting them in the first place). If you do leave the plaintext files around, and only encrypt & zip for backing up, that's worse than just using the plaintext files in the backup too, because it gives you a false sense of security. You want to minimize the amount of time passwords are in the clear.

Just use a password manager like Bitwarden. Simpler, more practical, more secure.

[-] algernon@lemmy.ml 32 points 7 months ago

There are worse things out there than Great Old Ones. You might invoke Perl by accident.

view more: next ›

algernon

joined 7 months ago