I never used Twitter save for occasionally hearing about tweets, but I have been enjoying using Mastodon because in practice it's basically just a way for me to have a feed of cool astronomy pictures.
You know, I wasn't that impressed by this article, but I am coming around to your point of view given that additional context.
It is hard to see how the explicit goal of not receiving updates too early is reconciled with the goal of not sacrificing security. Shouldn't there be no such thing as "too early" when it comes to security updates?
Because that way you can use it wherever something accepts WASM. In particular, as mentioned in the linked article, Javy started its life as a way for you to submit code to Shopify Functions in JavaScript, as Shopify Functions lets you submit code as WASM so that you can program in whatever language you prefer.
Hence the "unless there's something seriously wrong with the previous research" part. That is always a possibility, of course, but it's much less likely that is the case then that this single study is the thing that is wrong.
There's essentially an open standard for streaming video so it's not like the old days where you needed to download a platform-specific component to watch streaming video. I use Linux as my primary environment and I can't even remember the last time I had trouble with it; certainly not for several years at least. I've used Netflix, DisneyPlus, Amazon, Paramount+, and probably others.
Just as a heads up, though, if you are using Firefox then the first time you go to any of these sites it will prompt you as to whether you are fine with enable support for DRM video, and you need to click "Yes". This is a one-time thing, though. (It does this because if you are an open source purist then you might not want to do this so it likes to get your permission first; most browsers just assume that you don't care and enable it by default.)
Nah, at this point his only option is to cancel Starship and redirect all of its development funding into building a time machine so that he can dramatically increase the amount of weed he was smoking at the time he got the brilliant idea to buy Twitter so that his brain is made incapable of actually following through with it.
It can be nice not to have to worry about types when you are doing exploratory programming. For example, I once started by writing a function that did a computation and then returned another function constructed from the result of that computation, and then realized that I'd actually like to attach some metadata to that function. In Python, that is super-easy: you just add a new attribute to the object and you're done. At some point I wanted to tag it with an attribute that was itself a function, and that was easy as well. Eventually I got to the point where I was tagging it with a zillion functions and realized that I was being silly and replaced it with a proper class with methods. If I'd known in advance that this is where I was going to end up then I would have started with the class, but it was only after messing around that I got a solid notion of what the shape of the thing I was constructing should be, and it helped that I was able to mess around with things in arbitrary ways until I figured out what I really wanted without the language getting in my way at intermediate points.
Just to be clear, I am not saying that this is the only or best way to program, just that there are situations where having this level of flexibility available in the language can be incredibly freeing.
And don't get me wrong, I also love types for two reasons. First, because they let you create a machine-checked specification of what your code is doing, and the more powerful the type system, the better you can do at capturing important invariants in the types. Second, because powerful type systems enable their own kind of exploratory programming where instead of experimenting with code until it does what you want you instead experiment with the types until they express how you want your program to behave, after which writing the implementation is often very straightforward because it is so heavily constrained by the types (and the compiler will tell you when you screwed up).
Oh, wow, where is this?
Interesting, but are those commits to the glibc library itself or commits to the Debian package of it? The link makes it look like the latter, but I could be wrong.
In fairness, given everything else I've heard about this venture, there probably was cyanide available, it's just that the company decided to cheap out and instead of buying actual cyanide they packed a large sack of almonds.
I'd be interested in hearing what it is about the language that has gotten you so excited about it.