[-] chrash0@lemmy.world 20 points 1 month ago

you have to do a lot of squinting to accept this take.

so his wins were copying competitors, and even those products didn’t see success until they were completely revolutionized (Bing in 2024 is a Ballmer success? .NET becoming widespread is his doing?). one thing Nadela did was embrace the competitive landscape and open source with key acquisitions like GitHub and open sourcing .NET, and i honestly don’t have the time to fully rebuff this hot take. but i don’t think the Ballmer haters are totally off base here. even if some of the products started under Ballmer are now successful, it feels disingenuous to attribute their success to him. it’s like an alcoholic dad taking credit for his kid becoming an actor. Microsoft is successful despite him

[-] chrash0@lemmy.world 25 points 1 month ago

these days Hyprland but previously i3.

i basically live in the terminal unless i'm playing games or in the browser. these days i use most apps full screen and switch between desktops, and i launch apps using wofi/rofi. this has all become very specialized over the past decade, and it almost has a “security by obscurity” effect where it’s not obvious how to do anything on my machines unless you have my muscle memory.

not that i necessarily recommend this approach generally, but i find value in mostly using a keyboard to control my machines and minimizing visual clutter. i don’t even have desktop icons or a wallpaper.

[-] chrash0@lemmy.world 99 points 2 months ago

this is one of those facts i have to struggle to keep to myself to avoid coming off as an insufferable nerd

[-] chrash0@lemmy.world 69 points 3 months ago

the semantics of C make that virtually impossible. the compiler would have to make some semantics of the language invalid, invalidating patterns that are more than likely highly utilized in existing code, thus we have Rust, which built its semantics around those safety concepts from the beginning. there’s just no way for the compiler to know the lifetime of some variables without some semantic indication

[-] chrash0@lemmy.world 57 points 5 months ago

most Zionists i’ve met are white Protestants, and most Jews i’ve met aren’t Zionists…

[-] chrash0@lemmy.world 22 points 5 months ago

lol this is like Ben Shapiro telling people in areas threatened by climate change to sell their houses. “to who? fucking Aqua Man?”

best case you’ll get $10 and whoever bought it will end up back here

[-] chrash0@lemmy.world 69 points 5 months ago

Users who need to run their application in Python 2 should do so on a platform that offers support for it

damn go off

[-] chrash0@lemmy.world 29 points 6 months ago

i guess the rare thing is the public commitment, but Apple has generally had a good track record for updates compared to its Android counterparts, who have previously failed to meet their goals or set laughable goals like 2 years.

[-] chrash0@lemmy.world 22 points 6 months ago

as big as the circle jerk is here against AI, i think it’s on the whole a good thing if we use it for what it’s actually good at: approximating an answer. but once companies start promising things like security that require 100% accuracy they totally lose me. as someone who has worked on recognition systems i will be opting out so fast to things like facial scan at PoS. it’s not AI because it’s not actually intelligent. you can’t reason with it or change its mind without rigorous training. write some shitty code for me to fix? fine. buy a TV with whatever bs contractor bid the lowest for the facial scanning job? gtfo. startup founders, executives, and managers will promise the moon when they’re so far up their own ass they’ve never even seen it.

[-] chrash0@lemmy.world 60 points 7 months ago

honestly 8 space indents always felt a bit ridiculous to me. i usually use 4 since it’s more conventional in most languages but could also be happy with 2.

weird hill to die on. use default setting unless you have a good reason not to. the argument itself is a waste of time on projects that want to get things done.

[-] chrash0@lemmy.world 32 points 7 months ago* (last edited 7 months ago)

i really want to like Nix.

gave it a shot a few years ago, but i felt like documentation and community support wasn’t really there yet. this was long before Nix surpassed Arch in terms of number of available packages. now people still complain about documentation, especially of the Nix language. i see a lot of package authors using it, and that kind of tempts me to start using at least the package manager. but a lot of packages don’t. the allure of GitOpsing my entire OS is very tempting, but then there’s been these rumors (now confirmed) of new forks, while Guix splintered off much earlier. for something that’s ostensibly supposed to be the most stable OS, that makes me nervous. it also seems to have some nontrivial overhead—building packages, retaining old packages, etc.

the pitch for Nix is really appealing, but with so much uncertainty it’s hard to pull the trigger on migrating anything. heck, if i could pull off some PoCs, i think my enterprise job might consider adopting it, but it’s a hard recommend for me today as it was 5 years ago.

[-] chrash0@lemmy.world 58 points 7 months ago

what else would it be? it’s a pretty common embedded target. dev kits from Qualcomm come with Android and use the Android bootloader and debug protocols at the very least.

nobody is out here running a plain Linux kernel and maintaining a UI stack while AOSP exists. would be a foolish waste of time for companies like Rabbit to use anything else imo.

to say it’s “just an Android device” is both true and a mischaracterization. it’s likely got a lot in common with a smartphone, but they’ve made modifications and aren’t supporting app stores or sideloading. doesn’t mean you can’t do it, just don’t be surprised when it doesn’t work 1-1

view more: next ›

chrash0

joined 9 months ago