[-] chrash0@lemmy.world 68 points 3 days ago

Users who need to run their application in Python 2 should do so on a platform that offers support for it

damn go off

[-] chrash0@lemmy.world 12 points 1 week ago* (last edited 1 week ago)

even though those are rules as written, i like to honor the crits, with a bit of nuance. if you’re super stealthy and roll a 1, maybe it makes a small noise but doesn’t cause an alarm. if you’re dumping strength on your wet noodle wizard, maybe you’re able to move that heavy thing an inch on a 20. it’s always situational though. people get excited to see a crit, and i think it makes it more fun.

[-] chrash0@lemmy.world 15 points 2 weeks ago

nushell is excellent for dealing with structured data. it’s also great as a scripting language.

[-] chrash0@lemmy.world 29 points 1 month ago

i guess the rare thing is the public commitment, but Apple has generally had a good track record for updates compared to its Android counterparts, who have previously failed to meet their goals or set laughable goals like 2 years.

[-] chrash0@lemmy.world 18 points 1 month ago

same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

[-] chrash0@lemmy.world 12 points 1 month ago

tbh this research has been ongoing for a while. this guy has been working on this problem for years in his homelab. it’s also known that this could be a step toward better efficiency.

this definitely doesn’t spell the end of digital electronics. at the end of the day, we’re still going to want light switches, and it’s not practical to have a butter spreading robot that can experience an existential crisis. neural networks, both organic and artificial, perform more or less the same function: given some input, predict an output and attempt to learn from that outcome. the neat part is when you pile on a trillion of them, you get a being that can adapt to scenarios it’s not familiar with efficiently.

you’ll notice they’re not advertising any experimental results with regard to prediction benchmarks. that’s because 1) this actually isn’t large scale enough to compete with state of the art ANNs, 2) the relatively low resolution (16 bit) means inputs and outputs will be simple, and 3) this is more of a SaaS product than an introduction to organic computing as a concept.

it looks like a neat API if you want to start messing with these concepts without having to build a lab.

[-] chrash0@lemmy.world 22 points 1 month ago

as big as the circle jerk is here against AI, i think it’s on the whole a good thing if we use it for what it’s actually good at: approximating an answer. but once companies start promising things like security that require 100% accuracy they totally lose me. as someone who has worked on recognition systems i will be opting out so fast to things like facial scan at PoS. it’s not AI because it’s not actually intelligent. you can’t reason with it or change its mind without rigorous training. write some shitty code for me to fix? fine. buy a TV with whatever bs contractor bid the lowest for the facial scanning job? gtfo. startup founders, executives, and managers will promise the moon when they’re so far up their own ass they’ve never even seen it.

[-] chrash0@lemmy.world 15 points 1 month ago

there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

[-] chrash0@lemmy.world 13 points 1 month ago

if it’s easier to pay, people spend more

[-] chrash0@lemmy.world 60 points 1 month ago

honestly 8 space indents always felt a bit ridiculous to me. i usually use 4 since it’s more conventional in most languages but could also be happy with 2.

weird hill to die on. use default setting unless you have a good reason not to. the argument itself is a waste of time on projects that want to get things done.

[-] chrash0@lemmy.world 32 points 2 months ago* (last edited 2 months ago)

i really want to like Nix.

gave it a shot a few years ago, but i felt like documentation and community support wasn’t really there yet. this was long before Nix surpassed Arch in terms of number of available packages. now people still complain about documentation, especially of the Nix language. i see a lot of package authors using it, and that kind of tempts me to start using at least the package manager. but a lot of packages don’t. the allure of GitOpsing my entire OS is very tempting, but then there’s been these rumors (now confirmed) of new forks, while Guix splintered off much earlier. for something that’s ostensibly supposed to be the most stable OS, that makes me nervous. it also seems to have some nontrivial overhead—building packages, retaining old packages, etc.

the pitch for Nix is really appealing, but with so much uncertainty it’s hard to pull the trigger on migrating anything. heck, if i could pull off some PoCs, i think my enterprise job might consider adopting it, but it’s a hard recommend for me today as it was 5 years ago.

[-] chrash0@lemmy.world 58 points 2 months ago

what else would it be? it’s a pretty common embedded target. dev kits from Qualcomm come with Android and use the Android bootloader and debug protocols at the very least.

nobody is out here running a plain Linux kernel and maintaining a UI stack while AOSP exists. would be a foolish waste of time for companies like Rabbit to use anything else imo.

to say it’s “just an Android device” is both true and a mischaracterization. it’s likely got a lot in common with a smartphone, but they’ve made modifications and aren’t supporting app stores or sideloading. doesn’t mean you can’t do it, just don’t be surprised when it doesn’t work 1-1

view more: next ›

chrash0

joined 4 months ago