721
submitted 6 months ago by lseif@sopuli.xyz to c/programmerhumor@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] stingpie@lemmy.world 20 points 6 months ago

Can't find the exact source–I'm on mobile right now–but the code for the gpt-2 encoder uses a utf-8 to unicode look up table to shrink the vocab size. https://github.com/openai/gpt-2/blob/master/src/encoder.py

[-] crispy_kilt@feddit.de 3 points 6 months ago

Seriously? Python for massive amounts of data? It's a nice scripting language, but it's excruciatingly slow

[-] stingpie@lemmy.world 6 points 6 months ago

There are bindings in java and c++, but python is the industry standard for AI. The libraries for machine learning are actually written in c++, but use python language bindings. Python doesn't tend to slow things down since machine learning is gpu-bound anyway. There are also library specific programming languages which urges the user to make pythonic code that can be compiled into c++.

this post was submitted on 12 Jun 2024
721 points (98.0% liked)

Programmer Humor

32706 readers
1453 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS