this post was submitted on 17 May 2025
418 points (98.4% liked)

Programmer Humor

23312 readers
937 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] BatmanAoD@programming.dev 1 points 10 hours ago (1 children)

Without one, the run time system, must assign some semantics to the source code, no matter how erroneous it is.

That's just not true; as the comment above points out, Python also has no separate compilation step and yet it did not adopt this philosophy. Interpeted languages were common before JavaScript; in fact, most LISP variants are interpreted, and LISP is older than C.

Moreover, even JavaScript does sometimes throw errors, because sometimes code is simply not valid syntactically, or has no valid semantics even in a language as permissive as JavaScript.

So Eich et al. absolutely could have made more things invalid, despite the risk that end-users would see the resulting error.

[–] bss03@infosec.pub 1 points 9 hours ago (1 children)

Python also has no separate compilation step and yet it did not adopt this philosophy

Yes. It did. It didn't assign exactly the same semantics, but it DOES assign a run time semantic to min().

[–] BatmanAoD@programming.dev 3 points 8 hours ago* (last edited 8 hours ago)

I'm addressing the bit that I quoted, saying that an interpreted language "must" have valid semantics for all code. I'm not specifically addressing whether or not JavaScript is right in this particular case of min().

...but also, what are you talking about? It throws a type error if you call min() with no argument.