1671
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 16 Jul 2023
1671 points (96.5% liked)
Memes
45754 readers
1208 users here now
Rules:
- Be civil and nice.
- Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
founded 5 years ago
MODERATORS
I remember how confused I was when I first encountered i=i+1... like, what ๐คจ? How can this be correct, this thing has to be wrong... and then you start seing the logic behind it and you're like "oooh, yeah, that seems to work... but still, this is wrong on almost every level in math"... and then you grow a bit older and realize that coding has nothing to do with math, instead it's got everything to do with problem solving. If you like to name your variables peach, grape, c*nt, you can, and if that helps you solve the problem, even better, just make it work, i.e. solve the problem ๐คท.
Wait until you realize what math is all about
I think I do understand, but I'd rather embarres myself ๐.
Coding has nothing to do with math yet the entire basis of computing and programming is Boolean algebra.
I meant as in real world applications, like how much math do you need to know to sort a table or search through an array.
But isn:t that kinda true for most things? If you go down deep enough, amost all tasks end up in physics und thus maths somewhere. But if I'm stacking shelves, I don't care that there are some pretty complicated mathy physics things that determine how much weight I can stack on the shelf. I just stack it.
That's kinda how most of programming is related to maths. Yeah, math makes it all run, but I mostly just see maybe a little algebra and very simple boolean logic.
And the rest of my work is following best practices and trying to make sense of requirements.
This is what I was actually trying to say, thanks for elaborating ๐.
you don't need to worry about the load capacity of the shelf, but only because somebody else already engineered it to be sufficient for the expected load. i'd argue that you aren't the coder in this analogy, you're the end user.
But how often, as a coder, are you going low-level?
If I want to sort a list, I don't invent a sorting algo.
I don't even code out a known sorting algo.
I just type
.sort()
, and I don't even care which algo is used.Same with most other things. Thinking about different kinds of lists/maps/sets is something you do in university.
In reality, many languages (like e.g. Python) don't even give you the choice. There are
List(), Map(), Set()
and that's it. And even on languages like Java, everybody just goes for ArrayList, HashMap and HashSet. Can't remember a single time since university where I was like "You know what I'd fancy now? A LinkedList."I honestly don't even know if Java offers any Map/Set implementations that don't use hash buckets.
And even of boolean logic we only use a fraction. We use and, or, not and equals. We don't use nand, nor, identity, xor, both material conditional variants, material biconditional or their negations.
A monad is just a monoid in the category of endofunctors, what's the problem?
I'm not that good of a coder or mathematitian to know what that quote means ๐๐.
It's from a longer quote in "A Brief, Incomplete and Mostly Wrong History of Programming Languages" about the language Haskell:
Some other languages like e.g. Rust also use monads. The point I was trying to make humorously was that many programming languages sometimes do use math concepts, sometimes even very abstract maths (like monads), and while it's not maths per se, programming and computer science in general can have quite a bit to do with maths sometimes.
Yeah, I get what you're trying to say now ๐. Still, they're mostly used when doing algos, which in real world practical examples is almost never. We do all sorts of repetitive things, like sorting or user input blocks, but new algos is... something that you might do in NASA, CERN, Wall Street, not your every day programming job. Sure, you might optimize a thing or two here and there, but that's about it ๐คท.
I mean, coding does have to do with math, it's usually just different notation. i = i + 1 in math notation is just i := i + 1.
That's advanced calculus, and my guess is, those notations were made up to give rise to a new field in math, which has more to do with computers than math, so I don't think that counts.
What discipline do you think Allan Turing and Von Neumann were in?
Computation theory, but that's not math as in regular math. It's just a fancy way of expressing how things inside a computer work, so we can actually make better versions of it. You just have to express it somehow in math terms.
It's like saying engineers use math all the time. No, they don't. We use simple aproximations of what is actually happening to dumb down the problem, cuz, it does the job nicely and no one will notice the difference between what we used, a simple aproximation, and the real thing, a full blown advanced calculus model of the thing we're working on.
You mean they were not mathematics department professors?
Where?