752
Not my problem sort
(infosec.pub)
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Hello programmers...
I recently took a course that went through basic python, C, and C++.
I had a hard time implementing various forms of sorting functions by hand (these were exercises for exam study). Are there any resources you folks would recommend so that I can build a better grasp of sorting implementations and efficiency?
Skiena's Algorithm design manual is very widely recommended for learning algorithms, I've also heard good things about A common sense guide to algorithms and data structures. Skiena's also has video lectures on YouTube if you prefer videos.
From what I've seen, a common sense guide seems to be more geared towards newer programmers while Skiena assumes more experience. Consequently, Skiena goes into more depth while A common sense guide seems to be more focused on what you specifically asked for. algorithm design manual
A common sense guide
Thank you, awesome! I will definitely check out this material :)
don't get discouraged. sorting algorithms occur frequently in interviews, and yes you use them a decent amount (especially in languages without built in sorts like c) but they are one of the harder things to visualize in terms of how they work. I'd say avoid anything recursive for now until you can get selection and insertion down pat. check out geeksforgeeks articles on them, but also don't be afraid to Google willy nilly, you'll find the resource that makes it click eventually.
in terms of efficiency, it does become a little more difficult to grasp without some math background. big o is known as asymptomatic notation, and describes how a function grows. for example, if you graph f1(x)=15log(x) and f2(x)=x, you'll notice that if x is bigger than 19, then f2(x) always has a higher output value than f1(x). in computer science terms, we'd say f1 is O(log(n)), meaning it has logarithmic growth, and f2 is O(n), or linear growth. the formal definition of big o is that f(x) is O(g(x)), if and only if (sometimes abbreviated as iff) there exists constants N and C such that |f(x)| <= C|g(x)| for all x>N. in our example, we can say that C = 1, and N>19, so that fulfills definition as |15log(x)| <= 1|x| whenever x>19. therefore, f1(x) is O(f2(x)). apologies for just throwing numbers at you, (or if you've heard all this before) but having even just the most basic grasp of the math is gonna help a lot. again, in terms of best resources, geeksforgeeks is always great and googling can help you find thousands of more resources. trust that you are not the first person to have trouble with these and most people before you have asked online about it as well.
I also highly reccomend grabbing a copy of discrete mathematics and it's applications by Kenneth Rosen to dig farther into the math. there's a few other types of asymptomatic notation such os big omega and big theta, even little o, that I didn't mention here but are useful for comparing functions in slightly different ways. it's a big book but it starts at the bottom and is generally pretty well written and well laid out.
feel free to dm me if you have questions, I'm always down to talk math and comp sci.
edit: in our example, we could also pick c =19 and n = 1, or probably a few other combinations as well. as long as it fills the definition it's correct.