Really annoys me that this is actually O(n log n) because for large enough n the merge sort will take longer than n*1e6 second. Randall should know better!
You should know better too! Behaviour at large n is irrelevant to "best case" complexity analysis of sorting algorithms
They need to fix their mobile website. It has large side margins for no reason, and the comic is tiny. I have to zoom in every time I visit to read the comic. Makes no sense.
There is m.xkcd.com but I don't link to that when I post here, only use it to copy the title text.
In this day and age, the regular site should serve a mobile-friendly page on a phone. There is CSS to detect the browser size and orientation and change the style.
Can you do it without loading a bunch of heavy scripts? Making a html responsive is always something challenging I face since I'm not a web developer. I just make htmls when I have to share some data visualization. And I couldn't find how to make it responsive without using bootstrap, sth-ui, etc and using their classes and scripts.
I'd love if vanilla CSS just had if statement like thing for "portrait/landscape" or ">threshold/not" for contents width and fonts.
It actually does, there's "@media" which lets you query stuff about the browser like if it's touchscreen vs mouse (and maximum/minimum width/height)
Example:
@media screen and (max-width: 1300px) {
do stuff for screens less than 1300px
}
And if anyone asks you optimise the function, just mess with the sleep function!
Sleep(1e5.9[...])
, where [...]
is everything else, and hope that the compiler or interpreter can handle non-integer exponents for this type of scientific notation.
It'd be easier to do 9e5, 8e5 and so on though. Linear decrease in time with each optimization. 1e5.9 seems risky.
xkcd
A community for a webcomic of romance, sarcasm, math, and language.