87
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

IT needs more brains, so why is it so bad at getting them?::Open-book exams aren’t nearly open enough

you are viewing a single comment's thread
view the rest of the comments
[-] otl 20 points 1 year ago

For me, that feeling of needing to learn new things I think comes not from new tech or tooling, but from needing to solve different problems all the time. I would say there is definitely a fast-moving, hype-driven churn in web development (particularly frontend development!). This really does wear me down. But outside of this, in IT you're almost always interacting with stuff that has been the same for decades.

Off the top of my head...

Networking. From ethernet and wifi, up to TCP/IP, packet switching, and protocols like HTTP.

Operating systems. Vastly dominated by Windows and Linux. UNIX dates back to the 70s, and Windows on the NT kernel is no spring chicken either.

Hardware. There have been amazing developments over the years. But incredibly this has been mostly transparent to IT workers.

Programming. Check The Top Programming Languages 2023. Python, Java, C: decades old.

User interfaces. Desktop GUI principles are unchanged. iOS and Android are almost 15 years old now.

Dealing with public cloud infrastructure, for example, you're still dealing with datacentres and servers. Instead of connecting to stuff over serial console, you're getting the same data to you over VNC over HTTP. When you ask for 50 database servers, you make some HTTP request to some service. You wait, and you get a cluster of MySQL or Postgresql (written in C!) running on UNIX-like OS (written in C!) and we interact with it with SQL (almost 50 years old now?) over TCP/IP.

As I spend more time in the industry I am constantly learning. But this comes more from me wanting to, or needing to, dig deeper.

[-] kapx132@lemmy.world 3 points 1 year ago

Hardware. There have been amazing developments over the years. But incredibly this has been mostly transparent to IT workers

I'd argue that hardwre has gotten worse over the years, at least in the ability to repair context.

[-] Aceticon@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

This is also my experience.

Whilst one can viably move around in IT to be near the bleeding edge (which moves around from area to area slowly over timeframes of a decade or so), most of what's done in IT is pretty much the same old same old, maybe with bigger tech stacks because the expectations of fancy features keep going up yet the time frames are still the same (for example, integration with remote systems via networking used to be a pretty big deal, but nowadays it's very much expected as norm in plenty of sintuations) so you end up with ever larger frameworks and ever larger and thicker stacks of external dependencies (20 or 30 years ago it was normal to manually manage the entire hierarchy of library dependencies, whilst nowadays you pull out a clean project from source control and spend the next half an hour waiting for the dependencies to be dowloaded by whatever dependency management system the project build framework - itself much more complex - uses).

this post was submitted on 05 Sep 2023
87 points (93.9% liked)

Technology

59674 readers
3247 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS