80
submitted 1 year ago by cyu@sh.itjust.works to c/technology@lemmy.ml
all 30 comments
sorted by: hot top controversial new old
[-] Kubenqpl@szmer.info 25 points 1 year ago

We were taught obsolete things in university already 5 years ago

[-] queermunist@lemmy.ml 15 points 1 year ago

"But we refuse to train those students after school or allow them to go back to college for free."

[-] beejjorgensen 14 points 1 year ago

I was taught obsolete things in college in the early 90s. But FORTRAN wasn't the useful part of the class--problem-solving and broader language exposure was.

People focus on random technologies that are being used in class as being obsolete, but that's not the point of college. You can learn technologies on your own, and if you have trouble with that, maybe practicing it in college is a good idea.

Basically we're going to drill on technology-agnostic fundamentals for 4 years, and use a wide variety of technologies and languages as vehicles for that so you get a good breadth of experience.

[-] ribboo@lemm.ee 2 points 1 year ago

So much this.

People want more “real world usage” in college and school overall. Teach kids how to do taxes, teach engineers how to use X and Y software.

Well, in 10 years there’s a new software that does your taxes in another way, and plenty of laws have changed and there are new stuff to consider. And those software the engineers were taught, they are obsolete.

That’s why focus should be on getting people to a place where they themselves can acquire the skills needed to do those things by themselves.

[-] agressivelyPassive@feddit.de 1 points 1 year ago

Exactly. Whether you're learning fortran, C or python doesn't really matter. The core concepts are almost the same.

[-] ch00f@lemmy.world 11 points 1 year ago* (last edited 1 year ago)

I remember my microcontroller course professor telling us that if we just wanted to learn how to program assembly for microcontrollers, we could just pick up a book and skip the class.

Instead, he intended to teach us problem solving with microcontrollers.

The class was based around the Intel 8085 architecture, and this was in 2010. When I left the class, I started trying to make things using 8085s and assembly. These chips were so old, they needed external memory and flash storage to operate.

Anyway, I eventually learned about the larger microcontroller world; writing C; 32bit processors, real-time debugging, etc.

Understanding the fundamental goings on of assembly has been helpful, but it was only ever a building block.

[-] xchino@programming.dev 8 points 1 year ago

That's nothing new, I learned Novell Netware in college and Pascal in high school.

[-] XGM@lemmy.world 1 points 1 year ago

Funny enough I retired a dozen Netware servers in the past year with the last one just a month ago. To say they were old and outdated was an understatement.

[-] Established_Trial@lemm.ee 8 points 1 year ago

See I’m ahead of the curve; I didn’t learn anything in college

[-] SheeEttin@lemmy.world 3 points 1 year ago

I thought my data structures class was useful. A few others were interesting. But other than that, no, Java development was not useful to anyone's daily life.

[-] sylver_dragon@lemmy.world 1 points 1 year ago

no, Java development was not useful to anyone’s daily life.

You've never worked with the US Federal Government. For every software problem the Government has, there is a Java application written to make your life a living hell trying to solve that problem. It's also even odds on said application requiring a version of Java which is about a decade old and it just mysteriously breaks with anything newer.

[-] SheeEttin@lemmy.world 1 points 1 year ago

I think that was supposed to be my daily life. Not sure what happened between brain and fingers there. Java development was probably useful to some of my classmates.

[-] qooqie@lemmy.world 8 points 1 year ago

Haven’t they been saying this since the early 2000s?

[-] pixxelkick@lemmy.world 8 points 1 year ago

The important skills haven't changed in awhile.

Version control still works the same overall.

The concept of CI/CD are still just as important.

Understanding A/A/A for unit testing is still the same.

All the useful patterns are just as useful.

All the same antipatterns are just as important to watch out for.

Largely speaking while languages may evolve, the core foundational principles of how to write Good Clean Code remains the same.

[-] inspired@kbin.social 7 points 1 year ago

The obsolete skills they are learning are "prompt engineer".

[-] IzzyData@lemmy.ml 6 points 1 year ago* (last edited 1 year ago)

If some piece of knowledge or skill becomes obsolete in less than 4 years from its inception than it was not important in the first place.

[-] DarthYoshiBoy@kbin.social 6 points 1 year ago

Funny story time, intentionally vague to shield identities:

I have a friend who was hired to teach a course at a local University for their new CS degree that had a focus on video games some while ago. He was a bit of an expert in a particular portion of the material that they needed, and when they started putting out feelers to find someone to teach the subject matter, everyone locally in the industry gave him the highest praise and said he was the man for the job. The University met with him and eventually selected him to teach, which he did for 3 semesters. After 3 semesters, they dropped him because he didn't himself have a college degree in what he was teaching (which was something he made very clear in the hiring process.)

He went into making games straight out of high school, he was basically there at the ground floor, self taught, acknowledged by everyone in the industry locally as a foremost expert in the field where they had him teaching, and they couldn't keep him because they couldn't have him teach when he didn't have a degree in the field. Without his having a degree their program couldn't be accredited. So... They wanted him to have a degree in a subject he was an originator of and without that degree they had to drop him.

He makes financial software now because the games industry was/is brutal and he wanted to see his family now and then. I've always found it hilarious that a University had to let him go because otherwise the snake wasn't eating its own tail and the ouroboros apparently can't have that.

[-] Zima@kbin.social 2 points 1 year ago

I think the article should focus on how everyone or most people at work do keep up with the times. At least when I learned my teachers understood this issue and focused on providing a good theoretical foundation on which you can build on, the particular technologies are just examples of what's available at the time when you are being educated, it's not the actual focus of the education.

[-] Natal@lemmy.world 2 points 1 year ago

Applies to many fields. Studied translation at university and, kudos to the head teacher, he kept saying we worked on current software for illustration but the point was to learn transverse skills to apply to whatever tools are trendy once on the market. Turns out I work in a firm working outdated software older than my uni did. But I always agreed with the dude, we'll have to adapt or die as businesses.

[-] autotldr@lemmings.world 1 points 1 year ago

This is the best summary I could come up with:


In an essay, Hyams shared his top concerns around AI — one of which is how technologies like OpenAI's ChatGPT will affect the job market.

"With AI, it's conceivable that students might now find themselves learning skills in college that are obsolete by the time they graduate," Hyams wrote in the essay.

"The higher the likelihood that a job can be done remotely, the greater its potential exposure is to GenAI-driven change," the researchers wrote, referring to generative artificial intelligence.

The CEOs thoughts on AI come as labor experts and white-collar workers alike become increasingly worried that powerful tools like ChatGPT may one day replace jobs.

After all, employees across industries have been using ChatGPT to develop code, write real estate listings, and generate lesson plans.

For instance, Hyams said that Indeed's AI technology, which recommends opportunities to its site visitors, helps people get hired "every three seconds."


The original article contains 463 words, the summary contains 148 words. Saved 68%. I'm a bot and I'm open source!

this post was submitted on 23 Sep 2023
80 points (92.6% liked)

Technology

34819 readers
37 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS