139
submitted 1 year ago* (last edited 1 year ago) by Extrasvhx9he@lemmy.today to c/asklemmy@lemmy.ml

Another post regarding time travel got me wondering how far back in time can I hypothetically leave a modern computer where they, the most capable engineers of their time, can then somewhat reverse engineer it or even partially?

all 27 comments
sorted by: hot top controversial new old
[-] flamingo_pinyata@sopuli.xyz 127 points 1 year ago* (last edited 1 year ago)

The biggest issue would be microchips which require some really precise machinery to manufacture.

1930s - complete reverse engineering
By then they had both an understanding of semiconductors and computational theory. Using semi-conductive materials to compute wasn't yet a thing, but there wouldn't be much surprise at the concept. Some kind of reproduction is likely, probably not a 5nm manufacturing process like modern chip factories, but they could make it.

1890s - eventual understanding, but not able to manufacture
Measuring devices were sensitive enough by then to measure tiny electrical fluctuations. They would be able to tell the device functions due to processing of electrical signals, even capture those signals. Biggest missing piece is mathematical theory - they wouldn't immediately understand how those electrical signals produce images and results. Reproduction - no. Maybe the would get an idea what's needed - refining silicon and introducing other stuff into it, but no way they could do it with equipment of the day.

1830s - electricity goes into a tiny box and does calculations, wow!
This is the age of the first great electrical discoveries. They would be in awe what is possible, and understand on a high level how it's supposed to work. Absolutely no way to make it themselves.

1730s - magic, burn the witch!

[-] TonyToniToneOfficial@lemmy.ml 32 points 1 year ago* (last edited 1 year ago)

1730s - magic, burn the witch!

Sir Bedevere: And what do you burn, apart from witches?

Peasant: More witches!

[-] ColeSloth@discuss.tchncs.de 24 points 1 year ago

The novel ways that we've come up with to make processors and circuit boards over the past 40 years has been pretty amazing. I believe you're giving people of the 1930s too much credit here. Just for instance, the entire industry has known making chips smaller with more transistors will yield better performance for the past 40+ years. It's taken coming up with manufacturing "tricks" this long to get down to what we have today. Same thing for ram and hard drives.

And the code that programs it all to run would be completely unreadable. Much less the understanding of all the code for stuff that wouldn't have been named, created, or thought of, yet. Or how to program and read anything off the a solid state hard drive or the ram.

The first "digital computer" was made in 1945. You would bump that up a bit sooner by giving them a laptop in the 1930s, but most things since then have been just trying to refine the manufacturing process. They wouldn't be able to recreate the laptop at all. Not even in the 1980s would they be able to create it.

[-] wabafee@lemm.ee 4 points 1 year ago* (last edited 1 year ago)
[-] Smokeydope@lemmy.world 2 points 1 year ago
[-] PipedLinkBot@feddit.rocks 1 points 1 year ago

Here is an alternative Piped link(s):

Related

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] Hexagon@feddit.it 28 points 1 year ago

Depends on what you expect them to do exactly. Today's transistors aren't much different than older ones, just smaller mainly. People of, say, 20-30 years ago may have the technology to inspect them (electron microscope or something like that), and the knowledge to understand them, but not the equipment to reproduce them.

If you go much farther back in time, say before integrated circuits (1960) or even transistors (1947) were invented, I think it's unlikely that someone could reverse engineer the thing

[-] PowerCrazy@lemmy.ml 22 points 1 year ago

Not very far tbh. The basic concepts of how to arrange transistors to do useful work are well understood and have been since before the transistor was invented. The biggest problem that major cpu manufacturers face is how to physically create those cpus. The industrial process that brings us those techniques are technological marvels, but the engineer absolutely know what they want to do, just not how to do it. https://www.tomshardware.com/news/intels-long-awaited-fab-42-is-fully-operational

[-] WebTheWitted@beehaw.org 5 points 1 year ago* (last edited 1 year ago)

Yeah, modern CPU production is incredible and a pet interest of mine lately. I'd highly recommend the Asianometry YT channel if anyone wants to go deep.

https://youtube.com/@Asianometry

[-] PipedLinkBot@feddit.rocks 1 points 1 year ago

Here is an alternative Piped link(s):

https://piped.video/@Asianometry?si=qj4fqBK2pwyQozG2

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] intensely_human@lemm.ee 15 points 1 year ago

ITT: people conflating “reverse engineer” with “emulate”

[-] Bwaz@lemmy.world 13 points 1 year ago

Zero years. Having a computer chip wouldn't give much of a clue about how it was made.

[-] jhulten@infosec.pub 13 points 1 year ago

The bane of all time travel is materials engineering and supply chains.

[-] ninjan@lemmy.mildgrim.com 13 points 1 year ago

That depends on what we mean by reverse engineer.

The overall purpose and function of each component, the PCB and PSU can go pretty far back, maybe even prior to the invention of the semi-conductor. I think without knowledge of electricity, and even AC current, would make it very hard since they couldn't power it on. So my bet is around 1880 and it would need to be investigated by Nicolai Tesla.

But if we mean construct a similar one we're going to need a lot of tech which you can't infer from looking at the components, no matter what tools you have. The build of a modern CPU/GPU chip is absolutely mind-blowingly complex. 10 years for sure, 20 years likely, 30 years and I'm unsure. 40 years and it's going to be extremely alien. 50 years completely impossible.

[-] shinigamiookamiryuu@lemm.ee 12 points 1 year ago

When the hot air balloon was invented, citizens thought it was a monster and beat up one of the first ones when it landed, and that was in the 1700's (and that was right before the Hartlepool monkey incident, go figure). If people couldn't fathom the mechanisms of the hot air balloon, an invention of their own day, it would surprise me if anyone before the advent of retro computer would understand a modern one wasn't some kind of golem.

[-] Candelestine@lemmy.world 9 points 1 year ago

Good post, this was fun to read.

[-] j4k3@lemmy.world 9 points 1 year ago

I think the answer is somewhere in here: https://en.m.wikipedia.org/wiki/Timeline_of_microscope_technology

I mean it's just layers that can be removed by lapping. The real question is the ability to see the smallest features.

Chip fabs are the most expensive human industry is all of history. Production requires massive rare resources and tooling precision. Like, start looking up some of the nastiest chemicals that have ever been produced, mostly those intended to kill people, and you're looking at the inventory stocking list for a fab.

The YT channel Asianometry is based out of Taiwan and has a lot of ties to the industry if you want a good idea of what is involved on various fab nodes and their histories.

[-] ShaunaTheDead@kbin.social 8 points 1 year ago

Technically everything that a computer does can be simulated using any medium, pen and paper for example, or rocks and sand (relevant XKCD).

As for actually creating the parts needed, well a modern computer is just a very advanced Turing Machine which only requires 3 parts to operate: a tape for storing memory, a read/write head for reading/altering the data in memory, and a state transition tape to instruct the head to move left/right on the memory tape.

The memory and state transition tapes themselves can be anything, even a pen or rocks as in the previous examples. The read/write head could be anything as well. In previous iterations of computers we used the state of and turning on and off of vacuum tubes as a read/write head.

So conceptually, any time that humans were intellectually capable of reasoning out the logic. Their computer would just run much slower and be less useful the farther back in time you go.

[-] intensely_human@lemm.ee 20 points 1 year ago

“Reverse engineering” means tearing a machine down to figure out how it works.

Regardless of how much computation you can do with an abacus or an army of men with flags acting as logic gates, without sufficient microscopy you cannot reverse engineer a microchip.

That’s what this question is getting at: what previous incarnations of civilization would be able to study a computer and figure out what it’s doing?

[-] ShaunaTheDead@kbin.social 1 points 1 year ago

Well if we're considering alternate histories where a civilization gains access to a working computer then it's basically impossible to tell. It depends on so many variable factors. Whether someone in that time period takes a significant enough interest to even look into it in the first place, whether they're smart enough to solve the question of what it's doing, and even who's hands the computer falls into.

There's a famous example of an ancient Roman trinket that was kept in the collection of a wealthy person. It was a small device that when placed over hot water would spin. We would recognize that device today as a steam turbine and we would know that it has the possibility of sparking the industrial revolution if the right person got a chance to look at it.

So if an ancient civilization got their hands on a modern computer and managed to do anything useful at all with it, it would alter world history in ways that we wouldn't recognize it anymore. Even if they didn't directly reverse engineer the computer but instead gained insight into other technologies like electricity or plastic production, it would alter world history in such a way that the modern computer would almost certainly be produced much earlier than in our own history which kind of nullifies the point of the question.

[-] KISSmyOS@lemmy.world 6 points 1 year ago

I'd say before the 19th century, this theoretical computer would be so much slower that even the idea of constructing such a machine wouldn't occur to people, cause there is no problem it would solve and no task it would help with.

[-] intensely_human@lemm.ee 2 points 1 year ago

It would help with playing solitaire if they didn’t have a deck of cards nearby

[-] SatanicNotMessianic@lemmy.ml 3 points 1 year ago
  • You have to define what you mean by “modern computer.” If we really break things down, an abacus of infinite size would be Turing complete. It would take a really long time to play Doom on it, though. It would also need a person (or people) to operate it. However, the technology to do so would have been available starting around 2500 BCE. It could even be much earlier, if you want to have your time traveller also invent the abacus. If you want something a bit more pragmatic, we can look to Charles Babbage and Ada Lovelace, who are generally credited with creating the world’s first programmable computer with a number of functions still in use today. Babbage was working in the mid-19th century, but given knowledge of his work could probably be reverse engineered back a bit as well. If you want to go in the other direction and make it even weirder and less practical, you can perform computation with a large room filled with people passing slips of paper back and forth after doing a simple logical operation on them.

My point is that there’s the current state of hardware technology, which depends on a whole chain of technological advances, and there’s computation logic, by which we see the “universal” part of the universal Turing machine.

If you’re talking solely about hardware and modern electronics, there’s a whole set of dependencies on industrial engineering and chemistry that goes from gears to vacuum tubes to diodes, which is interesting in its own right. What I guess I’m saying is that the advancements in the theory of computation (elements of theoretical architectures and mathematics) is distinct from the hardware it runs on. If you were to go back and teach the calculus and the theory of computation to Da Vinci, I imagine he’d come up with something clever.

[-] MomoTimeToDie@sh.itjust.works -1 points 1 year ago

Probably not very far, all things considered, because go too far back, and modern semiconductors might as well just be a magic rock as far as the technology of the time is concerned. You can't just crack open that flashy new ryzen to see what makes it tick.

this post was submitted on 11 Nov 2023
139 points (99.3% liked)

Asklemmy

44148 readers
1022 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS