Another post regarding time travel got me wondering how far back in time can I hypothetically leave a modern computer where they, the most capable engineers of their time, can then somewhat reverse engineer it or even partially?

  • flamingo_pinyata@sopuli.xyz
    link
    fedilink
    arrow-up
    130
    ·
    edit-2
    1 year ago

    The biggest issue would be microchips which require some really precise machinery to manufacture.

    1930s - complete reverse engineering
    By then they had both an understanding of semiconductors and computational theory. Using semi-conductive materials to compute wasn’t yet a thing, but there wouldn’t be much surprise at the concept. Some kind of reproduction is likely, probably not a 5nm manufacturing process like modern chip factories, but they could make it.

    1890s - eventual understanding, but not able to manufacture
    Measuring devices were sensitive enough by then to measure tiny electrical fluctuations. They would be able to tell the device functions due to processing of electrical signals, even capture those signals. Biggest missing piece is mathematical theory - they wouldn’t immediately understand how those electrical signals produce images and results. Reproduction - no. Maybe the would get an idea what’s needed - refining silicon and introducing other stuff into it, but no way they could do it with equipment of the day.

    1830s - electricity goes into a tiny box and does calculations, wow!
    This is the age of the first great electrical discoveries. They would be in awe what is possible, and understand on a high level how it’s supposed to work. Absolutely no way to make it themselves.

    1730s - magic, burn the witch!

  • PowerCrazy@lemmy.ml
    link
    fedilink
    arrow-up
    27
    ·
    1 year ago

    Not very far tbh. The basic concepts of how to arrange transistors to do useful work are well understood and have been since before the transistor was invented. The biggest problem that major cpu manufacturers face is how to physically create those cpus. The industrial process that brings us those techniques are technological marvels, but the engineer absolutely know what they want to do, just not how to do it. https://www.tomshardware.com/news/intels-long-awaited-fab-42-is-fully-operational

  • Hexagon@feddit.it
    link
    fedilink
    arrow-up
    27
    ·
    1 year ago

    Depends on what you expect them to do exactly. Today’s transistors aren’t much different than older ones, just smaller mainly. People of, say, 20-30 years ago may have the technology to inspect them (electron microscope or something like that), and the knowledge to understand them, but not the equipment to reproduce them.

    If you go much farther back in time, say before integrated circuits (1960) or even transistors (1947) were invented, I think it’s unlikely that someone could reverse engineer the thing

  • Bwaz@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    Zero years. Having a computer chip wouldn’t give much of a clue about how it was made.

  • ninjan@lemmy.mildgrim.com
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    That depends on what we mean by reverse engineer.

    The overall purpose and function of each component, the PCB and PSU can go pretty far back, maybe even prior to the invention of the semi-conductor. I think without knowledge of electricity, and even AC current, would make it very hard since they couldn’t power it on. So my bet is around 1880 and it would need to be investigated by Nicolai Tesla.

    But if we mean construct a similar one we’re going to need a lot of tech which you can’t infer from looking at the components, no matter what tools you have. The build of a modern CPU/GPU chip is absolutely mind-blowingly complex. 10 years for sure, 20 years likely, 30 years and I’m unsure. 40 years and it’s going to be extremely alien. 50 years completely impossible.

  • Call me Lenny/Leni@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    When the hot air balloon was invented, citizens thought it was a monster and beat up one of the first ones when it landed, and that was in the 1700’s (and that was right before the Hartlepool monkey incident, go figure). If people couldn’t fathom the mechanisms of the hot air balloon, an invention of their own day, it would surprise me if anyone before the advent of retro computer would understand a modern one wasn’t some kind of golem.

  • ShaunaTheDead@kbin.social
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    Technically everything that a computer does can be simulated using any medium, pen and paper for example, or rocks and sand (relevant XKCD).

    As for actually creating the parts needed, well a modern computer is just a very advanced Turing Machine which only requires 3 parts to operate: a tape for storing memory, a read/write head for reading/altering the data in memory, and a state transition tape to instruct the head to move left/right on the memory tape.

    The memory and state transition tapes themselves can be anything, even a pen or rocks as in the previous examples. The read/write head could be anything as well. In previous iterations of computers we used the state of and turning on and off of vacuum tubes as a read/write head.

    So conceptually, any time that humans were intellectually capable of reasoning out the logic. Their computer would just run much slower and be less useful the farther back in time you go.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    I think the answer is somewhere in here: https://en.m.wikipedia.org/wiki/Timeline_of_microscope_technology

    I mean it’s just layers that can be removed by lapping. The real question is the ability to see the smallest features.

    Chip fabs are the most expensive human industry is all of history. Production requires massive rare resources and tooling precision. Like, start looking up some of the nastiest chemicals that have ever been produced, mostly those intended to kill people, and you’re looking at the inventory stocking list for a fab.

    The YT channel Asianometry is based out of Taiwan and has a lot of ties to the industry if you want a good idea of what is involved on various fab nodes and their histories.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago
    • You have to define what you mean by “modern computer.” If we really break things down, an abacus of infinite size would be Turing complete. It would take a really long time to play Doom on it, though. It would also need a person (or people) to operate it. However, the technology to do so would have been available starting around 2500 BCE. It could even be much earlier, if you want to have your time traveller also invent the abacus. If you want something a bit more pragmatic, we can look to Charles Babbage and Ada Lovelace, who are generally credited with creating the world’s first programmable computer with a number of functions still in use today. Babbage was working in the mid-19th century, but given knowledge of his work could probably be reverse engineered back a bit as well. If you want to go in the other direction and make it even weirder and less practical, you can perform computation with a large room filled with people passing slips of paper back and forth after doing a simple logical operation on them.

    My point is that there’s the current state of hardware technology, which depends on a whole chain of technological advances, and there’s computation logic, by which we see the “universal” part of the universal Turing machine.

    If you’re talking solely about hardware and modern electronics, there’s a whole set of dependencies on industrial engineering and chemistry that goes from gears to vacuum tubes to diodes, which is interesting in its own right. What I guess I’m saying is that the advancements in the theory of computation (elements of theoretical architectures and mathematics) is distinct from the hardware it runs on. If you were to go back and teach the calculus and the theory of computation to Da Vinci, I imagine he’d come up with something clever.

  • MomoTimeToDie@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Probably not very far, all things considered, because go too far back, and modern semiconductors might as well just be a magic rock as far as the technology of the time is concerned. You can’t just crack open that flashy new ryzen to see what makes it tick.