Author Topic: if any computer was send back to 1960,how it would effect computer evolution?  (Read 4668 times)

0 Members and 1 Guest are viewing this topic.

Offline aqarwaen

  • Contributor
  • Posts: 38
  • Country: us
lets say it was possible to send any modem day computer back to early 1960 or 1970.
2 computers would be sent..1 for teardown and 2th having all info stored on ssd/hdd.
computer would have data on hdd how stuff works and how to manufacture computers.i wonder such thing effect computer evolution and research.what would different today?
 

Offline onesixright

  • Frequent Contributor
  • **
  • Posts: 586
  • Country: nl
Depends what your looking at.

Looking at h/w, according to Moore, we should be right now -by technology- roughly in 2070 (+ 50 years). If it would his law will hold up, its already slowing doen a bit. So more like 2050. Guessing :-)

https://en.wikipedia.org/wiki/Moore%27s_law

 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 2959
  • Country: ca
Nothing.  Part of computing today is not just the horse power, but, all the artists, drawings, musicians, user interface specialists & getting people with the money and power to make such a device affordable and get general public interest to find all these things useful.  This means also fast networking structure like the internet which can communicate between individuals and expand the use of that horse power as well.

You may find a niche R&D lab with specialists doing some fancy math an limited physics experiments, but, it could never expand to the rest of the world and just the materials, manufacturing infrastructure needed just to make a single copy.  It's not just the CPUs, ram, dont forget the power transistors, compact low ESR capacitors even for the power supply.  Such a computer would be considered a military secret and hidden and used to break foreign cryptography.  That's about it.
« Last Edit: August 14, 2018, 05:26:15 pm by BrianHG »
__________
BrianHG.
 

Offline lapm

  • Frequent Contributor
  • **
  • Posts: 557
  • Country: fi
Would engineers of that day even understand todays technology? Could they even image modern 1x nanometer line width cpu with equipment of that time?

Yes i believe they would soon figure out its computer like nothing they have seen before. But to replicate technology used... Thats different matter.. At best case there might be fractions of technology to spurr from it in some things like better transistor manufacturing, etc...

It migh speed up shirking line width in prosessors since they now have evidence showing whats bossible.
Electronics, Linux, Programming, Science... im interested all of it...
 

Offline Pinkus

  • Frequent Contributor
  • **
  • Posts: 601
CIA would have stored it away. Same as with the Ark of the covenant.
Proof:
 

Offline HoracioDos

  • Frequent Contributor
  • **
  • Posts: 331
  • Country: ar
  • Just an IT monkey with a DSO
Windows would be born earlier and it would be more heavy. Please send back something else.
 

Offline onesixright

  • Frequent Contributor
  • **
  • Posts: 586
  • Country: nl
Windows would be born earlier and it would be more heavy. Please send back something else.
Stolen earlier that is... [emoji848]
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 2411
  • Country: si
It might speed thing up a little bit when it shows them what can be done but that's about it.

They would still need to develop all of this chip manufacturing technology from scratch. Even if they had the complete plans for a modern PC they would not be able to do much with them since they have no way of putting a bilion transistors in a chip. They would have trouble looking at a CPU die without today's fancy electron microscopes, and they would have problems looking at the high speed buses with the test equipment of the time.

Provided that they also have the software and documentation to write programs on it that would make it very useful as the world fastest supercomputer that could perhaps help further science in some way by allowing scientists to run massive simulations. But given the importance of such a thing this computer would likely end up captured by govorment for military use.

The actual problem with computing in the 60s is that a person couldn't own a computer. They ware missing the technology to manufacture a computer powerful enough to be useful yet cheap enough to be affordable for an average person. There was not even commercial interest in making one. It didn't make sense why an average person would want to own a computer if they are these giant power hungry and incredibly expensive machines while being so complicated to use that the average person could't even use it to calculate the result of 1+1.

Its the home computer revolution that really started it all. This is when computers became cheap enough for a average person to own.

So if you ask me if you wanted to kickstart the computing revolution you would need to give a pallet full of 68000 or Z80 CPUs to every major consumer electronics manufacturer in the 60s along with some 16K RAM chips and some video generator IC. Including the documentation on how to use them and how to build a computer with it. This would get the first home computers in to the hands of the people and once they do that there is a big market for it and this funnels lots of R&D money into developing these chips and making them in mass production. It would still take them a while but old home computers are simple enough to be understood by a single engineer and would not need chip manufacturing processes anywhere near as advanced.
« Last Edit: August 14, 2018, 05:51:06 pm by Berni »
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 2411
  • Country: si
Oh and if you really want to upset the flow of history then you teach an engineer the roman language, let him take a suitcase full of books and send him back about 2000 years.

The technology back then was advanced enough to have access to basic materials. So given that a person armed with modern knowledge could become successful enough in society to get these materials and tools he could build some amazing technology from it.

One route is building a steam machine with enough power output to be useful, potential kickstrating the industrial revolution.

Another route is making electricity practial. This unlocks many very useful inventions such as long distance communication (Something the roman empire could really make good use of) or when coupled with the steam machine help for efficient transfer of power. It would still take a lot of technological progress before a useful transistor could be made, but the technology used to make these electromechanical contraptions could be repurposed and combined with glassblowing technology to make the delicate parts for a thermionic valve. And once you have that you basically have electronics.

This would give the roman empire an incredible advantage and change nations over the entire planet. That's a lot of history to correct so historians would probably get pretty upset with you about sending that smart ass back in time.
« Last Edit: August 14, 2018, 06:10:34 pm by Berni »
 

Offline schmitt trigger

  • Super Contributor
  • ***
  • Posts: 1328
  • Country: mx
Thank goodness that the flux capacitors are out of stock.

https://www.jaycar.com.au/flux-capacitor/p/OUTATIME
 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 4663
  • Country: gb
Oh and if you really want to upset the flow of history then you teach an engineer the roman language, let him take a suitcase full of books and send him back about 2000 years.

The technology back then was advanced enough to have access to basic materials. So given that a person armed with modern knowledge could become successful enough in society to get these materials and tools he could build some amazing technology from it.

One route is building a steam machine with enough power output to be useful, potential kickstrating the industrial revolution.

Another route is making electricity practial. This unlocks many very useful inventions such as long distance communication (Something the roman empire could really make good use of) or when coupled with the steam machine help for efficient transfer of power. It would still take a lot of technological progress before a useful transistor could be made, but the technology used to make these electromechanical contraptions could be repurposed and combined with glassblowing technology to make the delicate parts for a thermionic valve. And once you have that you basically have electronics.

This would give the roman empire an incredible advantage and change nations over the entire planet. That's a lot of history to correct so historians would probably get pretty upset with you about sending that smart ass back in time.

Of course, given a 2000 year head start, we'd be be enjoying some really nice warm summers by now!  :)
Chris

"Victor Meldrew, the Crimson Avenger!"
 

Offline Homer J Simpson

  • Super Contributor
  • ***
  • Posts: 1054
  • Country: us


Watch Star Trek Voyager, "Future's End", season 3, episodes 8 and 9.

 

Offline Richard Crowley

  • Super Contributor
  • ***
  • Posts: 4310
  • Country: us
  • KE7GKP
Sending a modern computer back many decades may in a limited way influence the evolution of software.
But you would have needed to kick-start the development of the vacuum tube and then transistors sufficiently to have the semiconductor infrastructure in place to support advanced design of computers that could not exist without massive integration of circuits.
 

Offline jmelson

  • Super Contributor
  • ***
  • Posts: 1241
  • Country: us
lets say it was possible to send any modem day computer back to early 1960 or 1970.
2 computers would be sent..1 for teardown and 2th having all info stored on ssd/hdd.
computer would have data on hdd how stuff works and how to manufacture computers.i wonder such thing effect computer evolution and research.what would different today?
Not all that much.  Instead of the computer, send them the information on IC design now used, and the gear used to fabricate the ICs.  In 1960, they didn't even have real ICs, yet, although I think a couple VERY primitive ones had been built in labs.
By mid 1960's, ICs were coming into production, and by 1970 they were mainstream items, although on a very small scale of integration.  Then, it was a LONG slog to work up to modern LSI with density that the engineers in 1960/1070 could hardly dream about.

A whole lot of issues in extremely pure crystal growth, fine line mask generation and UV exposure technology had to be worked through, as well as the semiconductor physics as transistors got smaller and smaller.  Even in 1970, CMOS was in its infancy, and almost everything was still bipolar.  CMOS really started getting traction in maybe, 1975 - 1980.

Jon
 

Offline jmelson

  • Super Contributor
  • ***
  • Posts: 1241
  • Country: us
Sending a modern computer back many decades may in a limited way influence the evolution of software.
But you would have needed to kick-start the development of the vacuum tube and then transistors sufficiently to have the semiconductor infrastructure in place to support advanced design of computers that could not exist without massive integration of circuits.
Not sure how many decades you are referring to in this thread.  IBM was building transistorized computers starting about 1959, maybe.  And, they built some large systems such at the 707x and 709x models.  The 7094, which came out in 1959, had 11,000 circuit boards with 50,000 discrete transistors, which quite boggles my mind!  So, vacuum tubes were definitely obsolete for computers as of 1959 or so.  Still, they were HUGE systems, the 7094 filled about a dozen refrigerator-sized cabinets.  (IBM also made smaller systems for both business and scientific computing.)

Jon
 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 4663
  • Country: gb
Surely they'd either have been too scared of damaging such an unknown technology by trying to do any sort of in depth investigation... or they would have tried to investigate it (try to take an IC apart, maybe even the pcb) and irreversibly broken it. Any technology that in advance of where you are now is probably either going to end up be locked up or broken. Sending back information would be vastly more helpful that an actual device.

P.S. In the 60s, did they have any concept of a semiconductor triode, ie. a field effect voltage controlled semiconductor junction with a gate? That would have slowed investigation down a bit!
« Last Edit: August 14, 2018, 07:50:44 pm by Gyro »
Chris

"Victor Meldrew, the Crimson Avenger!"
 

Online rsjsouza

  • Super Contributor
  • ***
  • Posts: 3393
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
How can you say for sure they didn't take a modern computer back to the 1960? My suspicion is that they did and finally dropped off that idea of the analog computer once and for all! Similar thing using discrete transistors... The future was in those newfangled so called integrated circuits that started to become more and more ubiquitous in the 60's.  :-DD
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 

Offline In Vacuo Veritas

  • Frequent Contributor
  • **
  • Banned!
  • Posts: 319
  • Country: ca
  • I like vacuum tubes. Electrons exist, holes don't.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 5653
  • Country: nl
They had an analog navigation/solar moon planets system in 87BC.
What happened with that knowledge? Why did it take another 1600 years before it was reinvented?

https://en.m.wikipedia.org/wiki/Antikythera_mechanism
 

Offline Richard Crowley

  • Super Contributor
  • ***
  • Posts: 4310
  • Country: us
  • KE7GKP
P.S. In the 60s, did they have any concept of a semiconductor triode, ie. a field effect voltage controlled semiconductor junction with a gate? That would have slowed investigation down a bit!

1947 The modern, practical transistor was invented at Bell Labs by Bardeen, Brattain and Shockley. Both junction and field-effect, IIRC. 

1967 The modern, practical integrated circuit was invented by Kilby (TI) and Noyce (Fairchild).

1970 The first practical IC microprocessor (Intel 4004) in 1970.

1974 IMSAI introduced their microcomputer kit using the Intel 8080.

 
The following users thanked this post: Gyro

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 5532
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
I would send some RISC-V boards complete with all available documentation. And for some real fun, create a scene that looks like a crashed UFO.
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline Jr460

  • Regular Contributor
  • *
  • Posts: 110
Don't send a machine back.  Send them architecture guides for processors and how long that design lasted.  They can then see what ideas were dead ends and what features they really need to focus on.

Oh make make sure they get the reports of the next round of Intel hardware bugs that got released today, 8/14.   (Just a nice way of saying, don't do this crap to make the chip faster, it lead to much pain later)

May be if you want to send back hardware, try some of the early 10 half ethernet controllers for a non-PC based machine with a note, hey we now run 1G full-duplex in everyone's house.  Let them think our carriers supply that speed to door of everyone.   That would help a lot.
 

Online TK

  • Super Contributor
  • ***
  • Posts: 1119
  • Country: us
  • I am a Systems Analyst who plays with Electronics
I think any venture trying to get such advanced computer to market would have failed.  History has proven that being too early is not a guarantee of success...

https://appleinsider.com/articles/18/05/11/general-magic-tells-story-of-apple-vets-who-created-a-smartphone-15-years-too-early

From a pure technology perspective, computer architecture and technology has not changed much since its inception.  CPU Instruction Pipelining, Cache memory, Virtual Machine, Static Memory, Intelligent peripherals, Networking, email, SGML (Father of HTML) existed in the 60s - 70s (IBM Mainframes, OS360, VM).  The cost was extremely high and could not be used in low cost computers
« Last Edit: August 14, 2018, 11:12:25 pm by TK »
 

Offline KE5FX

  • Super Contributor
  • ***
  • Posts: 1071
  • Country: us
    • KE5FX.COM
They had an analog navigation/solar moon planets system in 87BC.
What happened with that knowledge? Why did it take another 1600 years before it was reinvented?

https://en.m.wikipedia.org/wiki/Antikythera_mechanism

Technology tends to show up when it's needed, and tends to disappear when it's not.  The earliest steam engine we know about was built by Hero of Alexandria, and it's very possible that the idea was already old at the time.  But who needs a steam engine when you have slaves?  As a result, nothing much happened for 2000 years until the economics underlying human labor were rethought and reworked. 

The same is likely true of electrochemical batteries.  If you have slaves, you don't need motors.  If you don't have mechanical motors, you don't need fuel or batteries.

If you took a 14-nm CPU back to the 1960s, it would have no effect.  They'd put the die under their strongest optical microscopes and it would still look like a featureless gray blob.  There's no way to rush the evolution that happened over the next 50 years, because it happened in lockstep with market needs that also evolved.
 
The following users thanked this post: Kjelt, SparkyFX

Online LaserSteve

  • Frequent Contributor
  • **
  • Posts: 812
  • Country: us
Send them something useable such as optical fiber 15 years ahead of its time, CW diode lasers, and how to make fast EO modulators.  Give them P and N Gallium Arsenide wafers.  741 op-amps, and leds.  Optical or Quantum computing would be in my cell phone by now, if you did.

Steve
"I've Never Heard of a Nuclear Meltdown Caused by a Buffer Overflow"  filssavi
 

Offline CatalinaWOW

  • Super Contributor
  • ***
  • Posts: 3241
  • Country: us
If you sent those computers back loaded with all the processing/fabrication information required it might in the long run gotten us to where we are a couple of decades earlier.

The machines themselves would have been proof of concept, and would need to be loaded with things that people of the day would recognize as useful.  NASTRAN, CAD, CAM, ECAD, MATLAB, Spreadsheets, and the like.  No internet browsers, email, Facebook, or games.  Photoshop is an interesting question.  Then with the impetus of the cold war, and if the stored information was detailed enough and covered all of the intervening technology generations, development might have begun with the dream of building a few dozen or maybe even a few thousand of these machines.   Maybe if the relative success of the weather prediction arrays and Sandia nuclear simulation stuff was made clear they would be dreaming of a few tens of thousands.  As the infrastructure built up something like the market that has really driven the development might occur.  Or might not since it might be implemented in an Area 51 type facility.
 

Offline Circlotron

  • Super Contributor
  • ***
  • Posts: 1613
  • Country: au
Whatever computer you do end up sending into the past in an effort to kickstart the future, make sure it uses a linear address space. None of that stupid segment:offset rubbish!
 

Online rsjsouza

  • Super Contributor
  • ***
  • Posts: 3393
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
P.S. In the 60s, did they have any concept of a semiconductor triode, ie. a field effect voltage controlled semiconductor junction with a gate? That would have slowed investigation down a bit!

1947 The modern, practical transistor was invented at Bell Labs by Bardeen, Brattain and Shockley. Both junction and field-effect, IIRC. 

1958 (Kilby), 1959 (Noyce) The modern, practical integrated circuit was invented by Kilby (TI) and Noyce (Fairchild).

1970 The first practical IC microprocessor (Intel 4004) in 1970.

1974 IMSAI introduced their microcomputer kit using the Intel 8080.
Just an adjustment... :)

IIRC, the earliest patent for a field effect transistor dates back to 1920s - thermoionic valves just had much popularity due to radio.
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 

Offline aargee

  • Frequent Contributor
  • **
  • Posts: 724
  • Country: au
I think it would be the whole package, hardware and software.
To make a difference in technology? Hard, the whole context of discovery is sometimes needed. Sure, showing how doping a junction of semiconductor or similar advancements would offer tiny leaps but the jigsaw puzzle is hard without the "how-to-do-it" direction of a human.

But what say you sent back a laptop with Excel or Mathcad on it to Blexley Park at the start of their code breaking activities or Germany in 1939. This could very well win the war, very early in the piece. What sort of timeline would we be travelling on now? How quickly would our technology have been improved? Or set us back because all those technological advancements during WWII were not created and built. Lots of options and what-ifs there...
Not easy, not hard, just need to be incentivised.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 9226
  • Country: au
A lot of our "desires" stem not from want of thinking about them, but for the means to create them.

Advanced microprocessor fabrication (well, what we call "advanced" today) would have been impossible in 1960, so anything sent back would be a scientific curiosity - portrayed rather well by the Terminator 2 movie, I think.  It may have sparked a rush of interest in developing the fabrication processes - but such efforts would have waned because the tech to do so was not within the scope of possibility.

Even if they were to find a way to fabricate the silicon at the sizes required, they would not have gone through the years of process development and refinement that taught the designers what does work and what doesn't - and, more importantly, why.  That knowledge would need to be gained before yields became acceptable.

Yes, it would likely have had an impact on the development timeline - but I don't think it would have been as dramatic as some might want to believe.
 

Offline Towger

  • Super Contributor
  • ***
  • Posts: 1551
  • Country: ie
The problem with too big a jump is the technology appears as magic. 
My grandparents had a series of books from the 1940-50s (could be earlier?) covering various industrial processes, in enough detail to get you/civilisation started if the knowledge was lost.  The advantage of technology from that era is it is much easier to reproduce.

Edit: Crossed with Brumby, who put it better.
« Last Edit: August 15, 2018, 05:58:52 am by Towger »
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 5653
  • Country: nl
If you don't have mechanical motors, you don't need fuel or batteries.
There are indications that the old egyptians already had primitive electrochemical / galvanic cells, probably used for electroplating or medical purposes.

Quote
If you took a 14-nm CPU back to the 1960s, it would have no effect.  They'd put the die under their strongest optical microscopes and it would still look like a featureless gray blob.  There's no way to rush the evolution that happened over the next 50 years, because it happened in lockstep with market needs that also evolved.
Agreed, the same as that tomorrow an alien race landed on the planet and wanted to share their technology.
If the tech is more than 50 years ahead of our tech OR it is with combination of elements not yet invented here or elements that can not be found on our planet , we have a lot to catch up and learn before we can understand it.

Your example of the 14nm CPU, no way they could make tech in the 60's to replicate those delicate structures.
If you sent the early 70's 4 bit cpu IMO you might have more succes in advancing the technology. Pure speculation ofcourse.
« Last Edit: August 15, 2018, 07:02:02 am by Kjelt »
 

Online vk6zgo

  • Super Contributor
  • ***
  • Posts: 4857
  • Country: au
Surely they'd either have been too scared of damaging such an unknown technology by trying to do any sort of in depth investigation... or they would have tried to investigate it (try to take an IC apart, maybe even the pcb) and irreversibly broken it. Any technology that in advance of where you are now is probably either going to end up be locked up or broken. Sending back information would be vastly more helpful that an actual device.

P.S. In the 60s, did they have any concept of a semiconductor triode, ie. a field effect voltage controlled semiconductor junction with a gate? That would have slowed investigation down a bit!

FETs were already around in the 1960s, but they were mainly used as small signal RF amplifiers.
Silicon FETs were the most common, but CMOS was being developed rapidly.

Actually, FETs were the first type of semiconductor amplifying devices being investigated in the 1920s/30s, but never got far, due to the difficulty in making one that worked!

It is tempting to think that people in earlier times were ignorant of concepts which are now commonplace,
but research labs were often looking at things which didn't become commercial reality for many years.
 

Online vk6zgo

  • Super Contributor
  • ***
  • Posts: 4857
  • Country: au
They had an analog navigation/solar moon planets system in 87BC.
What happened with that knowledge? Why did it take another 1600 years before it was reinvented?

https://en.m.wikipedia.org/wiki/Antikythera_mechanism

Technology tends to show up when it's needed, and tends to disappear when it's not.  The earliest steam engine we know about was built by Hero of Alexandria, and it's very possible that the idea was already old at the time.  But who needs a steam engine when you have slaves?  As a result, nothing much happened for 2000 years until the economics underlying human labor were rethought and reworked. 

The same is likely true of electrochemical batteries.  If you have slaves, you don't need motors.  If you don't have mechanical motors, you don't need fuel or batteries.

If you took a 14-nm CPU back to the 1960s, it would have no effect.  They'd put the die under their strongest optical microscopes and it would still look like a featureless gray blob.  There's no way to rush the evolution that happened over the next 50 years, because it happened in lockstep with market needs that also evolved.
They already had electron microscopes.
 

Online vk6zgo

  • Super Contributor
  • ***
  • Posts: 4857
  • Country: au
Send them something useable such as optical fiber 15 years ahead of its time, CW diode lasers, and how to make fast EO modulators.  Give them P and N Gallium Arsenide wafers.  741 op-amps, and leds.  Optical or Quantum computing would be in my cell phone by now, if you did.

Steve
741s date from 1967, semiconductor lasers were well known ( though maybe not CW ones) but not in general use, LEDs were common in mid to late '60s, used as indicator lights on panels.

Gallium Arsenide was certainly known by development labs.
Optical fibre was well known.
In the 1970s, decorative lamps using reject sections of optical fibre were a bit of a fad.
 

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 13731
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
The central conceit of these kinds of thought experiments is this:

A lot of them are just that, conceptual: you could hang out with da Vinci, language barrier aside, and have interesting discussions on a variety of subjects.  He would surely learn a lot, and you would surely stand to learn many things as well.

But as it turns out, semiconductor manufacture actually is hard.  It's multidisciplinary, involving many branches of chemistry and physics (with a bit of psychology thrown in to manage the inevitable meat-based support), on scales and precision unprecedented until then.

To say that semiconductor production could've been advanced, even by say ten years, is probably a very aggressive forecast!

In the late 19th century, say: machining was well matured, with fair precision, say, less than a tenth of a mm being a regular thing.  That's nice, but still a broad side of a barn compared to the micron precision demanded by semiconductors as we know them.  You simply can't have a machinist turning handles on a (large-reduction-ratio pantograph) Etch-a-Sketch and expect it to come out consistently. :P

At the same time (turn of the last century, say), some semiconductor theory and observation was just beginning to develop, but the theoretical tools were unprepared to handle it (physicists quickly learned that condensed-matter physics is hard, too!).  Observations were inconsistent, spooky even.  One would just as well write off another's results as "impossible to reproduce", or worse things.  Likewise, those working with such materials might've been disinclined to publish due to possibly being labeled as a loon?

The materials were partly to blame, but they didn't have a good way of knowing that.  Analytical chemistry couldn't detect parts per billion, except in some rare cases.  Studies would've been easier if the materials were pure and homogeneous from the start, but again, there wasn't much understanding of impurities, and dislocations and other defects.

And it took advances in polymer chemistry and photochemistry to make the step-and-repeat method possible, as we know it today.  At that time, photography was a fairly ordinary thing, but it tended to be grainy, and color photography for example was only in laboratories.  The best plastics at the time were mostly natural: rubber, lacquer, asphalt, and bakelite came a bit later (early 20th century).  It wasn't until much later that organic photochemistry became much better understood, and photographic resists became possible.

These just to name a few!

Incidentally, I don't think they would've had too much trouble cooking up the other process handling equipment, things like electric heaters, plasma generators and vacuum chambers.  Hard vacuum was studied at the time, though I don't know how useful it was in terms of engineering or other practice (ahem, quack medicine using x-rays aside).  Not that such equipment would've been accurate enough, either: you need a feedback control system to operate these things, which would've been quite the challenge before electronic controls came along.  (You can make a hydraulic or mechanical control system, but it's quite noisy (due to turbulence or rotating and sliding parts), and inevitably slow.)

Most of all, with so many highly-refined (for the time) technologies committed to one single goal, what good is it?  Who's buying these chips?  What are they using them for?

The driving force through the 50s and 60s was military technology, willing to pay high prices for top tech.  Consumer applications came as well, but only with the crudest of parts (germanium BJTs), and sparingly at that (unless they were faulty, in which case you might have a "six transistor" (or more) pocket radio ;) ).

You can't change an economy overnight.  It takes decades upon decades of rollout to get everyone invested into new technologies, and driving their further advance.  Nay, it's better to think of semiconductors as one spoke, among many, in the enormous feedback wheel that has pushed technology forward over the last century.

On a related subject: what, then, would be most practical to send back in history?

Probably, vacuum tubes would be achievable within ones' [remaining] lifetime, given suitable patronage.

Going back to da Vinci: suppose you brought with you, in your head, the designs for electrical power generation, transmission and transformation; designs for vacuum pumps and handling equipment; and the chemical and metallurgical knowledge to win the required materials from ores.  While all this would be most peculiar to da Vinci, let's say you win his confidence and you work together to create these fantastical creations.

You could start by grading and stockpiling ores, refining them to the various metals needed (iron and copper being relatively easy; nickel less so, and harder still for tungsten, molybdenum and others).  This already requires a lot of support apparatus, as only a few of these can be processed in the traditional method (as you'd find in a handy copy of, not De Re Metallica, but perhaps the parent texts it would be compiled from).  You would have to do much of the analytical work of, say, Lavoisier, but three centuries earlier.  For example, tungsten and molybdenum might be hydrogen-reduced (since, you can't make a fire hot enough to melt them, and you wouldn't want to anyway, because both readily form carbides).  This requires oil of vitriol (sulfuric acid), say.

In the end, what would you create?  You need something to show your patron.  You might create a telegraph, and show how it can be deployed over great distances to communicate between towns, or how an army might deploy wires as it marches, so it can be commanded instantly.  (Which, I suspect, is a very modern conception of warfare, and without the rapid supply lines we take for granted today, rapid communication probably wouldn't be all that tactically useful after all.)

Tim
« Last Edit: August 15, 2018, 07:34:58 am by T3sl4co1l »
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 5653
  • Country: nl
In short: you can invent the future but being unable to build it.
Davinci had drawings of flying machines, helicopters but was unable to build it and bring it to practice.
Babbage draw the first mechanical calculator/computer but the precision to create the needed brass gears was not available, only 100 years later it was possible to build it and prooved to be working as invented.
 

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 1787
  • Country: fi
  • Embedded SW/HW.
Somewhere in downunder: "Hi guys, look what I've got! Today is teardown-tuesday ..."
 

Offline Rerouter

  • Super Contributor
  • ***
  • Posts: 4312
  • Country: au
  • Question Everything... Except This Statement
If you packed it full or technical and scientific books / manuals / papers, Then it would matter, Infact if word got out of it, it could even start serious international tensions depending on who found it first.

Imagine what a countries military command would do to try and secure a device housing detailed technical information on the next 60 years of human advancement.

Equally put in perspective, most research publisher databases, technical book catalogs and wikipedia could be fit into a desktop case PC packed with high density hard drives.

The fun part is they would have no way to get the information out, other than transcribing. Which would be more than 60 years worth of reading.
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 2411
  • Country: si
Yes a lot of modern inventions stand on so much of science from the last 100 years. In these modern times we have large teams working on research projects in big labs filled with all sorts of sophisticated and expensive equipment. We have already invented the easy things, so the things that are left are the difficult ones that we didn't have the knowledge to understand or didn't have the technology to implement.

There are still lots of impressive things you could build before the year 1500 if you knew how. There is no way you could build a jet engine with the metallurgy of the time but you could build a electric generator, motor and even a light bulb. Everyone knew iron back then so magnetic cores are not a problem (Tho the losses would be awful with a solid core). Copper and brass there too so you can make wire and brass bearings. Insulation can be cloth and tar or clay for solid insulators. Glass was known too and you can make a vacuum pump using mercury falling under gravity trough a tube while carbonized string can be used as a filament making lightbulbs possible. Batteries are also not hard to make, they had lead and copper and sulfur that can be easily turned into sulfuric acid. These electrical contraptions would likely be pretty unreliable and very inefficient by todays standards due to missing the processes to make them to very tight tolerances but they would work well enough to be useful.

Its similar with making a flying machine. You can make a perfectly good plane using wood and cloth. They just didn't have a great understanding of aerodynamics, physics behind it and well actual plane design in general (since nobody got one working). Still you could only make a gliding aircraft to glide off a hill or just make a parachute. Powered flight needs a lot of power so pedaling a aircraft is not really very feasible. Yes it can be done and has been done but with a reasonably optimized design, an athlete pedaling and even that for just a little while. So you couldn't make a plane that can take off from a runway until you invent the internal combustion engine.

Making firearms is also perfectly possible 2000 years ago. All you need is a metal pipe and ball along with some gunpowder. They already knew about the materials that make up gunpowder a long time ago so all it was needed was crush and mix it in the right proportion. This obviously gives a massive military advantage.

Back in those days they had access to the materials, just missed the knowledge needed to turn those materials into these inventions.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 9226
  • Country: au
Imagine what a countries military command would do to try and secure a device housing detailed technical information on the next 60 years of human advancement.

That's probably a more likely fate!

With such a resource in the hands of the government - especially the military - it would get squirreled away in such a manner that it would make Area 51 look like a Disney ride.
 

Online richard.cs

  • Frequent Contributor
  • **
  • Posts: 667
  • Country: gb
  • Electronics engineer from Southampton, UK.
    • Random stuff I've built (mostly non-electronic and fairly dated).
I'm fairly convinced that piston engine powered flight would be achievable by a time traveller to the Roman era*, assuming he could convince the locals not to kill him and fund his big project somehow (knowledge of future gold mine locations perhaps?). I suspect that it's more practical to do that than to seriously advance semiconductor understanding and technology by taking a modern PC to the 1960s. As other's have mentioned the technology is very difficult to perfect, the microscopic scales make it incredibly difficult to reverse engineer and then there's the problem that you really need a computer to design something with a billion transistors - the circuit doesn't fit very well into a human brain.

* I'm picturing a lightweight timber and silk construction, perhaps something like a Blériot XI, powered by a cast-bronze inline-4. The engine is obviously the bigger challenge, fuelling it with ethanol isn't too hard but lifetime might be fairly short without modern lubricating oils. Bronze is should be fine for block, head and pistons, the most difficult things might be crankshafts and valves in the pre-steel era. Wrought iron is far from ideal here but could probably be made to work. Making small quantities of blister steel could be another option. Magneto ignition only requires wire drawing tech and the ability to create reasonable permanent magnets. In some ways a pulse jet might be easier.
 

Offline In Vacuo Veritas

  • Frequent Contributor
  • **
  • Banned!
  • Posts: 319
  • Country: ca
  • I like vacuum tubes. Electrons exist, holes don't.
I'm fairly convinced that piston engine powered flight would be achievable by a time traveller to the Roman era*, assuming he could convince the locals not to kill him and fund his big project somehow (knowledge of future gold mine locations perhaps?). I suspect that it's more practical to do that than to seriously advance semiconductor understanding and technology by taking a modern PC to the 1960s. As other's have mentioned the technology is very difficult to perfect, the microscopic scales make it incredibly difficult to reverse engineer and then there's the problem that you really need a computer to design something with a billion transistors - the circuit doesn't fit very well into a human brain.

* I'm picturing a lightweight timber and silk construction, perhaps something like a Blériot XI, powered by a cast-bronze inline-4. The engine is obviously the bigger challenge, fuelling it with ethanol isn't too hard but lifetime might be fairly short without modern lubricating oils. Bronze is should be fine for block, head and pistons, the most difficult things might be crankshafts and valves in the pre-steel era. Wrought iron is far from ideal here but could probably be made to work. Making small quantities of blister steel could be another option. Magneto ignition only requires wire drawing tech and the ability to create reasonable permanent magnets. In some ways a pulse jet might be easier.

Unfortunately the Greeks had an abundant supply of slaves, and regarded anyone, even a Greek, that dabbled with the real world to be inferior. That's what slaves are for. No one bothered to count teeth because Aristotle settled it.

https://scientiasalon.wordpress.com/2014/10/03/rescuing-aristotle/

You see, counting is work.

But:

https://en.wikipedia.org/wiki/Aeolipile

They could have done something from there.

They were also capable of forging quite long pieces of iron.
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 2411
  • Country: si
This roman steam engine Aeolipile is often given a bit too much credit as being the invention of the first steam engine. It was mostly just a cool desk toy that people found interesting to get movement out of a fire. Such a design is not practical for producing enough power to actually drive something. Nobody thought that it was even possible to get the power that a few horses can provide from just steam.

While yes they did have a lot of slaves there are still things that need more power than a few humans can provide continuously. They used windmills and waterwheels for that purpose but a practical steam machine could provide this sort of power anywhere at any anytime, including on ships.

Also sometimes one invention can give rise to making other inventions practical. Like romans did have running water, but they always used gravity to move it around. Pumping and lifting water takes a lot of work so it was not practical to move large amounts of it. But if they had a continuous source of enough power like a steam machine they would have probably also then invented some sort of pump to use it. (Yes i know they had the archimedes screw, but it didn't see heavy use)

Tho an internal combustion engine for an airplane that's quite a bit of a stretch for roman times in my opinion. Sure you could make an internal combustion engine since that's quite similar to a steam engine and you could build one of those. Brass is certainly a good candidate for such parts and they had that. The problem is getting a high enough power to weight ratio out of such an engine. Brass is heavy and not very strong and you need to make your engine to very tight tolerances to get the most power from it and you would need to zone in on the ideal operating conditions while its difficult to make many prototypes because it would be very labor intensive to make the parts for it. So you would proabobly get at most a tractor out of a internal combustion engine, but that's an application where a steam engine can work too, yet the steam engine can run on anything that can burn while internal combustion cant (so its more economical as fuel is cheaper)
« Last Edit: August 16, 2018, 05:49:35 am by Berni »
 

Offline SparkyFX

  • Frequent Contributor
  • **
  • Posts: 508
  • Country: de
It is also a lot more fun and dedication to come up with a concept yourself instead of reading other peoples concepts and trying to copy them.
Support your local planet.
 

Offline @rt

  • Frequent Contributor
  • **
  • Posts: 960
They wouldn’t know what to do with the hardware other than break it.
It would be more useful in the hands of an educational institution (with software) to use as a supercomputer for research.
 

Online richard.cs

  • Frequent Contributor
  • **
  • Posts: 667
  • Country: gb
  • Electronics engineer from Southampton, UK.
    • Random stuff I've built (mostly non-electronic and fairly dated).
Tho an internal combustion engine for an airplane that's quite a bit of a stretch for roman times in my opinion. Sure you could make an internal combustion engine since that's quite similar to a steam engine and you could build one of those. Brass is certainly a good candidate for such parts and they had that. The problem is getting a high enough power to weight ratio out of such an engine. Brass is heavy and not very strong and you need to make your engine to very tight tolerances to get the most power from it and you would need to zone in on the ideal operating conditions while its difficult to make many prototypes because it would be very labor intensive to make the parts for it. So you would proabobly get at most a tractor out of a internal combustion engine, but that's an application where a steam engine can work too, yet the steam engine can run on anything that can burn while internal combustion cant (so its more economical as fuel is cheaper)

I don't disagree that it's difficult, but I think it would be possible in a reasonable timescale, 10-20 years after arrival perhaps, giving you time to build a few machine tools, source materials, etc. Perfectly fitting parts can be produced by filing/scraping rather than machining given cheap labour, they won't be interchangeable with other parts but they will fit together with their matching part. if you look at some of the internal combustion engines from the early 20th century they were pretty primitive by modern standards, the parts manufacture is very doable, it's the metallurgy that is most challenging. On a conventional design it is probably the connecting rods and crankshaft that would benefit most from modern steels, without strength here it is hard to achieve the high rotational speed needed for good power to weight for the overall engine. Likewise aluminium pistons would be great (reduced connecting rod stress for the same RPM) but making aluminium in usable quantities would be hugely difficult, perhaps not possible in a lifetime from that starting point.

As a reference the Wright flyer had a 12 hp, 80 kg engine though they calculated that their limit was around 8 hp, 200 kg. With modern aerodynamic knowledge, especially better wing design flight should be possible with less power. A "modern" low compression, 1970s-designed lawnmower engine is around 10 hp and 20 kg. It doesn't feel unreasonable that you could build one from inferior materials and still keep it under 100 kg.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 5653
  • Country: nl
They wouldn’t know what to do with the hardware other than break it.
Yeah put 5V on the processor,  pooof  :)
« Last Edit: August 16, 2018, 09:39:00 am by Kjelt »
 

Online borjam

  • Supporter
  • ****
  • Posts: 739
  • Country: es
  • EA2EKH
They would have a very hard time understanding the software alone.

Object orientation didn't exist. Unix was created in 1969 and most computer systems were just running batch jobs at the time. If concurrent programming is unfamiliar to most nowadays, at that time it was arcane knowledge for operating system programmers.

 

Online Ian.M

  • Super Contributor
  • ***
  • Posts: 8011
@richard.cs,
OTOH a pulsejet aero-engine should be well within the capabilities of Roman era ironworkers.  The most difficult part to make would probably be the fuel injector, as the fuel pump (windmill driven) could be made of brass.  It probably wouldn't have enough thrust to get off the ground so would need a catapult launch system.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 5653
  • Country: nl
Object orientation didn't exist.
what a blessed time that was  :)
 

Online vk6zgo

  • Super Contributor
  • ***
  • Posts: 4857
  • Country: au
They wouldn’t know what to do with the hardware other than break it.
It would be more useful in the hands of an educational institution (with software) to use as a supercomputer for research.

People in the 1960s weren't stupid!
The reason things weren't done was that the technology hadn't developed to make things which EEs could already imagine.
Development of specialist devices & techniques requires a market for such technology.

It would be difficult, but not impossible to reverse engineer the thing hardware wise, there were quite sophisticated development labs in that decade.

The first thing they would do is try to make it work, not "tear it down".
All you really have to do with a computer is find the  "on" button, then, if it has the correct OS,it will start up.

OK, it comes up with a start screen, they could determine roughly how the screen worked, put further work on that in the "to do" box, then try to make it actually do something.

If kids of two or three can make a computer work, do you really think a 40 year old PHD wouldn't be able to do so.

If it was a laptop, they would no doubt be horrified by the incredibly flimsy mechanical design.
Materials scientists would be depressed by the lack of progress in the strength of plastic materials.
 

Offline @rt

  • Frequent Contributor
  • **
  • Posts: 960
I didn’t say they were stupid. In the later tube era, they wouldn’t know what to do with the hardware other than break it (in the sense or reproducing it).
Even if all chips were decapped perfectly, that doesn’t help their non-existent fabrication process any, even though by the 60’s there are certainly people thinking a lot about semiconductor technology.

So as I said, better in the hands of an educational institution that will use the computer.
 
The following users thanked this post: Richard Crowley

Online LaserSteve

  • Frequent Contributor
  • **
  • Posts: 812
  • Country: us
All fine and dandy to propose all this. But where are you going to get a good lathe and a good milling machine?  Yes, I've seen the incredible hand crafted rifles and shotguns  in the armour court  at the Cleveland Museum of art, but without mills, lathes, vacuum pumps  and glass to metal seals, Pyrex and fused silica , you'd spend 20-30 years just tooling up.
"I've Never Heard of a Nuclear Meltdown Caused by a Buffer Overflow"  filssavi
 

Offline particleman

  • Regular Contributor
  • *
  • Posts: 115
We would have bypassed an era with quality made products and "advanced" into plastic crap. Glad that didn't happen I like my through hole oscilloscopes and audio gear.
 

Offline CatalinaWOW

  • Super Contributor
  • ***
  • Posts: 3241
  • Country: us
All fine and dandy to propose all this. But where are you going to get a good lathe and a good milling machine?  Yes, I've seen the incredible hand crafted rifles and shotguns  in the armour court  at the Cleveland Museum of art, but without mills, lathes, vacuum pumps  and glass to metal seals, Pyrex and fused silica , you'd spend 20-30 years just tooling up.

While I personally think that computers in the de Vinci era would have been so confounding as to attract no interest, 20-30 years to tool up would still put us in the computer age 100s of years earlier.  Even if you throw in a century to electrify and industrialize a continent.  Same thing would apply to Roman era.  And in spite of the slave based economy they would be interested based on the weapons and ground transportation possibilities identified in the encyclopedias you would include.  Weapons and transport are the life blood of empire.
 

Offline SeanB

  • Super Contributor
  • ***
  • Posts: 15091
  • Country: za
Probably the biggest thing to improve the Roman era, aside from teaching them Byzantine mats, would be the single item that made the modern world possible, the blast furnace. While the Romans had the ability to make reasonable wrought iron, the ability to make low cost steel in volume would have been a big change. Just that you took a 15m tall column of burning carbon ( coal likely, but they used a lot of wood there in charcoal) with iron ore, silicon and limestone as flux, and blow air through it from the bottom, while feeding in new fine ground material at the top, and tap the molten steel off along with the slag at the base. At that time steel was a very hit or miss affair, passed down from craftsman to apprentice, and with almost no knowledge of the steps that were actually useful to convert basic wrought iron from a bloomery, basically high carbon steel with large amounts of impurities, into a lower carbon steel that was capable of holding an edge, and which could be hardened to a brittle exterior with a ductile interior, that did not shatter like wrought iron.
 

Offline Towger

  • Super Contributor
  • ***
  • Posts: 1551
  • Country: ie
Easy to reproduce with the right knowledge - Penicillin: https://en.m.wikipedia.org/wiki/Penicillin
 

Online LaserSteve

  • Frequent Contributor
  • **
  • Posts: 812
  • Country: us
Drop a  PC onto Newton's or Heavyside's or  Lord Rayleigh's desk....  Now that I'd pay to see...
~
Go to 1939-1940
Really just drop a case of Helium Neon laser tubes with a text on how to coat the dielectric mirrors onto a desk at the Tizzard Mission, Radlab, Bell Labs, NCR, Berkely, Delco, MIT,  NRL, or Tuxedo park.

 Its something they could almost make, can be powered by technology prevalent at the time, and gives you an unbeatable measurement standard and diagnostic tool in  surveying, construction,  physics, chemistry, bio, ME and EE... Truly a device that changed the world nearly as much as the transistor.  Or just teach the chemists to make a simple, easy to build atmospheric pressure Nitrogen laser in the near uV..  Impact not as big, but lasers revolutionized chemistry.

Steve
« Last Edit: August 17, 2018, 06:27:48 pm by LaserSteve »
"I've Never Heard of a Nuclear Meltdown Caused by a Buffer Overflow"  filssavi
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3569
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
I'd rather let's say it was possible to go back to around 200BC, and speak with whoever manufactured the Antikythera Mechanism.

Then teach those guys how to also make iron (and machine guns), and a quick course in memetics, cognitive bias, scientific method, making long-lasting documents, and how to structure a society to be both stable and able to defend itself against being overrun by psychopaths from inside and barbarians from outside (oh wait, apparently we don't know how to do that) so hopefully that level of technology wasn't lost to history for over a thousand years.

Failing that, I think the point for the biggest possible improvement to present computing science, would be to drop about a 20 page document on the desk of the group defining the ASCII standard in early 1960s. Listing all the critical information organizational concepts that present ASCII fails to include. And which result in a vast cascade of horrible workarounds and failings through the entire field of CompSci, that few even see because they are so used to 'the way things are'.




http://apod.nasa.gov/apod/ap110109.html
http://en.wikipedia.org/wiki/Antikythera_mechanism
http://www.antikythera-mechanism.gr/
/watch?v=RLPVCJjTNgk
http://www.world-mysteries.com/sar_4.htm        Part 1
http://www.world-mysteries.com/sar_4_2.htm      Part 2
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 

Online vk6zgo

  • Super Contributor
  • ***
  • Posts: 4857
  • Country: au
This thread has split into two, really.
(1) Dropping a modern computer into1960 or so.
&
(2) Doing something broadly similar to Ancient Rome or other civilisations of roughly that era.

With (1), younger people continuously underestimate the capabilities of mid 20th Century Science &
Technology, with comments like "They would "break it" trying to tear it down."
I believe it is much more likely, investigation would proceed very cautiously, exhausting all that could be found out externally, before trying to open up device packages.

If nothing else, it would give people ideas which could perhaps be implemented with existing technology,
& point them in useful directions.
They would be unlikely to attempt to duplicate the actual device.

(2) In my opinion, the best thing to introduce to the Ancient world would be a Montgolfier type hot air balloon, as this would be well within the capabilities of their technology.
 

Offline Mr. Scram

  • Super Contributor
  • ***
  • Posts: 7905
  • Country: 00
  • Display aficionado
@richard.cs,
OTOH a pulsejet aero-engine should be well within the capabilities of Roman era ironworkers.  The most difficult part to make would probably be the fuel injector, as the fuel pump (windmill driven) could be made of brass.  It probably wouldn't have enough thrust to get off the ground so would need a catapult launch system.
I don't think Roman metallurgy would suit a typically hot running pulse jet engine design. We have to remember that the Germans in WWII had trouble building long lasting jet engines because they lacked alloys resilient enough to stand the heat. Brits sidestepped the problem by using a less efficient motor design. We see the same issue throughout history. Steam engines only became more powerful after better alloys were developed. They knew how to increase power output, but the metals at the time wouldn't allow it.

It seems the progress of technology is much more a collection of advances making other advances possible, throwing a spanner in the works of sending a modern computer into the past.
 

Online Ian.M

  • Super Contributor
  • ***
  • Posts: 8011
Pulse jets run at a very low compression ratio so the casing isn't heavily stressed, and of course there are no high stress rotating parts.    That was why WWII Germany could mass produce them out of mild steel sheet  although, as you point out they had little success with rotary jet engines.   

Early WW1 era aero engines typically had service intervals for full overhaul as low as 10 to 20 hours.  It wouldn't be *that* hard to develop a pulse jet design capable of running for 10 hours, with a replaceable combustion chamber.

A bigger concern would be vibration - which with no good metal springs or elastomers other than animal sinew, would be difficult to handle in a light weight airframe.   I suppose its possible one could develop a rubber industry as there are some rubber latex producing plants native to Africa and India (see http://www.faculty.ucr.edu/~legneref/botany/rubber.htm) but unless one's time-travelling engineer got very lucky + had access to a lot of specialist reference books, the odds of success in a reasonable timespan are poor.  OTOH developing the 'rubber' prophylactic would probably be a good path to sufficient wealth and influence to be able to proceed with other engineering projects.
 

Offline Mr. Scram

  • Super Contributor
  • ***
  • Posts: 7905
  • Country: 00
  • Display aficionado
Pulse jets run at a very low compression ratio so the casing isn't heavily stressed, and of course there are no high stress rotating parts.    That was why WWII Germany could mass produce them out of mild steel sheet  although, as you point out they had little success with rotary jet engines.   

Early WW1 era aero engines typically had service intervals for full overhaul as low as 10 to 20 hours.  It wouldn't be *that* hard to develop a pulse jet design capable of running for 10 hours, with a replaceable combustion chamber.

A bigger concern would be vibration - which with no good metal springs or elastomers other than animal sinew, would be difficult to handle in a light weight airframe.   I suppose its possible one could develop a rubber industry as there are some rubber latex producing plants native to Africa and India (see http://www.faculty.ucr.edu/~legneref/botany/rubber.htm) but unless one's time-travelling engineer got very lucky + had access to a lot of specialist reference books, the odds of success in a reasonable timespan are poor.  OTOH developing the 'rubber' prophylactic would probably be a good path to sufficient wealth and influence to be able to proceed with other engineering projects.
I see more issues with the heat pulse jet engines tend to produce than the stress.
 

Offline CatalinaWOW

  • Super Contributor
  • ***
  • Posts: 3241
  • Country: us
Back to the 1960 branch, maybe the biggest impact would be the even quicker demise of the second tier big iron computing companies.  Burroughs, Sperry and the like.  And maybe Cray would never have happened.
 

Online richard.cs

  • Frequent Contributor
  • **
  • Posts: 667
  • Country: gb
  • Electronics engineer from Southampton, UK.
    • Random stuff I've built (mostly non-electronic and fairly dated).
I don't think Roman metallurgy would suit a typically hot running pulse jet engine design. We have to remember that the Germans in WWII had trouble building long lasting jet engines because they lacked alloys resilient enough to stand the heat.
I had assumed here a valveless design, which at the end of the day is just a cleverly shaped hollow structure. It could be constructed from riveted iron plate and I would not expect it to have a significantly shorter life than one of the modern amateur designs made from mild steel (a few tens of hours?) though it would almost certainly be heavier. Based on the videos I have seen of them operating a surface temperature of 600-800 C seems pretty normal.

Using animal sinew to suspend it in the airframe seems a reasonable technique, obviously with some intermediate part to protect it from the heat.
 

Online Ian.M

  • Super Contributor
  • ***
  • Posts: 8011
The problem with animal sinew is its elasticity and length vary dramatically with ambient humidity.   Treating the sinews with Glycerene could help control that, otherwise shrinkage due to IR radiation from the engine drying the sinew could rip the airframe apart.

The Romans has a sizeable Asbestos cloth industry, so thermally isolated mounts of riveted sandwich construction with an inner metal plate wrapped in asbestos cloth between two outer plates  would be reasonably practical.    Lightweight thermal IR shielding would be a lot harder - there's no aluminium foil covered rockwool till you've got a 20th century equivalent tech base.   Maybe gold leaf adhered to layered asbestos cloth using waterglass?

Probably the easiest option would be riveted on lugs for chains or rods to get the mounts out of the zone where IR radiation would char any organics, then protect the sinews with asbestos cloth sleeves.
« Last Edit: August 20, 2018, 11:21:52 am by Ian.M »
 

Offline edy

  • Super Contributor
  • ***
  • Posts: 1937
  • Country: ca
    • DevHackMod Channel
Reminds me of this movie (The Last Mimzy). Anyone else see it? The video below shows some clips from the movie (among the nanotech narration) but if you ever see the movie from start to finish, it's pretty strange. And without spoiling too much more, it probably gives a reasonable reaction to how scientists from 1960 would react to stuff sent to them from today... The change in scale, miniaturization and integration of IC's probably is on the same scale over the past 50 years as what you see in The Last Mimzy future to our time.

« Last Edit: August 20, 2018, 11:59:49 pm by edy »
YouTube: www.devhackmod.com
"Ye cannae change the laws of physics, captain" - Scotty
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 2959
  • Country: ca
EVERYBODY has ignored 'Where is the money?'.  Having sent back all the technical manuals, engineering docs, sample ICs & complete computers by the boatloads.  Who on Earth will pay for building all the infrastructure to manufacture and build everything from the plastics, high density circuit boards, IC, testing, hard drives, just the damn hard drive heads alone down to the size and spacing counted in atoms.  The super clean rooms.  What stores will buy such super expensive equipment to begin with.  Who would fund home computers at a time when IBM says only a few would be needed on earth.  No one in their right mind could even invest in everything just to make 1 PC, even the US military.  And if the US military decided to do so, each PC would cost would approach the 100 million dollar mark at least just to deal with all the setup and decades of testing to get the first CPU and ram off the shelf.  You must gear up big from stage 1 and build chips in the millions to get the price down per CPU and ram chip.

The fact of the matter is that our technology progression isn't delayed by our knowledge on how to build the IC, but, more every sector of the technology aligning together as the market for the IC comes into existence and magnitudes of profit can be properly realized to get the engineering done.  So I say, we are advancing as fast as economics will allow and that sending everything back to 1960 wouldn't change a thing other than someone reading the documents and carefully timing patents just ahead of the advancements in available infrastructure and if smart, making a mint if done right.
« Last Edit: August 21, 2018, 01:21:33 am by BrianHG »
__________
BrianHG.
 

Offline CatalinaWOW

  • Super Contributor
  • ***
  • Posts: 3241
  • Country: us
EVERYBODY has ignored 'Where is the money?'.  Having sent back all the technical manuals, engineering docs, sample ICs & complete computers by the boatloads.  Who on Earth will pay for building all the infrastructure to manufacture and build everything from the plastics, high density circuit boards, IC, testing, hard drives, just the damn hard drive heads alone down to the size and spacing counted in atoms.  The super clean rooms.  What stores will buy such super expensive equipment to begin with.  Who would fund home computers at a time when IBM says only a few would be needed on earth.  No one in their right mind could even invest in everything just to make 1 PC, even the US military.  And if the US military decided to do so, each PC would cost would approach the 100 million dollar mark at least just to deal with all the setup and decades of testing to get the first CPU and ram off the shelf.  You must gear up big from stage 1 and build chips in the millions to get the price down per CPU and ram chip.

The fact of the matter is that our technology progression isn't delayed by our knowledge on how to build the IC, but, more every sector of the technology aligning together as the market for the IC comes into existence and magnitudes of profit can be properly realized to get the engineering done.  So I say, we are advancing as fast as economics will allow and that sending everything back to 1960 wouldn't change a thing other than someone reading the documents and carefully timing patents just ahead of the advancements in available infrastructure and if smart, making a mint if done right.

Completely right and wrong at the same time   Having all the answers laid out ahead of time changes the economics.  Eliminates blind alleys and eliminates the risk of investment.  The patent question is interesting.  The documentations existence could be interpreted as existing prior art and eliminate all described patents.  Which would be a negative change in the economics which would slow development.
 

Online borjam

  • Supporter
  • ****
  • Posts: 739
  • Country: es
  • EA2EKH
Object orientation didn't exist.
what a blessed time that was  :)
Indeed!

Object orientation is briliant as an intellectual achievement, but it brings serious problems of its own.
 

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 5653
  • Country: nl
Reminds me of this movie (The Last Mimzy). Anyone else see it?
Nope but it is on my watchlist now, thanks  :)
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 2959
  • Country: ca
Completely right and wrong at the same time   Having all the answers laid out ahead of time changes the economics.  Eliminates blind alleys and eliminates the risk of investment.
I would have to disagree here.  You may eliminate a ton of hidden risks, however, you need customers for this stuff.  And, in the 60's, like my parents in the 80s, computer are idiot boxes and not worth buying or using for anything.  Do you know how long it took a larger community or artists who use the computer to do art just have their work accepted as art.

You may knock us a few years ahead, but, nothing like 40 to 70 years as one could dream.
__________
BrianHG.
 

Offline Brumby

  • Supporter
  • ****
  • Posts: 9226
  • Country: au
There is also the distinct possibility that by sending back "advanced" technology that you will actually slow down technological development.

Visions of what is "over the horizon" will detract from seeing where the stepping stones need to be placed in order to get there.  It is not inconceivable that someone will plot a direct course to the objective, only to encounter difficulties that they want to push through instead of abandoning them for a more circuitous route that will prove successful.  Not to mention all the "incidental" discoveries along the way that subsequently proved to be valuable in their own right.
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 2959
  • Country: ca
......... Not to mention all the "incidental" discoveries along the way that subsequently proved to be valuable in their own right.
How right you are!  I have completely missed that one...
Something I should have noticed as I am a specialty troubleshooter and usually am contracted to come in after the fact and clean up other engineer's problems up and explain what they've done wrong.
« Last Edit: August 22, 2018, 05:08:32 am by BrianHG »
__________
BrianHG.
 

Offline jmelson

  • Super Contributor
  • ***
  • Posts: 1241
  • Country: us
There is also the distinct possibility that by sending back "advanced" technology that you will actually slow down technological development.
Yes, for a number of years in the '70s and '80s, it was said, fairly seriously, that the IBM 360 slowed down computer advancement in the Soviet Union by at least a decade.  Instead of thinking ahead, they blindly copied a design that was already a little dated when introduced in 1965.  They did develop their own circuit technology, but they just STOLE the OS, compilers, etc.

Jon
 
The following users thanked this post: Richard Crowley

Offline Circlotron

  • Super Contributor
  • ***
  • Posts: 1613
  • Country: au
Instead of sending back the latest and greatest computer that would be impossible for them to reproduce, why not something more in reach like say a Z80 or even a 68000 based system? Still quite a jump from what they had at the time. They might be way impressed instead of just hopelessly bamboozled. A case of less is more.
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3569
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Also, do we want to advance computer evolution?

http://www.smbc-comics.com/comic/captcha
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 
The following users thanked this post: BrianHG

Offline Mr. Scram

  • Super Contributor
  • ***
  • Posts: 7905
  • Country: 00
  • Display aficionado
Also, do we want to advance computer evolution?

http://www.smbc-comics.com/comic/captcha
Why not? Humans will go extinct eventually and whether our evolutionary branch ends there, we're superseded by our own evolved selfs or are superseded by our own technology isn't wholly relevant to us. Thinking our species can and should remain relevant is just more typical human hubris. Whatever happens, we've had a pretty good run.
 

Offline RayHolt

  • Contributor
  • Posts: 5
  • Country: us
"1970 The first practical IC microprocessor (Intel 4004) in 1970."

I could not think of a more IMPRACTICAL small computer chip set than the 4-bit 4004. It was so incapable Intel quickly replaced it with the 8008 and then the 8080.  Date LATE 1971 and not really  used much until 1974 as engineers did not know how to program.

On a practical side a 20-bit microprocessor chip set that actually flew a mil-spec fighterjet in 1970 and all with the exact technology as the 4004.
http://FirstMicroprocessor.com
 
The following users thanked this post: Richard Crowley

Offline Berni

  • Super Contributor
  • ***
  • Posts: 2411
  • Country: si
That is quite the impressive 20bit chip.

You have to keep in mind that the Intel 4004 CPU was originally designed to run a calculator and that's it. Any electronic calculator of the time had to be heavily optimized in terms of part count so they ware never powerful. Also the humans that use them are very slow so they didn't need to calculate particularly fast. Also the chip design processes of the time ware very primitive. The schematics for the whole CPU got drawn by hand and then the manufacturing masks layed out onto transparent film by hand using black tape that then got optically reduced down to the final mask. The manufacturing process was also not as sophisticated so yield might suffer if the chip gets too big. To top it off this chip was designed on a incredibly tight deadline so the thing had to be kept as simple as possible and use as few transistors are possible.

What you get as a result of this "transistor penny pinching" is a very dumb and clumsy CPU. But it was plenty powerful to run the calculator it was designed for. Only after it was designed was it realized how useful a CPU on a chip is and that its worth putting more effort to make it better.

You could see the eminence of transistor penny pinching on the good ol PIC16F series of MCUs. Memory addressing is horrible as it can only address 128 bytes of memory at a time (including hardware registers) so paging needs to be constantly used, there are barely any CPU registers so you constantly have to shuffle things to RAM and back, call stack is in hardware and limited in depth and the instruction set leaves a lot to be desired. Yet this crappy MCU got everywhere because it was cheap.
 
The following users thanked this post: Richard Crowley

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 3444
  • Country: us
Quote
it can only address 128 bytes of memory at a time (including hardware registers) so paging needs to be constantly used
128 bytes of data memory (including hardware registers, so maybe 96 or so bytes of actual user memory) per bank, not per page.

The pages are for the program memory. So both banking and paging must be constantly used. |O But pages are larger, at least.

Also, you have to remember what the competition was back then. And what it took to for a chip company to succeed. There was no internet. There was no adafruit and Sparkfun teaching people how to buy a chip and wire it up and make stuff as a hobby. There was no ICSP. Early chips were OTP. The cost of microcontrollers wasn't as dirty cheap as it is now. The reason they succeeded may have had something to do with the infrastructure, toolchain, documentation, availability, and marketing, in addition to w/e was the price point per device.  And also the peripherals. I believe they were a bit novel in making PIC available to individual engineers and students and small businesses, whereas some of the other chips were sorta only available to specific industries or buyers who wanted thousands of chips, minimum.

As for the OP... Steve Jobs would have been a motivational speaker living in a van down by the river?
« Last Edit: January 10, 2019, 07:45:39 am by KL27x »
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 2411
  • Country: si
Ah yes sorry i meant to say banking rather than paging. But the worst thing about the PIC16F family are really the CPU registers. You only get a single 8 bit accumulator register as your general purpose register to actually do stuff. All other registers are for status, PC and the awful memory addressing.

And yes Microchip always tried to make there chips as accessible as possible, cost has to do with it too.

The importance of getting your chips in the hands of of the average joe did show too. The biggest competitor Atmel was making 8bit MCUs for the exact same market with a architecture superior to PICs yet Microchip still kept a big flowing and there chips found there way in to so many products. Atmel was still very successful, but you would expect them to sell even better since they made better chips.
 

Offline Richard Crowley

  • Super Contributor
  • ***
  • Posts: 4310
  • Country: us
  • KE7GKP
I could not think of a more IMPRACTICAL small computer chip set than the 4-bit 4004.
"Practical" in the sense that it was used in a commercial product and sold for a profit by the manufacturer.  i.e. not  laboratory curiosity or a secret government project.

Quote
It was so incapable Intel quickly replaced it with the 8008 and then the 8080.  Date LATE 1971 and not really  used much until 1974 as engineers did not know how to program.
Here in the commercial, for-profit world, the concept of a monolithic processor evolved into the 8008 and then 8080. The 4004 had a 10 year product life-span. Even today very small-scale microcontrollers are very practical for many applications.  Ben Heck recently posted a video about the ATTINY10 microcontroller in a SOT-23-6 package. And Dave has been doing videos about the 3-cent Padauk PMS150C in SOP-8.


Quote
On a practical side a 20-bit microprocessor chip set that actually flew a mil-spec fighterjet in 1970 and all with the exact technology as the 4004.
http://FirstMicroprocessor.com

A very remarkable accomplishment, and well ahead of what Intel was doing.  Too bad it was hidden away for so long because of government/military secrecy. Also too bad such technology was apparently unknown to NASA. The MIT-developed technology for the Apollo Guidance Computer seems pretty crude by comparison. They even used hand-woven mag core technology to store the firmware.

Your story deserves better coverage.  Have you contacted the computer history people? There are many videos on YouTube covering much more mundane projects than what you are describing.

The 4004 was developed rather from the opposite end of the process.  Federico Faggin had to "sell" the Intel management on making a general-purpose, programmable solution for Busicom's desk calculator.  And then Intel had to buy-back the rights from Busicom in order to sell it as a general commercial product.  Very interesting to hear your scenario of just pouring more money/resources on the project to meet the schedule.

Mr. Holt's video is very interesting and amazing that it is not better covered by documentary videos, etc.

https://youtu.be/aVEm5SSUULc
« Last Edit: January 10, 2019, 01:25:48 pm by Richard Crowley »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf