So Pc's are x86 CPU's while apple android and most other things are ARM?
Mostly now yeah. But:
- Apple and other manufacturers (with support from Microsoft) are looking into adopting ARM platforms for their laptops. Some of the newer "Smartphone" chips should reportedly compete with Intel Celeron's and Pentium's, which is plenty for a lot of people
- There are also x86 Android machines with Intel Atom's, but they haven't really caught on.
What is used inside an arduno or a pic? I know these aren't x86 or ARM but are they simpler versions of this?
The original Arduino uses the AVR core.
The PIC has some variety in cores, like PIC16 and PIC18s which are slightly different. However their general design philosophy remains the same.
Both chips have 8-bit CPU's and different approaches to the microcomputer problem (for example Harvard vs Von Neumann architectures).
ARM is yet another flavour..
And x86, in theory, also.
However, after many years of CPU generations, people noticed two groups started to appear: RISC and CISC processors. RISC use very simple instructions, so to do anything meaningful you need many of them. CISC tries to combine a complex operation into 1 instruction. It turns out that building CISC processors is really hard, so you often see modern x86 implementations translating CISC instructions to RISC using microcode in the CPU.
That is not to say RISC processors can't have microcode as well. It's a common approach to
implement the controller for a datapath.
Does software dictate processor or is it the other way around?
Software does mostly dictate what we use. The only reason why we like to keep x86 around is because all our software runs on it. The industry has more computing demands, which runs on x86, so we want faster x86 chips.
We ofcourse also want faster microcontrollers and ARM SoCs in smartphones, but those applications are heavily tied to power budgets. You can't fit a 65W CPU in a smartphone without it catching on fire or burning your hand. Why would a vendor then build a 65W ARM CPU if nobody wants to buy it; as it can't fit smartphones, and PC software doesn't run on it (yet).
If a player like Microsoft is able to make the Windows platform agnostic (i.e. the whole eco-system runs on ARM and x86), then you will probably see this situation change.
Will we ever see another architecture come out hypothetically or have we found that "round is the best shape for the wheel: With the ARM being a circle wheel and x86 being a 900 sided polygon that we add sides to as new technology comes out?"
Actually, Intel has gone 1 step further and placed a bet on VLIW processors with the Itanium series. They abandoned them a few years ago. Again it's not that it was a inherently bad architecture, but there were some difficulties, above all we had x86-64 that was far more easy to adopt.
In addition; I see CPU architectures are something where the "ultimate" design is an unicorn. Let's face it: you cannot make memory buses infinitely big, memory sized infinitely large, have an infinite amount of instructions or registers. In the end you decide to make an instruction 32-bits because that's what the current technology can cost efficiently support. Then you have a trade-off how you're going to devote those bits to what. Some trade-off may prove better in 1 scenario, but there a dozen more scenario's...
Or is a turing machine a turing machine and to come up with something else just doesn't make sense based on binary transistor based gates making the abstract 1/0 come into physical form.
If a system is Turing complete, it means it can simulate a Turing machine. A Turing machine in turn can model any algorithm you can think of. It will do it very slowly, and will never be used in practice, but at it's fundamental it's possible.
You can build hardware accelerators that will calculate only 1 task, and do it well (and power efficient), but you can't use that as a general purpose computer..
The Turing machine itself has nothing to do with logic gates. The Turing machine is described by computer scientists using sets and tuples. There is no notion of 1s and 0s strictly necessary here.
I think if quantum computers become common you won't actually own one, but rather you will buy a monthly plan where you get a % of it's resources or it's time and your device will just be a terminal. The wireless mesh network could make this a great idea with nothing centrally controlled or if we keep the monopoly with ISP's like they do now it would be a dark and expensive monopoly. I heard ISP's want to change their business model to the Cable TV business model where you buy a package like you do with TV; movie channel package, sports package etc. They would offer a search engine package, a social media package, or start selling things alacart. This would make the internet very expensive (high profitable) and lose our internet freedom. Having worked at comcast they would love this idea. When Netflix stopped mailing DVDs and started streaming the CEO said "Netflix is making an extra 1 bazillion dollars a year and WE ARE ENTITLED TO THAT MONEY". Funny he used "entitled" since that party is all about ending "entitlements" like the social security that we are in fact entitled to because unlike comcast we paid into it. Damn right we should get our entitlements. But that's the end of my political rant couldn't help myself.
I think you need to be less paranoid.
Secondly, I don't think we will have quantum computers in the hands of consumers for decades to come, if there will ever be a practical implementation for them.
When making chips do they all follow a basic design that we found to work the best over the years, unless you want something very specialized you use an ASIC. But it seems even ASIC can be replaced by FPGA unless you need maximum efficiency like a bitcoin miner where each watt of heat = lost profit or makes it unprofitable.
"Found to work the best" => proven to improve designs in academia and commercial research centers at the big manufacturers.
The only difference between an ASIC and a "normal" chip is the amount of applications. Both are designed using the same techniques.
FPGAs are similar to design with and more accessible, but both industries have their ins/outs, tricks and gotcha's.
ASICs are more power efficient and you can really work at a gate level. But they only work in sufficient volume.
Seems like the future of things will all contain and AT style arduno chip at its core and you just build your device around it. Alarm clock throw an $1ATmicro in it. Dishwasher At micro. Then repairing them would be possible as it could just be interchangeable models. A whole industry could come from this: A company needs a product based on this chip and companies could offer the firmware to make it work leaving the company to better use it RD on making the product instead of writing code. If I was an EE and I wanted a star trek house everything would work like this.
Companies want to make money to please their share holders. They have no interest in pleasing customers.
Without cynicism, manufacturers have really no incentive to work together. I think the only industry where you see that happen is the automobile industry, in particular the more price pressured market segments. Again this is also purely financially driven: developing a platform for a car that most not cost more than 10k$ is almost impossible to do, unless you can team up to double or triple your effective sale figures.