There is no single chip that comes anywhere near the amount of hardware that I am aiming for.
Are you sure? 100 of MachXO3L will probably be covered by a single high-end Xilinx device.
The XO3 is just to make things easier at first thanks to its nonvolatile setup n stuff. The ECP5 is the low cost one I aim to use once acquainted with the hardware. It's true that 100 of these chips can be covered by some high end FPGAs, but that's significantly more expensive, and even the highest end FPGA, I'd need multiple units of. 100 units of MachXO3s was a guesstimate number at a module size, stackable modules, might end up with over 100 modules some day (100000 MachXO3s, tho id probably switch to the ECP5 at that point as that's lotsa money). The ECP5 can be had at $1.20 per Thousand-CLBs, the only thing close to that is Intel's Cyclone VE at $1.80 (a new surprise I discovered, might be worth saved board space & simpler hardware management, tho costlier if broken and less surface area for heat if thats an issue). After you get higher end than the CycloneVE, Price per thousand CLBs rapidly increases to prices up to $55 per 1000-Stuff like ram is irrelivant due to the amount that'll be needed never going to exist even in high end FPGAs. Am I using the wrong spec? I know from one of dave's videos its a bit hard to compare, and I don't believe all logic elements are created equal, so I figure digikey's Logic Blocks filter is what I should use to get a general comparison.
Realistically you can interconnect 6-8 high end FPGAs on a single boards, but they are already sold a finished products, and it is cheaper to buy one.
With a 100 chips you will go mad ensuring signal integrity and writing constraints files.
Those finished products you speak of are too small, no FPGA is big enough, there is never an end to power needed.
100 chips, the idea is to make one single module, get it going good, and make tons of those to just "plug in" to eachother. Might be less per module might be more, we'll see. Signal integrity in FPGAs isn't something I know much about, so there might be one of the unknowns I was looking for, but every problem has a solution.
Starting out i've selected a FPGA that has nonvolatile-instant-on-background-reconfigurable configuration memory, as this takes care of a lotta FPGA issues,
Like what? The non-volatile chips tend to be small, the bigger chips all use some form of EEPROM storage for the bitstream. In addition, the EEPROM can contain initial memory contents and, in some cases, multiple images.
I was imagining nonvolatile configuration eliminates the need to use a processor to program them all (save for the initial programming). 'instant on' in addition to the nonvolatile config, eliminates the need to really manage a lotta stuff like say power disconnects, no need to set it up to reload everything (configuration) from memory? Like no boot screen n stuff, just bam your computers on & fully loaded lol. More of a "wouldn't that be cool" kinda feature i'm hoping for with that.
Most importantly background reconfiguration, being able to reconfigure it as it's running, like say one FPGA is "running firefox" and we close firefox on the operating system. Being able to background program n stuff, we can immediately just "stop firefox and load itunes", seamless transition to reconfigure (probably not 100% seamless), rather than a ton of middle work like powering that single fpga down, turning it on, going through some configuration process, my thinking is it makes it simpler (less on my plate initially) and takes less time to change what the FPGA is doing, and they'll all be changing what they do frequently, so that translates to clicking firefox and bam firefox is open. Then again, I've never programmed an FPGA so I may be way off.
In my view, you are attributing capabilities to FPGAs and compilers that simply don't exist. We can use C to describe hardware (I have never done it, I use VHDL) but when we want to start running application code, there needs to be some kind of processor to run it. Unless the application can be fully described in hardware. Maybe it can...
I imagine i'd have to make my own compiler. Not really sure about it, but if it can't be helped, a software (or hardware) core can be used for compatibility with normal programs. I believe it's doable though, like take every single command in C, then python, and all that, and translate it to VHDL.
https://en.wikipedia.org/wiki/C_to_HDL This link seems to suggest C to HDL stuff exists already. It's not something I've given too much thought though, "we'll get there when we get there". At worst, I gotta plop in a processor for compatibility. I'm really just wanting it to be usable for normal everyday stuff, serious use of it would all be programs written specifically for the hardware/setup. It'd be cool to have hardware that can do major things in the background without slowing anything else down, like some advanced AI programs that view your camera, and in the bacground save everyone's face, creates 3d maps of all the places it sees, listens to peoples' voices and gathers data to make deepfake profiles, cracks all the encryptions of every single wifi ssid it sees before you even think about connecting to one, bitmines, hosts a website, listens to all radio frequencies and analyzes and/or records all communications in plain text (then further filter out which ones aren't of interest to the user), does a detailed analysis of every single stock on the stockmarket and lets you know its best predictions (YOUR AI vs some service), controls a swarm of 500 drones as individual units (not just a single mass they all stick to), hacks into traffic cams as you drive along and makes you priority number one for green lights wherever you go, monitors your wife's data (vocal tone, wording, hormones if u got an olfactionmeter or something like that) along with input from AI built "rules of engagement" to help let you know what to do when shes angry, synthesizes AI made music according to your itunes library and what it thinks you'd like, hosts the matrix, watches what you are doing on the computer and how you react to each thing then predetermines how the rest of your life is gonna go after it's gathered enough data, hosts physics simulators detailed enough that one is built to simulate the human body's response to experimental surgeries and procedures.... allowing AI to come up with new medicine and find the way to upload our minds onto these FPGA supercomputers or them onto our brains, stuff like that all in the background while not slowing down the loading of Club Penguin one bit.