I'm not sure I got what the OP meant in this topic.
iCE40 are cheap devices, Lattice provides free tools for all of them (AFAIK - only used the iCE40UP line so far), and you can embed or design your own programming "probe" for pretty cheap. Same applies to most FPGAs at least for entry- to mid-level models. Programming a Xilinx part only requires an FTDI chip, same for Lattice.
Really? There's no proprietary programmer I have to buy? Any generic FTDI programmer will work? I guess I haven't seen anyone do this. All the guides online show people using dev boards. I assumed you had to program some sort of bootloader in using some proprietary programmer. Does this mean theoretically I could use an existing ICE40 dev board, which would have the FTDI chip (?) as an ISP to program ICE40 chips?
The
FT2232H Mini Module works with Lattice Diamond.
I don't use dev kits (no reason to use them) and we use the free tools for everything here. That includes the MicroSemi ProASIC-3E 3000, various Artix-7, Spartan 6 and Spartan 3A, and Intel Cyclones. I used a Lattice MachXO2 for a little side project I did.
So how do you test your ideas beforehand? I don't really have the balls or the patience to have a PCB manufactured before I'm sure what I'm doing is going to work.
Well, let's just say I've been doing this professionally for a looooong time.
And with the kind of products we design, there's no point to "testing ideas beforehand" on an eval board. Not when the products have multiple high-speed LVDS DDR serial ADCs for sensor-output digitizing, a sequencer/controller for the sensor, multiple slower SPI ADCs for housekeeping, a gigabit communications channel implementing a custom protocol over fiber, a command parser, a system initializer module, a side-band serial (UART) interface, on-board non-volatile parameter storage, and etc etc. If you have to design a board (or boards) to glue onto an eval kit, well, you might as well just design the whole damn board that goes into the product.
I spend a lot of time simulating the design, which means finding or writing models of the various peripherals. It means testing and functional verification of all of the low-level modules. It means scratching out timing diagrams and sequences on quadrille paper before even thinking about writing VHDL, and certainly before you synthesize and place and route. When I do get to the fitter tools, most of the time is spent closing timing. Sometimes that's easy. Sometimes it's not, so you spend time pipelining and whatnot so you meet timing, and since adding pipeline stages changes the design, you have to go back and re-verify the code to make sure you account for the latency.
It's called engineering.
Before we have first-article boards sent out for fab, there's enough of the FPGA design complete to say that our pinout choices will work. This means understanding the chip architecture so you can tell the layout person, "These pins can't change because they're clocks, these pins have to be on a 3.3 V bank, these pins are LVDS so they have to be on the 2.5 V bank, these pins connect to the gigabit transceiver so they can't change, these pins are for the memory interface and can't change." And when layout makes the pinout choices, those get put right into the design and checked by running the FPGA fitter tools with them.
So when we get first-article boards back from assembly, a lot of the design already works. A lot of what doesn't work comes from misunderstanding the data sheet for some other part. (You've read data sheets, right? So you know that they are a fountain of correctness and completeness!) Sometimes those other parts simply don't work as advertised. I spent way too much time trying to figure out why an ADC, with the simplest of interfaces, was giving us nonsensical noise histograms. I tried all manner of control timing changes to get it to work properly, and it turns out that the ADC quantizer is broken. Well, it meets the published spec, which means the 16-bit converter is really only good for 12 bits over the full temperature range.
I'm a believer in in-circuit debug tools like ChipScope, but not as a substitute for simulation and verification prior to synthesis. You want to start synthesis with the idea that your design is correct. I know there are a lot of burn-and-crash guys who pooh-pooh simulation, but when the designs get large, you end up wasting a lot of time analyzing in-system. This is exacerbated by long tool run-times. But! ChipScope is great for telling you that the peripheral connected to the FPGA doesn't behave the way you think it should. When that happens I update my models, re-simulate, then re-spin with the FPGA tools. And then you realize, like with the ADC above, that the part doesn't work ...
Anyway, all of the above is the difference between the hobbyist and the professional. And we like to say, "If it was easy, everybody would be doing it."
good luck.