Thanks for all the answers. I ended up with a nifty solution.
This smells like continuous integration server (jenkins/bamboo) with automated builds...
Spot on. Though entirely custom, fast, feature-ful and the next step in how hardware and software is built and tested.
So they and they are doing it also wrong? I really doubt that.
The folks who worked on this are amazing and great inspiration! I've spoken to them deeply about this problem and while this was the a fast solution at the time, there is much room for improvement.
So they and they are doing it also wrong? I really doubt that.
you should distinguish between different use cases.
1. testing software during development - this is much faster in a virtualized environment (soft test-bed)
2. testing boards during manufacturing - this can be done with a bunch of SD cards
of course if you have a hardware without a possibility of virtualization then you have to go for testing on hardware where such a SD card switch or even SD emulator has it's benefits.
and as always - there are many ways to achieve the same goal.
so feel free to doubt anything you wish... but i've been there... just sharing my experience
Its quite amazing how many sharp people tell someone they are wrong or doing things "inefficiently" without even understanding the problem. You have made so many assumptions without even attempting to understand my use case.
I'm not using x86, but custom RISC hardware. I've already tested in software and trying to emulate in virtualization would be 100x slower than hardware and require FPGAs. e.g. How do you "test" HDMI/LVDS bring up or hardware accelerated network features of your NIC in software?
Whats the difference between theory and practice? In theory, there is no difference. If you are not testing on actual hardware, you are shipping your customers lots of bugs.