In 2000 I was working at a high-speed industrial camera company on a team to build an extremely ambitious device: A camera that could take video at 100,000 frames per second while being subjected to 100G forces. We wanted to eliminate film from the industrial market, where so-called "streak" cameras could have frame rates up to 1,000,000 frames per second (most famously used to record nuclear bomb tests).
Being an embedded real-time software engineer, my job was to define and implement the interface layer between the low-level camera hardware and the external high-level user interface. The camera hardware was a fiercely complex dynamic state machine, while the user interface exposed many controls that could easily interfere with each other (the range of paramater X depended on the settings of parameters Y and Z, each of which depended on A, B and C).
My solution was to get the hardware engineers to implement a "what if" capability, where the camera would return an error rather than silently accept an invalid setting. This capability allowed both the low-level hardware and the user interface to evolve almost independently, while my middle-layer tracked two whole-camera states: The last successful configuration, and the last unsuccessful configuration, where the differences were made available to the UI.
But who cares about software?
When our first all-up alpha camera was built (electrically correct, but quite unable to fit in the final enclosure), the first high-speed video run (at 85K fps) produced all black frames when pointed out the window at a brightly sun-lit outdoor scene. The hardware folks were going crazy, thinking we had failed to correctly read the pixels from our custom sensor.
I sat back and did the math, calculating the number of photons needed to reach half-exposure on a pixel, then dividing by the exposure time to get the required flux. That flux level was enormous. About 30% of what pointing the camera directly at a full-field view of the sun would yield. Something we were not about to do, since we didn't want to melt the sensor in our first working prototype.
But what other bright light source could we use? In our conference room was an early DLP video projector that used a high-pressure gas-discharge tube mounted behind an IR filter. We aimed the camera through the filter and directly imaged the arc in the lamp, clearly showing how it danced within the confines of the bulb.
We immediately posted the video to our website, and just a few days later the US President of a Very Large Japanese projector manufacturer called, wanting to see our prototype camera in action.
It turns out that this 'arc wander' was a serious problem for projector manufacturers, since it meant the optical focus had to encompass the entire arc volume, rather than just a central portion. In many projectors of the day, this effect was visible in the projected image as a quiver or shake leading to a soft-defocus effect. One corrective approach had been to make the bulbs smaller, but that made it much more difficult to remove the generated heat, leading to greatly reduced bulb life.
By the time the Japanese delegation arrived, we had the first of three beta units built. We repeated our experiment for them, using bulbs they had brought with them. On the spot, they offered us an insane price for one of our beta units, wanting to leave with it that day. We made them wait until all three beta units were ready, then sold them one.
The beta units did fit in the final enclosure, but they didn't come close to meeting the 100G impact requirements. Why would any camera need to survive 100G treatment? That's more than enough force to tear any lens right off the front of the camera. Our design goal was for our camera to survive a lifetime of such impacts without affecting normal operation (aside from having to replace the lens).
Our camera had two main customers: First was the auto crash test industry, where they needed cameras to be inside cars during high-speed crashes. The second was military weapons testing, where the closer the camera could be to the blast, the better the video would be.
Once such high-speed video was available, some of the spin-off uses surprised us. For example, one customer used our video to greatly improve the state-of-the-art in Finite Element Analysis of structures, improving their analytical techniques until the simulation of an impact precisely matched the actual deformation captured by our cameras.
Unfortunately, the development costs for this camera bankrupted the company: The expected pre-orders failed to appear (there was a recession at the time), while our lines of credit became too expensive to afford. The company was sold to a competitor for a song, and the entire engineering team was laid-off just weeks after the camera work was completed.