How can you not support JTAG?
How can you not support JTAG?
Charles McGrath posted an announcement:
Hello All,
I saw some discussion in the comments about the thermal resolution of the camera. So I thought that I’d talk a little about that.
Our MCU does have an onboard Analog to Digital Converter(ADC) which does run at 1Msps(Million samples per second), but we won’t be using that due to a logic level discrepency with the microbolometer. Therefore we have a dedicated ADC on the board to convert the image data.
With an image resolution at 160×120, that gives us 19,200 pixels in each frame. At 30 fps, that gives us 576000 data points each second. For the best thermal resolution, we are sampling at 12bits. Giving us about 6.9Mbps of output.
Cheers!
I didn't want to bore people with a repost, and assumd they could read it on IGG if they wanted. [or are they visible to backers only?]
Anyway, here it is:QuoteCharles McGrath posted an announcement:
Hello All,
I saw some discussion in the comments about the thermal resolution of the camera. So I thought that I’d talk a little about that.
Our MCU does have an onboard Analog to Digital Converter(ADC) which does run at 1Msps(Million samples per second), but we won’t be using that due to a logic level discrepency with the microbolometer. Therefore we have a dedicated ADC on the board to convert the image data.
With an image resolution at 160×120, that gives us 19,200 pixels in each frame. At 30 fps, that gives us 576000 data points each second. For the best thermal resolution, we are sampling at 12bits. Giving us about 6.9Mbps of output.
Cheers!
I didn't want to bore people with a repost, and assumd they could read it on IGG if they wanted. [or are they visible to backers only?]
Anyway, here it is:QuoteCharles McGrath posted an announcement:
Hello All,
I saw some discussion in the comments about the thermal resolution of the camera. So I thought that I’d talk a little about that.
Our MCU does have an onboard Analog to Digital Converter(ADC) which does run at 1Msps(Million samples per second), but we won’t be using that due to a logic level discrepency with the microbolometer. Therefore we have a dedicated ADC on the board to convert the image data.
With an image resolution at 160×120, that gives us 19,200 pixels in each frame. At 30 fps, that gives us 576000 data points each second. For the best thermal resolution, we are sampling at 12bits. Giving us about 6.9Mbps of output.
Cheers!That's actually the most plausible thing I've seen from them to date, apart from the 'logic level discrepency', which could just be him not quite understanding the words the tech guy is saying.
Quite plausible you'd need 12 bits for full range from an imager.
I didn't want to bore people with a repost, and assumd they could read it on IGG if they wanted. [or are they visible to backers only?]
Anyway, here it is:QuoteCharles McGrath posted an announcement:
Hello All,
I saw some discussion in the comments about the thermal resolution of the camera. So I thought that I’d talk a little about that.
Our MCU does have an onboard Analog to Digital Converter(ADC) which does run at 1Msps(Million samples per second), but we won’t be using that due to a logic level discrepency with the microbolometer. Therefore we have a dedicated ADC on the board to convert the image data.
With an image resolution at 160×120, that gives us 19,200 pixels in each frame. At 30 fps, that gives us 576000 data points each second. For the best thermal resolution, we are sampling at 12bits. Giving us about 6.9Mbps of output.
Cheers!That's actually the most plausible thing I've seen from them to date, apart from the 'logic level discrepency', which could just be him not quite understanding the words the tech guy is saying.
Quite plausible you'd need 12 bits for full range from an imager.
Don't the SAM3X chips have a 12-bit ADC with 1M samples/sec built in?
http://www.atmel.com/devices/SAM3X4C.aspx?tab=parameters
I can't comprehend what "logic level discrepancy" means in terms of sampling with an ADC?!
I didn't want to bore people with a repost, and assumd they could read it on IGG if they wanted. [or are they visible to backers only?]
Anyway, here it is:
I agree whole heartedly. The only thing that matters at this point is tangible proof of a working camera prototype, or notification of delivery. Any other discussion is just a waste of words. I would expect that a mass exodus is going to happen soon unless they produce something of substance.
If I was them would have been in serious PR panic mode a long time ago, and squashed the nay-sayers on the head. To let it have have gotten this far without showing anything of substance is just project suicide. It's trivial given how easy it is to snap a photo of the prototype case(s), prototype board(s), test image or whatever.
P.S. Unlike what I would give to MuOptiBullSh*t I heap tons of applause onto IR Blue for making an awesome, legit product. These guys really deserve it, so here's a link: http://www.kickstarter.com/projects/andyrawson/ir-blue-thermal-imaging-smartphone-accessory
P.S. Unlike what I would give to MuOptiBullSh*t I heap tons of applause onto IR Blue for making an awesome, legit product. These guys really deserve it, so here's a link: http://www.kickstarter.com/projects/andyrawson/ir-blue-thermal-imaging-smartphone-accessory
Unfortunately I came too late to get one, but there'll probably be Eagle files etc. available soon to make one
Can't a fresnel lens and a cold mirror work for the optics?
No. All I'm going to say is that plastic and glass completely block the passage of thermal wavelengths.
A germanium lens that size would cost around $450 (edmundoptics.com/optics/optical-lenses/ir-lenses/germanium-meniscus-lenses/3081).
OK, y'all. As far as I can tell, this is how the whole thing has played out.
John McGrath, the lead guy, is damn rich. On a whim, he decided to buy a $6,000 FLIR E50 or E60 (other possibilities there) or just acquired one somehow. (flir. com/cs/emea/en/view/?id=41372) He found it pretty neat, and had the childish thought that he would make his own and mass produce it for less. (Kind of like what the guys in October Sky were thinking, but far more moronic and petty.)
I was just recently asked about exact same type of project idea, but for medical ultrasound imaging. The brilliant idea they have is that most of the cost of medical imaging is in the display, so if they replace the graphical display with an iPad, they could produce it for a fraction of the cost. Just design an ultrasound probe that we could plug into an iPhone or iPad -- it should be easy with an Arduino. What's next? CT Scan or MRI anyone?
Were they knowledgeable about the requirements for developing medical devices? The mandatory processes and mandatory documentation I have seen for developing medical equipment is not for the faint of heart. If I got it right you can't even do some stuff retroactively. I.e. you can't just happily develop something in whatever way you like and in the end get it somehow certified. You need to follow the mandatory stuff right from the beginning of the development, otherwise you don't have a chance to get it certified.