Awesome, this looks very promising :-+Thank you! Looking ahead, I used FX2LP for USB. FPGA is streaming data over 8-bit bus. As a start point I used this Cypress project: https://community.cypress.com/docs/DOC-14406 (https://community.cypress.com/docs/DOC-14406). More details further... With regard to the focus system, first I didn't believe too)
I'm looking forward to learn more about your project, especially how you interfaced the µbolometer and implemented UVC.
Didn't think it would be possible to get a motorized focus inside the tiny housing.
Keep up the good work!
What a superb project, well executed and very professional in its appearance :clap: :-+Wow, many thanks! :) As you are owner of one of these cameras, I hope for your help with testing ;)
As I own one of these cameras I will await further details with GREAT interest :-+
Quite interested in the interface with the sensor and the reverse engineering. Can't wait to see the code! (and hopefully writeup)Hi! The interface with the sensor is, obviously, the most tricky moment. In fact, there are a lot of things that I still don't understand, but I hope for community help.
I also have an NV3 and was working on reverse engineering efforts to interface with the original circuitry.Hi! I think we could colaborate for better result. I don't have NV2, as I know, this is a previous generation camera, it has an internal shutter (behind the lens). Sure you know, there is a great work about NV2: https://debugmo.de/2018/12/autoliv-nv2-teardown/ (https://debugmo.de/2018/12/autoliv-nv2-teardown/)
I just got some boards I made for interfacing to the NV2 going out to some friends to help with development/testing. Although its mostly used as a test platform not such a professional looking project as yours. I look forward to your progress and wish you luck!
Very cool project!Hello! Thanks much! Yes, you are right. I refused using original circuitry in my project, because it may not survive, when the front window is broken. Also original electronics limits some opportunities.
Do I understand correctly that you're using the sensor and other hardware, but not the PCBs/FPGA of the NV3?
Can't wait to see more details!
Do I understand correctly that you're using the sensor and other hardware, but not the PCBs/FPGA of the NV3?
Thus, I'm really impressed by the project.Thanks you! Ok, I thinks this will be the start point of the story.
Are you using just the FPA or also some parts of the NV3?
So if it is just the casing, lens, FFC flag and Microbolomter PCB that is used in your design, presumably this project could be adapted to the earlier NV2 model ?Yes, you are right. I believe it can be adapted to any camera, that needs deep raw image processing.
You have basically built a thermal imaging camera from scratch which is a significant achievement when you do not have the datasheet for the microbolometer or its recommended bias voltage values. Building the back end video processing package and firmware is a further very impressive achievement. You have skills :-+ Much respect for you :clap:Thanks much! Bias values were picked by hand, I'm still looking for a way to calculate this values properly. Do you have any knowleges about that? After tons of experiments I found out that this is 6-bit value per pixel and could properly feed them to FPA core.
Great to read this thread as it gradually reveals the work you have done. Loving this :-+Thanks a lot!
First technical question.Yes, it's taken into account. As you know, all FPA pixel are different, they all have different response to the same amount of IR radiation. So to solve this problem and make the image uniform we should determine a gain and an offset parameter for each pixel. Gain parameter changes with the camera temperature quite slowly, whereas the pixel offset changes at a temperature difference of ~0.2°C. To compensate this temperature drift FFC (Flat-field correction) procedure is initiated. We use external shutter to close the front of the lens and assuming that shutter surface is uniform, we perform recalculation of offset parameter, improving the image. You can run FFC manually any time you want to improve the image quality or enable automatic mode, when firmware monitors enclosure temperature by integrated temperature ICs and initiates FFC periodically.
I know, it is maybe to early at this stage, but have you taken into account the heating of your camera enclosure during operation (electronics, battery, external heat sources) and its temperature changes on camera (FPA) operation?
Yes, it's taken into account. As you know, all FPA pixel are different, they all have different response to the same amount of IR radiation. So to solve this problem and make the image uniform we should determine a gain and an offset parameter for each pixel. Gain parameter changes with the camera temperature quite slowly, whereas the pixel offset changes at a temperature difference of ~0.2°C. To compensate this temperature drift FFC (Flat-field correction) procedure is initiated. We use external shutter to close the front of the lens and assuming that shutter surface is uniform, we perform recalculation of offset parameter, improving the image. You can run FFC manually any time you want to improve the image quality or enable automatic mode, when firmware monitors enclosure temperature by integrated temperature ICs and initiates FFC periodically.Yes, you have two things, NUC and thermal dirft caused by several factors. I was asking about the latter. You are correcting it using a shutter. My question was about the dynamics of that process, i.e. how often the shutter has to be activated to keep a decent image quality/temperature measurement accuracy.
The amount of work you have done is amazing.Thank you! Well, yes, I realize that this hard to criticize the design without the schematics, but I mean, maybe you have some global ideas that do not require schemes currently. The problem is that I'm going to make quite a lot of changes for the next hardware revision and I think that it would be better if I publish the very last schematics. I believe we will have enough time to discuss and make changes to design if necessary before I send new files for manufacturing.
I think it makes no sense to criticize the design, because we do not see the schematic diagram, but it looks great and professional.
I am more than impressed by the effort so far. Wish I had the knowledge, experience and tools to mount such a project. But really shows what can be done... and motivates me to move on with my project (in 14 days.)Thanks much! I wish you good luck with your project!
I am loving this thread.Many thanks, Fraser!
...
Yes, you have two things, NUC and thermal dirft caused by several factors. I was asking about the latter. You are correcting it using a shutter. My question was about the dynamics of that process, i.e. how often the shutter has to be activated to keep a decent image quality/temperature measurement accuracy.Thank you, Max! Basically one should perform FFC every time when camera's temperature changes for a some threshold value up or down. Also there is no need to perform a FFC if temperature is stable, the image quality will not get worse in time. I can't exactly say how often the shutter has to be activated, as it depends on the environment temperature, how fast the camera heats up and the temperature change threshold value. I don't think that this a point of worry, until you use it on UAV, because FFC causes an image freeze for a short period of time (~1 sec). There are two ways to mitigate this problem. First - you can initiate FFC yourself, second - the firmware can display a countdown before FFC to help the pilot get ready.
Knowing about some of the challenges you were and are facing, I'm impressed.
I see not a visual camera... right?
maybe it can be combined with this one: https://openmv.io/products/openmv-cam-h7 (https://openmv.io/products/openmv-cam-h7)
and let the openmv board deliver a corner detection overlay for a "like MSX" image.
its a really awesome project.Thank you!
currently, I work on an open-source solution too, but its more an interface of an existing Camera, not a completely new device.
I see not a visual camera... right?
maybe it can be combined with this one: https://openmv.io/products/openmv-cam-h7 (https://openmv.io/products/openmv-cam-h7)
and let the openmv board deliver a corner detection overlay for a "like MSX" image.
VGN, incredible work. Another one here interested in the possibility of adapt your solution for the NV2 ISC0601 sensor. I'm another one with one of those lying around.Thanks!
I'm very impressed with this project - I know a little (mostly theoretical) about thermal imager design and it's a significant undertaking, normally involving a commercial team and significant money! Very interested to learn more.Thanks!
I agree that a visual camera in the MSX style (i.e. for edge enhancement etc) isn't necessary with a high resolution (>240p) thermal imager. If you're using the camera for security or something however it can be useful as a separate imager, e.g. to zoom in on someone that you've detected with a wider angle thermal imager.Integration of visual camera is quite challenging, though not impossible. The problem is that right now there is no free IOs on headers that connect boards with each other, while IO muxing is also not easy. I also doubt that, for example, in UAV mode this visual camera will be better that specialized FPV one. Though at the same time you could easily multiplex outputs of FPV visual and thermal camera. Generally, I think there should be really strong arguments for visual camera integration.
The brain of the whole device is a Spartan-6 FPGA. As a memory buffer I decided to use HyperRAM. HyperRam memory controller was also developed from the scratch, as there was no high performance implementation of this IP-core in the opensource space. In fact this IP-core is my separate project.Commercial camera cores use things like PSRAM (psuedo-static RAM) for this reason also - lower power consumption than DDR2/3 plus controller.
You could ask me, why didn't I use free FPGA's integrated memory controller with DDR2/DDR3 memory. The thing is a very high power consumption, DDR3 + integrated controller would burn even much power, than the whole camera consums itself.
The red bond wire is a bug fix. Though is not the last bug. Using Spartan-6 was a mistake. This FPGA is quite powerful, but also old, has a very poor support, no updates, some bugs in IDE, endless loops in peripherial drivers... :palm:
That's why I'm planning to replace it with Spartan-7 XC7S50 in the same package. I hope the migration will be easy, as previously I had some experience with Xilinx 7-series FPGAs.
Holy crap this is basically the HOLY GRAIL for my projects. I'll be watching this with INTENSE interest, it's essentially my dreams come true.Thanks! All the fun is ahead ;)
Since you're planning an FPGA swap anyway, it's worth considering the Lattice ECP5 series - the open-source toolchain for them is very good these days.I was thinking about that long ago, when realized that swap is needed. Well, yes, open-source toolchain is very cool and important (especially for security based projects), but I'm not completely sure, that it is worth doing that. I have several arguments:
Also, it would probably tempt more people to get involved once you open source the project.It will be open source anyway, please, be patient :)
also would be nice for those uninitiated, what to look for to get these modules. the nv3 ones.66549322653 or 9322653 - BMW
Perhaps the vibrations come from the lack of lubrication of the gears, apart from the slightly rough surface finish. You should apply some type of non-hygroscopic based grease, such as PTFE grease.When I disassembled this motor, I found out that there was no any grease at all in the gearbox. I tried to apply a bike grease (the only grease I had), but things got worse, the "no load" current increased. Looks like too much grease in the gearbox prevented normal rotation, amount of lubrication must be proper. On the other hand, by default at max voltage (3.0V) the shaft rotates at 240RPM, so with ratio 1:136 we have motor rotating at about 32640 RPM, this is a lot. I suppose that probably vibration is caused by rotor disbalance.
This is the most impressive thermal camera project that I have seen to date. You have manufactured a very professional mechanical and electronic solution. If I was working on your PCB’s I would honestly think that they were the product of one of the well known thermal camera manufacturers :-+ Most impressive and I can only dream of having such design skills :-+ :-+Many thanks for your support! :)
That is so beautifully engineered, my plans for a 3D printed follow focus (external) that I sketched out during some downtime at an exam look pathetic.Thank you! Just keep up. If your design or piece of a code starts looking pathetic for you, that means your skills are growing, this is a good sign. :-+ Wish you luck with your project! ;)
I will take a deep look at this completely project and see which parts I can somehow use for my project. Experience and tools is what I feel I am lacking.Feel free to use any piece of this design. I'm going to start commiting some sources by the next week. I'll leave a link to my github repo.
Soooo, VGN is the initials of three people, right? Vince, Gary, and Nancy? Because this project seems to be the work of a team of hardware designers, FPGA engineers, and electronics engineers all working together :-+Thank you! Hahaha, not Gary, me and Nancy call him Gabe ;)
Really impressive work, bravo!
This project is great, lots of features. It is looking quite professional ! :-+Thanks!
Keep the image and details comming.
Just wanted to say I joined this forum just to say thank you, and how impressed I am with your design. I can;t wait until we can order the components. Just got myself a new Audi 4GO980552A camera off eBay for £295 so I can do this project. Should be buy the FPGA now as well, or will you be selling all the circuit boards as a kit?Thank you! I'm really very glad, that you and other people are interested in this project, that additionally motivates to continue. Yes, I have plans to make a some kind of kit, that anyone can buy. This kit will include everything you need, except the thermal camera, sd card, battery (due to some problems with air transportation of batteries). Later I will determine the exact kit contents. There is no need to buy a FPGA for this project. The design of this camera is developed in a way that you will not need even a solder iron to put all parts together. I will also make efforts to keep the functionality and price competetive with other brands, though that will be not easy.
If any one is interested in getting the same camera, there is one left at that price (less than a half the normal price for a used one) and the link is:
https://www.ebay.co.uk/itm/293490373204 (https://www.ebay.co.uk/itm/293490373204)
This project is one of the most exciting projects I seen on this forum, I never expected a personal project can be so complete and elegant.Thank you! :)
I'm considering buying a second hand BMW camera for the FPA, I wonder is there any pitfalls to avoid?
I would be also quite interested in the kit (given the price is reasonable ;) ), as I don't have the equipment and experience to work on BGA packages.
Thank you! I'm really very glad, that you and other people are interested in this project, that additionally motivates to continue. Yes, I have plans to make a some kind of kit, that anyone can buy. This kit will include everything you need, except the thermal camera, sd card, battery (due to some problems with air transportation of batteries). Later I will determine the exact kit contents. There is no need to buy a FPGA for this project. The design of this camera is developed in a way that you will not need even a solder iron to put all parts together. I will also make efforts to keep the functionality and price competetive with other brands, though that will be not easy.
Got my Autoliv NV3 today, does anyone know of a guide to hook it up to my PC or a Raspberry Pi? I've tried googling, but everything I get is on the NV2 (or saying the NV3 isn't possible, which it must be now obviously). I have to admit to being out of my depth with automotive system, and don't even recognise the connector (cirlcular with 4 sprung connectors inside).This thread is actually the guide. :D
You cannot beat a decent microbolometer coupled with a good lens :-+
At least you get a decent, low noise, image to process :) Struggling with what the FLIR Lepton and Seek Cores produce is no fun at all.
merit = np.abs(cv2.Laplacian(cv2.GaussianBlur(imgFlt, (3, 3), 3), cv2.CV_32F)) / 65536
merit = 1 / pow(merit + 0.005, 3)
leftErr = imgFlt[:, 0:leftHalf] - cv2.GaussianBlur(imgFlt[:, 0:leftHalf], (1, 5), 10)
horiLeftAcc = np.average(leftErr, weights=merit[:, 0:leftHalf], axis=1)
rightErr = imgFlt[:, leftHalf:] - cv2.GaussianBlur(imgFlt[:, leftHalf:], (1, 5), 10)
horiRightAcc = np.average(rightErr, weights=merit[:, leftHalf:], axis=1)
vertErr = imgFlt - cv2.GaussianBlur(img, (5, 1), 10)
vertAcc = np.average(vertErr, weights=merit, axis=0)
for n in range(0, np.size(img, 0)):
imgFlt[n, 0:leftHalf] = cv2.subtract(imgFlt[n, 0:leftHalf], (k[0] * horiLeftAcc[n])).squeeze(1)
imgFlt[n, leftHalf:] = cv2.subtract(imgFlt[n, leftHalf:], (k[1] * horiRightAcc[n])).squeeze(1)
for n in range(0, np.size(img, 1)):
imgFlt[:, n] = cv2.subtract(imgFlt[:, n], (k[2] * vertAcc[n])).squeeze(1)However, it looks like VGN's masking in the Fourier space did a better job. I tried similar method before, but it looks like I made a mistake by also masking the part close to 0 frequency and it never worked. :-\I made the same mistake at first)
This thread is actually the guide. :DYou know, I thought that as soon as I pressed the post button (that's what I get for asking questions when I'm half asleep). In my head there was some magical M12 CAN -> USB connector, he he! Any news on when things will be ready for us to try (no rush, just over excitment, like a child with a new toy)!
Data lines of this 4 pin connector are going to MAX9259 IC: https://datasheets.maximintegrated.com/en/ds/MAX9259-MAX9260.pdf (https://datasheets.maximintegrated.com/en/ds/MAX9259-MAX9260.pdf)
This IC implements some proprietary gigabit link protocol...
Hi, I’m so amazed (like others also do) at how incredible work you have done. :-+
Again, it’s really unbelievable how you done such work, in a short time...
Just a quick question regarding to the last video: I realized you have motorized focus on the camera? Is it auto or manual? I found it is a bit slow, but stop at the focal point without overshoot. Is it phase detection or even laser assisted?Yes, the focus is motorized, you can find focus system PCB photos in my previous posts. There are two modes, manual and auto. On the video, you saw a manual mode, I changed the focus myself. Autofocus fuction is under development now. It is based on image sharpness detection. Right now it is implemented in software and works too slow and not precize. I definitly should make a hardware acceleration of image sharpness calculation. In this case I will be able to detect sharpness gradient change at the highest frame rate, and accidental sharpness change due to the hand shake will not confuse the algorithm, I hope.
I have to admit, I was getting a little worried when we hadn't heard from you in a while, so pleased it is going well.Sorry for this, I will try not to make long delays or at least warn about that.
1) Will the FPGA be able to handle the thermal data processing and the auto focus and keep the frame rate? I'm not familiar with the Spartan-6 board?New hardware will be based on newest Spartan-7. Of course, FPGA will be able to handle the thermal data processing, autofocus and many other things simultaneously. FPGA is very good with paraller processing.
2) Is there an ETA on when we can get a look at some of the code and stl files? I'm just being impatient with excitement, I know. But The code will be fascinating to look through for the image processing alone, and the stl files will be a good test for my printer (whenever it eventually arrives, grrr).Don't get me wrong, this is my first open source project. Though I have developed quite a lot of different devices in a small teams with private repositories, I don't have an experience with open source and wide community. I'm not sure I know the proper way of organizing such kind of projects. Is it better to make a public repo from the very beginning or keep it private for those of you, who would like to collaborate and reveal sources with the first batch of devices? I've always thought about it. Maybe you or anybody else can give an advice?
3) I'm not entirely sure where abouts you are in your project roadmap, it seems you are getting close to a finished product, just requiring some some tweaks (sometimes the phase that takes the longest time, but hopefully everyone on here can help once it goes on github).
4) Any idea of how much the kit minus the nv3 will cost? I'm moving house at the moment, so counting the pennies, and want to make sure I put some money aside. Even better if you could say some thing like the circuitry will cost about X and the housing/gears/internal parts should be about Y, etc.I will provide this information as soon as I get the final BOM. But I'm going to keep the price competitive to HT-301, TE-Q1, HT A2, FLIR C2 and other "cheap" thermals, though OpenIRV is going to be more powerful and multifuctional. Also keep in mind that the batch size makes difference on the final price.
If you are looking for tester I would be happy to be a tester.I'm very glad, thank you! That will be cool if you and all other people, who would like to be a tester or even a developer of the first kit batch, email me (view my profile for email).
Do you know if a 4H0980552 is an acceptable version. Note the H instead of G. How would I find out what that means?I don't know what actually each letter means, but according to the photos, 4H0980552 is the previos generation (NV2). This project initially was designed for NV3, but I have plans to make hardware in a way to support more types of sensors.
The easiest take-away is the connector - it's always this HSD ("weird automotive never seen before") connector, and either there are two additional pins next to it (NV2) or not (NV3). NV2 has Power (separate), CAN and LVDS so it needs 6 pins, whereas NV3 has Power and this "weird automotive never seen before bidirectional highspeed interface that's neither CAN nor LVDS but can replace both", and hence only needs 4 pins.Exactly!
The case is also distinctively different, but harder to describe/see.
Well for NV2 we have https://www.eevblog.com/forum/thermal-imaging/autoliv-nv2-on-raspberry-pi/ (https://www.eevblog.com/forum/thermal-imaging/autoliv-nv2-on-raspberry-pi/,), but VGN's work of course is far more impressive.tmbinc, btw I have to notice that I saw your investigation of NV2 on debugmo.de long ago. That is an incredible work that inspired me to work on NV3! I'm a newbie on this forum, I haven't remembered your nick-name, so I only recently realized that was you, haha)) ;D
Welcome to my GitHub repo! ;)
https://github.com/OVGN/OpenIRV
All mechanical parts STEP models are avaliable. You can try to print some of them for a test. That will be cool if you show your attempts to us, especially gears. I have used SLA printing, but didn't try any other technologies. Keep in mind that this parts are not release version, I'm going to make some changes, even not backward compatible.
I just received an e-bay NV3 that seems in near new condition with just a plastic tab on the window holder broken.This window holder locking plastic tab breaks very easily. Mine was broken too)
Opening the case reveals a ribbon cable from below going up to the window that is not shown in your pictures. What is that for? I can probably get a picture, but its pretty tight. How do I unhook it?This is actually an internal heater with integrated SMD thermistor. Original camera activates this heater in cold weather conditions to melt the ice at the front window, as ice is not IR transparent. There is a FPC connector at the other end of this cable. The heater itself is glued to the plastic housing by double-sided tape. You don't need this part with OpenIRV.
I just received an e-bay NV3 that seems in near new condition with just a plastic tab on the window holder broken.
Here's an attempt to print the worm with a regular printer, Prusa Mk3s. Looks bad but in my experience it will probably still work.Nice try!) I counted 31 layer per 5.1mm or looks like your layer height is around 0,15mm. I think you probably can get much better quality with FDM if you make layer height 0,05mm, exact this layer height I used while printing my parts on SLA.
Nice try!) I counted 31 layer per 5.1mm or looks like your layer height is around 0,15mm. I think you probably can get much better quality with FDM if you make layer height 0,05mm, exact this layer height I used while printing my parts on SLA.Yes on both. Never tried smaller than 0,15. I have a 0,3 nozzle to try too.
Yes on both. Never tried smaller than 0,15. I have a 0,3 nozzle to try to. Here's an uncleaned-up gear for test.Yes, this one looks much better! But I'm not sure about that four tips. Anyway, FDM looks promissing.
Just curious, is it necessary to use black?Do you mean plastic color? What's the difference?
Just curious, is it necessary to use black?Do you mean plastic color? What's the difference?
The inside of the gear beveled surface is nearly a lens shade. On a visible light camera this would all be black. Is this necessary? Almost seems like white would emit less IR. But I'm ignorant of this IR stuff.
I have designed a few stainless steel springs however. The material is critical. Not sure about stencil material. If you can bend it sharply and it doesn't spring back its the wrong kind.
The inside of the gear beveled surface is nearly a lens shade. On a visible light camera this would all be black. Is this necessary? Almost seems like white would emit less IR. But I'm ignorant of this IR stuff.The camera FOV is 24° h x 18° v, according to this documentation: http://www.safetyvision.com/sites/safetyvision.com/files/FLIR_PathFindIRII_User_Guide_1.pdf (http://www.safetyvision.com/sites/safetyvision.com/files/FLIR_PathFindIRII_User_Guide_1.pdf)
I have designed a few stainless steel springs however. The material is critical. Not sure about stencil material. If you can bend it sharply and it doesn't spring back its the wrong kind.I believe stencil material and thickness should be stable, as this parameters are very critical for PCB manufacturing. Anyway, this is a matter of experiments.
Barev, VGN!Barev, Ruhkukah!) Thank you for your test! Looks really good. You'd better try to use 0.025mm layer height for worm too. In this case the worm surface will not be so stepped. Looking forward your results)
I took a stab to print these two challenging focus parts (worm and wheel). Part of the worm broke off after my kid played with it (to speed things up I used 5% infill, so it is pretty fragile) :-DD Now reprinting it
IMO the quality is good enough for the purpose, both parts are very smooth to the touch. For the worm I used black PLA filament, 0.05mm layer height, 0.15mm TriangleLab nozzle and 0.15mm extrusion width, 2 perimeters. For the wheel I tried 0.025mm layer height. My printer is Prusa MK3s stock. I didn't use any supports, so the cross and the circular plate needed a little bit of cleaning.
See attached.
VGN, I applaud your work - it looks very professional and top notch in every aspect. Looking forward to become a tester!Thanks much! Don't forget to email me to become a tester. You can find my email in attached images of the first topic post.
1. Is the FLIR PathFindIR II _identical_ with the Autoliv NV3? I know it looks pretty much spot on, but I wonder if it's the same thing. Is that an AutoLiv OEM product then? If so, is there also an NV2-based PathFind device?I don't exactly know, but I think, that NV3 is identical to FLIR PathFindIR II with some light changes, that each car manufacturer make to fit this camera in their car.
Can we use with this interface the Autoliv NV4 (12µm, 640x480, 50°x39°) sensor in the future ?I'm trying to do the best to provide as wide support of different IR cores as it is possible: more RAM, higher RAM throughput, more IOs at the sensor connector, more various adjustable power rails at the same connector, etc. In case of best scenario, we will just have to make a cheep adapter board to connect current hardware with a new sensor and develop a HDL module that will interface with it. Also, probably, a new sensor covering enclosure.
Would be cool when these sensor become available and we could use same interface.
Is it maybe an idea to use a ring magnet instead of the spring for the lens thread compensation? I think for 2mm there is always enough force available.You are right, this is definitly a good idea! :-+
Four 1.5mm height magnets generate enough force, pulling a metal ring, mounted on the lens. I glued magnets directly to the housing, but I think some special thin plastic frame, that will hold magnets properly is strongly needed. We will also have to glue the lens to the focus wheel gear. I used a hot glue gun instead of cyanoacrylate for a test.
That sounds great for this little test. What I meant is using a ring magnet with the right dimensions. They can be bought via ebay. Have a look at the attached picture. If there is a fitting magnet no additional plastic frame for the magnets is needed. Just glue the ring magnet to the housing.Yes, a ring magnet would be the best solution. On the other hand, small magnets are very cheap and easy to find, while ring magnets are quite exotic. Anyway I measured mounting seat dimentions, you can find them in attachments.
Had a look at the schematic. Debouncing of the buttons will be done in VHDL via a filter in the FPGA? In the microcontroller unit it is an annoying task to debounce something.Buttons pull-ups and debouncing capacitors are at the P-board (peripheral). Of course, there is additional debouncing logic in the HDL design. Agree, filtering buttons by internal mcu is a poor idea.
In the original design they seem to have gone to great lengths to keep the lens assembly from rotating when focusing. There are 4 sliding teflon surfaces between the focusing ring and the lens. Was that to keep wear particles from the threads off the sensor?This unit looks quite over engineered for me. Maybe I don't understand something, but really, no idea, why using so many parts just for a fixed focus...
Looks like the foam gasket that seals the bottom of the lens to the housing is not thick enough to seal for close focusing if it requires more than about .5mm expansion from the stock setting.Right, but I don't think that we need this foam gasket, it prevents lens rotation a bit. Also this camera is not supposed to be used in any aggressive environment, no water/dust protection. BTW, even Flir Vue Pro is not water/dust resistant.
A popular way to get around lead screw slop on 3D printers is plastic nuts. They apparently last long enough for that extreme task. Why not replace the threaded focusing ring with a combined focusing ring + ring gear version. The SLA printer may be up to the task. The threads don't have to be perfect, the errors control the slop. And you get rid of a glue joint !I'm afraid, that custom plastic nut probably will rotate too hard, and this tiny focus motor won't be able to make it move. All the more so I'm going to change the focus motor with higher RPM one. We should somehow keep lens rotating very easily and without backlash, otherwise, there is no chance for fast focus adjustment.The threaded portion could be backed with a circlip if required. edit, not enough room
Here's what I was thinking of:WOW! I take my words back. These cutouts are very interesting! :-+ :clap:
There is enough room for a steel ring spring that could be tensioned for a nice fit. I haven't tried on my filament printer. Wonder how it would do on SLA.
1. How are you going to make the steal ring?Just bend some music wire around a pipe of a guessed dia.
2. What about a special internal border that will help to center and fit the lens? Check out attachments. But I'm not completely sure how to make it properly, this is just a suggestion.
3. So the holding ring thread pitch is 0.508 (20mil)?! :palm: I thought it is metric 0.5mm pitch, though I didn't measure it, not even sure that I could...I don't know the pitch either, but the dia. is .998 inch so I assume 1 inch with 50thrds/inch. Could be something else. So short not sure it matters, especially with plastic threads.
I thought you had an SLA printer for the parts you posted?No, I don't have it, I just order printing.
@VGN Thinking about the focus. Your original design just needs a weaker spring. The spring deflection is proportional to thickness^3. Maybe the existing spring can be etched in something like pcb copper etch solution to the proper thickness. Not sure if it has a coating on it.I also tend to this. I'm going to try both the stencil spring and the magnet based one.
The spring deflection is proportional to thickness^3. Maybe the existing spring can be etched in something like pcb copper etch solution to the proper thickness. Not sure if it has a coating on it.Not sure if that's going to work. Also far not everyone will be able to repeat this.
Yes, a ring magnet would be the best solution. On the other hand, small magnets are very cheap and easy to find, while ring magnets are quite exotic. Anyway I measured mounting seat dimentions, you can find them in attachments.You are right. I also didn't find a fitting ring magnet. I think it would be a lot better to use normal magnets.
Had a look at your new schematics. Great work!Thank you! Feel free to ask me any questions! Any suggestions are welcome for discussing.
For debouncing of the buttons I usually use the following circuit. So the bouncing is filtered even when the button is pushed.Oh...right, pull-up + RC filter looks much better, going to fix this. Thanks!
Do you know the standby current of the whole system?Of course, 8uA (micro amps).
Button 3 seems to turn on the whole system.Right, button 3 and also EXT_GPI/PWR_ENA pin at the X9 connector is used to enable the whole system.
amazing project!Hi! Many thanks!
It clearly shows your dedication in every aspect of the complete project.
The lovely designed Focus mechanism.
The Housing.
The Software and even the PCBs are designed with love!
I really appreciate this.
You are reading the sensor out with 60fps?
How low can the clock for reading go, so it i spossible to readout at 10fps? or ist just internally fixed to 60fps?We don't have and I believe will never have access to datasheets of this core, so I actually don't know that. Original board is forwarding clock at 73.636MHz, I'm using the same value. You will be able to experiment with clock rate, as soon as the HDL-design will be published.
For what is the CMD Pin on the sensor? 1 Wire? Any information on that?Forget about any standard interface with this FPA, everything is proprietary, the CMD interface as well. There are a lot of thing to tell about, I'm preparing a special manual for this. Well, the short answer, this is a 1-bit command line, synchronous to the main clock (73.636MHz). It is used to control the sensor. According to my investigations, a special 22 byte command is sent to the sensor to get each new frame. First two bytes of the command are 0x3F and 0xC0. I'm 99.9% sure, that this is a preamble that helps the core to detect a new command. Why? There is no any other strobe to validate the incomming command, 0xC0 = ~0x3F, very likely to be a preamble. Also, the core does not stream anything if I flip any bit in these two bytes. Other 20 bytes keep some special data to control the sensor. Unfortunatelly, the command fields are not byte aligned, that makes investigations a bit harder. There are a lot of bits that do not affect anything. Also a command for one sensor may not fit another, as themal array parameters are very different from core to core. Though I don't know what all the command fields are doing, I could find some common command pattern that should fit all sensors and a single special field that you can quite easily tune to get the image. I have tested this algorithm on my three FPAs and could make all of them working well. I'm sure that the more people will play with this sensor, the faster we will find out the function of each command field. Another way is to reverse engineer the firmware and get this data from original EEPROM. But, I'm not sure that this is a good idea, as this is an intellectual property of Avtoliv/FLIR.
Do you think it's possible to have the sensor connected via a shielded cable from about 50cms long to the controller electronics?Do you mean the ISC0901B0 sensor?
Yes, only the sensor in a compact as possible housing.Sensor lines IO standard is LVCMOS 2V5. Data stream is running at @73.636MHz SDR. 50cm is a quite long distance, I'd not say that this is impossible, but will be a bit challenging. You should take in account proper termination of the I/O lines to exclude signal "ringing". I also think that using buffers/repeatters for the five high speed lines, i.e. clock, bias, data_odd, data_even, cmd is also a good idea.
Many thanks for the comprehensive answer.It was worth starting with this)) In this case I don't really think that this expander will be reliable enough. At the same time dimentions and weight of the camera is quite critical for small UAVs, that cannot carry this big body camera. But I think I have a solution.
And no it's no secret at all, I would like to mount the sensor on a very small pan/tilt head on a Trimble UX5 drone.
Transmission could be done with HDMI or AV, 2.4GHZ or 5.0GHZ FHSS.Thanks, streaming HDMI is not a problem, though I can't decide what kind of connector is better to use. I saw different FPV HDMI transmitting hardware, everything looks to be proprietary.
I could opt for a bigger pan-tilt had but i'm afraid this causes to much drag for this fixed wing UAV.Looks like this drone is not designed to carry any camera outside of the body. But there is some free space at the nose, probabry you could make some mount for a forward looking camera. In any case, the best FPV mode of this themal camera that I can suggest you is 90g within dimention of ~50x50x50mm.
amazing project!Hi! Many thanks!
It clearly shows your dedication in every aspect of the complete project.
The lovely designed Focus mechanism.
The Housing.
The Software and even the PCBs are designed with love!
I really appreciate this.You are reading the sensor out with 60fps?
Yes, the raw 14-bit per pixel data is comming out at 60fps. I left in my previos posts a link to my google drive with a video, captured right from ther sensor.
Here is the link: https://drive.google.com/file/d/1QfK6TBKxTnJjb9-Vjo1R4MsX56F_mhFq/view?usp=sharing
I advise you to download (~180MB) this video before viewing, as not smart youtube downgrades video quality too much.How low can the clock for reading go, so it i spossible to readout at 10fps? or ist just internally fixed to 60fps?We don't have and I believe will never have access to datasheets of this core, so I actually don't know that. Original board is forwarding clock at 73.636MHz, I'm using the same value. You will be able to experiment with clock rate, as soon as the HDL-design will be published.
On the other hand, I don't think that you really need it. If you would like to integrate this camera with any other hardware, you can choose USB, HDMI, AV or a low speed interface access over GPIO lines at X9 connector of the P-board. I have plans to implement a SPI interface over GPIO line, so that you can control the camera and readout the video data at any FPS you would like.For what is the CMD Pin on the sensor? 1 Wire? Any information on that?Forget about any standard interface with this FPA, everything is proprietary, the CMD interface as well. There are a lot of thing to tell about, I'm preparing a special manual for this. Well, the short answer, this is a 1-bit command line, synchronous to the main clock (73.636MHz). It is used to control the sensor. According to my investigations, a special 22 byte command is sent to the sensor to get each new frame. First two bytes of the command are 0x3F and 0xC0. I'm 99.9% sure, that this is a preamble that helps the core to detect a new command. Why? There is no any other strobe to validate the incomming command, 0xC0 = ~0x3F, very likely to be a preamble. Also, the core does not stream anything if I flip any bit in these two bytes. Other 20 bytes keep some special data to control the sensor. Unfortunatelly, the command fields are not byte aligned, that makes investigations a bit harder. There are a lot of bits that do not affect anything. Also a command for one sensor may not fit another, as themal array parameters are very different from core to core. Though I don't know what all the command fields are doing, I could find some common command pattern that should fit all sensors and a single special field that you can quite easily tune to get the image. I have tested this algorithm on my three FPAs and could make all of them working well. I'm sure that the more people will play with this sensor, the faster we will find out the function of each command field. Another way is to reverse engineer the firmware and get this data from original EEPROM. But, I'm not sure that this is a good idea, as this is an intellectual property of Avtoliv/FLIR.
Hello VGN, in the FPA timing I measured, there is another signal I don't understand its function. This signal is temporarily called "PIN6", and it appears together with the FPA data output PIN2 and PIN3. I don’t know what it does. Is it “exposure/integration” control?If you are asking about ISC0901B0, your "PIN6" looks like to be a pixel BIAS value, check out schematic of the M-board in my repo. Controller sends 7-bit (in fact only 6-bit are significiant) LSB-first BIAS values for each pixel, each row. Approximate value is about ~0x25.
In addition, from the rising edge of FSYNC (DATA IN) to the rising edge of the first data output, there are a total of 200 clocks. I think FPA samples on the falling edge of the clock.Is FSYNC (DATA IN) a CMD lines in my terms? Anyway don't focus at the number of the clocks of the data output relative to the command. There is a parameter in the command that determines this value.
I would be _really_ curious about firmware dumps of these devices, or any specification that go beyond the two-page summaries.A few years ago, after I did a teardown video on one of these, someone emailed me, with some info they'd reverse-engineered, but didn't want to publish due to ITAR concerns.
I would be _really_ curious about firmware dumps of these devices, or any specification that go beyond the two-page summaries.A few years ago, after I did a teardown video on one of these, someone emailed me, with some info they'd reverse-engineered, but didn't want to publish due to ITAR concerns.
The main point was that they had found that the original FPGA used a Microblaze core, and it was easy to find the code image in the flash memory, and after disassembling with IDA, they managed to reverse-engineer the handshake protocol used to authenticate the camera with the control box.
There might be some useful insights in to how any sensor-unique data is stored.
so 'only' need to find focal length for first lens :-+What's the purpose of it's focal length ? For f-stop calculation You need effective f.l. of complete objective.
Debouncing entirely in software is trivially easy - you simply sample the buttons every 20-40mS or so.Had a look at the schematic. Debouncing of the buttons will be done in VHDL via a filter in the FPGA? In the microcontroller unit it is an annoying task to debounce something.Buttons pull-ups and debouncing capacitors are at the P-board (peripheral). Of course, there is additional debouncing logic in the HDL design. Agree, filtering buttons by internal mcu is a poor idea.
Hi Mike, yes, that was me. I've since then published my findings - in slightly redacted form - in https://www.eevblog.com/forum/thermal-imaging/autoliv-nv2-on-raspberry-pi/. (https://www.eevblog.com/forum/thermal-imaging/autoliv-nv2-on-raspberry-pi/.)I would be _really_ curious about firmware dumps of these devices, or any specification that go beyond the two-page summaries.A few years ago, after I did a teardown video on one of these, someone emailed me, with some info they'd reverse-engineered, but didn't want to publish due to ITAR concerns.
The main point was that they had found that the original FPGA used a Microblaze core, and it was easy to find the code image in the flash memory, and after disassembling with IDA, they managed to reverse-engineer the handshake protocol used to authenticate the camera with the control box.
There might be some useful insights in to how any sensor-unique data is stored.
Congratulations, your Altium skills are top notch.Thanks, though there so many thing to learn)
New focus speed is for my purpose now fast enough.One more video of focusing right from the thermal camera. This is not autofocus yet, I'm adjusting the focus manually and not accurate enough sometimes. The autofocus will do this job much better that me.
If someone in the EU have a faulty sensor, please send it to me. I'll be happy to get some SEM images. (I'll try my best ;) to persuade my advisor to allow me to do that)That will be really nice to get SEM photos, but not mandatory, as we know the most important value - the pixel pitch of 17um. FLIR’s Tau2 camera core has 17 μm pixel pitch and we know that it is based on ISC0901.
So simply shifting the array for a 1/2 of pixel pitch value in X and Y direction, we could increase the resolution from 336x256 to 672x512.Bear in mind you may need a much better ( more expensive) lens to get improved resolution. If it is the case that it's only sampling a fairly small percentage of the sensor area, then you would see noticeable aliasing artifacts (e.g. jaggies on straight edges) unless the existing optics system has been designed to slightly blur the image to antialias it. It may also be the case that there is some post-processing going on as well to reduce aliasing effects.
Yes, but it is probably engineered to that resolution, giving just the right amount of blur to avoid aliasing artifacts at that resolution
Wouldn't it have to be a pretty terrible lens to not be able to resolve 336x256, for example?
Pixel shifting can be found in Sony cameras and Hasselblad. There are many thread in this subforum that touch on the idea and some did tests. Pixel shifting is not viable for video, but a slow photomode for high resolution stills would be a great feature I would love to have.
Pixel shifting is not viable for video, but a slow photomode for high resolution stills would be a great feature I would love to have.True, but piezo actuators are extremely fast. So theoretically it is possible to shoot a video, though the FPS will be n^2 times lower, where n is a factor of resolution enlargement.
The claims about active area vs Pixel pitch Sounds a little odd. We know that Leonardo/DRS has a special pixel design that is a lot more area efficient than the one shown here, using "micro umbrellas". It might be the sole reason why they managed to 10μm somewhat useable.That was news to me! So it turns out that each "micro umbrella" just heats up the same VOx film, which is placed under it, supported by the same know thermal isolation pattern? And we have almost no pixel gaps. This is really incredible! Checkout attachments.
Also a thought: while you shift the sensor, why not include a whole 5 axis platform for in body stabilizion? Sony uses that for their pixel shifting as well. And they do capture like 12 individual frames on the a7RIV to get 120MP and no interpolation.That is going to be too complex and expensive...Moreover, no idea what to do with 5 axis.
Do you hope to shift by perfectly 1/2 the pixel pitch or would 1.5 also work? If you go beyond 4x this would require getting 0.33,0.66 or even 0.25, -0.25 too. While it might seem that perfect precision makes this sound possible - the real world has limitations in holding perfectly still. So a potential photomode could also include stacking of same pixels shifted by 1 to reduce noise.
Vibration induced Resonance and the effects of G forces in microbolometer pixels ......Thanks, Fraser. That should be taken in account, but I think that vibration frequency caused by pixel shifting actuators will be far away from the sensor's pixel resonance frequency.
Bear in mind you may need a much better ( more expensive) lens to get improved resolution.Thanks, Mike, I didn't thought about that. Well, that may be a huge problem. I'm not really good with optics, but original FLIR E8 lens is much worse that this one, but it was also designed for the right same ISC0901 sensor.
1. The piezo actuator, this piece of...shiceramics... costs around 100$. Yes, it is even more expensive that the FPGA. And...we need two of them for X and Y axis. :scared:
I would give these a try: ...Hm... the characteristics look quite fantastic for this dimentions. Have you ever used this parts?
180µm for 100V, should be around 18µm for 10V, and as you are 3D printing your housing and absolute accuracy is not required for this application, I think these would be sufficient.
Wouldn't it have to be a pretty terrible lens to not be able to resolve 336x256, for example?
Do you have any ideas how to check this lens capabilities?
I would give these a try: ...Hm... the characteristics look quite fantastic for this dimentions. Have you ever used this parts?
180µm for 100V, should be around 18µm for 10V, and as you are 3D printing your housing and absolute accuracy is not required for this application, I think these would be sufficient.
For example, both different well known manufacturers, and the parts with common dimentions have commom characteristics:
Kemet's AE0203D18H18DF from here: https://ru.mouser.com/datasheet/2/212/1/KEM_P0101_AE-1518874.pdf (https://ru.mouser.com/datasheet/2/212/1/KEM_P0101_AE-1518874.pdf)
PI's P-882.51 from here: https://static.pi-usa.us/fileadmin/user_upload/physik_instrumente/files/datasheets/P-882-Datasheet.pdf (https://static.pi-usa.us/fileadmin/user_upload/physik_instrumente/files/datasheets/P-882-Datasheet.pdf)
I have absolutely no experience with piezo actuators, but chinese ones look really strange. Maybe I'm wrong, not sure this is truth. We need anyone how can prove that.
At first you can probably do a simple check:Sure.
Can you get a sharp image of a point or edge source?
At first you can probably do a simple check:
Can you get a sharp image of a point or edge source?
The real thing is called measuring the modulation transfer function (MTF) of the lens.
Basically you describe an image in terms of spatial frequencies: how fast the luminosity is changing quickly across pixels.
Details equal to high frequencies for example. Since you played with frequency domain filtering of images you'll quickly grasp how that works.
Think of lenses like low-pass filters: in order to get more details, you need to grab more high-frequencies.
Lens designers would probably match the cut-off to the targeted sensor pitch if it allows to build it cheaper.
From the article:Ok, so if I got it right, we need a faster lens if we want to decrease the pixel pitch. Probably, first we should determine the lens f-number.
Since the lens performance is essentially diffraction limited, there is only one way to increase the MTF of the lens so that it remains constant for the higher spatial frequency. That is to make the lens faster in the ratio of the change in pixel pitch.
You can probably get to half the pitch without too much trouble (or software sharpening), but more than that you'll probably notice the lens limitations.If I'm not wrong, not only lens limitations. As the pixel itself is not a spot, i.e. it has some dimentions, we will also face with pixel overlap at very small steps.
I have no experience with those yet. I did an experiment with a low cost piezo buzzer (with the intention of building an Scanning Fabry-Perot Interferometer like http://repairfaq.cis.upenn.edu/Misc/sale/sfpiins1.htm (http://repairfaq.cis.upenn.edu/Misc/sale/sfpiins1.htm)).
This is the type of piezo I am talking about: https://nl.aliexpress.com/item/4000120679339.html?spm=a2g0o.productlist.0.0.5cde25c3H0l9jK&algo_pvid=39a8075f-6e25-492a-911d-34217e99d7f2&algo_expid=39a8075f-6e25-492a-911d-34217e99d7f2-3&btsid=0bb0623416055593233587628ed6a4&ws_ab_test=searchweb0_0,searchweb201602_,searchweb201603_ (https://nl.aliexpress.com/item/4000120679339.html?spm=a2g0o.productlist.0.0.5cde25c3H0l9jK&algo_pvid=39a8075f-6e25-492a-911d-34217e99d7f2&algo_expid=39a8075f-6e25-492a-911d-34217e99d7f2-3&btsid=0bb0623416055593233587628ed6a4&ws_ab_test=searchweb0_0,searchweb201602_,searchweb201603_)
These have a ceramic layer of only approximately 0.2mm thick. It was connected directly to my signal generator with a 10Vpp output. A mirror was glued to the piezo, and distance was measured with a DIY interferometer (details here: http://www.repairfaq.org/sam/uMD1/ (http://www.repairfaq.org/sam/uMD1/)). This gave me a little less than 250nm pp. Taking into account the ceramic material was only 0.2mm thick, this would mean 6.25µm pp for a 5mm thick material. This is below the 18µm spec for 10V, but if similar specs, with a higher voltage, 17µm must be easily achievable.
To be sure, I also ordered a set (https://nl.aliexpress.com/item/32952294614.html?spm=a2g0o.productlist.0.0.4da739052Pa8qc&algo_pvid=aa231da8-f9e6-4c2d-97a4-be556ee5db3d&algo_expid=aa231da8-f9e6-4c2d-97a4-be556ee5db3d-0&btsid=0bb0624516055606928738744e86ab&ws_ab_test=searchweb0_0,searchweb201602_,searchweb201603_ (https://nl.aliexpress.com/item/32952294614.html?spm=a2g0o.productlist.0.0.4da739052Pa8qc&algo_pvid=aa231da8-f9e6-4c2d-97a4-be556ee5db3d&algo_expid=aa231da8-f9e6-4c2d-97a4-be556ee5db3d-0&btsid=0bb0624516055606928738744e86ab&ws_ab_test=searchweb0_0,searchweb201602_,searchweb201603_)) and will test them the same way.
This gave me a little less than 250nm pp. Taking into account the ceramic material was only 0.2mm thick, this would mean 6.25µm pp for a 5mm thick material. This is below the 18µm spec for 10V, but if similar specs, with a higher voltage, 17µm must be easily achievable.Agree, that makes sense.
To be sure, I also ordered a set and will test them the same way.Anyway, I will be very grateful if you share your results with us!
Experiment #2:
The distance to the object is about 0,4m:
If I'm not wrong, not only lens limitations. As the pixel itself is not a spot, i.e. it has some dimentions, we will also face with pixel overlap at very small steps.
Found this "datasheet" for AL1.65X1.65X5D-4F, according to it only 3,8um@90V :-//
https://img.alicdn.com/imgextra/i2/71977092/TB2wz4ybNaK.eBjSZFAXXczFXXa_!!71977092.png (https://img.alicdn.com/imgextra/i2/71977092/TB2wz4ybNaK.eBjSZFAXXczFXXa_!!71977092.png)
Found this "datasheet" for AL1.65X1.65X5D-4F, according to it only 3,8um@90V :-//
https://img.alicdn.com/imgextra/i2/71977092/TB2wz4ybNaK.eBjSZFAXXczFXXa_!!71977092.png (https://img.alicdn.com/imgextra/i2/71977092/TB2wz4ybNaK.eBjSZFAXXczFXXa_!!71977092.png)
Anyway, I will be very grateful if you share your results with us!
I'd say you lens looks definitely adequate to take advantage of superresolution.I hope that very much too. But I also want to try to prove this theoretically if it is possible.
The ultimate limit will be Airy disk dia, i.e. physics.Ok, lets calculate this Airy disk dia.
http://www.calctool.org/CALC/phys/optics/spot_size (http://www.calctool.org/CALC/phys/optics/spot_size)
Thank you for highlighting this situation as I have not previously looked at it for thermal lenses :-+Thanks to LesioQ and our cruel, but beautiful universe! ;)
The game doesn't end with calculating the airy disc diameter ! That's a common mistake.First, thank you for this article, it is really mind changing. I recommend everyone to read it.
What do you want to do next?
--> sample it at more than just one pixel per airy diameter!
Because you're not really interested in sampling the airy disc with one pixel, but sampling the tiny contrast (in amplitude AND space) resulting from merging two close diffracted spots.
You can sample at a pitch much less than the airy diameter.
I found that document that appears to explains it quite well (I had a quick glimpse at it but it seems good).
That one says a pitch of 1/6 of the airy diameter !Here is a quote from the article, explaining why we are talking about 1/6 of the airy diameter for those, who didn't find it:
Since we know from the Rayleigh criterion that Δxmin equals the radius of the Airy disc, it follows that the best
‘match’ between sensor characteristics and the optics is achieved when the detector pitch p equals one‐third of
the Airy disc radius. In other words: three pixels suffice to resolve the Airy disc radius entirely. The reasoning
then goes that above that threshold, the optics diffraction limitation kicks in. So digitising the radius of the PSF’s
central part with more than three pixels does not improve the final spatial resolution of the image because the
detector only resolves diffraction blur at that point.
One minor detail. The focal length given on a lens is a value when focusing your camera at infinity. In most cases a so called effective focal length should be used instead, also when calculating Airy disk diameter.I'd say you lens looks definitely adequate to take advantage of superresolution.I hope that very much too. But I also want to try to prove this theoretically if it is possible.The ultimate limit will be Airy disk dia, i.e. physics.Ok, lets calculate this Airy disk dia.
http://www.calctool.org/CALC/phys/optics/spot_size (http://www.calctool.org/CALC/phys/optics/spot_size)
First, let's get the f-number of the lens.
According to this datasheet (table 6.1): http://www.safetyvision.com/sites/safetyvision.com/files/FLIR_PathFindIRII_User_Guide_1.pdf (http://www.safetyvision.com/sites/safetyvision.com/files/FLIR_PathFindIRII_User_Guide_1.pdf)
The lens focal length is f = 19mm (seems to be true). The aperture circle dia D = 15mm. In this way f-number = f/D = 19/15 = 1.26(6). The closest standard value is 1.25.
So, let's say that we have a f/1.25 lens.
Finally for LWIR spectral band of (8 - 14um) we have an Airy disk dia of (21.35 - 42.7um). Well...even smallest value of this range is much larger than the actual pixel size for a 17um pitch technology...
Again, I'm not good enough with optics, but the result looks a bit weird for me, I didn't expect such values. Any ideas what am I doing or interpreting wrong? |O
One minor detail. The focal length given on a lens is a value when focusing your camera at infinity. In most cases a so called effective focal length should be used instead, also when calculating Airy disk diameter.I'll be grateful if you explain it in more detail and attach a picture if possible. Effective Focal Length (EFL) is a bit overloaded term. The resulting focal length of multiple lenses is called EFL. Also photographers call EFL a focal length affected by the camera’s crop factor. But looks like you are talking about other things.
Please look at the link below.One minor detail. The focal length given on a lens is a value when focusing your camera at infinity. In most cases a so called effective focal length should be used instead, also when calculating Airy disk diameter.I'll be grateful if you explain it in more detail and attach a picture if possible. Effective Focal Length (EFL) is a bit overloaded term. The resulting focal length of multiple lenses is called EFL. Also photographers call EFL a focal length affected by the camera’s crop factor. But looks like you are talking about other things.
Can't wait to see the boards populated and heading to first tests.Me too)
That looks so professional :-+Thanks! Doing my best)
Seems your project is going well. When will you accept beta-testers? ;D Planning a crowdfunding of some sort?I'm still accepting beta-testers, checkout the very first post of this thread (have updated it for clarity), you will find some details about beta tesing. More information I will provide further.
Planning a crowdfunding of some sort?Sure, but first I must get sure that every single part of this complex device is working properly. I really don't want anyone to face with hardware problems, even beta-testers.
Found this "datasheet" for AL1.65X1.65X5D-4F, according to it only 3,8um@90V :-//
https://img.alicdn.com/imgextra/i2/71977092/TB2wz4ybNaK.eBjSZFAXXczFXXa_!!71977092.png (https://img.alicdn.com/imgextra/i2/71977092/TB2wz4ybNaK.eBjSZFAXXczFXXa_!!71977092.png)To be sure, I also ordered a set and will test them the same way.Anyway, I will be very grateful if you share your results with us!
Result: with a 0 to 10 volt sine wave, I measure about a +-0.4µm signal, which matches with the 3.7µm@90V from the datasheet...Thanks so much, _Wim_ , for proving the characteristics! :-+
Kemet's AE0203D18H18DF from here: https://ru.mouser.com/datasheet/2/212/1/KEM_P0101_AE-1518874.pdfThough Kemet's one is avaliable at both Digikey and Mouser, I hope to find something cheaper if it is possible.
PI's P-882.51 from here: https://static.pi-usa.us/fileadmin/user_upload/physik_instrumente/files/datasheets/P-882-Datasheet.pdf
Well, looks like best candidates are still:Kemet's AE0203D18H18DF from here: https://ru.mouser.com/datasheet/2/212/1/KEM_P0101_AE-1518874.pdfThough Kemet's one is avaliable at both Digikey and Mouser, I hope to find something cheaper if it is possible.
PI's P-882.51 from here: https://static.pi-usa.us/fileadmin/user_upload/physik_instrumente/files/datasheets/P-882-Datasheet.pdf
Since I also like the inbuilt superresolution-stage idea, I redrew the XY-Stage as board in KiCAD and ordered a few. Mine are based on either MLCCs or those shitty AliExpress piezo actuators. However I have very low confidence in MLCCs as they most certainly will crack and the piezos from AliExpress seem to have too little travel. :/
Anyways this will be a fun little thing to play with. If someone wants a board or two, DM me :)
As you doing your own mechanical design also, would integrating a lever to increase the stroke be an option?Thank you, yes, I though about that. There are special series of APA (amplified piezo actuators): https://www.cedrat-technologies.com/en/products/actuators/amplified-piezo-actuators.html (https://www.cedrat-technologies.com/en/products/actuators/amplified-piezo-actuators.html)
_Wim_, thanks for your tests! I'm not sure, but probably the voltage level is too low. A man here could achieve 800 nm over 100 V for with 1812 capacitor: https://dberard.com/2015/08/16/mlcc-piezo-actuators/ (https://dberard.com/2015/08/16/mlcc-piezo-actuators/)
Also I found a piezo actuator driver HV56020, that integrates a step-up DC-DC converter (up to 225V) and two high voltage amplifiers for driving two piezo actuators.
Datasheet: https://ru.mouser.com/datasheet/2/268/HV56020-Data-Sheet-DS20006335A-1843786.pdf (https://ru.mouser.com/datasheet/2/268/HV56020-Data-Sheet-DS20006335A-1843786.pdf)
Mouser link: https://ru.mouser.com/ProductDetail/Microchip-Technology/HV56020T-V-KXX?qs=vmHwEFxEFR%252BS41dx8RulPQ%3D%3D (https://ru.mouser.com/ProductDetail/Microchip-Technology/HV56020T-V-KXX?qs=vmHwEFxEFR%252BS41dx8RulPQ%3D%3D)
The only problem with this HV56020 - a poor documentation. I can't understand what kind of transformer is used, no information about it in datasheet. I couldn't find any application notes or examples, evalution boards, etc. Also, no support at the microchip forum: https://www.microchip.com/forums/m1144592.aspx (https://www.microchip.com/forums/m1144592.aspx)
The only problem with this HV56020 - a poor documentation. I can't understand what kind of transformer is used, no information about it in datasheet. I couldn't find any application notes or examples, evalution boards, etc. Also, no support at the microchip forum: https://www.microchip.com/forums/m1144592.aspx (https://www.microchip.com/forums/m1144592.aspx)
The schematics are attached:
https://www.coilcraft.com/en-us/products/power/coupled-inductors/1-n-shielded-coupled/lpr/za9735/ (https://www.coilcraft.com/en-us/products/power/coupled-inductors/1-n-shielded-coupled/lpr/za9735/)No way... :palm: Thanks, _Wim_ . Passive parts designed exclusively for certain ICs are so annoying! BTW, this inductor is not avalible at Mouser/Digikey, though at least we know its parameters. Unfortunatelly there are still a lot of undocumented things, like the feedback resistor divider, diodes, dc-dc step-up caps, Rsht and so on. This is ridiculos to have such a useless datasheet for this quite non-standard application. The only hope is that microchip will finally release any appnote or evalution board. :-\
What OS are you using for the firmware? You might look into the Zephyr project. It is an embedded RTOS supported by the Linux Foundation.Hi! Oh...forgot to say, I'm using FreeRTOS. It is running on 100MHz 32-bit Microblaze MCU, avaliable RAM size is about 8MB.
First, congratulations for your work, this is simply awesome !!Hi, thanks! Yes, unfortunatelly ImGui, that you have recommended is C++, though there is a cimgui project, that wrappes original ImGui API: https://github.com/cimgui/cimgui
REMOVED : I didn't see you asked for C gui, not C++
Looks like a very interesting project and I'm very impressed by the professional level of the design. Bookmarked for sure and seems like I might have to get one of those NV3 modules before summer :popcorn:Hi, thanks! Consider email me if would like to become a beta-tester (details are in the first post of this thread).
Did you consider helical barrel arrangement for focus instead of the screw? I think a focusing helicoid would be doable with small nylon washers.Hi! In fact, no, I haven't considered this way of focusing. I'm not really good with optics, but I think that helical focusing mechanics are mostly needed for zooming, where large lens travel is required. But this camera isn't supposed to have zoom (except digital one). The main idea was to overcome the fixed lens focus distance range limitations (factory focus distance starts from ~3 meters). Fortunately 2mm lens travel screw-based focus is enough to cover distances from ~8cm to infinity. It is also quite cheap and reliable (almost no wear), reused quite a lot of original camera parts. Moreover I have redesigned some parts and fixed some problems with speed, vibrations and noise (will make a post later).
This massive one uses actual bearings but you get the idea.
I have been thinking about something that locks securely like a PL mount, but the flange distance of what I can work with don't allow that at all. As one of the lens I am planing to use is rather large. It will have to be mounted to rails or a plate. Which makes changing lenses by turning and such more difficult.Why not using a special flange adapter to get the proper flange distance for your large lens?
I have bought the camera module (AUDI) and am ready to fabricate PCB.I haven't finished testing hardware yet, hope to finish by the end of this month. Also this is supposed to be a kit with fully assembled PCBs + enclosure and other mechanic parts, though for the most brave and experienced of you, I could ship bare PCBs with BOMs, assembling schemes, etc...
Do you have a current BOM or should I take from schematic?
Yesterday I came across a LVGL library: https://lvgl.io/, (https://lvgl.io/,) github: https://github.com/lvgl/lvgl (https://github.com/lvgl/lvgl)
LVGL looks promising, worth to try. Any thoughts?
I highly recommend you go with LVGL. It's extremely simple to implement, can be augmented with 2D acceleration, and is fairly easily extensible.Hi, Spirit532! Thanks, going to try it!
I'm using it in a commercial project right now, no issues whatsoever.
Supported outputs:
1. 320x240 TFT LCD
2. USB UVC (a common webcam protocol)
3. HDMI 480p/720p (probably even 1080p with Spartan-7 FPGA)
Received an Audi 4G0980552 part.
Exact clock value is 73.636MHz. There is no any encryption or any other secure communication between the camera and FPGA. If you send clock, valid command and valid bias values, you will get valid data on both data lines (even + odd).VGN, do you know if FPGA queries the sensor at all, before receiving a key?
Just probed the DATA_EVEN and CLK lines- there is a ~75MHz clock and some activity on data line (at least, as much as I could see with DS1052). Judging from the captures in Vue336 interface thread, DATA_EVEN comes from sensor only, so it appears the FPGA is receiving data from sensor even without any security key.
VGN, you mentioned the difficulty of obtaining bias values for each pixel. Looking at your HDL sources, I now understand that a bias value is sent by FPGA to sensor, for each pixel?Yes, that is right. FPGA sends 6-bit bias value for each pixel. (In fact it sends 7-bit, but only 6-bit are significiant).
That is quite interesting- in that case, it seems to me that there is a significant manufacturing variation for each microbolometer (that is, even more significant than the usual non-uniformity which is handled by the NUC).This is also true. There are two ways to solve this problem. Bias data and command word can be retrieved from original board NOR flash (I have some success here). Another way - to recalculate this values over a two-point calibration.
Can you share the bias values for your sensor? I wonder if there is any discernible pattern in the values.Follow this reply: https://www.eevblog.com/forum/thermal-imaging/openirv-isc0901b0-(autoliv-nv3-flir-e4568)-based-opensource-thermal-camera/msg3254678/#msg3254678 (https://www.eevblog.com/forum/thermal-imaging/openirv-isc0901b0-(autoliv-nv3-flir-e4568)-based-opensource-thermal-camera/msg3254678/#msg3254678)
Anyway, I'm especially curious about this, because I hooked my unit to a logic analyzer, and the FPGA doesn't seem to send any bias values at all. Consequently after the per-frame command packet, I see something that resembles one image line, and then only the preamble on the data lines from sensor. I wonder if it's a security feature (I've heard about lockdown mode if FPGA receives incorrect data on NV2), or my unit is somehow damaged. There does appear to be some activity on the SPI flash lines, but I'll probably try dumping it next.This is ok, camera do not output any bias values at this point, it waits for some command from the control unit. Don't worry, your unit is not damaged, as you see preambles at the data lines (10101010101010). Preambles are generated by sensor, that means it is alive. As I said previously, there is no any secure protocols between sensor and FPGA. First three lines - are a pipeline garbage, it doesn't look like to be a thermal data, no correlation with scence or sensor temperature. The first image line you will get at the 4-th line output. But now you see zeros after preambles 10101010101010, as you do not send any bias at all. Also there is a huge pipeline delay (3 lines) between transfer of the bias values and pixel output. After command transfer we start sending bias values, then we will see 3 strange data lines from sensor (I call it pipeline garbage), only after that we will get valid thermal data. Checkout ISC0901_capture.v design file.
Failing that, I wonder what is the way detectors are characterized at the factory. Maybe it's just as simple as as binary search on the bias value for each pixel that yields the most uniform output of the detector.I think you are right. There is only 64 levels of bias. We just have to take 128 pictures from two different simple blackdoby sources, find the bias values for each pixel that gives the best pixel dynamic output range. But picture will not be uniform at this step. We will have to calculate individual pixel gain value to get uniform image.
I think you're doing too well by yourself for people to post useful comments.Thanks for your support! It really helps to keep working)
So in the end I can just say you're doing an amazing work :-+
I agree that there is some fantastic work going on. I only wish I had more time available to look into your design in more detail - I would like to have a go at interfacing a ULIS sensor to it in place of the FLIR one, as I have a couple without working cores attached (both VGA and QVGA).Thanks! I think this is possible to do, but you will have to make a special adapter board for your sensors, check and tune power rails to suit ULIS power requirements and implement a HDL design for FPGA that will capture raw data and feed to the image processing pipelines. I keep in mind, while developing, that image processing cores should be able to support other sensors and resolutions.
Which ones do you have? Most of them seem to have analog output, although some of the newer ones have digital output (which, although is simpler protocol wise, is not compatible with ISC0901B0).For this case we have 20 high performance IOs at the sensor connector of the M-board. I think this is enough to connect ADC, which can be placed on the special adapter board. Yep, this is quite tricky, but worth doing if you want to bring alive some really cool and expensive sensor.
Which ones do you have? Most of them seem to have analog output, although some of the newer ones have digital output (which, although is simpler protocol wise, is not compatible with ISC0901B0).VGA sensor has analogue output only, QVGA has digital as well (with a different package/pinout though :palm:). Would only be using analogue option, and I have the power supply, biasing and ADC in hand, it's the HDL design and modification that would be the big job.
After a quick search.Thanks, I sent a letter to infinity-component.hk, let's see their answer. Though I'm agree with Hydron, parts availability and price at this market is toooooo good to true...
...
I have no experience with those suppliers, so...?
Thanks for your detailed answer Vaagn. It is really disappointing that we don‘t get all parts right now. I also had a look and there seems to be no (trusted) supplier for the FPGA. That seems to be a global issue right now. Same with graphic cards for PCs and the automotive industry. Cannot wait to get a working camera.I'm not giving up! After a closer look I found out that XC7S50-2CSGA324C is theoretically possible to purchase on mouser in a two months, that's much better, that waiting for september. I should look through the whole BOM one more time to understand the scale of this problem.
Another question:Sure! All part are printed on Formlabs Form 2, the resin type is Formlabs Grey. After thorough UV curing the parts itself and the surface become as hard, that friction is very low and there is no wear at all. Gears are really very hard to break, though I'd not say that the are too fragile. I could probably make a crash test video of the focus gears if you really would like.
Do you know which material the 3D printing service used? I own a 3D resin printer and the parts beeing printed with normal resin are brittle and not very durable. PLA from a resin printer is much more durable for gears and so on.
Hi, everyone. I have some good and bad news.
2. Bad news. Right now, I'm at the step of ordering components to manufacture devices for you. I decided to use my own funds to produce the first batch of devices, as
I believe that this will be the most safe and comfortable way for me and you.
When I was developing this device I prefered to use newest and popular parts that can be easily ordered.
Well, this is sad, but for today there is a huge trouble with electronic components all over the world. I can't understand what's going on..., but most active parts just disapeared from the market. For example, the most critical component of the OpenIRV device - FPGA is non-stock.
I'm sure it's one of the most asked questions, but before putting down my name as a potential buyer/beta tester, I was wondering if you are any closer to have a rough idea of the price range (very rough). Or even an interval of confidence (say, between $500 and $1,000)4) Any idea of how much the kit minus the nv3 will cost? I'm moving house at the moment, so counting the pennies, and want to make sure I put some money aside. Even better if you could say some thing like the circuitry will cost about X and the housing/gears/internal parts should be about Y, etc.I will provide this information as soon as I get the final BOM. But I'm going to keep the price competitive to HT-301, TE-Q1, HT A2, FLIR C2 and other "cheap" thermals, though OpenIRV is going to be more powerful and multifuctional. Also keep in mind that the batch size makes difference on the final price.
I'm sure it's one of the most asked questions, but before putting down my name as a potential buyer/beta tester, I was wondering if you are any closer to have a rough idea of the price range (very rough). Or even an interval of confidence (say, between $500 and $1,000)Very rough price for the kit is definitely going to be under 500$.
I'm also trying to understand if your goal is to offer a full solution (even as a kit), that includes the thermal camera module, or if the plan is for people to source a NV3/Audi camera on their own (I see quite a few Audi cameras going for ~$500 on eBay)There is no way to ship thermal camera modules, because of export/import restriction of different countries, though it possible to find NV3 camera almost all over the world. The goal is to design a kit, that is almost ready to use. This photo shows the whole set of part, that are going to be reused from original camera: https://www.eevblog.com/forum/thermal-imaging/openirv-isc0901b0-(autoliv-nv3-flir-e4568)-based-opensource-thermal-camera/?action=dlattach;attach=1022356;image (https://www.eevblog.com/forum/thermal-imaging/openirv-isc0901b0-(autoliv-nv3-flir-e4568)-based-opensource-thermal-camera/?action=dlattach;attach=1022356;image)
(I see quite a few Audi cameras going for ~$500 on eBay)This is quite much, probably, because the part is new. Try to look for other partnumbers:
Hi guys, it was such an interesting topic, but for some reason it stalled, I would like to continue......................Hello! No worry! The project is alive! I will report updates soon.
I also want to join the test, how do I get the test board ThanksSovled in pm.
What are the test results in the new version, does everything work?
When do you expect the first test sets?
Some form of B.O.M. would be welcome, so it's clear what more is required for initial functionality testing, as it may be affected by parts availability...Sure, BOM will be published today.
I was researching the idea of a XY displacement enhanced thermal camera when I found this thread, very impressive VGN!Hi, thanks)
I have the feeling I am a few minutes late but if you still need beta-testers let me know, I would absolutely join a crowdfunding and I am happy to help with beta-testing, documentation,.. if needed. Already ordered a NV3 (broken window, crossing fingers...) on ebay :-+You are not late, consider email me with title: "OpenIRV.Developer/Tester". You can find my email address in attached pictures of the very first post of this thread.
Also the nudge to finally register to the forum and not only reading all the interesting stuff that's going on here...Welcome)
Is it just the primary lens or just the protective flat lens at the front of the unit?
VGN, we can use any FPGA XC7S50 or XC7S25 core with 324?XC7S25 cannot be used. Mostly because IO BANK35 and BANK16 are NC in XC7S25. Also design is too huge to fit in XC7S25.
Hi guys, but the topic is very delayed, it would have been very relevant a year ago. Now the Chinese are tearing apart thermal imagers with low prices and very good quality, I have been doing a thermal imager for a long time and now American flir has receded into the background, it is very difficult to sell American flir.
I would like to know what the price will be for boards without a sensor, at least approximately so that it would be interesting to watch this topic approximately 100 200 300 or 400 US dollars If you answer this will be a big plus it interests everyone thanksIf you are asking about fully assembled PCBs, I cannot say exact price right now, nobody knows when parts will be avaliable and how the price will change.
How is the production going for you VGN?I have been pretty overworked last tree weeks. But no worry, I'm continuing.
Hi,VGNHello!
When I use 0x25 as bias, I get the this image. Is my FPA damaged?
There shouldn't be a bigger problem with the camera itself (we have a lot of car parts), but what about the rest?Hi, thanks a lot. If you are asking about OpenIRV hardware parts - yep, this a great trouble, almost no any FPGAs or some other parts avaliable in stock all over the world, due to global chip shortage. We can do nothing with it, just wait when this hell finishes or we all find a way to deal with it. The initial plan was to ship fully assembled boards, but even people from USA cannot buy part, so I decided to start shipping soon only bare PCBs for those, who was lucky to get parts and have enough skills to assemble the PCBs.
I came across this interesting article about the ISC1406L sensor ("micron" teardown).You can easily find articles about other sensors, including ISC0901: https://www.systemplus.fr/wp-content/uploads/2017/06/SP17330_Autoliv_Night_Vision_FLIR_microbolometer_Sample_System_Plus_Consulting.pdf (https://www.systemplus.fr/wp-content/uploads/2017/06/SP17330_Autoliv_Night_Vision_FLIR_microbolometer_Sample_System_Plus_Consulting.pdf)
Hi, I've started looking at a Tau2 640x512 sensor in hope of interfacing with this project. The sensor interface sounds like it might be identical to the ISC0901. I've attached an image what I think the sensor is doing. I guess the last two signals are pixel valid/enable like in the HDL? I can't see a CMD signal however.Hi, Gleg!
VGN, could the 3 lines of pipeline garbage you note in ISC0901_capture.v be values that reflect/help with the row/column noise? Could they be Unexposed pixels? My camera reliably has empty, data, empty for the three lines and I feel like there's a chance it's not garbage.In fact I still have no idea what these 3 lines contain. I couldn't find any correlation of the content of the first 3 lines and the actual scene.
Greg! Glad to see you in thread finally.Thanks for the welcome, I was hoping to show up right before xmas and get a present!
I think the last two pins at your image are enable and cmd, there is no pixel valid strobe from sensor.You're right thanks, I can see a command once per frame on Pin 10 and 11 is enable.
Thank you for your reminder. I found 2 sets of 336*256*2 bytes of data in the original SPI flash file. The data is similar to 0xA0xx, 0x9Fxx. I used these data to generate this image. Are they bias data? (Attachment Link)Unfortunately no, these are gain values. These are in fact almost useless tables, as I already have an algorithm that can recalculate each pixel gain using two uniform images with a temperature difference about 30-40 degree, that is called two point calibration. This is quite easy to do. The quality of thermal images, that you saw in this thread are based on this calibration. Also bad pixels are detected and marked for pixel replacement pipeline.
Except for the 4 "dead rows", the main problem now is that it can only be used to detect objects with relatively high temperature, and the image at about 40 degrees Celsius will be submerged in noise.This is ok, as soon as you have proper command pattern, bias values, you will be able to recalculate the pixel gain. Finally after offset correction, that is periodically done by internal shutter, your camera image quality will meet my level.
I'm intending to make a flex PCB to adapt the Tau2 through-hole pinout in my previous pictures to the Molex 52991-0408 to match the ISC0901B0 pinout.:-+
VGN, do you think it would be reasonable to have a common bitstream between the two FPAs if I use an ID resistor on the adaptor? Maybe a sensor ID in general might avoid frying an FPA?I haven't decided yet if it is reasonable to keep a common bitstream. Probably not, as this going to be a waste of logic. Also some FPAs will require another level of bias, core or vccio voltage, that can be tuned by feedback resistor only. If some FPAs are very common, I think that common bitstream could be used.
BTW, I think X1 and X2 are mixed up between the PCB and documentation.This is because I haven't updated schematics yet. In the repo now is v1.0.0, while new boards are v1.1.0. I will upload new schematics at this week.
Hi, here's a picture of what I'm thinking of making. It's my second go at a PCB, anyone have suggestions? It's ~26mmx30mm. It'll fold down 180 and present the same interface as the ISC0901B0.Good job!
I'm assuming the OpenIRV M board is the same size as the autoliv board so I don't think it will fit in the Tau case.Right, I think a 3d printed adapter can solve this problem.
Right, attempt two! this one should allow an I2C or UNI/O EEPROM. I left the I2C option in because they go up to 2MBit. I think with some compression you could store something interesting like the bias values with the sensor. With UNI/O it will only use one IO.ISC0901B0 requires 84KB for a single bias frame. As I know, original NV3 has 4 table to cover wide camera temperature working range, i.e. we should store 84x4 = 336KB of bias values, i.e. we need 4mbit eeprom. But as I know, you can hardly find 1-Wire, UNI/O or even I2C eeprom of such capacity. For higher resolution sensors, we probably will require even more. Also loading bias table from I2C eeprom will be slower than from QSPIx4@65MHZ, that impacts on statup time.
I think we can only rely on VCC_SYS being present in the bootloader and I've assumed the IO voltage will start off at 2.5v. Hopefully SENSOR_IO_0 could sense the pull-up to stay compatible with an FPA that needs every pin(?). VGN, would you be interested in using this in the bootloader to ID other sensors?We can also rely on VCCIO_FPGA and VCCIO_FPGA+SENSOR (controlled over Q3 mosfet). Bootloader can switch VCCIO_FPGA+SENSOR power on, that will power a 1-wire ID eeprom, which will be able to communicate with FPGA over pin 19 of X2 connector, that already have a pull-up resistor R6. Though I think that this resistor should be a bit stronger, i.e. probably 4.7K will be fine for 1-wire. How about AT21CS or 11LCxxx EEPROMs ?
Switching IO voltage after the bootloader could be a problem though. What if you passed VCCIO_FPGA through? These EEPROMs can do 1.8v to 5.5v, I could get rid of the regulator and they'd follow the IO voltage.This is already done) Just look though the schematics again, BTW I have uploaded lastest v1.1.0 schematics. VCCIO_FPGA+SENSOR is the same VCCIO_FPGA, controlled over Q3 mosfet.
ISC0901B0 requires 84KB for a single bias frame. As I know, original NV3 has 4 table to cover wide camera temperature working range, i.e. we should store 84x4 = 336KB of bias values, i.e. we need 4mbit eeprom. But as I know, you can hardly find 1-Wire, UNI/O or even I2C eeprom of such capacity. For higher resolution sensors, we probably will require even more. Also loading bias table from I2C eeprom will be slower than from QSPIx4@65MHZ, that impacts on statup time.I was thinking that because the bias is 6bit it's only really 64KB per bias array. A lot of the data should be similar(?) and compress well. But I didn't realise there were at least 4! And slow too, I take your points. I wonder how compressible they are though.
We can also rely on VCCIO_FPGA and VCCIO_FPGA+SENSOR (controlled over Q3 mosfet). Bootloader can switch VCCIO_FPGA+SENSOR power on, that will power a 1-wire ID eeprom, which will be able to communicate with FPGA over pin 19 of X2 connector, that already have a pull-up resistor R6. Though I think that this resistor should be a bit stronger, i.e. probably 4.7K will be fine for 1-wire. How about AT21CS or 11LCxxx EEPROMs ?I didn't read the schematic very carefully sorry. I assumed the regulators were being digitally adjusted with PWM or something. So I was thinking we had to decide which voltages to program and make sure to not enable anything in the bootloader... :palm:
There are 4 groups of 336*256*9byte data in FLASH, are they encrypted bias data?Not only bias, also complex offset table, some kind of gain, etc. There 4 tables to cover wide temperature ranges. Original tables are quite complex, as I understand they have precalculated offset values to minimize requirement of frequent shutter calibration. I decided not to use this model. Bias, gain and offset table is enough for hand device.
Hi Vaagn,
do you think it is possible to order the stencils for most of the PCB‘s ?
I think otherwise it is a pain to assemble them.
Hello. I haven't decided yet. At least I'm going to opensource stencil gerbers. But not sure if it going to be included in KIT. On the other hand, you are right, assembling M-board and P-board is a pain, though I have assembled them without stencils.
Hi, would be great if you could just release the stencil gerbers. So evererybody who needs this can order it by themselve. It wont be that expensive. I want to use my reflow oven for assembly. Thanks for your answer. I‘m already excited about the news coming about the new development status.
Does anyone else have a Tau 2?I have tau1 640 which has broken ribbon cable connector on green board, does your solution fit tau1?)
I have tau1 640 which has broken ribbon cable connector on green board, does your solution fit tau1?)I'm not sure sorry, I've never seen a Tau 1. I think you'd have to replicate the investigation I did earlier in the thread. I don't even know if the adaptor works with the Tau 2 yet though ;D I guess as long as the power lines are correct the rest could be sorted in the fpga? I think this pdf: https://flir.netx.net/file/asset/12412/original/attachment (https://flir.netx.net/file/asset/12412/original/attachment) suggests the Tau 2 is an incremental upgrade over Tau 1. It does mention higher frame rates but all the features look like they could be fpga improvements to me. Anyone else know?
I'm in the UK and possibly interested... Just need more spare time and funds for this project.This project will help you see how quickly you're burning through money though
Could you share a CPLD dump?
I have small experience with UAVs, and absolutely no any experience with FPV, FPV-cameras, interfaces, etc... Are you still flying on NTSC or PAL? ;D
That's why I have some question to you and community:
1. What do you think about this addon? Is it worth doing?
2. What about total weight and dimentions?
3. What kind of interfaces FPV pilots use nowadays? Is AV enough? What is the output format (NTSC/PAL/or...)?
4. What else is reasonable to add to the board?
In my projects, I like to implement such interactions on UART:
- 2 bytes for synchronization;
- information byte settings flags (palette, calibration, autofocus);
- information byte 255 gradations of focus;
- the last byte is the modulo sum of two information bytes.
I think, for unification, you can repeat the uart protocol, such as flir Photon, tau, etc .. (there is a detailed open description of the commands, the file is attached). At least get away with it.Yes good idea
I don't have the technical knowledge to understand exactly what is going on, but I heard that the camera can be activated by following the URL below.I know similar projects, but they use the original video processing unit from the car. The activator is already working with this block.
https://github.com/pavelmalik/BMWCanBridge (https://github.com/pavelmalik/BMWCanBridge)
https://www.bmwcustomretrofit.com/navigation-retrofits/BMW-Night-Vision-System-Activator (https://www.bmwcustomretrofit.com/navigation-retrofits/BMW-Night-Vision-System-Activator)
these are not broken pixels, this is something else I don’t know, pixels will be visible on the sides on a clean image.
Hi, are you still able to get SEM images of a sensor if I send you a full camera?
I have a few spare NV3 cameras lying around that could be put to good use...
Denis_K, we can cooperate with you, the following advantages:
- logistics chains are well established between our countries;
- in connection with the current terrible events, several of my colleagues in Armenia (next to the author)));
- I have access to the European auto parts market, NV3 is much cheaper there than in our countries.
Hello!
Have added datasheets for LCD and focus motor: https://github.com/OVGN/OpenIRV/tree/master/docs/BOM/BOM_v1.1.0
You can easily find both. Note that there are a lot of manufacturers that make same 6mm geared DC motors.
Use provided motor datasheet as a reference. The key parameters are diameter (6mm), gear ration (~1:26), shaft geometry and voltage (nominal 3.0V).
[/quotweHello!
Have added datasheets for LCD and focus motor: https://github.com/OVGN/OpenIRV/tree/master/docs/BOM/BOM_v1.1.0
You can easily find both. Note that there are a lot of manufacturers that make same 6mm geared DC motors.
Use provided motor datasheet as a reference. The key parameters are diameter (6mm), gear ration (~1:26), shaft geometry and voltage (nominal 3.0V).
Hi VGN, hope everything is ok. Chippagedon is less serious now. Will you publish the spice , only the binaries or have something else in mind?
Daniel
Now that I've obtained the FPA and have some FPGAs on order, I'm working with some coworkers to re-do this from the baselines that exist here and take it on our own path as it appears VGN has gone dark.
https://github.com/festlv/isc0901b0-breakout (https://github.com/festlv/isc0901b0-breakout)
This is the project the USB webcam converter is based on. This one's "just" a breakout board that supplies power. It does have unidirectional shutter control. However in his board VGN used a H-bridge driver to give bidirectional shutter control (which also gave him a driver for the focus motor, as it's a dual H-bridge driver). The shutter I have definitely needs bidirectional control to return fully to the open position, but it's somewhat damaged so I'm not certain this applies to undamaged shutters.