| Electronics > Projects, Designs, and Technical Stuff |
| Acute Logic & Lead CCTV PE-1005S HD Camera Module |
| << < (9/13) > >> |
| OzOnE:
Finally sending all 44 bytes (plus the address) to the cam, and the Wire library is now saying Success on each try. ;) I had to update the ESP32 library manually... http://wei48221.blogspot.com/2018/04/arduino-esp32-how-to-update-to-latest.html Then I was seeing the first response byte as 0x03, which is RAE (Recovered ALT Bit Error). So you have to set the SR bit on the very first command, to reset the whole protocol state on the cam, which also copies the current ALT bit from packet byte 0 (sent from the ESP) into the cam. The cam should then toggle the ALT bit on each successive command. The ESP should read back that bit, invert it, then write it on the next command. Sheesh. Talk about going overboard on the error correction. lol I mean, it already has a checksum, and refuses any packets with fewer than 44 data bytes, so it seems a bit excessive to have that whole ALT bit mechanism too. Anywho, I can FINALLY read back the serial number from the cam. :) Now going to try the zoom command again. It's possible the camera is in the IDLE state at power-on. Setting it to DVC (Digital Video Cam) or DSC (Digital Stills Cam) mode requires configuring the interrupt type, sending a "Mode Change" request command, then triggering an Interrupt. I think if the cam was already in the IDLE mode, though, it wouldn't be outputting video at all yet, so fingers crossed. EDIT: Awesome - It ZOOOOMED! lol |
| OzOnE:
YCbCr to RGB on the FPGA is almost working now. The image isn't aligned properly, and will have some glitches, since it's not in sync with the HDMI test pattern gen yet. (it positions the image randomly each time the FPGA is configured, and then it very slowly creeps up the screen. The test pattern gen is driven from the camera's 66 MHz clock using a PLL, so the clocks are kind of in sync, but not Hsync / Vsync.) For some reason, the usual Green output from the YCbCr block isn't working right yet, it's constantly maxed out. You can see the colour fringing on the first image where I'm using the Luma signal direct from the camera for the Green output. The Cb and Cr signals have a slight delay from the MAC (Multiply Accumulator) operation. On the second image, I've delayed the Green data by one pixel clock. Still not quite perfect, but this will all get sorted eventually. This is 720p@30Hz, but you can see the aspect ratio isn't quite right either, due to the different pixel clocks and crude line buffer. Even then, the overall image quality looks very nice already. Please keep in mind this is not a direct HDMI capture either. I'm using analog Component from the tvONE scaler box to an Avermedia H727 capture card. Long story. lol And yes, the camera is upside-down atm. The first image was as-is, but with the Rasp Pi box upside-down. The second image was flipped in Paint Shop Pro, as I haven't got the vertical flip option working on the camera itself yet. I've added some more timeout stuff to the ESP32 code, which seems to be working OK now. I want to add my OSD block next. Maybe the final version of the PCB could have the ESP32 module already on it, which would allow for very fast boot times. It could also be used to control the camera from a simple web page. I don't think it would be trivial to try streaming the video itself via the ESP32 as well, unless the FPGA can be made to output the video data in the same format as the OV2640 perhaps... https://randomnerdtutorials.com/esp32-cam-video-streaming-face-recognition-arduino-ide/ The "MIPI" connector on this cam board only has nine FPGA IO pins available, but that might be just enough for bt.656 style video (8 bits, plus the clock). Here's the current Arduino code if anyone finds it useful... https://mega.nz/#!v8JHiQTS!Di-HcoJxq6NW58v9Fns0MAcfSaoBNZjyOY1GwszaF1U |
| OzOnE:
Wow. The macro on this thing is insane. These boards were only about 12mm from the lens, and the Auto Focus worked no problem. :o |
| Yansi:
I am starting to think I should get a cam or two too, as the support HW for it seems to be quite simple, I expected much much more trouble with this. Would be probably quite good motivator for me to improve my VHDL skills. Are any cams still available? |
| OzOnE:
Took quite a while to get the OSD working, mainly because of the differences with the Arduino SPI functions on the ESP32 tripping me up. I was also being very lazy initially, and not soldering a specific ground wire for SPI, instead relying on the one ground wire already in use for I2C. Of course.. ground bounce caused glitches on the SPI SS_N signal, and screwed everything up. My OSD code basically allows you to embed graphics tiles within the text lines. (4BPP, same as the Genesis/MD) I do need to change that Sonic graphic, though, as it's been like that since about 2013. lol Obviously the OSD window is very small right now, but it can be resized quite easily, and positioned anywhere on the screen. The next thing I want to do is to hook up a small joystick module or a few buttons, so I can control the zoom and focus directly, or via the OSD. I'm trying not to add too much feature creep, as it's getting relatively complex already. The camera I2C functions are working quite reliably now. I added a timeout / retry for the response data, fixed the response check itself, and also now doing a checksum on the response, to see if it matches what is expected. I can also change the video output format, but it looks like it requires an interrupt pulse before the change takes effect. That doesn't seem to be mentioned specifically in the manual, but I could be missing something. It's working better at 720p@60Hz atm. It's a better match to the 720p pattern gen, but still not properly synched. So every time I configure the FPGA, the whole image rolls vertically. But, the image does at least stay put afterwards. Looks like 1080p is having trouble with my bodge wires, which was fully expected. The image probably isn't supposed to have that "Matrix" style green tint to it, but I'm still using the Luma channel for "Green" atm. I don't quite get how Luma is working so well for green, as it would surely contain brighter areas for the red and blue content? Maybe that's why the image looks a tiny bit washed out? The OSD logic alone is taking up around half of the FPGA, but there are definitely some savings to be made there. (it has a way of displaying a much larger image which can scroll a tile map, like a game does. I'll probably remove that.) SignalTap is taking up about a 5th of the logic, too. In total, I'm using 80% of the FPGA. |
| Navigation |
| Message Index |
| Next page |
| Previous page |