Author Topic: Help request to finish coding a Kodak DCS (first commercial DSLR) storage unit.  (Read 2046 times)

0 Members and 1 Guest are viewing this topic.

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Hi everyone!

TL;DR: I have a DueProLogic FPGA board based on a Altera Cyclone 4, at the moment i've coded enough to command the camera to stream the sensor data in parallel 8bits at 6mhz and with the appropriate clock signal i can read it back and dump it on some LEDs. Now my coding skill reached the end and i need help to store or transfer that data.

TL;RA (Too Long Read Anyway):

I'll start with a little bit of story. Back on 2018 i got a Kodak DCS, the very first commercial DSLR and made on a small run it is one very important item that shaped today's life, it paved the way for every digital camera we have, the one on top of your display or the back of your phone too. The one I got was incomplete and in very rough shape, most importantly the DSU (Digital Storage Unit) which contains most of the logic was missing.
On a quick googling session I came across the schematics and original source code from it, Jim McGarvey, which was the lead engineer of the design took care to put online a lot of resources from the camera. Having all this I came with the idea of replacing all that bulk of PLDs and ancient storage technology with something more modern, in that moment I though of an arduino.
Not understanding much of what I was doing I sent an e-mail to Jim McGarvey itself telling him about my idea, few days later, to my surprise, Jim answered my e-mail, he told me he had the same idea but I will need an FPGA since timing was a critical thing and even more surprising he sent me an scan of an original hand written paper defining the communication protocol between the winder (the part I have) and the DSU, that was more than enough motivation to still be here 5 years later keeping up with it.

I originally started posting about the project over here https://www.nikonweb.com/forum/viewtopic.php?t=841 .
The progress and the excitement are clearly visible on the posts, I got farther than I would've ever imagined and even got to fully control the camera and transmit a few light sensitive bytes, but giving my almost null knowledge about high level coding interfacing an SD card or negotiating with a PC is really far from what I can accomplish, still I feel the project stopped very close to the point where results can be seen.

So well, here's the actual moment, I've travelled back and reunited with all the parts, tested it and works exactly as the day I stored it and since I got some time to test I'm reaching the community looking for someone interested in this project

I really appreciate any comment on the subject, even if you can't help, a thumbs up is good enough for me so thanks to everyone reading this and I hope we can throw some life on the DCS.

 

Online Berni

  • Super Contributor
  • ***
  • Posts: 4946
  • Country: si
Impressive amount of effort into that thing.

It shouldn't be hard to transfer an image over a UART serial port over to a computer. Excellent excuse to learn some programming for it.

It wold be more difficult to create a self contained unit with a MCU, LCD, SD card etc. You do need a fair bit of embedded programing to get all of that to work and run fast. Yet you could still use Arduino stuff to get a quick jumping off point for it.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Hi Berni, Thanks for your answer, and yes you are absolutely right in everything.

It is indeed a lot of effort that i've put on it, but that I loved doing and learnt a lot by doing it, i'll do it again if i had to.

About the UART that's also the way to go but now i have to learn about coding for a modern PC operating system and that's a completely different topic, I still tried to do it based on some examples included whit the FPGA board and the results were far from useful, the computer side overflowed very quick, it was also very slow... real coding skills were required there.

And lastly the self contained unit is the final step, and also yes, using a dedicated MCU is the way to go, an Arduino might or might not work, it needs to be a really fast one to keep with the speed of the UART and to write fast enough to the high speed SD card (the data is not being buffered on dedicated DRAM like on actual cameras), regular ATmega328 based ones cannot do that at all. Using a Raspberry PI could be somewhat more reasonable, it is powerful enough to do all the tasks without a problem.
Once I tried adding a small 128x96 i2s OLED display directly to the FPGA and bitbang the data to it... that was a whole lot of nonsense
 

Online Berni

  • Super Contributor
  • ***
  • Posts: 4946
  • Country: si
You can use serial terminal programs(like putty,teraterm,realterm etc..) to send the data from a serial ports directly into a file, this can help you debug to make sure data is making it in. If you are using linux you can just pipe a serial port device directly into a file using the terminal.

Serial ports are pretty slow so it should not be hard to make an application that receives at the datarate of a few Mbits per second. The main thing to keep in mind when writing software under a OS is that you can no longer have full control of when the CPU is going to be executing your code. So for servicing a serial port one would typical have some sort of event fire up periodically (be it a timer every few miliseconds, or an receive event from the serial port). Inside that event you start taking data out of the serial port for as long as there are bytes still in the buffer, looping around and around in place to pump it out, this will near instantly drain the buffer (since PCs are very fast) then you can exit out of the event and wait for a new one. PCs will always tend to have a few kilobytes of buffer and you can usually increase it into the megabytes if you want (but doesn't make sense to do that when you only have a transfer rate of like 300 KB/s)

Something like C# makes writing windows apps really easy as the IDE sets up all the GUI for you and the language is much like plain old C. Only real caveat being the garbage collection, so you just need to be mindful of when memory is being allocated, everything else is very fast.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Yeah, classic serial ports are out of scope because of speed, the camera transfer speed is 8 parallel signals at 6mhz clock, so 6Mbytes, doubling the data rate would be very nice to avoid data corruption.
The FPGA board has a FT2232h chip which can create a real UART via the USB port (plus other cool modes) but that requires talking to a specific driver and writing some code that does it.

Here is the board and its characteristics https://earthpeopletechnology.com/products-page-2/modules/dueprologic (mine is the same but without the LED array).

C# is what is used on the examples included, but my knowledge at high level coding are near 0, last time i tried was on Visual Basic 6 and the results weren't great either.

I tried modifying the examples and understood what the driver expects but when it came to dealing with data inside the c# code I online managed to crash the program or even to not compile at all.

This was the fastest i could get to transfer data without overflowing something or losing a lot of data

On an important matter, the data must be read as fast as possible, the CCD when is exposed to light retains the picture and via a some sort of analog shift register it will spit one pixel at a time at a fixed 6mhz rate to the ADC which converts it to 8 parallel bits, when the current line of pixels is read completely the sensor must be clocked and the whole picture shifts the whole picture one pixel so the next line falls into the shift register buffer.

The only possible moment where the ccd can take a pause is between each line, this for testing purposes is kinda okay, but for final usage is not possible since the CCD itself will start collecting noise from the ambient heat and the image will end up severely damaged.
 

Online Berni

  • Super Contributor
  • ***
  • Posts: 4946
  • Country: si
Getting data into a PC this fast is a different matter.

You generally can't rely on things like USB to run that fast without a fair bit of buffering. The USB controller in a PC is offten handling multiple devices over a built in "hub" and due to the way USB 2.0 works there is a lot of CPU involvement in moving data, so USB transfer rates can hitch if the computer is very busy. Tho 6MB/s is only 48Mbit so definitely possible to do, but does move up from being just a serial port.

If you want to grab photos from a camera sensor real time the way to do it are chips designed for webcams that take a RGB video stream and get it going into USB. This makes things on the PC side easier since it has drivers specifically for dealing with live video data, making them fast.

But if you are going after capturing just one still image, then it just makes way more sense to first capture your image into RAM as fast as possible and then slowly transfer it over at any pace you want. I would guess the resolution of the camera is not massive and SRAM chips in the few megabytes are reasonably cheep. Stick that on your FPGA and you can pump data as fast as you want.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Yes, UART at 48Mbit is quite doable and since the final design is to have a dedicated MCU to crunch the data it was actually a quite good way to go, when testing on a PC the cpu usage is no matter since is dedicated to only do that, i think is the easiest way to go.

Adding some RAM was part of the original scope, each picture is only 1.3Mbyte, the thing about RAM are the refresh cycles, that's a lot to code and to wire, and still remain with the transferring issue. My idea previous to the UART was to use a modern very high speed SD card which can do 48Mbit/s without a problem and fix two problems at once, even if there's not a file system and use it as a block device then add a end of file marker and make individual files from the PC.
Also this could be used on the dedicated MCU by sharing the bus of the SD card, first the FPGA saves the data stream to the SD card on a specific address and tells the MCU when it finished, then the MCU goes and creates a file with the content of that address, since it is already stored the MCU can do this at whatever speed rate it wants to.
 

Online Berni

  • Super Contributor
  • ***
  • Posts: 4946
  • Country: si
SRAM doesn't need any refresh cycles or special controllers. You set the address pins and the data appears on the data bus some nanoseconds later. You can commonly find these simple asynchronous SRAM chips up to 8MB in size. It is DRAM that needs refreshing but there is no reason to use that here since you don't need 1GB of RAM (Unless you plan to record video with this camera)

SD cards don't provide a stable write speed down to this kind of fine grained timing. Internally they have a controller that erases and wearlevels the raw NAND flash while you are using it, so there are small hitches in performance while it does that (and they depend on how much junk there is on the card and how old it is). There is also a filesystem to be taken care of if you want the cards to be red able by anything else (like a PC). Handling a filesystem in a FPGA is all sorts of complicated. Tho in this case you could in theory have the MCU allocate a file of the correct size and then tell the FPGA the sequence of sectors that file occupies so it can overwrite them. But you don't want to be messing with filesystems this much in depth if you have trouble with high level stuff.

48M baud is a lot for UART, don't think many MCUs will let you run it that fast (since the UART controller has to have a much higher clock speed than the baudrate). Something like SPI can go faster, but QSPI can definitely go that fast.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Oh you're right, I read it wrong about the RAM, Static RAM should be the way to do it correctly and completely avoid data corruption. I see if i can scavenge some ram chip and i'll try to wire it.
(I say scavenge because where I am is not easy to buy overseas, there's normally a 60day shipping time plus lots of taxes and customs paperwork real PITA)

And you got my attention at QSPI, theres even a DDR mode, do you know if it's somewhat doable with an ESP32? I read that is only Dual SPI.

I know SD cards are not really suitable for this kind of applications, I hoped that using really high speed ones that can record 4k video I could use the extra bandwidth as a margin, there are cards rated at 120MB/s continuous write which are expensive but fairly easy to get.

I like a lot the idea for a final design to use the actual FPGA adding SRAM as buffer and a ESP32 to handle the filesystem and the SD Card operations plus a small 128x64 oled screen as a status display.
No preview display nor fancy stuff, I already have another more modern Kodak DCS that only has status display and is actually all that is needed to shoot.
 

Offline Wolfram

  • Frequent Contributor
  • **
  • Posts: 382
  • Country: no
Nice project, it's good that this historic piece of camera history gets a new lease on life. Could QSPI PSRAM be an option here?  There are options with plenty of bandwidth at a good price, in easy-to-use packages like SO-8, requiring few IO pins and minimal wiring. Something like the APS1604 or related devices. Note that I don't have any experience with these parts, I just thought they could be worth considering here.
 
The following users thanked this post: pieroc91

Online Berni

  • Super Contributor
  • ***
  • Posts: 4946
  • Country: si
That is a pretty interesting chip.

It is DRAM but with self refresh so it looks like SRAM apart from having pages. The speed of about 40MB/s write and 70MB/s read is pretty impressive for just 6 wires. Tho the interface does need a bit of a memory controller in the FPGA to tack on the read/write commands on the start of a transfer, but that sounds worth it for not needing to run something like 30 wires that a regular asynchronous SRAM chip would need.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Hi Wolfram, thanks for your answer.

That chip looks really good and it's incredible to have that kind of advanced IC for less than $2.

Hi Berni, yes also being SPI there's some code to add for the protocol and for the specific IC commands but it's a really nice path to approach, after searching for real SRAM chips i started asking myself if i had enough I/O free on the FPGA, the only easy way to get SRAM chips i could find was an old L2 cache from a Pentium 3 Xeon server, which are 18 bits wide which means a lot of wires.

Also being SPI is perfect because I can hook up an Arduino to the bus to handle the data out to an SD card or a PC after the FPGA is done with it, since is already on RAM speed isn't a problem anymore.
i'm gonna research it better and see how can I get one to try it.

Wish i were in another country, i'll be already ordering a few of those.
 

Offline leftsquarebracket

  • Newbie
  • Posts: 2
  • Country: us
Hi! I remember seeing your posts on NikonWeb and I'm glad to see you're still working on this! That's super cool that you talked with Jim himself as well.

I just found a full DCS kit (with DSU, cables, batteries, manuals, software, etc) and a second DSU+winder+back without the F3 in December and things seem like they should work, but being 30 years old there's some tinkering ahead to get the DSUs up and checked out. Something more portable than the original DSU pack is always an interesting idea, especially as these units continue to age.

I'm not sure how you're faring with trying to get parts or which ESP32 module/board you have in particular, but there are some "hacks" to get them to capture 8- to 16-bit parallel data directly with the I2S module and DMA. It looks like a popular solution for reading out data from (ironically) camera modules, on Github there's demo code (eg "esp32-cam-demo") to do this, and it looks like CNLohr has a project named "esp32-cnlohr-demo" where there's a relatively standalone version in i2s_stream_in.c which can even take an external clock (like the FPGA or maybe even the camera, directly).

The ESP32 and DMA overall are a bit outside my wheelhouse, but maybe you'll find these helpful. My apologies for not throwing in links, I'm not sure if they'll get flagged on my first post.
 
The following users thanked this post: pieroc91

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Hey, how are you? This is so cool you could found more DCS hardware, thanks for the followup.
Yes, i'm still with it, i'm specially excited since i could finally got some images from it, from what I could read i think my CCD might have some damage but still makes some pictures.
(i used to know somebody who had a back with no more parts than that, i hope he still have it)

As always i posted the progress on Nikonweb, although the forum seems to be completely dead by now.

If you need help with your DCS hw just tell me, i'm all in on preserving this devices, as a must on every DCS is replacing every tantalum capacitor, they go short, always, normally they just prevent from the camera from working but no further damage occurs but better safe than sorry.

Very interesting about what you tell me on the ESP32, i just got an ESP32-CAM because is the only way to source QSPI PSRAM over here, so... I have one to tinker.

i'm quite positive that everything i've added to my DCS can be replaced with a single ESP32 but some very skillful coding is required to not screw up the timings which i found them not forgiving at all.

Right now i'm struggling with the dark current and getting to expose at the correct moment, it shoots randomly when flushing the sensor, quite erratic behavior right now...


 

Offline leftsquarebracket

  • Newbie
  • Posts: 2
  • Country: us
I'm good! I've actually just been a NikonWeb lurker, I just signed up so maybe together we can breathe some life back into it!

I am not surprised about the tantalums, in no small part because of Dave's videos. I've only done a few quick tests (I also got an original AC adapter) and the DSUs boot and the camera fires and winds, but they're not without issue. The displays have a lot of horizontal smearing, one has a busted character LCD, the SRAM batteries are probably long dead, I'm sure there's more. I'd love to round up an old Macintosh to run the full kit and see if the old SCSI hard drives are still kicking too. Thank you for the offer for help!

Since I can't comment on NW yet, your progress is amazing! Your drawing of Lenna is great, and the images you're getting are already super impressive.

Thinking about it and poking through the schematics I've got questions and thoughts for you:
  • Which way does the sensor read out? Line by line, or column by column?
  • Could the slant in your images come from expecting an extra pixel per row, or one missing clock?
  • Could some of what's flat white be the CCD saturating and blooming, if you have to manually light it up with a flash?
  • The schematics aren't super clear where the camera interfaces with the DSU, but given the rest of the design does there need to be more termination/buffering on the receiving end of the clock/data lines? Also, could some of the missing data/clocks be due to noise from the leads for the logic analyzer?
  • I'd guess the banding would be coming from the analog side, prior to the ADC. Could it be noise from your power supply side?
  • As far as when you're exposing the image, it looks like you've hooked up all the signals on the front of the winder. What do HGATE, ATN, and /LO get or tell you? HGATE seems like "horizontal gate", like maybe it indicates what's valid image data?

I've previously built a very rough scanning CCD camera (a linear CCD on a breakout, moved by a stepper motor slide, in a cardboard box with a lens stuck in it) and I've seen dark current/dark voltage mentioned as something that represents the output of pixels that truly receive no light, representing the darkest dark you can read. Down the line you'd normalize against that being your "true" black reference. I'd almost expect the ADC to do that for you, given the electronics, but I haven't mapped out the whole schematic.

To not upend all your work so far, I wonder if that parallel+I2S bus for the ESP32-CAM interface could help get the data out of the FPGA quickly. Even if you move the PSRAM to the FPGA for a fast buffer, if you're controlling both ends of your own parallel interface after that image data is captured, you could signal when an image is received and the ESP32 could clock the FPGA to read out image data at its own pace, and buffer data to some of its 500+K of internal SRAM while it writes out to an SD card.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
I'm good! I've actually just been a NikonWeb lurker, I just signed up so maybe together we can breathe some life back into it!

That's very cool, that site has a lot of good information and gathered all of the ones who are interested on the subject, i hope it remains alive.

Quote
I am not surprised about the tantalums, in no small part because of Dave's videos. I've only done a few quick tests (I also got an original AC adapter) and the DSUs boot and the camera fires and winds, but they're not without issue. The displays have a lot of horizontal smearing, one has a busted character LCD, the SRAM batteries are probably long dead, I'm sure there's more. I'd love to round up an old Macintosh to run the full kit and see if the old SCSI hard drives are still kicking too. Thank you for the offer for help!
Yeah, having an original set would be really cool, as far as i know there are no complete systems that are actually functional, NikonWeb webmaster itself made a post about his struggle and even with his working set couldn't download the pictures.
Tantalums on mine were kind of scary, the body just bursted in smoke while testing some simple stuff, internal battery must be replaced but that's easy, is detailed on the schematics, i guess is only for the RTC, the RAM is actually DRAM and it should work regardless of the battery.
Character LCD looks like a regular 16x2 or 20x2 parallel input, very popular on the Arduino stuff except it comes presoldered with an i2c converter board.
The HDD can be fun, maybe recovering some of the remaining data can shed some info on it's past life, you can easily test those with an external enclosure on an old Macintosh, doing a full dump and examining it with some hex editor.
Lastly the horizontal smearing on the LCD... i've seen on laptops that they have a lot of smearing and sometimes is caused by leaky SMD caps, i've replaced a few with some success, is a good idea to check them out.
Quote

Since I can't comment on NW yet, your progress is amazing! Your drawing of Lenna is great, and the images you're getting are already super impressive.
Thank you!
Quote

Thinking about it and poking through the schematics I've got questions and thoughts for you:
  • Which way does the sensor read out? Line by line, or column by column?
It is line by line, as viewed from the view finder is left to right and top to bottom, so the read image doesn't need to be reversed or mirrored. There's a shift register prior to the first line and a gets loaded with the first line when V1 and V2 signals are sent, that moves all the image on the sensor one pixel closer to the shift register, then the shift register spits one pixel at a time.

  • Could the slant in your images come from expecting an extra pixel per row, or one missing clock?
It is some sort of jitter and missing one pixel on the logic analyzer (i'm using a $10 24mhz one), making some lines one pixel shorter, stealing it from the next line thus shifting it on pixel to the left 

  • Could some of what's flat white be the CCD saturating and blooming, if you have to manually light it up with a flash?
On another topic i've asked to the gurus on the site about it and while i was testing what they were asking me, the thing started to improve a lot, all the artifacting was gone, leaving me only with the left side bloom, and everyone pointed out it can be some ccd driving problem.
Thinking about it i found that i'm turning off and on all the electronic at the start of each line, that can easily be the cause of the bloom since the electronics aren't settle when i'm exposing, i've tryed to correct some of the code and the winder stopped giving me data so... i need further work on the code.https://www.eevblog.com/forum/projects/any-ccd-sensor-guru-in-the-house-is-my-ccd-damaged

  • The schematics aren't super clear where the camera interfaces with the DSU, but given the rest of the design does there need to be more termination/buffering on the receiving end of the clock/data lines? Also, could some of the missing data/clocks be due to noise from the leads for the logic analyzer?
Not sure but i don't think so, everything is terminated and buffered everywhere, it is over-engineered in every aspect, i've tested with the clock signal which should be the fastest signal and i got a perfect reading.

  • I'd guess the banding would be coming from the analog side, prior to the ADC. Could it be noise from your power supply side?
I guess it was just a loose connection on the CCD pins, after reseating it the problem went away.
As for the power supply it is also over engineered, adding to that, when i recapped i've added newer SEPC ultra low ESR and higher capacity ones on place of the dead ones.

  • As far as when you're exposing the image, it looks like you've hooked up all the signals on the front of the winder. What do HGATE, ATN, and /LO get or tell you? HGATE seems like "horizontal gate", like maybe it indicates what's valid image data?
I've made this exact same question to Jim, HGATE goes high whenever the logic gate controlling the CCD clock turns on.
I've also misread /LO, it's actually /LD for Latch Data, it does that, it latches the direction of the data on the signal bus.
and i quote what Jim said to me on the ATN signal "As for ATN signal i know nothing of it."

[/list]

Quote
I've previously built a very rough scanning CCD camera (a linear CCD on a breakout, moved by a stepper motor slide, in a cardboard box with a lens stuck in it) and I've seen dark current/dark voltage mentioned as something that represents the output of pixels that truly receive no light, representing the darkest dark you can read. Down the line you'd normalize against that being your "true" black reference. I'd almost expect the ADC to do that for you, given the electronics, but I haven't mapped out the whole schematic.

Very cool project, i've seen a few online some time ago, very cool actually. As for the dark current, i have it a lot of that on my mind right now since i've been re-reading the original source code the whole day and it calls a lot a function called "DARK_ENABLE" and i have no idea what it does, i need to study better that part of the code since i guess it can improve the image quality a lot.

Quote
To not upend all your work so far, I wonder if that parallel+I2S bus for the ESP32-CAM interface could help get the data out of the FPGA quickly. Even if you move the PSRAM to the FPGA for a fast buffer, if you're controlling both ends of your own parallel interface after that image data is captured, you could signal when an image is received and the ESP32 could clock the FPGA to read out image data at its own pace, and buffer data to some of its 500+K of internal SRAM while it writes out to an SD card.
I want to get to the part of dealing with the ESP32, i think it has the power of solving completely the saving problem, originally I though on scavenging the PSRAM for the FPGA and using it as its own ram, but when the readout is finished tunnel the pins of the bus to other FPGA pins making it visible to the ESP32 as if the memory was actually soldered on it and just reading it back. but i guess i'll get to that in some days, development once again will start to slow down since i'm on vacation and that ends today, but still i'll keep on working on it on my free which turn out to be quite a lot these days.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
Well, i'm unearthing this post since i'm ready to begin with the recorder part. I'm happy enough with the FPGA side of things and the limiting factor right now is the logic analyzer.
Already tested the ESP32 with the sample code and it's working fine so I can focus on that but I feel completely lost reading the code, i'm used to gate level code.

Electrically speaking the included OV2640 is already an 8bits + clock signal, for what I want it is somewhat pin compatible, I think it should be a matter of wiring and recoding what needed.

TL;DR: I need to read the 6mhz parallel ADC output sync'd with its sync signal, tunnel that read to the PSRAM and finally dump that to the SD card.

Any help is appreciated. Thanks!
 

Offline Boscoe

  • Frequent Contributor
  • **
  • Posts: 276
This is very cool. I'm actually working on making a digital back for my Nikon F3 with a BW u43 sensor which looks to be the same size used in this version. I have one KAF-8300 which is the only sensor I could find that would find inside the frame to get the right focusing distance but it's now gone obsolete and epically expensive.

As for getting the data to the PC, use the FT232H or variant and this IP on opencores. I have modified it to work with two FIFOs rather than the Avalon MM interface, easy enough. Then you can just pipe the data in the FIFOs. For this kind of thing in the past I'd set the first byte in a frame or line to 255, if there are any other bytes in the frame at 255 I'd rewrite them to 254 so there was something to look for in the data on the PC side.

https://opencores.org/projects/usb_ft232h_avalon-mm_interface

 
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
So cool! the F3 is an amazing body and pretty sure one of the most modifiables, I think the best one for heavy DIYers.

I was okay using it tethered for testing purposes but now I want a self-contained device so I can take it to the streets, hence the ESP.

The more I read the more I found the ESP is going to be a tough challenge, actually thinking in going back to the idea of sharing a PSRAM where the FPGA dumps the data and the ESP dumps the PSRAM to an SD.

DMA and parallel sync'd data seems to require a lot of deep knowledge on how the ESP actually works and how to make workarounds to achieve what you want.

The last pic I took with my DCS while tethered:
 

Offline Boscoe

  • Frequent Contributor
  • **
  • Posts: 276
Nice!

Sorry I hadn’t read that far down the thread.

Then (having done a lot of reading around this) I would say your best bet is to use a regular fast micro like STM32H7. Try and find a micro with OCTOSPI for the interface with the FPGA. These interfaces are simple, are fast and most likely have DMA. This means you can be DMAing data in from the FPGA into one buffer while DMAing into the SD card with another. When both DMAs have completed, swap the buffers and start again. This means you’ll be able to get an overall throughput close to the SD card speeds.
 

Offline pieroc91Topic starter

  • Contributor
  • Posts: 46
  • Country: ar
STM32s are a choice too but i can only get the most basic ones over here and i'm not sure they are much different than the ESP32 i already have.

Storage speed is not a concern, but buffer speed is, that's why I need some kind of RAM, PSRAM is good enough in speed and since it comes in QSPI variants I see a possibility of using it directly to the FPGA.

Directly transmitting the data to the uC and make the uC buffer it is possible too but since i'm not very good coding that and i can't find any example to use as a starting point I think i'll go the PSRAM route.

Once the data is buffered I can share the QSPI bus with any uC and use some simple code to dump the info and save it on a SD card.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf