Electronics > FPGA

Arrow DECA MAX 10 board for $37

<< < (38/39) > >>

migry:
Sorry if I wasn't clear, and sorry for not knowing how to quote properly.

"It sounds like he is generating the video clock for the STe and sampling the binary data from the ST's display chip."

Yes.

The STe version of the Atari allows the video chip (aka the shifter) to be driven from an external 32MHz clock.

The HDMI generation (which I have done many times on various different hobbiest FPGA boards and found amazing interesting) is given below and I chose a 50Hz mode which needs a 54MHz clock. I use a Dell monitor as my display.


--- Code: ---// FORMAT_1440x576p_50Hz
// FORMAT 29,30  == 1440 x 576 Progressive @ 50Hz
// pixel clock =  54.000MHz
// aspect ratio = 4:3 & 16:9
// horiz = 32.0us => 31.250kHz
// frame = 20.0ms => 50Hz
--- End code ---

The line frequencies and pixel count work out perfectly at 50Hz, but unfortunately not for 60Hz (although I found something close, but the Atari VSYNC was 58Hz or thereabouts).

The prototype, which used a plug in breadboard and lots of "dupont" wires, eventually worked flawlessly. Each set of 4 digital RGB outputs from the "shifter" which go through resistors to generate an analogue voltage are buffered by my 74LVC8T245 voltage translator/buffers and are then input to the FPGA. Same for the VSYNC, HSYNC and DE. Only the 32MHz external clock comes from the FPGA back to the Atari (there is a pin on the 13 pin video connector for this). I had two PCBs made to tidy up the setup, but I didn't consider issues related to having a bus running at this speed. Anyway the glitch problem is fixed with 1k resistors which just "take off the edge" so to speak, working with the input capacitance of the FPGA pins.

It's been a while, but the digital parallel video rate is 16Mb/s for medium res and 8Mb/s for low res.

In order to find the first pixel on the top active video line, I use the DE (display enable) input. This input is sampled by the 32MHz clock. I am pretty sure the current configuration has a bad sampling point of DE, and that the pos edge of the 32MHz clock is too close to where DE transitions. I simply need to find a way to delay DE in order to get a clean sample. Also the Atari has a feature (bug) where it can come up in one of 4 clock positions relative to DE.

Just FYI, the digital video (with 15.625kHz sync rate) is read into a memory, then  each scan line is output twice at the HDMI rate of 31.25kHz.

As long as I sample the RGB input busses and DE "cleanly" the picture is rock solid.

As a side note I made several attempts to use the DECAs audio codec and failed miserably. So I used a fairly dumb analogue to I2S converter board instead. So the HDMI output, thanks to the HDMI chip on the DECA carries video and audio.

The problem is that I don't know how to utilise the I/O features of the MAX10 FPGA. I currently use them in rather a dumb way. I am looking for any insight or pointers to example code where I can learn how to properly use the advanced(?) features of the I/O, just like the DDR controller that is already described in this thread, where I assume the clock needs aligning to the data window, which is more or less the same problem which I face.

Please dumb down your replies!  ;D

BrianHG:
In Quartus, use the IPcatalog and select basic Functions / I/O - GPIO-Lite Intel FPGA
Choose a verilog or vhdl file name and configure your IO port.
You can set the port register to DDR and look at the documentation there.

You will not find the IO timing de-skew features here.
Such fine timing is useless for a 32MHz as the adjustable skew is in the +/- 50ps delay adjustments.
(Do you know how fine 50ps is?)

To confine your IO port to a specific time, you will need to create an .sdc file.  This may be out of your league for now, but again, this constraint file is for +/- a 0.5 to a few nanoseconds.
(Your video clock is 16MHz, do you know how many nanoseconds this is?)

My recommendation to run your inputs clocked at 256MHz, 8x your current 32MHz reference clock, 16x the speed of your video output data is your best solution.  Create a software adjustable pipe delay, 1 for your syncs and DE, another for your video data and operate there in standard verilog.
(With a 256MHz input data clock, do you know how many positions you may take from that Atari ST source 16MHz video output?)

The super complicated +/- 50ps alignment delay wont do a thing for your 16MHz clocked DE signal coming from your Atari ST.  If you Atari's video was clocked at 500MHz, then I would first say your PCBs and wiring needs to be improved, then you would want to use the super complicated +/- 50ps alignment delays capabilities in the advanced IO features of the MAX10 to compensate for signal timing.


Q: How did you divide your output 54MHz into the ST's 16MHz?

migry:
So Brian, thank you for taking the time to read and answer my questions. You have made suggestions and I will explore those which make sense to help me to a solution. I haven't used anything except the basic I/O pad, so I really ought to find some learning material to understand more about these special I/O pins (features?) and how to instantiate them. I've found a few useful PDFs, but nothing regarding delay primitives.


--- Quote ---(Do you know how fine 50ps is?)
(Your video clock is 16MHz, do you know how many nanoseconds this is?)
(With a 256MHz input data clock, do you know how many positions you may take from that Atari ST source 16MHz video output?)
This may be out of your league for now...
--- End quote ---

Is there really any need to be so insulting?



--- Quote ---Q: How did you divide your output 54MHz into the ST's 16MHz?
--- End quote ---


--- Code: ---PLL_HDMI pll_hdmi_inst (
  .inclk0 ( MAX10_CLK2_50 ),
  .c0     ( w_pll_hdmi    ) // 54MHz
);

PLL_ATARI pll_atari_inst (
  .inclk0 ( w_pll_hdmi  ),
  .c0     ( w_clk_atari ) // 32MHz
);

--- End code ---

The numbers work out in that: 54 * 16 / 27 = 32.

BrianHG:

--- Quote from: migry on January 20, 2022, 11:09:03 pm ---

--- Quote ---(Do you know how fine 50ps is?)
(Your video clock is 16MHz, do you know how many nanoseconds this is?)
(With a 256MHz input data clock, do you know how many positions you may take from that Atari ST source 16MHz video output?)
This may be out of your league for now...
--- End quote ---

Is there really any need to be so insulting?


--- End quote ---
No insult intended.  I was just trying to convey the vast scope of timing between a 16/32MHz system in the day of 25-15ns PLDs and an FPGA which can easily do 400MHz and when they talk about IO timing, the kind of scope of their available adjustments are for correcting timing inside this tiny period where simple coding cannot accommodate those required tiny delays.

You are working in a world where the kind of timing adjustments you are doing is done in HDL code, not the realm of adjusting an IO's delay by +/- a few hundred picoseconds.

For your 'PLL_ATARI pll_atari_inst', if you are outputting that 32MHZ on an IO directly, unless it is a specific dedicated PLL output, timing isn't guaranteed.  Your output will be better quality if you ran the PLL at 64MHz and make an output @ that clock ' out <= !out; ' making a logic output at half frequency.


I still say make your PLL generate the 256MHz and from that 54MHz, generate the 32MHz out, IE count 8 clocks out, and select an adjustable position withing to sample 1 every 16 clocks making your adjustable delayed input for the DE.

normi:
Only option for quote available on Arrow site, says only 2 in stock but I doubt they have any available. When was the last time that these were in stock at the $37 price.
Di a quote request but waiting for response. 

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version