Author Topic: Arrow DECA MAX 10 board for $37  (Read 41039 times)

0 Members and 1 Guest are viewing this topic.

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2728
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #175 on: November 13, 2021, 01:44:33 am »
Sure, lecture me on how DVCS work, it was obvious I didn't know.
Yes, that much was obvious...

The github link I gave shortly explains the two approaches you can use, and I explained numerous times why I favor the collaborator approach to the fork approach UNLESS you have a good reason to do it with forks. I have given my main reasons, which seem to be consistently ignored. Good, I just hope at least a few people have gotten my points. For the others, I don't care. And I have used DVCS for years, thank you. Using the collaborator model doesn't mean that you're not using git as a distributed system either. DVCS have a number of benefits over centralized systems even when not "forking". But I think the term "fork" has been largely abused here.
...and so is the fact that you don't use it properly. When using git you stick to a "single branch/fork - single developer" approach, or you going to have massive headaches when several devs will constantly step on each other's toes.

Oh and a final thought related to this discussion. The most successful open-source projects have pretty much all used a form of vertical organization - the Linux kernel being a prime example - which is likely how they could become successful in the end. As Linux shows, the use of a DVCS is absolutely not contradictory to having some amount of vertical project management, and is still quite useful in this setting. Github, while promoting, whether they want it or not, the dilution of open-source projects, hinders vertical project management. Not saying it prevents it, but it makes it unnecessarily harder. Now, what happens, to be honest, is that many of the successful projects, even when they have a github repo, are primarily managed outside of github. And to get back to Brian here, even if it's a small project, what I've seen so far is that it's what he does. (IIRC, he has even stated on his github page to use this forum for any communication... ;D )
Git was designed by Linus precisely for Linux, and each kernel dev always works in his own branch, and once a feature is implemented, there is a pull request (PR) to merge it back into the mainline, primary job of maintainers do is pick-and-choose which PRs to include into the release. And the exact same mechanism is used for accepting code from external entities for inclusion into the mainline (for example, this is how RISC-V and ARM support was added into the mainline).

So yeah, forks are the norm in git, not exception nor any sort of insult or whatever other nonsense you fantasized. With that, I think this discussion is over.

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14309
  • Country: fr
Re: Arrow DECA MAX 10 board for $37
« Reply #176 on: November 13, 2021, 11:57:08 pm »
So yes, it's obvious I don't know. Sure. Looks impossible to have a different view on some topics, in particular anything related to software development, the discussions of which often end up very badly. I'm not surprised. You can absolutely disagree with me and find my concerns invalid or at least largely exxagerated, no problem. Claiming I don't know how a DVCS work is something else. Thanks for the smile =)

It's also obvious you didn't address most of the points I made, nor cared to understand them. That's possibly actually mainly due to how the term "fork" is abused here, although I tried explaining what my concerns were, and how making forked projects *public*, as with github, was NOT quite the same thing as each developer working on their own repo, because it precisely promotes project dilution - diluting the number of potential users and contributors to several instances of the same project, which is a concern that I'm far from being the only one to have expressed, and while you can indeed perfectly do so using DVCSs, there are myriads of ways to use them with different workflows. Not to talk about closed-source projects for which DVCSs are also used a lot these days. There are a ton of different workflows, very few of them including each developer in a team having their own forked *repo* on a (or several) shared server(s). A branch is not a complete forked repo. Dunno - I get the impression you absolutely see no difference. Again, git as no concept of fork by itself. All it has are clones and branches. Even a cloned repo is a slightly different beast from a fork (as in github and gitlab), which they themselves explain if you care to read the docs. But I'm the one not knowing here.

As to my comment on excessive branching, this is also nothing new. In well managed projects, excessive branching is avoided - using the right workflows - because they can quickly make merges an untractable problem. That doesn't mean DVCSs are not appropriate. Not at all. It's all about structuring things a bit instead of pure anarchy.

I still stand by the opinion that Github tends to promote bad use of git and VCS in general because it's used a lot as *cloud storage* rather than a truly collaborative tool. Small projects are likely to be more affected than the big ones, for which users will naturally tend to flock to the repo that has the most activity. I've seen some stats showing that a large majority of forks on github were just dead beef. I prefer personal dead beef on personal computers rather than being public and polluting projects. I'm not just saying that in theory. I've seen a lot of projects, some I'm interested int, being diluted this way and thus harder to contribute to or even use. I doubt I'm the only one.

Again, everyone working on their own branch and having multiple *public* forks of a project are two different things, which you apparently don't want to acknowledge, good. Maintaining or contributing to a an open-source project takes some work and active collaboration, even if it's just to report bugs (which can be every bit as useful as contributing code, btw, and for which splitting reports on many different projects makes it all the harder to aggregate them. That was one of my point too.) Clicking on a "fork" button takes a split second, otoh.

As to the way the Linux project is handled - it definitely doesn't use git in the way that is common in Github, and has indeed a very specific workflow. It doesn't take anything from github. See what every pull request gets as answer: https://github.com/torvalds/linux/pull/805 . The github repo is read-only as they explain, but "funnily" enough, there are almost 40k forks. Not, Linus himself certainly doesn't use git the way that looks "usual" on github.

So yeah, I'll leave it at that, but just wanted to make my points clear. If they are still not, oh well.
« Last Edit: November 14, 2021, 02:39:38 am by SiliconWizard »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #177 on: November 19, 2021, 02:19:45 am »
LOL, Arrow now has just received 2 more DECA boards in stock.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
 

Offline migry

  • Regular Contributor
  • *
  • Posts: 71
  • Country: gb
Re: Arrow DECA MAX 10 board for $37
« Reply #179 on: January 19, 2022, 10:33:04 pm »
I hope you all enjoying using your dirt cheap DECA  :-+

I have used mine in a project to extract digital level video from an Atari STe, and then use the DECA to generate HDMI (since it has a nice easy HDMI transmitter chip to utilise).

The picture shows the video (4 bits each: R,G and B) buffer and level translator, for which I had a PCB manufactured by JLCPCB. I used the wrong footprint for the buffers  |O

On the DECA is a "hat". There is a 20 way IDC cable with headers which connects them. I couldn't get the built in audio to work so I added an external PCB/module.

Verilog software was developed while it was a still a mess of wires. But I was rewarded with a pixel perfect 50Hz Atari medium res display. Much better than the jail bar ridden mess which came out of the composite video. I do not have a RGB (15kHz) monitor or TV (BTW).

Initially scoping the signals, they were a horrendous mess. Lots of spikes everywhere. I output the received vsync and hsync on debug output pins, and they looked horrendous. I eventually figured it was all down to @@@@-poor high speed (high slew rate) signal mis-management. As an experiment, as seen in the photo, I added 1k resistors into the hsync and vsync lines out from the Atari and this fixed the syncs. The debug outputs from the FPGA were now clean.

I am now suffering from a new problem. I use the 32MHz clock, generated on the FPGA to sample the incoming signals. Apart from vsync and hsync is "display enable" or de. I use this to detect the start of active video. I am pretty sure that I am not sampling this signal cleanly using the 32MHz. It worked on the mess of wires since code would be compiled differently and delays would be different.

Oh BTW, the 32MHz clock generated by the DECA FPGA goes to the STe and is used as the video clock, so it is synchronous with the RGB and sync outputs back from the STe. This feature is not available on the standard Atari ST (note no 'e').

I simply want to use the input delay feature of the IO pins, to tweak de. I know that this is a bad one off solution, but I do not plan to publish the code and the project is a one off. I just need it to work. I need to tweak de, by adding a small delay so that it is not transitioning when sampled by the 32MHz clock. Over time the picture stabilises, then jitters again, then badly jitters, repeat and rinse. Adding finger capacitance changes the jitter, which makes me confident that the issue is simple a sampling point problem.

My question is, how do I use the MAX10 input buffer delay feature, and how do I code this in Verilog?

NOTE: I did some googling and found a solution for the Xilinx Spartan6, where you can specify directives to the compiler in comments (IOB).

I have read the PDF for the MAX10 I/O and I do see what looks like a programmable delay for the input signal on DDR pins. I tried to find some example Verilog code, but my google foo failed me once again.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #180 on: January 19, 2022, 10:50:48 pm »
I simply want to use the input delay feature of the IO pins, to tweak de. I know that this is a bad one off solution, but I do not plan to publish the code and the project is a one off. I just need it to work. I need to tweak de, by adding a small delay so that it is not transitioning when sampled by the 32MHz clock. Over time the picture stabilises, then jitters again, then badly jitters, repeat and rinse. Adding finger capacitance changes the jitter, which makes me confident that the issue is simple a sampling point problem.

Get rid of the capacitors...  Input delay buffers operate in to +/-50 picosecond range, up to +/- a few nanoseconds.  Way too small and useless for 32MHz.  This was designed to correct for pin-pin and PCB routing delays for buses above 300MHz way into the 900MHz range.

Here is what you should do:  Take in 32MHz.  Multiply by 8 on the MAX10 PLL for an internal 256 MHz.  Now, sample the input at 256MHz and have your HDL code select 1 of the 8 phases to sample from to get that 'sweet' 32 MHz sampling spot.

You can have a user set selection which phase to use, or if you want smart HDL, you can work out the sweet spot on the fly by looking for when the inputs transition and when the data has a 2-3 position clean stable value.  (I used to sample the Amiga's DENISE 12bit video output and different revisions of the chip has different ns delay skews on the data output bus VS it's 14.31818 MHz source clock  A smart self timing alignment is the best way to go whereas in the old days, we used to use a re-clocking PLL with a trimpot to phase adjust our data capture clock to zero in on that sweet data sampling spot.)

If you truly want to go all out, you can also use DDR input buffers for your input giving you an equivilant 512MHz sampling rate with 1 of 16 phase positions to select your source data from.

And, don't forget about using 'Signal-Tap' within Quartus where you can get your 256/512MHz logic analyzer of the input data + your HDL code's registers allowing you to capture/see what's actually happening on your inputs and HDL in real-time.


I doubt you would need to, but if the ST's output is crap beyond belief, you may select a different 1 or 8 sampling position phase for each input, though, this is an extreme case of fixing a problem which is generated elsewhere.

An example is if the Atari ST has a different logic IC generating the video sync output compared to the 12 bit RGB data and you have no choice but to correct for a different skew delay where in an analog RGB monitor, this skew would have no effect.
« Last Edit: January 19, 2022, 11:25:18 pm by BrianHG »
 

Offline RoGeorge

  • Super Contributor
  • ***
  • Posts: 6146
  • Country: ro
Re: Arrow DECA MAX 10 board for $37
« Reply #181 on: January 20, 2022, 12:23:27 am »
Nice project!  :-+

I didn't used my DECA recently, though there is a pile of projects for it in the bucket list.

Analog TV signals are using 75 ohms impedance.  If that Atari output is for a standard video monitor, then you need to use 75 ohms impedance coax cables (preferably same length for vertical and horizontal sync).  These 75 ohms coaxial cables (can be as well video cables, often RCA terminated coax cables, like the ones for old VCRs) must be terminated by a 75 ohms resistor at the other end of the cable, in parallel with the input of the FPGA.  Without impedance matching, the signals will reflect back and forth in the cable, creating horrible spikes, echoes and waveform degradation.

I don't recall if the ADC inputs in the DECA board already have 50 ohms low impedance input, I suspect they are not.  In case they do have 50 ohms, then a series resistor of 25 ohms is needed at the FPGA side of the cable.

Series 1k could be a workaround, but that is expected to degrade high frequencies, so more jitter and less detailed edges for pulses.  A proper 75 ohms impedance matching and 75 ohms coaxial cables should work better.



TV signal synchronization can be very tricky when the signal source has an independent clock.  I did once a machine to overlay video subtitles, which immediately showed the same clock and syncro problems you are are talking.  Back then I simply synchronized the clock of subtitling computer by pausing the clock (it was a ZX Spectrum, not Atari) so to make the computer clock "driven" by the analog video signal coming from a VCR.

Not sure if your Atari setup allows this trick.  If not, whatever you do, there will be skew between the clocks, maybe buffering a full frame might fix it.  I have no idea how sensitive the HDMI standard is against time jitter.

Also, analog TV were very tolerant regarding lines jitter, or frames jitter/frequency, and they were locking well with the source signal in a much broader range than the specs of broadcasting TV standards.  Therefore many computers back then were far for compliant, so I wouldn't be surprised if some of the jitter comes from the Atari hardware itself, but let's hope it is not that.



I'm sure many will chip in with new tricks and ideas for many pages to come, video conversion is a tricky problem, this new project of yours deserves its own topic. 
« Last Edit: January 20, 2022, 12:32:22 am by RoGeorge »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #182 on: January 20, 2022, 01:32:32 am »
Analog TV signals are using 75 ohms impedance.  If that Atari output is for a standard video monitor, then you need to use 75 ohms impedance coax cables (preferably same length for vertical and horizontal sync).  These 75 ohms coaxial cables (can be as well video cables, often RCA terminated coax cables, like the ones for old VCRs) must be terminated by a 75 ohms resistor at the other end of the cable, in parallel with the input of the FPGA.  Without impedance matching, the signals will reflect back and forth in the cable, creating horrible spikes, echoes and waveform degradation.
It sounds like he is generating the video clock for the STe and sampling the binary data from the ST's display chip.  Otherwise, he would still get the jail bars and not have any jitter issues.

IE, if he makes the video clock, there cannot be any jitter on the HDMI output.
I have made enough video output on the DECA, now at 1080p, 720p, 480p, 960p, and a few other modes, no jitter at all.  Maybe he isn't driving the HDMI output IC properly?
I never had to use any fancy delays.

Look at the ribbon cables on his PCB where the silkscreen says red/green/blue, it has 4 wires on each ribbon, ie 12 bit digital color.

There are also 3 wires labeled sync and DE.  My guess is just like the Amiga (IE: the sync signals toggle on the rising clock while the digital video out toggle on the falling clock unless it is the ECS DENISE with the 28.63636MHz video out mode where the output timing has all new timing problems), the output timing of the sync VS the RGB color data has a fine but definite skew.  Using my trick of running the FPGA inputs at 8x frequency and sweet-spot selecting the RGB data and sync/de inputs from the ST will allow a perfect clean jitter free experience at each 8x sample position will proved a programmable tunable +/- ~4ns step size delay line.
« Last Edit: January 20, 2022, 01:45:22 am by BrianHG »
 

Offline migry

  • Regular Contributor
  • *
  • Posts: 71
  • Country: gb
Re: Arrow DECA MAX 10 board for $37
« Reply #183 on: January 20, 2022, 02:08:33 am »
Sorry if I wasn't clear, and sorry for not knowing how to quote properly.

"It sounds like he is generating the video clock for the STe and sampling the binary data from the ST's display chip."

Yes.

The STe version of the Atari allows the video chip (aka the shifter) to be driven from an external 32MHz clock.

The HDMI generation (which I have done many times on various different hobbiest FPGA boards and found amazing interesting) is given below and I chose a 50Hz mode which needs a 54MHz clock. I use a Dell monitor as my display.

Code: [Select]
// FORMAT_1440x576p_50Hz
// FORMAT 29,30  == 1440 x 576 Progressive @ 50Hz
// pixel clock =  54.000MHz
// aspect ratio = 4:3 & 16:9
// horiz = 32.0us => 31.250kHz
// frame = 20.0ms => 50Hz

The line frequencies and pixel count work out perfectly at 50Hz, but unfortunately not for 60Hz (although I found something close, but the Atari VSYNC was 58Hz or thereabouts).

The prototype, which used a plug in breadboard and lots of "dupont" wires, eventually worked flawlessly. Each set of 4 digital RGB outputs from the "shifter" which go through resistors to generate an analogue voltage are buffered by my 74LVC8T245 voltage translator/buffers and are then input to the FPGA. Same for the VSYNC, HSYNC and DE. Only the 32MHz external clock comes from the FPGA back to the Atari (there is a pin on the 13 pin video connector for this). I had two PCBs made to tidy up the setup, but I didn't consider issues related to having a bus running at this speed. Anyway the glitch problem is fixed with 1k resistors which just "take off the edge" so to speak, working with the input capacitance of the FPGA pins.

It's been a while, but the digital parallel video rate is 16Mb/s for medium res and 8Mb/s for low res.

In order to find the first pixel on the top active video line, I use the DE (display enable) input. This input is sampled by the 32MHz clock. I am pretty sure the current configuration has a bad sampling point of DE, and that the pos edge of the 32MHz clock is too close to where DE transitions. I simply need to find a way to delay DE in order to get a clean sample. Also the Atari has a feature (bug) where it can come up in one of 4 clock positions relative to DE.

Just FYI, the digital video (with 15.625kHz sync rate) is read into a memory, then  each scan line is output twice at the HDMI rate of 31.25kHz.

As long as I sample the RGB input busses and DE "cleanly" the picture is rock solid.

As a side note I made several attempts to use the DECAs audio codec and failed miserably. So I used a fairly dumb analogue to I2S converter board instead. So the HDMI output, thanks to the HDMI chip on the DECA carries video and audio.

The problem is that I don't know how to utilise the I/O features of the MAX10 FPGA. I currently use them in rather a dumb way. I am looking for any insight or pointers to example code where I can learn how to properly use the advanced(?) features of the I/O, just like the DDR controller that is already described in this thread, where I assume the clock needs aligning to the data window, which is more or less the same problem which I face.

Please dumb down your replies!  ;D
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #184 on: January 20, 2022, 02:40:01 am »
In Quartus, use the IPcatalog and select basic Functions / I/O - GPIO-Lite Intel FPGA
Choose a verilog or vhdl file name and configure your IO port.
You can set the port register to DDR and look at the documentation there.

You will not find the IO timing de-skew features here.
Such fine timing is useless for a 32MHz as the adjustable skew is in the +/- 50ps delay adjustments.
(Do you know how fine 50ps is?)

To confine your IO port to a specific time, you will need to create an .sdc file.  This may be out of your league for now, but again, this constraint file is for +/- a 0.5 to a few nanoseconds.
(Your video clock is 16MHz, do you know how many nanoseconds this is?)

My recommendation to run your inputs clocked at 256MHz, 8x your current 32MHz reference clock, 16x the speed of your video output data is your best solution.  Create a software adjustable pipe delay, 1 for your syncs and DE, another for your video data and operate there in standard verilog.
(With a 256MHz input data clock, do you know how many positions you may take from that Atari ST source 16MHz video output?)

The super complicated +/- 50ps alignment delay wont do a thing for your 16MHz clocked DE signal coming from your Atari ST.  If you Atari's video was clocked at 500MHz, then I would first say your PCBs and wiring needs to be improved, then you would want to use the super complicated +/- 50ps alignment delays capabilities in the advanced IO features of the MAX10 to compensate for signal timing.


Q: How did you divide your output 54MHz into the ST's 16MHz?
« Last Edit: January 20, 2022, 02:55:56 am by BrianHG »
 

Offline migry

  • Regular Contributor
  • *
  • Posts: 71
  • Country: gb
Re: Arrow DECA MAX 10 board for $37
« Reply #185 on: January 20, 2022, 11:09:03 pm »
So Brian, thank you for taking the time to read and answer my questions. You have made suggestions and I will explore those which make sense to help me to a solution. I haven't used anything except the basic I/O pad, so I really ought to find some learning material to understand more about these special I/O pins (features?) and how to instantiate them. I've found a few useful PDFs, but nothing regarding delay primitives.

Quote
(Do you know how fine 50ps is?)
(Your video clock is 16MHz, do you know how many nanoseconds this is?)
(With a 256MHz input data clock, do you know how many positions you may take from that Atari ST source 16MHz video output?)
This may be out of your league for now...

Is there really any need to be so insulting?


Quote
Q: How did you divide your output 54MHz into the ST's 16MHz?

Code: [Select]
PLL_HDMI pll_hdmi_inst (
  .inclk0 ( MAX10_CLK2_50 ),
  .c0     ( w_pll_hdmi    ) // 54MHz
);

PLL_ATARI pll_atari_inst (
  .inclk0 ( w_pll_hdmi  ),
  .c0     ( w_clk_atari ) // 32MHz
);

The numbers work out in that: 54 * 16 / 27 = 32.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #186 on: January 20, 2022, 11:56:16 pm »

Quote
(Do you know how fine 50ps is?)
(Your video clock is 16MHz, do you know how many nanoseconds this is?)
(With a 256MHz input data clock, do you know how many positions you may take from that Atari ST source 16MHz video output?)
This may be out of your league for now...

Is there really any need to be so insulting?

No insult intended.  I was just trying to convey the vast scope of timing between a 16/32MHz system in the day of 25-15ns PLDs and an FPGA which can easily do 400MHz and when they talk about IO timing, the kind of scope of their available adjustments are for correcting timing inside this tiny period where simple coding cannot accommodate those required tiny delays.

You are working in a world where the kind of timing adjustments you are doing is done in HDL code, not the realm of adjusting an IO's delay by +/- a few hundred picoseconds.

For your 'PLL_ATARI pll_atari_inst', if you are outputting that 32MHZ on an IO directly, unless it is a specific dedicated PLL output, timing isn't guaranteed.  Your output will be better quality if you ran the PLL at 64MHz and make an output @ that clock ' out <= !out; ' making a logic output at half frequency.


I still say make your PLL generate the 256MHz and from that 54MHz, generate the 32MHz out, IE count 8 clocks out, and select an adjustable position withing to sample 1 every 16 clocks making your adjustable delayed input for the DE.
« Last Edit: January 21, 2022, 12:01:19 am by BrianHG »
 

Offline normi

  • Regular Contributor
  • *
  • Posts: 75
  • Country: 00
Re: Arrow DECA MAX 10 board for $37
« Reply #187 on: January 24, 2022, 01:01:37 pm »
Only option for quote available on Arrow site, says only 2 in stock but I doubt they have any available. When was the last time that these were in stock at the $37 price.
Di a quote request but waiting for response. 
 
The following users thanked this post: Miti

Offline lintweaker

  • Contributor
  • Posts: 23
  • Country: nl
Re: Arrow DECA MAX 10 board for $37
« Reply #188 on: January 25, 2022, 08:19:59 am »
Only option for quote available on Arrow site, says only 2 in stock but I doubt they have any available. When was the last time that these were in stock at the $37 price.
Di a quote request but waiting for response.
I did use the quote options a few days ago and got response: they do not have any stock currently.
 

Offline ArsenioDev

  • Regular Contributor
  • *
  • Posts: 236
  • Country: us
    • DiscountMissiles: my portfolio and landing page
Re: Arrow DECA MAX 10 board for $37
« Reply #189 on: April 25, 2022, 04:45:04 pm »
Yeah the stock on these has totally dried up, plus the FPGA silicon shortage is BRUTAL.
I snagged one for $56 on ebay recently as an entry into soft silicon world.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #190 on: August 14, 2022, 01:15:45 am »
 :scared:  Can somebody tell me which YOYO decided to wire the audio codec TLV320AIC3254 line-out jack to the audio codec's line out pins instead of the headphone out pins?  I mean, the entire difference is that the line out's A-weighted Dynamic range is 100db VS the headphone out being 99db.  Absolutely every other spec is the same except the headphone pins can drive a 16ohm load, IE: headphones directly.  Their decision to use the codec's line out designed to drive a 10k load means I now need to find separate transfer audio cables and a headphone amp just to use this tiny board.  I mean, couldn't they think a little or read the data sheet?

Maybe they only wanted people with a suite of analog audio equipment to accompany this DECA board just to hear any sound.
« Last Edit: August 14, 2022, 01:18:16 am by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #191 on: August 14, 2022, 01:47:30 am »
Sounds like a bad design decision surely.  Though how much I'd trust the analog level audio quality / SNR / reliability / ESD protection from what's intended to be a cheap FPGA board would be small regardless of what interface they used.
I'd worry about frying the FPGA / board based on plugging the audio cable or something unless proven otherwise by CE testing and schematic review.

Anyway though highly inconvenient vs. a correct PCB design, surely there has to be basically an 3.5mm-stereo-line-in to buffer-op-amp / line driver to 3.5mm stereo line / headphone out with micro-USB power input chinese gadget for like $10/5pcs or something?  I'd probably buy or design / build / use something like that just to add some EMC filtering and ESD protection and "jack saver" lessening the amount I'd physically handle / plug into the "unprotected"/"naked" FPGA board to make an audio connection and similar for HDMI, ethernet switch, etc.

I mean really.  I have a project where I just want to demonstrate some audio to someone.  Now I need to carry a crap of audio equipment just to listen instead of just headphones.  It was merely the difference of moving 2 ping over to the left on the IC...

That audio IC would need a serious zap going into it before it made its way far enough through it to make it to the FPGA to damage that IC.  Otherwise, even my home PC with an audio codec chip on the motherboard must be in great danger every time I touch my headphones.
 

Offline mopplayer

  • Newbie
  • Posts: 5
  • Country: tw
Re: Arrow DECA MAX 10 board for $37
« Reply #192 on: December 05, 2022, 12:40:37 pm »
Wow, that's interesting, people love to do something on a board of EOL and more worst, a ES silicon.

The EOL product means you have no support, and you will solve problems on your own.

The ES silicon means there are some bugs in it, no guarantee, and the C6G (not I6G) device is still not available on the market, so that device is discontinued and only used for fun (or education?)

You could imagine why Intel Core "ES" devices are so cheap, and there are still "QS" devices.

Arrow is a commercial company that all realized that point and released them at a low price to get rid of them.

To be honest, I will not waste money on that electronic waste, even though it's only $39. :-//

If you want to use a MAX10 device, the DE10-Lite platforum is more reliable and used to develop products nowaday.
 

Offline RoGeorge

  • Super Contributor
  • ***
  • Posts: 6146
  • Country: ro
Re: Arrow DECA MAX 10 board for $37
« Reply #193 on: December 05, 2022, 03:08:22 pm »
To be honest, I will not waste money on that electronic waste, even though it's only $39. :-//

If you want to use a MAX10 device, the DE10-Lite platforum is more reliable and used to develop products nowaday.

It's out of stock for a very long time anyway, though $39 was a very good price for the a MAX10 devboard with the biggest/fastest MAX10 FPGA and all those peripherals.  The one you like is $140 instead of $40, and how do you know it is more "reliable", since you don't have a DECA board for comparison?

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #194 on: December 05, 2022, 07:29:24 pm »

If you want to use a MAX10 device, the DE10-Lite platforum is more reliable and used to develop products nowaday.
What if you want 1600megabytes per second a digital 24bit 1080p multilayerd graphics included HD embedded sound, with a 512megabyte display buffer?

That DE10-Lite uses obsolete discontinued SDRam.  A technology if you are learning to use will develop nothing but cheap old slow bad habit skills, never learning how to use advanced high speed ports or using .sdc time constraints files to operate those super high speed DDR IO ports properly.

Yes, the C6 50k Max10s exist.  The ES was just an advanced engineering sample.
All, C8, I8, C7, I7, and C6 speed grades all function the same and you can get identical functio0nality between them.
Also, it is an FPGA, who cares which exact versions are available?
« Last Edit: December 05, 2022, 07:45:30 pm by BrianHG »
 

Offline mopplayer

  • Newbie
  • Posts: 5
  • Country: tw
Re: Arrow DECA MAX 10 board for $37
« Reply #195 on: December 26, 2022, 02:28:31 pm »

If you want to use a MAX10 device, the DE10-Lite platforum is more reliable and used to develop products nowaday.
What if you want 1600megabytes per second a digital 24bit 1080p multilayerd graphics included HD embedded sound, with a 512megabyte display buffer?

That DE10-Lite uses obsolete discontinued SDRam.  A technology if you are learning to use will develop nothing but cheap old slow bad habit skills, never learning how to use advanced high speed ports or using .sdc time constraints files to operate those super high speed DDR IO ports properly.

Yes, the C6 50k Max10s exist.  The ES was just an advanced engineering sample.
All, C8, I8, C7, I7, and C6 speed grades all function the same and you can get identical functio0nality between them.
Also, it is an FPGA, who cares which exact versions are available?

Not really.

I only consider availability, reliable and consistency, so an ES device is not an option.

That's why Intel also discontinued MAX 10 FPGA Development Kits. So where is the commercial C6 device? Oh... there is also no ES device.
C6~C8 is speed grade, a high speed application also will consider that.

Yeah, just for fun, that's OK.

Why do you only see the SDRAM/DDR, framebuffer?
I do not need the SDRAM/DDR, there is always a bandwidth bottleneck. On the other hand, the device is not only for video applications.

So DE10-Lite is better for students and hobbyists, that is the truth.

Beside MAX10, there is still Cyclone V and even Cyclone IV E, there is not much expensive, and more resources on that device. DE0-nano is an example. I have also recommanded the board.

I have used other non-ES device to do my project a long time, and those ES boards in my country is cheap. Do not need $39.

1672399-0
« Last Edit: December 26, 2022, 02:31:14 pm by mopplayer »
 

Offline Wiljan

  • Regular Contributor
  • *
  • Posts: 225
  • Country: dk
Re: Arrow DECA MAX 10 board for $37
« Reply #196 on: February 17, 2023, 09:28:37 am »
Hi
I was involved in project  a MIPI camera, and then I remember that the DECA does have a MIPI CSI-2 port on board and wanted to play a bit with a camera to FPGA

The DECA MIPI port does have an 30 pin connector and was original made for an AR0833 camera which you can't find anymore for the DECA layout.

So I have a IMX219 (RPI V2.1) Cam using a 15 pin FFC connector with 3V3 and I2C, enable, and the 1x CSI clk and 2 X CSI data lanes.

I did solder some thin wire on the diff pair DHPY term resistors (see schematic on page 15 of the DECA board) and the to to a small 15 pin FFC (made from a FFC to FFC connection board cut into half.

The I2C was just connect to the GPIO header as well as the GND and 3V3

Some search on the internet and I found a  place where the I2C for 640x480 was used in a FPGA implementation
 https://purisa.me/blog/mipi-camera-progress/

I used the imx219.sv for the I2C https://github.com/hdl-util/mipi-ccs

which basic does the the 640x480 driver part from the linux and it works fine on the DECA and the camera.

I also added in the camera.sv from https://github.com/hdl-util/mipi-csi-2

I did expect to get a "frame_start", "frame_end", "line_start", "line_end", "image_data_enable"
Those signal are not steady  :(

So when measuring on the MIPI clk lane I see it's sending in bursts as well on the 2 data lanes.

My scope is a 200Mhz one, so the signal get very weak above 200Mhz.

In the datasheet for the IMX219 I can see what registers does and what the FPGA set it to.

The Camera is a a
Max. 30 frame/s in all-pixel scan mode
Pixel rate: 280 [Mpixel/s] (All-pixels mode)
180 frame/s @720p with 2x2 analog (special) binning, 60 frame/s @1080p with V-crop
Data rate: Max. 755 Mbps/lane(@4lane), 912Mbps/Lane(@2lane)

Total number of pixels : 3296 (H) × 2512 (V) approx. 8.28 M pixels
Number of effective pixels : 3296 (H) × 2480 (V) approx. 8.17 M pixels
Number of active pixels : 3280 (H) × 2464 (V) approx. 8.08 M pixels

The Cam module does have a local 24Mhz osc and the there is a few PLL in the cam to generate the right pixel clk and MIPI clk

There is a register 0x030d which set the "PLL Output System multiplier Value", if I lower that one to 0x20 I the get 128Mhz on the MIPI clk and clk get continues and I get frame start and end.

So my question:

If the camera does have a total area of 3280x2464=8M pixels/frame and only 640x480= 0.3Moixels/frame are set as Point Of Interest.

Can that be clocked out on the MIPI with a lower clk?

So when you have the first line of 640 will the 640 need to clocked by the speed of the 3280 or will it be possible to clk it slower.
I understand the it will require the the cam does hold 1 line of date, in the datasheet I see there is a FIFO in the  cam but I cant see if it does only hold the active pixel of a line or not.

Cam data sheet says says  912Mbps/Lane(@2lane) and MAX10 data sheet the the LVDS mode on the MAX10 it says 800Mhz.

That's why I want to reduce the speed on the MIPI clk lane but still get the 640x480

Let me hear what you think?

Code: [Select]
{1'b0, 16'h0100, 8'h00}, // Stop Streaming
 //--Access command sequence Seq. No. Address (Hex) data---
 {1'b0, 16'h30eb, 8'h05}, // init seq
 {1'b0, 16'h30eb, 8'h0c}, // init seq
 {1'b0, 16'h300a, 8'hff}, // init seq
 {1'b0, 16'h300b, 8'hff}, // init seq
 {1'b0, 16'h30eb, 8'h05}, // init seq
 {1'b0, 16'h30eb, 8'h09}, // init seq
 //--Output Set-up Registers--
 {1'b0, 16'h0114, 8'h01},  //CSI_lane_mode 0: Reserved, 1: 2-Lane, 2: Reserved, 3: 4-Lane
 {1'b0, 16'h0128, 8'h00}, //DPHY_CTRL MIPI Global timing setting 0: auto mode, 1: manual mode
 {1'b0, 16'h012a, 8'h18}, //EXCK_FREQ[15:8] RW INCK frequency [MHz] defauls 0x0C // 24 = 0x18
 {1'b0, 16'h012b, 8'h00}, //EXCK_FREQ[7:0] RW INCK frequency [MHz]
 {1'b0, 16'h0162, 8'h0d}, // LINE_LENGTH_A[15:8]  0x0d78 = 3.443 pixels
 {1'b0, 16'h0163, 8'h78}, // LINE_LENGTH_A[7:0]
//----offset X-----------
 {1'b0, 16'h0164, 8'h03}, // X_ADD_STA_A[11:8] x_addr_start 0x03e8 = 1.000
 {1'b0, 16'h0165, 8'he8}, // X_ADD_STA_A[7:0]
 {1'b0, 16'h0166, 8'h08}, // X_ADD_END_A[11:8] x _addr_end = 2.279
 {1'b0, 16'h0167, 8'he7}, // X_ADD_END_A[7:0]
//----offset Y-----------
 {1'b0, 16'h0168, 8'h02}, // Y_ADD_STA_A[11:8] y_addr_start 0x02F0= 752
 {1'b0, 16'h0169, 8'hf0}, // Y_ADD_STA_A[7:0]
 {1'b0, 16'h016a, 8'h06}, // Y_ADD_END_A[11:8] y_addr_end 0x06af = 1.711
 {1'b0, 16'h016b, 8'haf}, // Y_ADD_END_A[7:0]
//----output size----------
 {1'b0, 16'h016c, 8'h02}, // x_output_size[11:8] 0x0280 = 640
 {1'b0, 16'h016d, 8'h80}, // x_output_size[7:0]
 {1'b0, 16'h016e, 8'h01}, // y_output_size[11:8] 0x01e0 = 480
 {1'b0, 16'h016f, 8'he0}, // y_output_size[7:0]
 
 {1'b0, 16'h0170, 8'h01},
 {1'b0, 16'h0171, 8'h01},
 {1'b0, 16'h0174, 8'h03},
 {1'b0, 16'h0175, 8'h03},

 //---Clock Set-up----------
 {1'b0, 16'h0301, 8'h05},
 {1'b0, 16'h0303, 8'h01},
 {1'b0, 16'h0304, 8'h03},
 {1'b0, 16'h0305, 8'h03},
 {1'b0, 16'h0306, 8'h00},
 {1'b0, 16'h0307, 8'h39},
// {1'b0, 16'h0307, 8'h2b},
 {1'b0, 16'h030b, 8'h01},
 {1'b0, 16'h030c, 8'h00},
// {1'b0, 16'h030d, 8'h72},
 {1'b0, 16'h030d, 8'h40},
// {1'b0, 16'h030d, 8'h10},  // 0x10=63Mhz ok , 0x20 =128Mhz ok, 0x40 = 256Mhz
 //---Test Pattern Registers----
 {1'b0, 16'h0624, 8'h06}, //1640
 {1'b0, 16'h0625, 8'h68},
 {1'b0, 16'h0626, 8'h04},  //1232
 {1'b0, 16'h0627, 8'hd0},
 
 {1'b0, 16'h455e, 8'h00},
 {1'b0, 16'h471e, 8'h4b},
 {1'b0, 16'h4767, 8'h0f},
 {1'b0, 16'h4750, 8'h14},
 {1'b0, 16'h4540, 8'h00},
 {1'b0, 16'h47b4, 8'h14},
 {1'b0, 16'h4713, 8'h30},
 {1'b0, 16'h478b, 8'h10},
 {1'b0, 16'h478f, 8'h10},
 {1'b0, 16'h4793, 8'h10},
 {1'b0, 16'h4797, 8'h0e},
 {1'b0, 16'h479b, 8'h0e},
 {1'b0, 16'h0100, 8'h01} // Start streaming




 

Offline Wiljan

  • Regular Contributor
  • *
  • Posts: 225
  • Country: dk
Re: Arrow DECA MAX 10 board for $37
« Reply #197 on: February 23, 2023, 10:09:18 am »
Some progress, did have some success with this clk setting for the RPI v2.1 CAM IMX219
Code: [Select]
//---Clock Set-up----------
 {1'b0, 16'h0301, 8'h05},
 {1'b0, 16'h0303, 8'h01},
 {1'b0, 16'h0304, 8'h03},
 {1'b0, 16'h0305, 8'h03},
 {1'b0, 16'h0306, 8'h00},

// {1'b0, 16'h0307, 8'h10},
 {1'b0, 16'h0307, 8'h39},
// {1'b0, 16'h0307, 8'h2b},

 {1'b0, 16'h030b, 8'h01},
 {1'b0, 16'h030c, 8'h00},

 // {1'b0, 16'h030d, 8'h72},
 {1'b0, 16'h030d, 8'h20},
// {1'b0, 16'h030d, 8'h10},  // 0x10=63Mhz ok , 0x20 =128Mhz ok, 0x40 = 256Mhz

So I have the camera.sv using the 2x D.phy_receiver.sv running and a I2C master to set the I2C registers the cam

Using Signaltap I get those 2 different data out of the camera.sv and a fine frame start / end signal
one images is with lamp off and the other is with a lamp on shining on the camera.

Now I have to process the data to get some RGB and then try to feed it into the BrianHG_DDR3 ram so it will come on the HDMI output




« Last Edit: February 23, 2023, 10:11:05 am by Wiljan »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7661
  • Country: ca
Re: Arrow DECA MAX 10 board for $37
« Reply #198 on: July 04, 2023, 01:04:44 am »
This is where most of the 37$ DECA boards went...
DECA at 200$ on Aliexpress
Funny how the store just happened to open with the same timing of the 37$ special...
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf