Author Topic: Embedded Graphics Interface GUI  (Read 2422 times)

0 Members and 1 Guest are viewing this topic.

Offline bentomoTopic starter

  • Contributor
  • Posts: 37
Embedded Graphics Interface GUI
« on: February 17, 2020, 02:17:18 pm »
I'm working on a personal project that uses an FPGA to convert YCbCr to RGB and control the sync timing for a 640x480 LCD. I would like to reuse this display and a couple of buttons for a graphics user interface. I've got things that I'd like to control at run time like sound, display adjustment, battery status etc...

Currently, my design has no frame buffer and I do not have any kind of external DRAM controller.

What I was thinking is I could insert a small BRAM buffer and have the internal CPU populate that based on a few parameters, then the LCD controller simply pulls from that buffer (and scales to minimize ram usage) in addition to the YCbCr channel.

This is probably the simplest method I could think of because then I can handle all the text generation and primitives in firmware. The GUI doesn't need to be fast, just usable.

I'm wondering if there are any open-source projects out there that do something like this? Or maybe there are some industry standards for embedded display graphics? I have a feeling that embedded graphics depend heavily on the available hardware, and there's probably not much available for standards?

This is a personal project so I'm looking to balance some learning with actually getting the project done.  ^-^

I've been looking at high and low level projects as examples.

What I think is high level: https://littlevgl.com/
What I think is low level: https://www.academia.edu/7652720/Design_of_VGA_Controller_using_VHDL_for_LCD_Display_using_FPGA

I'm not new to FPGAs, but I am new to embedded graphics. Most of my graphics background was on full Linux workstations using easy things like java and python.

Any direction to save me some time would be very much appreciated! Cheers!
 

Online rstofer

  • Super Contributor
  • ***
  • Posts: 9932
  • Country: us
Re: Embedded Graphics Interface GUI
« Reply #1 on: February 17, 2020, 05:04:40 pm »
I'm working on a personal project that uses an FPGA to convert YCbCr to RGB and control the sync timing for a 640x480 LCD. I would like to reuse this display and a couple of buttons for a graphics user interface. I've got things that I'd like to control at run time like sound, display adjustment, battery status etc...
I'm no help on this but if the images are static, I would use MATLAB to do the conversion.  There is stuff all over the Internet including this app note by Xilinx

https://www.xilinx.com/support/documentation/application_notes/xapp931.pdf  There is a link to the design files...
Quote
Currently, my design has no frame buffer and I do not have any kind of external DRAM controller.

What I was thinking is I could insert a small BRAM buffer and have the internal CPU populate that based on a few parameters, then the LCD controller simply pulls from that buffer (and scales to minimize ram usage) in addition to the YCbCr channel.
You will almost certainly need a frame buffer as you move toward a GUI.  Dual-port BRAM will work well because the application can read/write from one side and the display controller can read from the other side.
Quote
This is probably the simplest method I could think of because then I can handle all the text generation and primitives in firmware. The GUI doesn't need to be fast, just usable.

I'm wondering if there are any open-source projects out there that do something like this? Or maybe there are some industry standards for embedded display graphics? I have a feeling that embedded graphics depend heavily on the available hardware, and there's probably not much available for standards?

This is a personal project so I'm looking to balance some learning with actually getting the project done.  ^-^

I've been looking at high and low level projects as examples.

What I think is high level: https://littlevgl.com/
What I think is low level: https://www.academia.edu/7652720/Design_of_VGA_Controller_using_VHDL_for_LCD_Display_using_FPGA
I couldn't get to the academia article but that littlevgl project looks nice.  But what to do with it?  Unless you implement a core for which there is a C compiler, you're in for a lot of pain.  I'm in the Xilinx camp so I guess I would drop in a MicroBlaze core and use the provided tools to deal with the code.  I haven't done any of this so I have no idea how difficult it might be.  Or, maybe one of the open source RISC-V implementations, if I knew for a fact that there was a decent C compiler available.

ETA:  I wonder how Zynq plays into this.  Having an ARM core is nice, there are ARM compilers all over the place.  I don't know anything about the device so I wonder how much FPGA fabric is available.  Still, a nice alternative to consider.

It will run Linux; I'm not sure if that is a help or not.

The chip itself works for oscilloscopes but I don't know if there is a Linux kernel involved or not.

« Last Edit: February 17, 2020, 05:11:47 pm by rstofer »
 

Offline OwO

  • Super Contributor
  • ***
  • Posts: 1250
  • Country: cn
  • RF Engineer.
Re: Embedded Graphics Interface GUI
« Reply #2 on: February 17, 2020, 05:15:06 pm »
I'm pretty sure you don't need a framebuffer to implement a basic UI;
if you do the math, with a tiny 320x240 display, you need 320*240*16bpp = 1.3Mbit for a framebuffer. Most MCUs that run a UI don't have anywhere near that much memory ;)
All you need to do is render polygons on the fly and store shapes in in memory. The CPU could "render" the screen to some triangles, while the FPGA generates video in realtime by testing whether each pixel is inside a triangle.
Email: OwOwOwOwO123@outlook.com
 

Online rstofer

  • Super Contributor
  • ***
  • Posts: 9932
  • Country: us
Re: Embedded Graphics Interface GUI
« Reply #3 on: February 17, 2020, 06:38:12 pm »
Littlevgl doesn't need a bitmap type frame buffer, it draws objects and they are transferred to the display adapter via DMA.  I didn't read all of the documentation but it seems to be able to work in a relatively small RAM ARM device

The Xilinx Artix XC7A100T FPGA has 4860 Kb of BlockRAM and 1188 Kb of Distributed RAM.  I think I might limit the color depth to 8 bits for something like an HMI device.  640*480*8 => 2,457,600 bits or 2400 Kb - half the size of BlockRAM.  I could either expand the color depth or have two buffers.  If I even had to have a bitmap.  Or I could leave well enough alone and just use the rest of BlockRAM for the core memory.  In any event, I would try to keep the memory internal to the FPGA.

https://www.xilinx.com/support/documentation/selection-guides/cost-optimized-product-selection-guide.pdf#A7  Page 4

This board also has 128 MB of DDR2 and there is a VHDL component to make it look like static RAM:

https://store.digilentinc.com/nexys-a7-fpga-trainer-board-recommended-for-ece-curriculum/

I'm not sure I see a clean way to get a complete HMI running on an FPGA.  Certainly, it can be done, but it seems to me like one of the newer ARM chips would be a better way to go.  Some are already set up for HMI with a touch screen display, etc.

There's a free version of Embedded Wizard  https://www.embedded-wizard.de/

 

Offline bentomoTopic starter

  • Contributor
  • Posts: 37
Re: Embedded Graphics Interface GUI
« Reply #4 on: February 17, 2020, 07:00:10 pm »
Great responses guys! Thank you,

a little more details.

I use Zynq devices in my day job, and while I think they're one of the best in the FPGA market currently, they come with a price tag to match. I'm most likely going to be using a lattice device with a RISC-V cpu. Probably machxo3 or ecp5 series (<$10 on digikey vs $45 for the smallest zynq). I've got an environment for a picorv32 in sim and in hardware that works so I'm probably sticking with that for keeping it simple and cheap, while not reaching for another MCU.

I'm pretty sure you don't need a framebuffer to implement a basic UI;
if you do the math, with a tiny 320x240 display, you need 320*240*16bpp = 1.3Mbit for a framebuffer. Most MCUs that run a UI don't have anywhere near that much memory ;)
All you need to do is render polygons on the fly and store shapes in memory. The CPU could "render" the screen to some triangles, while the FPGA generates video in realtime by testing whether each pixel is inside a triangle.
While I can imagine there are plenty of tricks to not requiring a "frame buffer" and rendering on the fly. A 200x100 4bpp is probably good enough for my needs, so an internal bram should do the trick.


I'm working on a personal project that uses an FPGA to convert YCbCr to RGB and control the sync timing for a 640x480 LCD. I would like to reuse this display and a couple of buttons for a graphics user interface. I've got things that I'd like to control at run time like sound, display adjustment, battery status etc...
I'm no help on this but if the images are static, I would use MATLAB to do the conversion.  There is stuff all over the Internet including this app note by Xilinx

https://www.xilinx.com/support/documentation/application_notes/xapp931.pdf  There is a link to the design files...

Quote
This is probably the simplest method I could think of because then I can handle all the text generation and primitives in firmware. The GUI doesn't need to be fast, just usable.

I'm wondering if there are any open-source projects out there that do something like this? Or maybe there are some industry standards for embedded display graphics? I have a feeling that embedded graphics depend heavily on the available hardware, and there's probably not much available for standards?

This is a personal project so I'm looking to balance some learning with actually getting the project done.  ^-^

I've been looking at high and low level projects as examples.

What I think is high level: https://littlevgl.com/
What I think is low level: https://www.academia.edu/7652720/Design_of_VGA_Controller_using_VHDL_for_LCD_Display_using_FPGA
I couldn't get to the academia article but that littlevgl project looks nice.  But what to do with it?  Unless you implement a core for which there is a C compiler, you're in for a lot of pain.  I'm in the Xilinx camp so I guess I would drop in a MicroBlaze core and use the provided tools to deal with the code.  I haven't done any of this so I have no idea how difficult it might be.  Or, maybe one of the open source RISC-V implementations, if I knew for a fact that there was a decent C compiler available.

ETA:  I wonder how Zynq plays into this.  Having an ARM core is nice, there are ARM compilers all over the place.  I don't know anything about the device so I wonder how much FPGA fabric is available.  Still, a nice alternative to consider.

It will run Linux; I'm not sure if that is a help or not.

The chip itself works for oscilloscopes but I don't know if there is a Linux kernel involved or not.

That's a good point that I should have brought up. The gui would be simple enough that I'd like to stick with targetting bare metal, not linux. If I had a Zynq I probably would just spin through the tools and have a command-line debug interface ready to go, but I think that's a little overkill for my uses. Maybe if I was an embedded gui developer in my day job I would use littlevgl, but I agree, implementing a compiler for custom hardware that doesn't exist yet would be a pretty big bear. That would definitely a killer for a "spare time" project. Thank you for the app note pointer though, that looks very interesting.

The academia article just goes over generating simple ascii characters to be sent to a VGA controller, not exactly what I'm looking for but I just wanted to highlight that low level meant almost no SW abstraction of hw.
 

Online asmi

  • Super Contributor
  • ***
  • Posts: 2794
  • Country: ca
Re: Embedded Graphics Interface GUI
« Reply #5 on: February 17, 2020, 07:12:28 pm »
Where is video signal coming from, and in what form - is it parallel "RGB" bus, or some more sophisticated protocol?
If there is a way to get that video data in form of AXI Stream bus, I'd use IP provided by Vivado for conversion to RGB, as well as IPs to read from and write to framebuffer. For framebuffer I'd use smallish DDR2 memory chip (to save one DC-DC converter as DDR2 can reuse the 1.8 V rail that is required for FPGA itself) - something like 32Mx16, which is fairly cheap, yet will provide more than enough capacity to implement triple-buffering, at the same time providing enough bandwidth to read and write to the memory at the same time, as well as any kind of frame composition.
« Last Edit: February 17, 2020, 08:11:15 pm by asmi »
 

Offline bentomoTopic starter

  • Contributor
  • Posts: 37
Re: Embedded Graphics Interface GUI
« Reply #6 on: February 18, 2020, 01:25:40 pm »
The actual frame buffer and YCbCr data stream is coming from an external device. The frame buffer is also on said device and I have no control over it. I already have a working YCbCr to RGB converter working in hardware. Although, I could see how having my own frame buffer could solve my issue but that's not really my challenge either. My post was more looking out there to see if there were any standards for embedded video, but I don't think very many things exist, and if they do, they're certainly not a standard.

I'll most likely end up implementing my own text/box renderer using an internal frame buffer and controlling it with software. I might be able to use some real GPU tricks too like transforming and stretching instead of direct bitmapping, and that's where the fun math starts. But I don't need to get too fancy.  ::)
 

Online asmi

  • Super Contributor
  • ***
  • Posts: 2794
  • Country: ca
Re: Embedded Graphics Interface GUI
« Reply #7 on: February 18, 2020, 02:34:44 pm »
The actual frame buffer and YCbCr data stream is coming from an external device. The frame buffer is also on said device and I have no control over it. I already have a working YCbCr to RGB converter working in hardware. Although, I could see how having my own frame buffer could solve my issue but that's not really my challenge either. My post was more looking out there to see if there were any standards for embedded video, but I don't think very many things exist, and if they do, they're certainly not a standard.
Well in Xilinx world there is that thing called "AXI Stream video", which is a convention for streaming video over AXI Stream bus, and all video-related IPs provided by them are compliant with it. So you can mix-and-match those IPs as you see fit, or chain them into a pipeline, and it will "just work". A lot of these IPs are provided for free, and include some fairly sophisticated ones like video mixer, which allows blending several layers together into a single video stream. But if you're into Zynq, you probably already know this. Not sure if you can call it a standard or not in a global sense, but it certainly is within their ecosystem.

Offline OwO

  • Super Contributor
  • ***
  • Posts: 1250
  • Country: cn
  • RF Engineer.
Re: Embedded Graphics Interface GUI
« Reply #8 on: February 18, 2020, 03:45:22 pm »
Nah it's more fun to implement that stuff yourself  ;) Plus I don't like being locked to one particular FPGA vendor and try to keep all my designs trivial to port to any new vendor.
Email: OwOwOwOwO123@outlook.com
 

Online asmi

  • Super Contributor
  • ***
  • Posts: 2794
  • Country: ca
Re: Embedded Graphics Interface GUI
« Reply #9 on: February 18, 2020, 04:31:27 pm »
Nah it's more fun to implement that stuff yourself  ;)
This is good as long as your goal is to have fun, and not deliver product on tight timeline.

Plus I don't like being locked to one particular FPGA vendor and try to keep all my designs trivial to port to any new vendor.
I heard this thrown around a lot, but have you ever actually done that? I know my designs are always designed around specific board with specific part installed on it. So I do whatever it takes to get it to work. Don't see myself ever moving to Altel parts for example - they are too greedy.

Offline OwO

  • Super Contributor
  • ***
  • Posts: 1250
  • Country: cn
  • RF Engineer.
Re: Embedded Graphics Interface GUI
« Reply #10 on: February 18, 2020, 05:07:52 pm »
Yes. I started with Altera many years ago, and when it came time to make my own boards I had to go with X because A chips are hard to get. All of my existing VHDL modules "just worked" without any porting effort. I generally avoid using vendor IP, and device-specific functionality (PLLs, I/O interfacing) are only allowed at the top level module (these need to be redone with a board change anyway). Using IP or primitive instantiation in a core module is a no-no, and even when I e.g. need a specific DSP48E1 cascading structure I will infer it by writing HDL that exactly represent the DSP48 logic instead of instantiating it. In my FFT core the multiplier module is pluggable and I have several implementations each optimized for a different vendor. If you use the wrong vendor variant (e.g. use the DSP48 version on altera) it will still work but with lower Fmax.

In product designs I can literally just swap out the FPGA for a Gowin or Anlogic one with less than an hour of code changes (it will be mainly PCB routing effort), and this is extremely important in that it gives you real leverage when negotiating prices with X or A for a high volume product.
Email: OwOwOwOwO123@outlook.com
 

Online asmi

  • Super Contributor
  • ***
  • Posts: 2794
  • Country: ca
Re: Embedded Graphics Interface GUI
« Reply #11 on: February 19, 2020, 02:19:59 am »
In product designs I can literally just swap out the FPGA for a Gowin or Anlogic one with less than an hour of code changes (it will be mainly PCB routing effort), and this is extremely important in that it gives you real leverage when negotiating prices with X or A for a high volume product.
How many high volume products with FPGA have you shipped?

Offline OwO

  • Super Contributor
  • ***
  • Posts: 1250
  • Country: cn
  • RF Engineer.
Re: Embedded Graphics Interface GUI
« Reply #12 on: February 19, 2020, 03:58:25 am »
I work for a contractor so we don't ship products, but customers usually demand vendor flexibility (using generic parts whenever possible, minimum single sourced parts possible). No lock-in to X parts will usually be on the contract. In other words, product OEMs really consider this a top priority and IMO it's not hard to achieve, provided you don't get lazy and start using IP gratuitously like they want you to do.
« Last Edit: February 19, 2020, 04:00:43 am by OwO »
Email: OwOwOwOwO123@outlook.com
 

Online asmi

  • Super Contributor
  • ***
  • Posts: 2794
  • Country: ca
Re: Embedded Graphics Interface GUI
« Reply #13 on: February 19, 2020, 03:13:47 pm »
I work for a contractor so we don't ship products, but customers usually demand vendor flexibility (using generic parts whenever possible, minimum single sourced parts possible). No lock-in to X parts will usually be on the contract. In other words, product OEMs really consider this a top priority and IMO it's not hard to achieve, provided you don't get lazy and start using IP gratuitously like they want you to do.
It's not about being lazy, it's all about the cost. Engineers's time over here is very expensive, so when I give them a choice of "chip family A only for X$ and a couple of weeks" vs "vendor-agnostic for 10X$ and a couple of months", they always pick the first option. Extra investment into engineering only pays off the volume is high and the product's lifetime is long enough to "outlive" original chip.

Offline bentomoTopic starter

  • Contributor
  • Posts: 37
Re: Embedded Graphics Interface GUI
« Reply #14 on: February 19, 2020, 06:07:41 pm »
Yes. I started with Altera many years ago, and when it came time to make my own boards I had to go with X because A chips are hard to get. All of my existing VHDL modules "just worked" without any porting effort. I generally avoid using vendor IP, and device-specific functionality (PLLs, I/O interfacing) are only allowed at the top level module (these need to be redone with a board change anyway). Using IP or primitive instantiation in a core module is a no-no, and even when I e.g. need a specific DSP48E1 cascading structure I will infer it by writing HDL that exactly represent the DSP48 logic instead of instantiating it. In my FFT core the multiplier module is pluggable and I have several implementations each optimized for a different vendor. If you use the wrong vendor variant (e.g. use the DSP48 version on altera) it will still work but with lower Fmax.

In product designs I can literally just swap out the FPGA for a Gowin or Anlogic one with less than an hour of code changes (it will be mainly PCB routing effort), and this is extremely important in that it gives you real leverage when negotiating prices with X or A for a high volume product.

I generally like to target this too, I try to infer BRAMs and DSPs wherever possible. My main motivation being able to switch between Lattice and Xilinx easily. But do you have situations where you must use a primitive in a lower hierarchy? Like an an asynchronous crossing for example. Do you try to infer asynch latches and use vendor specific constraints? Or do you use xilinx primiteves like the xil_cdc blocks? I feel like in some situations you would need to create a wrapper and use a #define depending on which vendor to use it for.
 

Offline OwO

  • Super Contributor
  • ***
  • Posts: 1250
  • Country: cn
  • RF Engineer.
Re: Embedded Graphics Interface GUI
« Reply #15 on: February 20, 2020, 03:56:33 am »
Clock domain crossing isn't a problem, you just use a chain of flipflops and give them descriptive names (e.g. asyncCDCTarget) so that when they come up in timing analysis you can mark it as false path easily. On Xilinx you might also add the ASYNC_REG attribute:
Code: [Select]
attribute ASYNC_REG : string;
attribute ASYNC_REG of greyCDCSyncAsyncTarget: signal is "TRUE";
Which is optional and will prevent the registers from going into a SRL16. On other vendors this is ignored, and since you can add multiple attributes to a net you can simultaneously support all vendors' attributes.

The only time I've used something that can't be inferred is for clock gating (BUFGCE). For that I defined a "building blocks" library that contained a wrapper. Pretty much every vendor has a BUFGCE equivalent, so I would simply swap in a different wrapper to target Altera.

Some vendor specific primitives lead to more optimized designs, such as the SRL16, and in those cases I'll develop several variants of a basic building block. One example was a shallow FIFO, which has two variants, a SRL16 version and a LUTRAM version. Only on Xilinx would the SRL16 version be used, although it will still synthesize on other vendors because it's inferred.
Email: OwOwOwOwO123@outlook.com
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15246
  • Country: fr
Re: Embedded Graphics Interface GUI
« Reply #16 on: February 20, 2020, 02:16:03 pm »
I work for a contractor so we don't ship products, but customers usually demand vendor flexibility (using generic parts whenever possible, minimum single sourced parts possible). No lock-in to X parts will usually be on the contract. In other words, product OEMs really consider this a top priority and IMO it's not hard to achieve, provided you don't get lazy and start using IP gratuitously like they want you to do.
(...) Extra investment into engineering only pays off the volume is high and the product's lifetime is long enough to "outlive" original chip.

I don't quite agree with the "only" part of your statement. There are certainly other cases to consider here. Your above statement seems to only deal with ONE specific product!

One key thing here is reusability. Whether you work as a contractor, or directly in the company designing the product, there may be (in practice, very often IME) at least some part of the design that can be reused in future designs/other products, and that can largely justify some extra investment at some point - future developments can then be much faster and cost a lot less. If you keep designing one-shot solutions, it ends up being a lot less effective in the long run, with often less consistency.

Situations in which this doesn't apply are if ALL your designs are completely different with nothing in common - that's extremely rare IME, even when you're contracting! There's always something to reuse. This can also be a decision point if the deadline is much too short on a given project. But every time you can afford some initial extra time, it almost always pays off over time.

Sure you can also reuse things while using third-party IPs, but that's more limited in the long term, and progressively building your own IPs is IME a good investment. Anyway, the key idea here is: reusability across projects, while keeping complete control over the maximum number of "critical" blocks you're using.

 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf