Author Topic: FPGA VGA Controller for 8-bit computer  (Read 424721 times)

0 Members and 2 Guests are viewing this topic.

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3175 on: January 26, 2022, 12:16:15 pm »
The blitter doesn't take pixel locations, it only takes memory addresses? ???

I've got a command for Blitter Copy Source Pointer and Blitter Copy Destination Pointer amongst others - neither take pixel locations, they're both memory pointers?  In fact, I only have memory pointers as a way of referencing the screen with the GPU at all???
The blitter is the source address and since you are blitting from the display screen to the display screen, the 2 addresses should match.

Remember, you have the:
Code: [Select]
SET_PAGET <src/dest> <base_address_hex> <width> <depth>    > Sets PAGET's (pixel address generator) source/dest   memory address, image width, color depth in bits/pixel = (1/2/4/8/16).

For the source and destination address, display width and depth.

Say for example, address 200, width 720, 8 bits for both source and destination.

Now, set:
Code: [Select]
BLIT POS   <px> <py> <width> <height>  > Sets the source image top left pixel position (px,py) and the copy (width,height) in pixels.
Use 0,16,720,480 so you end up selecting a blit beginning 16 lines down all the way to 496, but 0-720 across.
You can also use 0,16,720,464 if you will be black-box fill drawing in the bottom line to clear the bottom line unless your screen is truly 720-496 where the extra 16 lines have already been cleared to a black.

Next:
Code: [Select]
DRAW PIXEL <X>   <Y>   <c>                                          > Draws a dot at coordinates [XY], c=color(0..255)
Use 0,0,0.

This will take the screen beginning 16 lines down and paste it beginning at 0,0, in otherwords, moving the screen 16 lines up.

Don't forget:
Code: [Select]
BLIT CONFIG  <ena>  <mask> <h-centp> <mirror> <v-centp> <flip> <r90>  <r45>To turn off the mask, turn off the h-centerp, turn off the v-centerp, and turn off everything except the <ena>.
« Last Edit: January 26, 2022, 12:19:56 pm by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3176 on: January 26, 2022, 12:23:44 pm »
Without my above instructions, how have you been blitting the text up until now?
It must be a killer to work out the beginning base address for each letter.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3177 on: January 26, 2022, 02:28:49 pm »
Without my above instructions, how have you been blitting the text up until now?
It must be a killer to work out the beginning base address for each letter.

Here's how I've been doing the scroll:
  • Address in memory is calculated for start of font row 2 and stored in ROW2_ADR when GPU is initialised
  • At start of scroll routine, set raster width & bpp for source and destination to width of window and correct bpp for mode
  • Set blitter copy width/height to window width and window height minus tile/font height
  • Set source pointer to ROW2_ADR
  • Set destination pointer to start of screen memory
  • Blit

 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3178 on: January 26, 2022, 05:01:31 pm »
Well, I hope you now understand that this was a round about way to do it.
Otherwise, how did you expect to do horizontal scrolling?
Even the original base address did not allow for 'ODD' address values meaning you could not set a horizontal pixel offset of 1 or 3 pixels in 8 bit mode, let alone 1 or 2 or 3... pixels in 4 bpp or 2bpp or 1 bpp  video modes.

I mean, if you wanted to scroll the screen to the left by 1 pixel on a 1bpp screen mode.  How would you do that if the base address for the first 8-16 pixels are all identical?
« Last Edit: January 26, 2022, 05:41:07 pm by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3179 on: January 28, 2022, 02:56:55 am »
Quote
Hi,

I still can't get scrolling to work again since the recent changes.  I am obviously missing something very obvious, but I gave up in frustration yesterday and thought a clear head would help this morning, but it hasn't. |O

Scrolling is currently overwriting the HW_REGs, no matter what address format I use to set the destination and source pointers (i.e. normal or >>8, I've tried both).  What am I doing wrong?

Here's a breakdown of the current scroll routine (values for 720x480x8 mode):

* Check FIFO isn't full before proceeding
* Clear X[0-3] and Y[0-3] registers
* Turn off blitter features with command 0001
* Set Y[3] to 0
* Set X[3] to 0x50
* Command 7B00 sets source address to Y[3]X[3] (0x000050), or address of start of screen space >>8 (it should be 0x5000 normally)
* Command 7C00 sets destination address to the same as above using Y[3]X[3] without setting/changing their values
* Y[3] set to 0
* X[3] set to 680 - width of Window[0] (this is screen width 720 minus 40 pixels for the borders)
* Command 7207 sets destination raster width to X[3], with low byte set to bpp for the screen mode
* Set Y[3] to tile height (height of font - 16 pixels)
* Set X[3] to 0
* Command 7700 sets source image offset to 0,16 (to skip top row of window)
* Set Y[3] to window height minus tile height (this will be 440-16) - MINUS 1 for 0-based register(?)
* Set X[3] to window width 680 - MINUS 1 for 0-based register(?)
* Command 7500 sets blitter copy width & height to above values to copy the window, minus one text row
* Command 0100 blits source to destination (the offset portion of the screen to the top at 0,0)
* Clear the bottom row (this isn't the issue at the moment so I won't detail this section)

What am I doing wrong?  Is this still overcomplicated?  :-//

Ok, I simulated a screen text CR/LF in the 'GPU_GEOMETRY_Testbench' and the saved test bitmaps seem to look fine.

I've attached the latest testbench as it now has the new 4GB addressable space source code.  However, I set modelsim to cap the screen memory to 16 megabytes, otherwise it will crash badly.

Enter modelsim and change to the testbench directory.
In the transcript, type:
Code: [Select]
do setup.do
do test_blitter.do
Wait 10 seconds for the sim to complete.

Read the 'GEO_tb_Blitter.txt' to see the commands I sent.
(CR/LF begins at line 145 which sets the blitter to match the destination output screen specs.)
(Lines 146,147,148 are optional if they have already been set accordingly.)
(Lines 150,151 perform the CR/LF.)
Read the 'GEO_tb_command_results.txt' to see what was actually being sent to the GEOFF.

Look at:
GEO_print_before_crlf_blit.bmp
GEO_print_after_crlf_blit.bmp
GEO_print_after_clear_boxfill.bmp

To see what's happening to the display after the commands are being sent.

If you like, you can change all the coordinates and screen sizes in the 'GEO_tb_Blitter.txt' to match your screen and pixel sizes as well as screen address and see what happens after typing 'do test_blitter.do' once again in the transcript.  Note that higher resolutions means a longer simulation time for the CR/LF as the sim needs to simulate all the hardware to copy all those pixels in simulated memory.
« Last Edit: January 28, 2022, 05:26:20 am by BrianHG »
 
The following users thanked this post: nockieboy

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3180 on: January 28, 2022, 05:08:43 am »
Here is a GEOFF patch for the GPU.  All the files in the .zip belong in the GPU's 'SV' folder.  The patch just makes the new 4gb DDR3 addressing change match the source files in the above Modelsim simulation.  There should be no change in functionality.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3181 on: January 28, 2022, 12:52:54 pm »
 :phew: Ok, here is the new 2 page block diagram 'BrianHG_GFX_VGA_Window_System.pdf' and 'BrianHG_GFX_VGA_Window_System.txt' documentation for developers.

Details here...
« Last Edit: January 28, 2022, 01:31:51 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3182 on: January 28, 2022, 03:08:00 pm »
Ok, I simulated a screen text CR/LF in the 'GPU_GEOMETRY_Testbench' and the saved test bitmaps seem to look fine.

I've attached the latest testbench as it now has the new 4GB addressable space source code.  However, I set modelsim to cap the screen memory to 16 megabytes, otherwise it will crash badly.

Enter modelsim and change to the testbench directory.
In the transcript, type:
Code: [Select]
do setup.do
do test_blitter.do
Wait 10 seconds for the sim to complete.

Read the 'GEO_tb_Blitter.txt' to see the commands I sent.
(CR/LF begins at line 145 which sets the blitter to match the destination output screen specs.)
(Lines 146,147,148 are optional if they have already been set accordingly.)
(Lines 150,151 perform the CR/LF.)
Read the 'GEO_tb_command_results.txt' to see what was actually being sent to the GEOFF.

Look at:
GEO_print_before_crlf_blit.bmp
GEO_print_after_crlf_blit.bmp
GEO_print_after_clear_boxfill.bmp

To see what's happening to the display after the commands are being sent.

If you like, you can change all the coordinates and screen sizes in the 'GEO_tb_Blitter.txt' to match your screen and pixel sizes as well as screen address and see what happens after typing 'do test_blitter.do' once again in the transcript.  Note that higher resolutions means a longer simulation time for the CR/LF as the sim needs to simulate all the hardware to copy all those pixels in simulated memory.

Perfect.  Got it working now.  I seem to recall you mentioning something about the order in which certain registers are set for the blitter, though I might be making that up - my memory is terrible thanks to an accident involving a mountain bike, my head and some tree roots years ago.  There didn't seem to be a lot wrong with the commands I was sending other than the order in a couple of cases, but once I'd sorted that and got single-line scrolling working, I needed to look at mass output (typing HELP, for example, produces about 3 screens worth of text in Mode 0) and it does look as though I was flooding the GPU's command FIFO as well; adding an extra FIFO status check has cleared up some random graphical issues with repeated scrolling.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3183 on: January 28, 2022, 07:02:19 pm »
Perfect.  Got it working now.
:-+
Quote
  I seem to recall you mentioning something about the order in which certain registers are set for the blitter, though I might be making that up - my memory is terrible thanks to an accident involving a mountain bike, my head and some tree roots years ago.  There didn't seem to be a lot wrong with the commands I was sending other than the order in a couple of cases, but once I'd sorted that and got single-line scrolling working, I needed to look at mass output (typing HELP, for example, produces about 3 screens worth of text in Mode 0)
  ??? Is mode 0 640x480, or 1920x1080?  If the latter, then my god, how big is your documentation?
Quote
and it does look as though I was flooding the GPU's command FIFO as well; adding an extra FIFO status check has cleared up some random graphical issues with repeated scrolling.
Yes, in super-hires, blitting a full screen CR/LF with the current old-fashioned pixel-writer driving the DDR3 command bus, it will take around 1/60th of a second, maybe 1/30th of a second for 1080p.

2 ways around this:

A. Use a tile font text mode and still use the blitter and box fill, but blitt and fill (actually now only a line) to the screen data which will be a 16/32bit word of ascii data for every 8x16 pixels on display.

Approx calculation:
For 32bit character mode = 8:(32/(8*16))= or  1:32, or 32 times faster.
(32 bit blits probably wont work as we never expected to go that far with the 8 bit GPU, but you can cheat this one with 16bit or 8 bit blits setting the screen width to 2x or 4x the character width when driving the GEOFF.)
For 16bit character mode = 8:(16/(8*16))= or  1:64, or 64 times faster. (16 bit blits should work, I hope.)

B. Make the new memory co-processing engine which can move ~750 megabytes a second, and clear/write/set around 1500 megabytes a second.  Probably at least 15x the current speed.
« Last Edit: January 28, 2022, 07:10:44 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3184 on: January 28, 2022, 07:58:34 pm »
Quote
  I seem to recall you mentioning something about the order in which certain registers are set for the blitter, though I might be making that up - my memory is terrible thanks to an accident involving a mountain bike, my head and some tree roots years ago.  There didn't seem to be a lot wrong with the commands I was sending other than the order in a couple of cases, but once I'd sorted that and got single-line scrolling working, I needed to look at mass output (typing HELP, for example, produces about 3 screens worth of text in Mode 0)
  ??? Is mode 0 640x480, or 1920x1080?  If the latter, then my god, how big is your documentation?

Mode 0 = 720x480, Mode 1 = 1280x720, Mode 2 = 1920x1080.  All modes use a window with a 20-pixel border around its edges, so take 40 pixels off those dimensions for the actual display area.

It is three pages in Mode 0.  Less than 16KB as it fits in a ROM bank. ;)

Quote
and it does look as though I was flooding the GPU's command FIFO as well; adding an extra FIFO status check has cleared up some random graphical issues with repeated scrolling.
Yes, in super-hires, blitting a full screen CR/LF with the current old-fashioned pixel-writer driving the DDR3 command bus, it will take around 1/60th of a second, maybe 1/30th of a second for 1080p.

Scrolling in 1080p is noticeably slower.

2 ways around this:

A. Use a tile font text mode and still use the blitter and box fill, but blitt and fill (actually now only a line) to the screen data which will be a 16/32bit word of ascii data for every 8x16 pixels on display.

Approx calculation:
For 32bit character mode = 8:(32/(8*16))= or  1:32, or 32 times faster.
(32 bit blits probably wont work as we never expected to go that far with the 8 bit GPU, but you can cheat this one with 16bit or 8 bit blits setting the screen width to 2x or 4x the character width when driving the GEOFF.)
For 16bit character mode = 8:(16/(8*16))= or  1:64, or 64 times faster. (16 bit blits should work, I hope.)

B. Make the new memory co-processing engine which can move ~750 megabytes a second, and clear/write/set around 1500 megabytes a second.  Probably at least 15x the current speed.

B is looking like a far more interesting route. :-+  I don't really want to go back down the tile font text mode route again as I can easily do text and graphics on the screen currently, but yes I know that text mode is fast.  I'm going to get around to writing a tile mode demo soon though, as it'll be useful for countless applications in gaming etc.

Still have a remaining bug to squash in mode-switching.  I'm still sounding this one out at the moment, but it appears to happen if the GEOFF draws a shape in a higher-resolution mode, outside the bounds of a lower resolution mode, then I switch to that lower resolution mode.  It shouldn't make any difference to anything, but it causes screen corruption on switching to a lower mode and causes the GPU and or whole system to lock up after a few characters are typed into the console.  Like I said, I'm still exploring this issue at the moment so I can't clearly define the conditions that cause it (though what I've described above it pretty much it), and it's most likely to be an issue with my code, but at the moment I can't see an obvious cause for it.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3185 on: January 28, 2022, 08:14:16 pm »
B is looking like a far more interesting route. :-+  I don't really want to go back down the tile font text mode route again as I can easily do text and graphics on the screen currently,
??? You cannot place 1 text window above or below a graphics window?

This way when CR/LF scrolling, you may separately decide to scroll or not scroll the graphics image in tandem as well.  Well, I guess you can still do this with 2 graphics layers, but, the picture data cannot be separated with the text data if they are on the same layer.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3186 on: January 28, 2022, 11:08:57 pm »
B is looking like a far more interesting route. :-+  I don't really want to go back down the tile font text mode route again as I can easily do text and graphics on the screen currently,
??? You cannot place 1 text window above or below a graphics window?

Well I suppose I could - perhaps I'm a little too fixated on duplicating the display of my old 8-bit computer.  I'm also trying to shake the mindset that I'm short on video memory space.  When the opportunity to shave 10-20 bytes off of some critical code is a big thing, it's hard for me to switch to the mindset that I've got megabytes of DDR3 RAM sitting around doing nothing.

This way when CR/LF scrolling, you may separately decide to scroll or not scroll the graphics image in tandem as well.  Well, I guess you can still do this with 2 graphics layers, but, the picture data cannot be separated with the text data if they are on the same layer.

Indeed, the possibilities are endless; but whilst a text layer and a graphics layer would be trivial for the GPU and DDR3 RAM, the driver for both would take up a significant portion of my system's limited available resources.  My DMI has to fit within a 16KB ROM bank - that includes the GPU driver as well, and I'm down to my last 1KB space for testing and development, and that's with some features turned off to make room for the GPU driver.  I'll probably develop the tile system for CP/M as that gives me a little more room to manoeuvre.

Hopefully not getting too far ahead regarding the co-processor, but I've started reading about CPU design in case it helps me understand what's going on later.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3187 on: January 28, 2022, 11:21:33 pm »
Well, with 'OPTION B' memory processor for manipulating pixels and such, each instruction will either be 64bits, or 128bits.  So, we are going to eat through those tiny 16kb banks in a flash just to store a set of instructions / functions you wish to access.

Though, the multi-coordinate drawing functions like a quadrilateral will still exceede 128bit as you will have 8x16bit coordinates, then you still need a 32bit color and which display address it belongs to.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3188 on: January 29, 2022, 03:00:03 am »
Ok, I'm now working on a weird overflow bug in my DDR3 controller when you really load it beyond 95% bandwidth.  It will occasionally glitch out on the read channel, then, recover a frame later.

It's another annoying once in a blue moon bug to track down.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3189 on: January 29, 2022, 08:04:11 am »
Well, with 'OPTION B' memory processor for manipulating pixels and such, each instruction will either be 64bits, or 128bits.  So, we are going to eat through those tiny 16kb banks in a flash just to store a set of instructions / functions you wish to access.

Though, the multi-coordinate drawing functions like a quadrilateral will still exceede 128bit as you will have 8x16bit coordinates, then you still need a 32bit color and which display address it belongs to.

Naïve question; do the instructions have to be so many bits wide?

The 16KB bank issue relates solely to my DMI test/development environment on the uCOM, which I use because it's quick to develop for, get into and update with changes.  If I'm using CP/M, the program can (in theory) be any size, but without bank swapping or magical waving of hands, I have about 46KB to play with.  I could take over the entire system with a game and have the full 64KB to play with. With bank swapping and careful programming, I have megabytes to play with (albeit in 16KB chunks).

Ok, I'm now working on a weird overflow bug in my DDR3 controller when you really load it beyond 95% bandwidth.  It will occasionally glitch out on the read channel, then, recover a frame later.

It's another annoying once in a blue moon bug to track down.

That's all I seem to have been doing recently.  I feel your pain.
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3915
  • Country: gb
Re: FPGA VGA Controller for 8-bit computer
« Reply #3190 on: January 29, 2022, 10:53:16 am »
Isn't this project too *complex* to be a-single-person hobby?
Really I am lost here, too many things, too many pages for my little free time.

Guys, you have my moral support  :-+
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3191 on: January 29, 2022, 09:29:00 pm »
Well, with 'OPTION B' memory processor for manipulating pixels and such, each instruction will either be 64bits, or 128bits.  So, we are going to eat through those tiny 16kb banks in a flash just to store a set of instructions / functions you wish to access.

Though, the multi-coordinate drawing functions like a quadrilateral will still exceede 128bit as you will have 8x16bit coordinates, then you still need a 32bit color and which display address it belongs to.

Naïve question; do the instructions have to be so many bits wide?

Hun?  Count the coordinates.  A quadrilateral has 4 pairs of XY coordinates.  Each XY coordinate has 16bits for the X and 16 bits for the Y coordinate.  That's 32bits * 4 pairs = 128bits.  Each quad is 128bits not counting the 32 bit color, filled or not filled.  Now, we haven't yet decided how this will be stored in DDR3 ram as your current old design relies on the Z80 to copy this data to the GEOFF port and it is still the old-school 12bit X&Y coordinate system which cannot support large windows bigger than the display resolution.  Remember, you want to switchover from having the Z80 sending everything to the GEOFF is 16bit chunks to a system where you have a per-compiled string of commands stored in the DDR3 as microcode and has the Z80 just tell the new microcode system to run the code @DDR3 address and let the GPU do all the graphics work on it's own.  (This means a lot of home make assembly programming to engineer all you graphics routines.)

The Current DDR3 is optimized to read and write 128bit chunks at a time and does exponentially better as you enlarge that read or write by multiples of 128bit.  If you ever upgrade the PCB to a 256bit DDR3 system, then 256bit becomes the optimum burst chunk size.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3192 on: January 29, 2022, 09:33:36 pm »
Isn't this project too *complex* to be a-single-person hobby?
Really I am lost here, too many things, too many pages for my little free time.

Guys, you have my moral support  :-+
I'm sure Noickieboy has done his share.  Because of the DDR3 and new Window system, plus my earlier contributions, I have contributed over 25K lines of HDL code not counting some of the extras like the RS232 hex editor and debugger.   But, yes, properly doing such a project should have required a team of around 8-12 people beginning with a 25kgate FPGA from day one instead of the extra effort involved in squeezing so much into a 6k gate FPGA at the beginning.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3193 on: January 29, 2022, 11:18:08 pm »
Well, with 'OPTION B' memory processor for manipulating pixels and such, each instruction will either be 64bits, or 128bits.  So, we are going to eat through those tiny 16kb banks in a flash just to store a set of instructions / functions you wish to access.

Though, the multi-coordinate drawing functions like a quadrilateral will still exceede 128bit as you will have 8x16bit coordinates, then you still need a 32bit color and which display address it belongs to.

Naïve question; do the instructions have to be so many bits wide?

Hun?  Count the coordinates.  A quadrilateral has 4 pairs of XY coordinates.  Each XY coordinate has 16bits for the X and 16 bits for the Y coordinate.  That's 32bits * 4 pairs = 128bits.  Each quad is 128bits not counting the 32 bit color, filled or not filled.  Now, we haven't yet decided how this will be stored in DDR3 ram as your current old design relies on the Z80 to copy this data to the GEOFF port and it is still the old-school 12bit X&Y coordinate system which cannot support large windows bigger than the display resolution.  Remember, you want to switchover from having the Z80 sending everything to the GEOFF is 16bit chunks to a system where you have a per-compiled string of commands stored in the DDR3 as microcode and has the Z80 just tell the new microcode system to run the code @DDR3 address and let the GPU do all the graphics work on it's own.  (This means a lot of home make assembly programming to engineer all you graphics routines.)

The Current DDR3 is optimized to read and write 128bit chunks at a time and does exponentially better as you enlarge that read or write by multiples of 128bit.  If you ever upgrade the PCB to a 256bit DDR3 system, then 256bit becomes the optimum burst chunk size.

Ah, I misunderstood and thought you just meant the instruction itself, not the data that goes with it.  Remind me - we're aiming for a memory implementation, aren't we?  Where the data and some instructions are written to GPU memory and the GPU gets on with it?

Isn't this project too *complex* to be a-single-person hobby?
Really I am lost here, too many things, too many pages for my little free time.

Guys, you have my moral support  :-+
I'm sure Noickieboy has done his share.  Because of the DDR3 and new Window system, plus my earlier contributions, I have contributed over 25K lines of HDL code not counting some of the extras like the RS232 hex editor and debugger.   But, yes, properly doing such a project should have required a team of around 8-12 people beginning with a 25kgate FPGA from day one instead of the extra effort involved in squeezing so much into a 6k gate FPGA at the beginning.

This most definitely started out as a single-person hobby.  I designed and built the uCOM myself, then decided I'd try to wrap my head around building some sort of video output - hence this thread.  What happened thereafter is well documented here in the starting pages, but basically BrianHG has lent his considerable knowledge and skill with FPGAs and taught me a heck of a lot about them in the process.  I'd never really heard of FPGAs before starting this thread and we started with a 6kgate FPGA because it was cheap and in a package that I felt my soldering skills could stretch to (a QFP-144) - but as BrianHG points out, if we could have gone straight to a 25kgate BGA package, it would have skipped a significant diversion at the start, but I feel starting simple and building from there makes it easier for people (who have the time and willpower) to follow.  ;)

The GPU project has pretty-much outgrown the uCOM project that birthed it - you'd have a hard time arguing that the development and technology going into the GPU has any place being hosted by an 8-bit system - but at the same time the GPU can be used with ANY home-built system (be it 8-, 16- or more bits).

I am, now more than ever, along for the ride as what BrianHG is doing is well beyond my capabilities and even understanding for the most part. I provide some hardware corroboration that the GPU is working as intended with a host system and offer feedback on the design and features where appropriate, but this a public project intended to help anyone out who wants to do more than just print text to the screen.  And I'm really enjoying it as I'm learning about things and doing stuff that I couldn't even have dreamed of when I was a kid.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3194 on: January 29, 2022, 11:31:24 pm »
Ah, I misunderstood and thought you just meant the instruction itself, not the data that goes with it.  Remind me - we're aiming for a memory implementation, aren't we?  Where the data and some instructions are written to GPU memory and the GPU gets on with it?

The data and instructions will be the same thing.  IE: A single bitstream of data can draw objects as well as perform other GPU instructions besides GEOFF functions and generate a drawing.  The complexity of the support structure managing a lot of this will be up to you to develop.  If done right, most geometric objects should be akin to a file just being copied to DDR3 and you setting a pointer to execute that drawing, one of many as well as a master routine table calling each drawing in a list.  Within that drawing, I would recommend setting up some system default register pointers which the drawing can call, like screen base address and resolution, window drawing limits and scale/rotate coordinates.  (Scale and rotate, including floating point support here is assuming you want to add such modules to the GEOFF, but we are a long way off from that.)

Ok, lets see something interesting with what you got now.
« Last Edit: January 29, 2022, 11:33:38 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3195 on: February 01, 2022, 04:33:37 pm »
Here's a little something.  It's really simple, doesn't using tiling but it does show parallax scrolling over 4 layers (5 if you include the normal text layer in the background):



Took some messing around with the PDI/SDI swap settings - I think a little more info on how they work would be handy.  If I get more time in the future, I'll make a start on something a little more technically impressive in tile mode.

The CP/M program is less than 130 lines of assembly and took me about ten minutes.  Converting and setting the images up in the GPU RAM and tweaking the PDI/SDI layers took a couple of hours.  :-[
« Last Edit: February 01, 2022, 04:35:51 pm by nockieboy »
 
The following users thanked this post: voltsandjolts, BrianHG, Ted/KC9LKE

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3196 on: February 01, 2022, 08:46:46 pm »
Here's a little something.  It's really simple, doesn't using tiling but it does show parallax scrolling over 4 layers (5 if you include the normal text layer in the background):
Nice.  :-+
Quote
Took some messing around with the PDI/SDI swap settings - I think a little more info on how they work would be handy.  If I get more time in the future, I'll make a start on something a little more technically impressive in tile mode.
I know it may be a little confusing.  There is a technical limitation due to the method in which the SDI video layers come out in a sequential manner compared to the PDI Layers.  I tried to code it as simple as possible.  I recommend opening up all the layers with a screen size big white fat 0 through 15 text with black outline on each layer and play with the settings to see what happens.  As you can see on my page 2 of my .pdf documents, the bitwise 'XOR' means first for each PDI sequence, you can swap layers 0/1, 2/3, or 0&1, 2&3, or reverse the 4 layers.  Same goes for 4/5/6/7, 8/9/10/11, 12/13/14/15.  Then, once those layers have been optionally swapped around, you have the vertical equivalent swap.  If I were to do it any other way, like assigning 1 layer to a target layer, you would need a full byte per layer, meaning 64 control bytes to address 64 SDI layer swapping plus another 64 bytes to address the 64 PDI layer swapping with a ton of routing inside the FPGA.

This means if you are making a game where the character can move vertically in front and behind a few parallax scrolling layers, you will just have to strategically space the parallax layer numbers so that the players end up on the second or third location when it comes to the vertical PDI swap variable allowing you to move your player vertically among the parallax background layers without having to change those layer swap variables.
Quote
The CP/M program is less than 130 lines of assembly and took me about ten minutes.  Converting and setting the images up in the GPU RAM and tweaking the PDI/SDI layers took a couple of hours.  :-[
Well, scrolling 4 windows means nothing more than incrementing 4 'CMD_win_bitmap_x_pos' 16 bit integers at a different pace for each window.  This can even be done in basic with around 10 lines of code.  As for the graphics, yes, converting and dealing with the palettes and making use of the transparency is a massive chore especially if you haven't engineered any tools for the job.  Well, if you made the graphics in photoshop using translucency and save those images as 32bit .tiff uncompressed, like that madilorian 512x512x32bit image you uploaded awhile back, all you would have to do is strip out the .tiff header, open your 4 windows as 32 bit layers in the GPU with the correct bitmap width setting and copy in the raw data to the correct beginning address.  No palettes, just perfect 32bit true color with transparency data.

Looking at the graphic data having dithering patterns for the background, I'm assuming you took these images from somewhere else?  Otherwise, you have a new 256 color palette per window layer, or even 16.7m colors per layer, dithering should have not been needed.
« Last Edit: February 01, 2022, 09:09:42 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3197 on: February 01, 2022, 09:29:29 pm »
As for the graphics, yes, converting and dealing with the palettes and making use of the transparency is a massive chore especially if you haven't engineered any tools for the job.

Thanks - another idea for a tool I can make; something that will open a binary file and allow me to change values throughout the file, so I can reassign pixels to custom palette entries. :-+ :-/O

EDIT: Or I could just not re-invent the wheel and use Find/Replace in my hex editor of choice. ::)

On the subject of palettes, how can I make a particular Layer use a particular palette?  With five layers, each referenced its own palette when I would have preferred to point them all to the same palette.

Looking at the graphic data having dithering patterns for the background, I'm assuming you took these images from somewhere else?  Otherwise, you have a new 256 color palette per window layer, or even 16.7m colors per layer, dithering should have not been needed.

Yes, for familiarity and ease I just downloaded some Creative Commons assets from here: https://ansimuz.itch.io/mountain-dusk-parallax-background

I did think about using some 32-bit images with transparency, but I wanted to keep the sizes small and use something more era-appropriate for a test run. ;)
« Last Edit: February 01, 2022, 09:51:00 pm by nockieboy »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #3198 on: February 01, 2022, 09:47:45 pm »
The only way I can get the graphics into the GPU currently is via the RS232_debugger.  I can't access the actual graphics in the GPU's DDR3 RAM from the uCom if I wanted to - the first layer starts at the very end of GPU RAM that the uCOM can access within its 512KB window.

As a minor side project, I'm going to start on an MMU for the uCOM's memory interface to the GPU's DDR3.  I'm just starting to think about it and could do it one of two ways that I can see at the moment:
  • I could page 512KB banks of DDR3 into the uCOM's memory window. Maybe the easiest way to do this would be to control which 512KB bank of DDR3 is visible to the uCOM via another IO port, which would give the uCOM access to 256x512KB banks - enough to access 128MB of DDR3 if I use only one additional IO port.
  • The other way would be to keep the lowest 16KB of DDR3 (the HW_REGS and palettes) always visible to the uCOM, and instead of IO, use a memory location/s in that first 16KB to control which 496KB chunk is available in the rest of the 512KB window.  That sounds complicated, but would give access to unlimited memory.
Option 1 seems the most sensible way to do it.  This would involve some minor additional code in Bridgette (the Z80 interface) to add some additional address lanes to its DDR3 pathway and control the upper bits with an additional IO port value.  Is there an easier/better way to do this?  Can you see any issues that I am unaware of or haven't thought of yet?
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7733
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #3199 on: February 01, 2022, 09:52:37 pm »
On the subject of palettes, how can I make a particular Layer use a particular palette?  With five layers, each referenced its own palette when I would have preferred to point them all to the same palette.
I'm sorry, but it is hard wired.
If I were to share palettes, every PDI layer channel would still have it's own palette.
You will have to wait for the memory copy function, store 1 master palette in DDR3 and send the command to copy that reference palette into all the destination palettes once a V-Sync or so.
Actually you can already do this using the blitter.
Store your palette anywhere in DDR3 so long as it's base address begins on a 128byte boundary.

When copying, you are basically blitting a single 1024x1 pixel 8 bit bitmap to the destination base address, coordinates 0x0 to fill layer 1 palette.  Paste a blit pixel again into 0x1 to fill the palette for layer 1, paste a blit pixel again into 0x2 to fill palette for layer 2.  To fill the palettes of all layers, just draw a vertical blit line from 0x0 to 0x15.  You can also blit from palette to palette just by defining the blit source image at address of palette 0, 8bit, 1024 pixel width and draw a line from 0x1 to 0x15.  You can also blit portion of a palette by defining the X coordinates in your copy.

You can also stack the palette vertically by making the blit source 4 8bit pixels wide and using every 1 Y coordinate for the color index #.
« Last Edit: February 01, 2022, 09:58:42 pm by BrianHG »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf