Author Topic: FPGA VGA Controller for 8-bit computer  (Read 416294 times)

0 Members and 1 Guest are viewing this topic.

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1200 on: July 03, 2020, 09:59:16 am »
Fixing the 'Data_mux' module bug.
If you remember, in the Z80_bridge, we had to make the read requests and write requests going into the data_mux 2 clock cycles fat before we could solve the Z80 R&W errors.
See here: https://www.eevblog.com/forum/fpga/fpga-vga-controller-for-8-bit-computer/msg2859700/#msg2859700

Anyways, the problem had to be solved 'properly' for the geometry pixel writer.
Step 1, simulate the original 'data_mux' with single clock cycle pulsed read and write requests from both ports A&B.
1015544-0
The green boxes and lines denote a read or write request, and the successful results.
The red boxes and lines denote a read or write request, but failure to execute a proper response.

Next, the re-write.  The new 'data_mux_v2' with the same simulation stimulus:
1015548-1
As you can see, all the same read & write requests properly execute the desired proper response.  In fact, as an improvement, right at 230ns, I placed 2 consecutive read requests simultaneously on on both port inputs for a total of 4 consecutive reads.  The new data_mux_v2 now cached the 4 reads, interleaves the read addresses to the GPU ram and returns all 4 addressed data results to the correct port.
(You can sort of see why we had problems earlier.  And also how adding another port for the geometry pixel writer would have been a nightmare if I didn't patch this one up properly...)

Attached are the Quartus Simulation test benches for both the original and the new data_mux 2:1 ram port emulator.

@nockieboy, step 1, change the data_mux to the data_mux_v2 in your GPU project from my working simulat source code.  You will also need to add my 'fifo_3word_0_latency.sv' source code located in the simulation project to your GPU project as well.

Step 2, text the Z80 and RS232 interface.

If working, then go to step #3, change the Z80_bridge back the it's original intended mode of operation.

Step3, fix these lines in 'Z80_bridge.sv' so that the gpu_rd_req&wr_ena pulses are now 1 clock cycle wide like so:
Code: [Select]
   // GPU RAM FLAGS
   gpu_wr_ena     <= Write_GPU_RAM && ~last_Z80_WR; // (data_mux bug fixed, pulse for only 1 clock, not 2) 2 ; // Pulse the GPU ram's write enable only once after the Z80 write cycle has completed
   gpu_rd_req     <= Read_GPU_RAM  && ~last_Z80_RD; // (data_mux bug fixed, pulse for only 1 clock, not 2) 2 ; // Pulse the read request only once at the beginning of a read cycle.

And test the Z80 access + RS232 Debugger access.  Report results please.
« Last Edit: July 03, 2020, 10:07:58 am by BrianHG »
 
The following users thanked this post: nockieboy

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1201 on: July 03, 2020, 11:07:13 am »
Okay, have done some testing with the new mux_v2 module.  Primarily, everything seems to work fine - however, my PS/2 keyboard no longer works properly.  After about the second keypress, the screen is filled with :) faces and then blanks.

Initially I thought the keyboard may be damaged, as I managed to drop it on the floor earlier ::), but a test using the old mux module showed it was working okay.  I'm still getting the odd duplication of some key presses which I've been meaning to look at, but on the whole it's usable.

This indicates there's still an issue with the IO read function in the z80_bridge, which is probably affected by the changes in the mux_v2 module, making it unusable.  I think I'm going to need help debugging that.  However, whilst testing with the serial console providing input to the system, I didn't experience any errors accessing or modifying GPU RAM.

EDIT:  I'm going to try again, but with the Stage 3 changes to the z80_bridge as well.
« Last Edit: July 03, 2020, 11:13:03 am by nockieboy »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1202 on: July 03, 2020, 11:22:44 am »
No, doesn't seem to be working at all with the stage 3 modification.  The screen is badly corrupted, randomly each time I reset the system, so it looks like writes to the GPU RAM are being corrupted and it can't set up the display properly.  That's before I can even get to using the keyboard.  :-\

EDIT:  The RS232_debugger is reading and writing the GPU RAM fine, however.
« Last Edit: July 03, 2020, 12:35:41 pm by nockieboy »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1203 on: July 03, 2020, 02:26:13 pm »
Please try setting just the read to 1 pulse and the write enable back to 2 pulses to make sure the new reads 1 shot reads are ok.

If the single write present the bug, upload a copy of the latest version for me to take a look at.  This time around, I think we may be latching the Z80 data bus itself too early and I want to try something.

Also, please specify how the Z80 software keyboard is accessed.  I would like to know how it relates to 'writing' to GPU ram, and then how it relates to reading the GPU ram.  IE - when does the keyboard access use the 'gpu_wr_ena' and  'gpu_rd_req' flags.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1204 on: July 03, 2020, 03:03:37 pm »
Please try setting just the read to 1 pulse and the write enable back to 2 pulses to make sure the new reads 1 shot reads are ok.

Yes, that's working.

If the single write present the bug, upload a copy of the latest version for me to take a look at.  This time around, I think we may be latching the Z80 data bus itself too early and I want to try something.

No problem - latest z80_bridge.sv attached.

Also, please specify how the Z80 software keyboard is accessed.  I would like to know how it relates to 'writing' to GPU ram, and then how it relates to reading the GPU ram.  IE - when does the keyboard access use the 'gpu_wr_ena' and  'gpu_rd_req' flags.

Hmm.. well that's the thing, the keyboard doesn't touch the GPU RAM (at least not in the z80_bridge). Here's the process:

1) A key is pressed
2) ps2_keyboard_to_ascii.vhd converts the keycode to an ASCII code, after filtering 'break' codes, shifts, caps, etc.
3) PS2_RDY goes HIGH to signal valid char data in z80_bridge.
4) The char is latched into the PS2_CHAR register (line 289 in z80_bridge)

That's it for z80_bridge.  Currently I'm not using interrupts, so the char sits in PS2_CHAR until it's overwritten with another character or the Z80 initiates an IO read cycle addressing the PS/2 IO port:

1) IO_DATA_RL goes HIGH on detection of an IO cycle addressing the PS2 port (240) (lines 312-317)
2) After a short delay, IO_DATA_RL goes HIGH, causing the bidirectional data bus to switch to output (TO Z80) and placing the char onto the data bus (lines 320-327)
3) At the end of the IO cycle, PS2_CHAR is cleared to zero (important as the Z80 software takes a zero value as no char, otherwise it would repeatedly think the same key has been pressed)
4) The bidirectional data bus switches back to TO FPGA and goes hi-Z

So all I'm using to store the PS/2 character code is an 8-bit register.  Once the Z80 has received a valid character it does whatever it needs to with it and may send it back to the GPU RAM as a character to display (echoing the key press on the screen).

 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1205 on: July 03, 2020, 03:15:47 pm »
not enough,  need the full project you are working with...
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1206 on: July 03, 2020, 03:26:15 pm »
Give this line a try:
Code: [Select]
// This will make sure the Write_GPU_RAM is high for 3 consecutive clocks.  The output 'gpu_wr_ena' will still only pulse for 1 clock cycle.
// Think of this like a noise removal for the bus write command and address with extra time for the Z80 data bus to stabilize before accepting and writing the data to the GPU
gpu_wr_ena     <= (Write_GPU_RAM && last_Z80_WR && last_Z80_WR2) && ~last_Z80_WR3; // (data_mux bug fixed, pulse for only 1 clock, not 2) 2 ; // Pulse the GPU ram's write enable only once after the Z80 write cycle has completed
« Last Edit: July 03, 2020, 03:55:59 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1207 on: July 03, 2020, 03:44:36 pm »
Hmm.. well, aside from the background colour being dark yellow instead of black and the text getting corrupted (but only when the screen scrolls up), it's a bit of an improvement.

Using the memory editor, I can see that there are random read errors appearing in the data.  I'm viewing the 2nd page of GPU RAM, which is empty (all zeros), and every time I read that page I'm getting between 0-3 errors - they're coming up as 0x7Es, not that it means much I guess.

Perhaps we need to delay the Write_GPU_RAM for a little longer?

EDIT:  If it helps, I found that putting data onto the Z80's data bus too early in the IO cycle meant I got garbage when trying to read a value from the GPU via IO.  The memory cycle timing is tighter, but perhaps it's the same issue?
« Last Edit: July 03, 2020, 03:50:46 pm by nockieboy »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1208 on: July 03, 2020, 04:04:31 pm »
Now, keeping the 3 clock verified write request, test 2 new things.

A) On the data_mux, change the 'READ_CLOCK_CYCLES' to 3.

Then:
B) gpu_rd_req     <= (Read_GPU_RAM  &&  last_Z80_RD  &&  last_Z80_RD2) && ~last_Z80_RD3 ;

Then:
C) On the data_mux, change the 'READ_CLOCK_CYCLES' back to 2.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1209 on: July 03, 2020, 04:56:57 pm »
Now, keeping the 3 clock verified write request, test 2 new things.

A) On the data_mux, change the 'READ_CLOCK_CYCLES' to 3.

Then:
B) gpu_rd_req     <= (Read_GPU_RAM  &&  last_Z80_RD  &&  last_Z80_RD2) && ~last_Z80_RD3 ;

Then:
C) On the data_mux, change the 'READ_CLOCK_CYCLES' back to 2.

Okay, I did A & B together and tested, works fine.  Made the change in C, tested - works fine.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1210 on: July 03, 2020, 05:05:42 pm »
Ok, so far, so good.  Stay with 'C'.  I found some errors when simulating the data_mux_v2 in 'Timing' mode instead of 'Functional' mode.  I will post an update of the newer 'data_mux_v2' later tonight.

As of this point, we now know that single cycle write&read req pulses now work.
We also seem to need to really delay/slow down the Z80 data by 20-30ns to get proper results.

Q: Does your keyboard work fine?
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1211 on: July 03, 2020, 05:23:38 pm »
Here is update #1 for 'data_mux_v2.sv'.
You will need to generate and update symbol in quartus for the new 'ZERO_LATENCY' parameter to appear.
For now, keep it at 0.  The setting of 1 has some issues when simulating in 'Timing Mode'.  The 0 latency bug is in the new 'FIFO_3word_0_latency.sv' which I will update soon.

 
The following users thanked this post: nockieboy

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1212 on: July 03, 2020, 08:36:38 pm »
Ok, all fixed up.  Both functional simulation and timing simulation give matching correct results with all configurations.

You will need to create and update symbol so that the new parameter 'REGISTER_GPU_PORT' shows up.
When on, this parameter improves the FMAX, but adds a 1 clock delay to the read_req's.

If you get read errors, increase the READ_CLOCK_CYCLES to 3.

You can test the code with REGISTER_GPU_PORT=1 or 0, verifying that the FMAX of C0>125MHz.  With the register port = 0, if your FMAX is above 125 now, I bet it wont be when we add the geometry unit.

New simulation results in 'timing mode'.  Though messier, you can see the reads match the green functional simulation above.  Before, the above code gave crap results when simulating in 'timing mode'.  This is a sign that your design may exhibit meta-stability issues or may function erradically from 1 FPGA to the next.

1016044-0

All files attached below.
Don't forget to copy over the 'fifo_3word_0_latency.sv' as well.
Damn that was a lot of work........
« Last Edit: July 03, 2020, 08:54:01 pm by BrianHG »
 
The following users thanked this post: nockieboy

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1213 on: July 03, 2020, 11:58:41 pm »
Just thought I'd check this quickly before I go offline for the night.  Works perfectly.  :-+

I've still got duplicate keys appearing when I type, but they're about as frequent/possibly slightly less frequent than before.  More an annoyance than a critical problem.

No read errors at all when I view the GPU RAM memory pages.  No artefacts or errors in the display setup, so it looks like RD/WR are working nicely now. :-+

I'm not sure I know exactly what I'm doing checking Fmax, but I've looked at the Timing Analyzer->Slow 1200mV 85C Model->Fmax Summary, and it's showing 4 clocks:

    Fmax           Restricted Fmax  Clock Name
1: 77.68 MHz    77.68 MHz          inst17|altpll_component|auto_generated|pll1|clk[2]
2: 139.18 MHz  139.18 MHz        inst17|altpll_component|auto_generated|pll1|clk[0]
3: 268.6 MHz    268.6 MHz          inst17|altpll_component|auto_generated|pll1|clk[1]

(and a 919.12 MHz Fmax for the PS/2 keyboard_to_ascii debounce circuit)

So hopefully all working well now (apart from the IO read cycle / PS2_CHAR register output).   ;D
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1214 on: July 04, 2020, 12:22:39 am »

I'm not sure I know exactly what I'm doing checking Fmax, but I've looked at the Timing Analyzer->Slow 1200mV 85C Model->Fmax Summary, and it's showing 4 clocks:

    Fmax           Restricted Fmax  Clock Name
1: 77.68 MHz    77.68 MHz          inst17|altpll_component|auto_generated|pll1|clk[2]
2: 139.18 MHz  139.18 MHz        inst17|altpll_component|auto_generated|pll1|clk[0]
3: 268.6 MHz    268.6 MHz          inst17|altpll_component|auto_generated|pll1|clk[1]

(and a 919.12 MHz Fmax for the PS/2 keyboard_to_ascii debounce circuit)

So hopefully all working well now (apart from the IO read cycle / PS2_CHAR register output).   ;D

This is your 'pll1', 'inst17' you have in your project.  It has 4 clock outputs in use.  clk[0,1,2].  (don't worry about the missing c[3], it just feeds 1 flipflop data input, it's actually not clocking anything so it is not reported)


     I calculated the frequencies of the clock outs clk [0,1,2] based on the PLL settings in the window.  Now, in the timing report, are all the reported fastest possible clocks of the GPU for each clk[ # ] domain fast enough to operate without flaw according to the actual clock speeds we are running those sections at? 
« Last Edit: July 04, 2020, 12:30:45 am by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1215 on: July 04, 2020, 12:32:48 am »
I've still got duplicate keys appearing when I type, but they're about as frequent/possibly slightly less frequent than before.  More an annoyance than a critical problem.
Upload the current Quartus project so I can take a look.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1216 on: July 04, 2020, 11:20:12 am »
This is your 'pll1', 'inst17' you have in your project.  It has 4 clock outputs in use.  clk[0,1,2].  (don't worry about the missing c[3], it just feeds 1 flipflop data input, it's actually not clocking anything so it is not reported)
(Attachment Link)

     I calculated the frequencies of the clock outs clk [0,1,2] based on the PLL settings in the window.  Now, in the timing report, are all the reported fastest possible clocks of the GPU for each clk[ # ] domain fast enough to operate without flaw according to the actual clock speeds we are running those sections at?

Thanks - I figured that was the case, just wasn't sure I was looking at the right report in the Timing Reports section - there's so many.  Yes, looks like we're above the minimum clock speeds for the domains we're using.  clk[0] requires 125 MHz, has an Fmax of 139.18 MHz, and clk[1] requires 250 MHz, has an Fmax of 268.6 MHz, so looking good so far, though the margins are slim.

I've still got duplicate keys appearing when I type, but they're about as frequent/possibly slightly less frequent than before.  More an annoyance than a critical problem.
Upload the current Quartus project so I can take a look.

Current project attached.  To my untrained mind, it looks like it's a timing problem with the Z80_bridge when it puts the data for the ASCII char onto the Z80's data bus.  I did some experimentation with it a week or so ago and arrived at the code and timings you'll see in the project by trial and error. :-//

The following lines in Z80_bridge.sv are relevant to what's going on:

287-299 - latches the ASCII char into PS2_CHAR register when PS2_READY goes HIGH for one cycle.  (Ignore the INT_TYP conditional - it's always LOW until I get the interrupts up and running).
309-339 - handles putting the PS2_CHAR data onto the Z80 data bus during an IO read cycle.  PS2_CHAR should be 0x00, unless it has valid ASCII data from the ps2_keyboard_to_ascii.vhd module.  That way, the Z80 reads the PS2 IO port (240) and ignores it if it gets 0x00 back, otherwise it receives an ASCII char and acts upon it.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1217 on: July 04, 2020, 01:38:28 pm »
Hows the ellipse and filled ellipse coming?
After that, we have triangle & 4 vector polygon, then filled versions.
For these 2, we will have to update the line function so you can run 4 of them.

Either 4 independent (a for loop can build 4 engines with 4 sets of arrays), or 1 engine with 4 banks of registers which will use less gates, but 1 line coordinate will be computed at a time.

These will based on but replace the current line.

 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1218 on: July 04, 2020, 01:47:54 pm »
Current project attached.
ZERO_LATENCY should be = 1.
Please make it 1 and give it a try.

1 means reqs take immediate action.  0 means an additional clock cycle is waited until the req is considered.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1219 on: July 04, 2020, 04:32:30 pm »
Ok, I cleaned up the 'geometry_xy_plotter.sv'.  The big change is I moved the absolute value size of dx&dy to lines 237&238.

Now, the setup for drawing the line is down to 1 clock.  This is as efficient as my box algorithm every time it increases the Y axis.  This is important because we will be getting rid of my old box algorithm in place of running 3 line algorithms.

For now, skip the ellipse and lets expand the line to support 4 different lines being drawn with a pre-setup geo_shape which calls calls the lines in sequence to generate a:

1) basic line
2) line-to (same as line but automatically updates x[0] & y[0] with x[1] and y[1] after the line was drawn)
3) box/filled box
4) triangle/filled triangle
5) 4 coordinate polygon.

Later, we will make an additional varible control to the line algorithm to generate a Bézier curve / arc.
Once done, calling ellipse will just call #1, #3 or #5 to generate with the right arc radius to generate an ellipse or an ellipse at any angle.

Then, you geometry engine is done.  We will save blitter copy for when the pixel memory writer is working and you successfully run the current geometry shapes.

I attached the patched line .sv & updates Quartus sim project.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1220 on: July 04, 2020, 05:10:27 pm »
Hows the ellipse and filled ellipse coming?

Haven't had a chance to make a start on them yet - hopefully I've got a bit of a time now to make some progress.

After that, we have triangle & 4 vector polygon, then filled versions.
For these 2, we will have to update the line function so you can run 4 of them.

Either 4 independent (a for loop can build 4 engines with 4 sets of arrays), or 1 engine with 4 banks of registers which will use less gates, but 1 line coordinate will be computed at a time.

These will based on but replace the current line.

Sounds good.  With 10 MAGGIEs, the current build is taking up 85% of total logic elements in the EP4CE10, with 62% of memory bits, so there's still room.  It's taking over 4 minutes to build the project, though.  Those extra MAGGIEs seem to have slowed the build down a lot. :o

ZERO_LATENCY should be = 1.
Please make it 1 and give it a try.

1 means reqs take immediate action.  0 means an additional clock cycle is waited until the req is considered.

Made it 1, have just been testing it - all looks very good.  It's probably luck or psychological, but it seems I'm getting less keyboard errors.  They're still there, but seem less frequent.

Ok, I cleaned up the 'geometry_xy_plotter.sv'.  The big change is I moved the absolute value size of dx&dy to lines 237&238.

Thank you - will download it and give it a try in a sec.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1221 on: July 04, 2020, 10:54:07 pm »
The following lines in Z80_bridge.sv are relevant to what's going on:

287-299 - latches the ASCII char into PS2_CHAR register when PS2_READY goes HIGH for one cycle.  (Ignore the INT_TYP conditional - it's always LOW until I get the interrupts up and running).
309-339 - handles putting the PS2_CHAR data onto the Z80 data bus during an IO read cycle.  PS2_CHAR should be 0x00, unless it has valid ASCII data from the ps2_keyboard_to_ascii.vhd module.  That way, the Z80 reads the PS2 IO port (240) and ignores it if it gets 0x00 back, otherwise it receives an ASCII char and acts upon it.

Darn that Z80_bridge has turned into a mess, however, there are valid reasons why.
Do you have a simulation setup?  If so, post it here...

Were going to re-do most of it in a better way.

Step #1 will be to denounce and isolate any Z80 read request trigger point and Z80 write request trigger point.
The bus debounce and transaction delay will be built in to these 2 1-shot strobes.  These will have a programmable # of debounce/deglitch GPU clock cycles.  This is where you also take care of interrupts.
Currently, these are only done for the GPU ram with gpu_rd_req & gpu_wr_ena, then you have a bunch of other if's and delays for the other possible devices like the keyboard.

Step #2, the 1 shot strobes will acquire the address and data being sent from Z80 to FPGA.

Step #3, Based on address and request direction, all possible read or write data ports will be set here.

Step #4, the 1 shot Z80 read req will change the 245 IO direction until the read req terminates.

We will use a separate data_in and data out_reg for the Z80's data bus.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #1222 on: July 04, 2020, 11:43:04 pm »
Darn that Z80_bridge has turned into a mess, however, there are valid reasons why.
Do you have a simulation setup?  If so, post it here...

Yes, it has become a bit messy since I have added stuff for testing and trying to get IO RD/WR from the Z80 working.  I guess I should housekeep the code more often.  :-[

Test setup for the Z80_bridge attached.  This is what I used in the other thread when I was testing the IO cycle.

What do you mean by 'denounce'?  Not sure on the terminology here.
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1223 on: July 05, 2020, 12:40:41 am »
Darn that Z80_bridge has turned into a mess, however, there are valid reasons why.
Do you have a simulation setup?  If so, post it here...

Yes, it has become a bit messy since I have added stuff for testing and trying to get IO RD/WR from the Z80 working.  I guess I should housekeep the code more often.  :-[

Test setup for the Z80_bridge attached.  This is what I used in the other thread when I was testing the IO cycle.

What do you mean by 'denounce'?  Not sure on the terminology here.
:palm: DE-Bounce.  Bloody auto-correct.
In other words, give the Z80 bus command lines time to settle before considering the address and data coming in. then deciding if a transaction is required and if data should be returned to the Z80.
« Last Edit: July 05, 2020, 01:32:24 am by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: FPGA VGA Controller for 8-bit computer
« Reply #1224 on: July 05, 2020, 01:16:34 am »
Ok, I converted the GPU ram port from 8bit into a dynamic 16bit version.  This means I modified and adapted the following sections of the GPU:

sixteen_port_gpu_ram gpu_RAM.sv
vid_osd_generator.sv
GPU.bdf
data_mux_v2.sv

I also had to modify the compiler settings to ensure the compiler would meet the timing for the GPU ram as the old settings were receiving a few address & write_enable bits a little late.  Though, it was there in your existing version, it probably didn't cause any problem as the silicon usually performs better than the compiler's report which takes into consideration the worst possible conditions based on the set operating specs.

Let me know if the attached project works.  If so, you will need to switch to that project from now on.

Also, why do you have your palette system memory set shy of 32k and your palette memory set right after?
Quartus still wasted the Cyclone's ram with the missing bytes in the main system memory with 32k and then wrote the separate palette memory in that shared address space.  Also, why is the main system memory an odd number of bytes?  You should have used 32768 for main ram and 32768 for the offset of the palette memory.  But, you could have gone even a bit larger on the main system memory.  See here:



With the attached project and my settings, you now have 41kb, just enough memory for a 320x240 16 color screen plus a 256 character 8x8 pixel font.  Or, 640x240 4 color, or 640x480 2 color screen.  Even 160x240 @ 256 colors, 160x120 @ 65536 colors.

It should compile and function.
« Last Edit: July 05, 2020, 06:49:53 am by BrianHG »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf