The bus frequency is less important than the edge rate. Adding source termination limits the frequency but also adds considerable delay even at low frequencies.
Well, the busses in question are the 68k (at ~8MHz) and 6502 (at ~1.7MHz) - Given that some of this is 40-years-old hardware, I'll be grateful if they're not sinusoids...
It's probably safer to leave things as they are - my eyes just lit up at the opportunity to get rid of the need for 11 direction pins... That's a lot of pin real-estate when you've only got 60-odd to start with...
This is interesting as its common to want to connect a 5v chip like this to a Fpga. Out of interest why so many direction pins if only the data bus changes direction? I havent tried but wonder if auto direction level shifters as TXB0108 will work for something like this to save direction pins. Has any one had success?
Well, in my case it's not just the data-bus that's changing direction
The DECA board for $37 has resurrected an old pet project that fell by the wayside a while back, to add standardized "expansion slots" to old computers via their exposed bus signals. I'm going to start with the Atari XL/XE line, but it seems to me that the ST could work just as well using its cartridge port, and the C64 seems like a good candidate too. I know nothing about the Amiga internals, but maybe that too..
The idea is to monitor the A,D,control signals, and to provide a couple of general capabilities
- Provide "memory apertures" which allow banking in of external RAM (I was originally going for 8MB of SDRAM, but now it's 512 MB of DDR3!) on a page-by-page basis. Define a memory-descriptor which the host can write to, consisting of page-start, page-stop, DDR-offset and take over the bus to return the correct value from DDR if an access is made there. An extension of this would add 'stride', x,y,w,h and allow off-screen bitmaps to be easily manipulated from the host
- Monitoring the bus will let me detect (apart from perhaps the Amiga with its FastRam) the accesses the computer makes to draw the screen, and I was planning on making a dumb frame buffer (or handing off to an STM32H7) but then BrianHG helped out Nockieboy with an 8-bit GPU and now the plan is to render to a 16-bit frame buffer using Brian's DDR controller and output over the HDMI on the DECA board. I'll also be able to offer high-resolution, full-color graphics to old computers as-standard, it's impressive how many filled ellipses nockieboy's z80 is throwing around the screen, and MAGGIEs could be useful for the hardware sprites that older systems had in them
Given the first of those above, I can provide memory-mapped access to expansion slots - I'm planning on 4 or 6 of them at the moment, probably using RP2040's essentially as intelligent bi-directional io-expanders-with-buffers because they're so damn cheap. The idea is that the card in the slot (probably using PCIe x1 for physical hardware) will talk over SPI or maybe even UART (or both, basically interfaces that every modern microcontroller has) to the RP2040(s) which will then talk to the FPGA over a far faster bus. It's possible I might allow "bus masters" so slots can talk to each other without involving the FPGA, but probably not. Once the data is in the FPGA, it's accessible to the host via a memory aperture for the slot.
It also means I can expose the other devices on the DECA card (SD-card for "hard drive", Ethernet, USB) as built-in "slots" in the same manner. Once ethernet is working, it'd be possible to have 'virtual' slots, which connect to a socket on the network and you can run a program on a server which provides a 'peripheral' to the host.
Anyway, to (finally, sorry) answer your question... To take over the bus and even possibly DMA directly into the host computer's internal RAM, there's several other signals that I want to control, which let me /HALT the processor on demand or signal a bogus /REFresh command. About the only signal I've not found a possible use for controlling is the host computer CLK
I've had bad experience at work with the auto-level changers - I'm not an EE, I'm an aging physicist-cum-software-engineer, but I worked with EE's for several years in Platform Architecture, which is basically prototyping and none of them had nice things to say about the auto-level changers. Things like inadequate drive strength, signals not being pulled to where you want them to go, and speed etc.
I figured it was easier, especially with old hardware - none of this new-fangled CMOS stuff, we're NMOS baby, and we like it that way - to have the variants with power on both sides and a dedicated direction pin. I don't mind wasting the company's money on experiments but I'm a lot more persnickety about my own [grin]