Products > Test Equipment
Some old school instruments showing how it's done (HP 3325A and Fluke 8506a)
SilverSolder:
--- Quote from: garrettm on February 08, 2021, 03:16:43 am ---
--- Quote from: SilverSolder on February 08, 2021, 02:58:59 am ---Re binary mode: 3 bytes is enough for an ADC reading (21 bits plus a sign bit).
It seems the 500 readings per second literally consists of taking an ADC reading and pumping it out on the interface, with no processing at all!
If we can get that to work, we could do the averaging externally as discussed.
[Edit] 3 bytes times 500 readings per second is 1500 bytes per second, probably requiring a min baud rate of something like 19,200 to transmit? I don't think 9,600 sounds enough?
--- End quote ---
I like the idea of taking the raw output from the ADC. And again, the high speed mode assumes doing all the math remotely, i.e. calibration corrections, scaling, offset, etc. are performed by the PC.
A 3-byte binary payload with <CR><LF> termination gives 5-bytes per reading. Assuming 8N1 serial frames, that’s 10 bits per byte. So 5[bytes/reading]*10[bits/byte]*500[readings/s] = 25000 bits per second minimum. So a modded Bit Serial Interface module with 19200 baud could in theory transmit 384 readings/s.
Assuming Fluke's specifications aren't a broken promise, I think we have a new eevblog forum challenge! Who can get their 8505/6A to read the fastest! Bonus points awarded for achieving 500 readings/s.
--- End quote ---
From the manual: The HighSpeed Reading
mode provides a shortened 3-byte binary two's
complement format response representing the input to
the DMM's A/D Converter. Speeds up to 500 readings
per second are possible in this mode of operation.
Also: The response data from the DMM will be in 3-byte format, as shown below,for each voltage or current reading.
I don't think there is any record separator characters... no CR-LF ! Basically, your application has to keep track of the count...
SilverSolder:
I have found a much better disassembler for the 8080 machine code, called "DASMx".
Also, one of the universe's most closely guarded secrets was revealed: where does an 8080 begin executing code when it starts? (Answer: it starts at zero, no reset jump vectors or anything like that - but the 8080 manual assumed that everyone knows that... maybe all processors did back in the day).
So, I think the attached disassembly is quite reasonable, and probably accurate for the first few steps of the processing on startup at least. It might be possible to glean how it figures out if it is an 8505A or an 8506A, for example.
It looks like they were pressed for space, and might have used "creative" coding to do a lot with a little - this is something that causes headaches for disassemblers, so to really "crack the code" would likely require an experienced 8080 assembly programmer to take a look.
joeqsmith:
--- Quote from: garrettm on February 08, 2021, 02:54:32 am ---
--- Quote from: joeqsmith on February 08, 2021, 02:35:06 am ---[...]
40 ASCII character readings per second would be 2.8 readings a second with 14 characters but I was seeing data much faster than this. It's odd they would use the term Readings as they defined that to mean display updates in the manual. I wouldn't be surprised it they turn off the display in these high speed modes. Still, I don't buy it. The reason is because I never see the sync running that fast with no data being transferred. Then again, maybe reading in this context is that rate the ADC is making decisions, not how fast the display is being updated.
[...]
--- End quote ---
I interpret "ASCII character readings" to mean one 6.5 digit reading sent via a 13-character string with <CR><LF> tacked on for a total of 15 chars per "reading". Although, this rate could require command J to suppress <LF>, reducing to 14 chars per reading.
--- End quote ---
Using your interpretation, it's close to what I was able to achieve, with or without the serial interface running.
--- Quote ---A 3-byte binary payload with <CR><LF> termination gives 5-bytes per reading. Assuming 8N1 serial frames, that’s 10 bits per byte. So 5[bytes/reading]*10[bits/byte]*500[readings/s] = 25000 bits per second minimum. So a modded Bit Serial Interface module with 19200 baud could in theory transmit 384 readings/s.
--- End quote ---
I wasn't able to get the high speed mode to work with with the serial port, however with the async mode only sending 5 bytes as I show with the one scope shot, it should still reach the 384 readings/sec. Even using this mode, the 40 readings/sec seems to be a stretch as the serial port does not appear to be the limiting factor.
Thank you both for the feedback on the programmer. I hadn't realized there were clones of it. My old programmer could test the old CMOS and TTL parts. Not knowing if the part was good, I would just plug them in. It could normally ID the parts (if they were good) which also proved useful. I did manage to blow one of the drivers once but they used a fairly common part that easy enough to change.
While I sort out what to get to replace mine, I'll borrow one and program up some PROMs to try. Not holding out that I will see any gains in speed but maybe some of the other strange problems I see have been addressed.
garrettm:
Using 1kV range, zero off and internal triggering at the DMM and a 10 second gate time at the counter, I got the following results using the Scan Advance.
S0: 19.5 Hz, 51.3 ms / reading (51.3 ms / sample)
S1: 18.0 Hz, 55.6 ms / reading (27.8 ms / sample)
S2: 16.5 Hz, 60.6 ms / reading (15.2 ms / sample)
S3: 14.0 Hz, 71.4 ms / reading (8.9 ms / sample)
S4: 10.9 Hz, 91.7 ms / reading (5.7 ms / sample)
...
S10: 0.234 Hz, 4273 ms / reading (4.2 ms / sample)
S11: 0.117 Hz, 8547 ms / reading (4.2 ms / sample)
If you factor in the number of ADC samples per reading, you can see that we converge on an average of about 4.2 ms per sample from the ADC. From what SilverSolder has said, the meter takes 4 samples per mains cycle when in synchronous mode. That implies 4.17 ms per sample with a 60 Hz AC mains and correlates with my rough measurements.
So Fluke isn't completely full of BS. However, the meter appears to add a variable delay between groups of sequential readings used for the average. The question is, how, or even if, we can minimize this internal delay.
I assume the Controller averages the raw sequential ADC readings and then applies the range scaling, calibration corrections and possible math functions afterward. I doubt this process makes up for the observed delay.
Using 4.16... ms as the digitizing time required for each individual sample and multiplying this by the number of sequential samples for each mode, we obtain the total digitizing time. Subtracting this from the previous reading period, we obtain the following delays.
S0: 47.13 ms (51.3 - 4.16...*2^0)
S1: 47.27 ms (55.6 - 4.16...*2^1)
S2: 43.93 ms (60.6 - 4.16...*2^2)
S3: 38.07 ms (71.4 - 4.16...*2^3)
S4: 25.03 ms (91.7 - 4.16...*2^4)
...
S10: 6.33 ms (4273 - 4.16...*2^10)
S11: 13.67 ms (8547 - 4.16...*2^11)
SilverSolder:
--- Quote from: joeqsmith on February 07, 2021, 10:55:04 pm ---[...] So I looked at the schematic and I notice U24 routes A13. Then we see U23 ties pin 26 high. Looking at my original manual, it is also routed like this. They must have planned on adding some features to U24 at one point and decided to support the 27128. [...]
--- End quote ---
The disassembler found three jumps outside the confines of the 16K available:
Ignoring branch outside ROM to 40D8 at address 25A2
Ignoring branch outside ROM to 40E9 at address 25B9
Ignoring branch outside ROM to 40CB at address 2555
So perhaps there are some special versions of this meter that have extra functionality? - slowly, light is beginning to dawn on the disassembly effort... thanks to help from @retiredfeline in a separate thread on 8080 assembly coding.
[Edit] Another thing: The code appears to start in U24 (at address 0000 in that EPROM), and continues into U23. A 'gotcha'! :D
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version