Author Topic: Agilent 54831D modernising  (Read 86787 times)

0 Members and 1 Guest are viewing this topic.

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Agilent 54831D modernising
« on: November 07, 2016, 04:33:19 pm »
Folks

I've spent the past few weeks on and off investigating getting my old PC-based 54831D scope modernised to a Skylake processor, and although it's not yet a completed project, I thought I'd put some notes down as I've seen a handful of posts on the subject of updating these scopes. The aim is to have a reasonably current OS, and improve the scope's general performance, particularly when dealing with large memory acquisitions and doing decodes, which is my typical use case for this instrument. If there are any corrections or other constructive comments please feel free to add them.

Why? To be completely honest I've burned far too much time and resources on this project, I thought it'd be half a day or so. I'll now never see the RoI, but I've been like a dog with a bone on it, and I don't really need to do it: for day to day use there are better instruments at my disposal for most things, but occasionally the 54831D still earns its place on the bench. So it's because you can, not because you need to do it, a bit like climbing Everest.

If you're considering typing a response along the lines of "why would you do that?" I have the afore quoted pre-baked answers to save your typing effort  ;)

Overview

Here is the Agilent 5483xx family:

Model   Bandwidth   Channels
54830D   600MHz   2+16
54831D   600MHz   4+16
54832D   1GHz      4+16
54833D   1GHz      2+16
54830B   600MHz   2
54831B   600MHz   4
54832B   1GHz      4
54833A   1GHz      2

These are all 4GSa/s scopes, which drops to 2GSa/s when using channels sharing the same ADC. For example in 4 channel scopes, using ch 1+2 or 3+4 will result in 2GSa/s maximum sampling rate on each channel, but using ch 1+3, 2+4, 2+3 or 2+4 gives you 4GSa/s on each channel. They max out at 128Mpts memory.

You can't update a 2ch scope to a 4ch scope or a DSO to and MSO as there are required front panel features that will be missing (sounds obvious, but to avoid any doubt I mention it here).

It's a Windows XP Pro PC-based scope from about 2002 or so, and ran until about 2005. It's woefully underpowered nowadays in terms of computing power, using an old 1GHz P3 with 512MB RAM and a PCI (not PCIe) motherboard, but the acquisition electronics remain solid, although devotees of high update rates and intensity graded displays will be largely disappointed. To display the waveforms quickly enough on this old hardware it uses the concept of VGA overlay, which hands over a rectangular section of the screen to hardware external to the graphics card. There were a number of similar scopes and even LAs from the era, and some of the really early ones were Windows 98 based with even less RAM.

The unit I have was based on the "latest" Motorola VP22 motherboard. This maxes out with the 1GHz P3 and 512MB RAM as far as I've been able to fathom in terms of my own testing and what I could find out from Googling.

The UI is limited on the front panel. You frequently need to use the Windows UI to access stuff, but not rarely the keyboard, so I keep a wireless USB mouse under the desk on an under-desk keyboard slide to do this: that way it doesn't take up bench space, and you don't even need to slide out the keyboard, you just operate it from under the desk.

First, the easy upgrades

Update to SSD, either PATA (if you can find one) or SATA with a SATA/PATA adapter (there are no SATA ports on the motherboard). Some really new SATA drives I couldn't get to boot. If you use a SATA/PATA adapter, keep in mind that the internal CD-ROM is configured as an IDE slave. While there are concerns about XP and TRIM, I've had no problems and am still using the same SSD from a couple of years ago without any problems.

The 600MHz scopes can be updated to 1GHz by removing a resistor on the acquisition board. https://www.eevblog.com/forum/testgear/54831b-upgrade-to-54832b-possible/

There are a few other easy upgrades knocking around too if you do a bit of judicious Googling.

Tools of the trade

To backup, move and resize the disk I used Acronis True Image 2013 and 2016. If you choose to do the imaging on the scope itself, then be aware that the 640x480 screen becomes a limitation, there is a fix https://kb.acronis.com/content/3836 In general though, you'll find it quicker to backup and restore images using a more modern PC booted with Acronis TI with assorted USB 3.0 to SATA/PATA adapters if you have them available. I am sure there are other methods, this is what I used.

I also found I had to fix the restored disk's MBR a few times after imaging, for that I used Paragon Disk Manager. It appears that the Smart Boot Manager deployed with a virgin scope doesn't much like resizing partitions, and I don't know the password that Agilent used on it to configure it the password is "agilent" (lower case, no quotes), so I just blew away the MBR. The problem is then that if you need to get into the recovery partition, you have some work to do, but then you did keep the original disk, right?  :P

USB 2.0 PCI adapter. The VP22 Motherboard only supports USB 1.0 onboard. I used a SATA/USB 2.0 combo board that I had, but I couldn't boot a SATA drive from it on this board although the card's ROM was recognised at POST. I also tried a USB 3.0 PCI card (rare on PCI!) but there is limited point to this as the motherboard's maximum PCI bus is barely over 1Gbps. Also consider that the PCI bus is a shared resource, and some of this will be demanded by the scope app, so take care on what you place on this bus.

PCIe bootable SATA/PATA card - handy because Skylake motherboards only rarely support IDE booting.

SATA CD/DVD drive - Skylake only supports USB 3.0 out of the box, so your OS needs the drivers out of the box, default USB 2.0 drivers won't work on Skylake. You can slipstream them but I've had very limited success with this (on an X99 board with an i7-6800K!).

PCI to PCIe and PCIe to PCI riser adapters.

Upgrading the OS

Once, maybe 25-30 years ago, I considered that knew my way around a motherboard and PCs in general pretty well. Those were in the days when frankly there wasn't really much to know, arm yourself with a screwdriver and Bob's your uncle. As time has gone on, PC architecture has become layer upon layer of functionality, with some things being completely disposed of and becoming obsolete, while others have become so ingrained it's nigh on impossible to remove them, like a PS/2 port for example, essential if you can't even get a USB stack up, let alone bluetooth of course.

So much of this is me getting up to speed with today's PC tech, although I've been doing it on and off pretty much since the mid 80s when the PC was born.

I tried doing fresh installs of OS (Windows Vista and Windows 7), and although the OS installs worked, I couldn't get the scope app to work afterwards. There's some Sentinel service running, and I don't know how much that has to do with it, but it's generally for copy protection of one sort or another. There are also some other bits and pieces you need to find such as drivers, the IO libraries and the app itself.

So I chose to do in-place upgrades. To upgrade in place from XP to Windows 7 you need to do this via Vista. The only in-place upgrade sequence from XP Pro is XP Pro -> Vista Ultimate -> Windows 7 Ultimate. You can't go straight from XP Pro to W7.

I could get the existing OS to run on a Skylake H110 motherboard by using Acronis Universal Restore to add the XP SATA drivers to the build (driver here: http://www.win-raid.com/t11f23-Modded-Intel-AHCI-and-RAID-Drivers-digitally-signed.html using the 32 bit 11.2 version which covers Skylake). Often I also needed to do an XP repair (with media slipstreamed with the aforementioned drivers using nLite) to fix ACPI BSODs, and pressing F5 when the F6 prompt is offered, selecting the Pentium 4 ACPI later on. I sometimes found that the slipstreamed restored image needed activating, which is irritating to say the least as there's no network set up so it's a manual telephone process. To add insult to injury, the activation process itself is broken, and you need to do a manual install of IE8 in safe mode to work around it (go figure).

Unless you're a current hardcore PC engineer who makes slipstreamed OS bootable USB thumb drives in their sleep, I recommend installing from CD/DVD rather than trying to make up bootable USB drives which add to the general confusion although I accept in the longer term they'd be more convenient. For bootable USB drives for things like Paragon Disk Manager and Acronis, I used SanDisk Extreme USB 3.0 thumb drives as they're super fast.

Upgrading from XP to Vista can be done on the scope's VP22 board, and this works although of course it's horrendously slow. One thing to note is that it won't upgrade complaining about .NET needing to be uninstalled but there no officially documented way seemed to succeed (way to go M$). In the end I simply renamed the C:\Windows\System32\WindowsPowerShell directory and re-ran the upgrade. The scope app itself is fine when running on the VP22 board under Vista. Upgrading to Window 7 though failed and rolled back each time I tried it on the board.

I had no luck doing an in-place Vista upgrade on a ported XP image on a Skylake board, it failed each time. Instead I restored the working VP22 Vista image and retrofitted the Skylake AHCI drivers using Acronis Universal Restore. I could then do a Windows 7 upgrade on the Skylake board.

I have had success with running the scope fully up to Vista on the VP22 board, and also a ported XP on a Skylake board. The ported XP suffered because it  lacked working drivers for things like USB (at least I've not been able to get them to work yet), and the ACPI mismatch means it won't properly turn off or reboot under software control.

Practical logistical problems

The IO shield on these scopes is milled or punched out of the chassis, it's not replaceable. So the choices are (a) find a compatible board [I failed], (b) hack a hole for an IO shield or (c) use assorted risers. If you're brave, then sure, go for (b). Considering this scope still has substantial resale value in its factory state, I preferred option (c). More on this later, I am sure.

I've tried other options such as mini ITX and DTX boards with PCIe to multiple PCI daughter boards in place of where the mATX board would present its cards. This hasn't been too successful, the daughter boards tended not to work or only worked very intermittently. I strongly suspect this is a power related problem, although they are supplied with their own molex or SATA power connector. At first the USB 3.0 male-male cables used to sent the PCIe data were the suspects, but now I'm not so sure.

The preferred method now is to go back to using individual PCI to PCIe adapter risers off the mATX board and fabricate a frame internally.

Drivers...

Regarding drivers, as well as the Agilent drivers, there are also the drivers for the CT65550 graphics card pair. There's only one PCI acquisition/GPIB interface board, but it requires about eight drivers. Most of the Agilent drivers are not properly exposed as the .inf files are missing from the deployed build. They can, however, be found on the scope's RECOVERY partition inside a ghost image. I opened this up inside an old VM with Ghost 9.0 installed and managed to find a bunch of .infs for all the devices. Here are the .infs & .sys I found with relevance:

A0014602.inf
Adobe.inf
Adobe.sys
agave.inf
agave.sys
agbridge.sys
agt357.inf
agt82341.sys
agt82350.sys
agt82357.sys
agtgpib.inf
faroacq.inf
Faroacq.sys
mesa.inf
Mesa.sys
Phramacq.inf
phramacq.sys
tstone.inf
tstone.sys
Zeum.inf
Zeum.sys

I located all of these and placed them in a single directory to ease later installation from the Device Manager applet.

The CT65550 graphics card driver is in c:\WinXPFiles\CT65550. I am still having trouble getting this to install properly and consistently on newer operating systems, it seems quite flakey when you override the default VGA driver.

Once you have all the above installed, you need to run the postsysprep.bat batch file in the C:\PciFilter directory. Hopefully when you reboot, all will work!

Interesting...

More recently, a couple of days ago I went full throttle and installed an in-place Windows 7 on a Skylake motherboard on top of a Vista that was suffering slow screen updating in the scope app, with maybe one screen graticule update per second or so. Of interest was that the updated screen graticule looked like it had intensity grading, something I've never seen on this scope (other than with the limited very slow persistence-based blue trace Megazoom option).

At this point I thought I'd take a punt and remove the CT65550 cards, and see what the on board Intel 530 graphics did with an external monitor, wondering how (or if) this works with the same software in later scopes like the 8000 series which don't use the VGA overlay technique, assuming that this was what was happening. Well, it seemed at first to work really well, screen updates were at least as good as before and there was definitely intensity grading now. I could also increase the resolution, with 1024x768 being the maximum that the application would show.

Then I took a look at what the scope thought it was with the Help->About. It seemed it now though it was an MSO8104A!

All this was short-lived though, because I tried the scope on equivalent time rather than real time using the built in calibrator, and the graticule was just gobbledegook. The app crashed soon afterwards too, so I strongly suspect in its attempt to be something it isn't it broke something. Reinstalling the old VP22 board in its virgin state restored everything to normality.

So the effort now is to try to get it back to thinking the Windows 7/Skylake combo is a 54831D/54832D with the CT65550 with the integrated flat panel.

Results so far

Running the ported XP image on a $60 Skylake micro ATX H110 board with the three PCI cards on individual PCIe->PCI risers, performed extraordinarily well with a G4500T 2C2T Pentium 3.0GHz/3MB cache 35W processor, however those risers mean that the boards won't fit upright in the chassis. For a short time I also tried a 2C2T G3900 Celeron 2.8GHz/2MB cache 51W processor and a 2C4T i3-6320 3.9GHz/4GB cache 51W processor. Boot times were pretty impressive, under 20s to get the scope screen up in the case of the i3, and not much further behind with the other two processors. This being a single-threaded app, the application start up times all very closely reflected the processor speed. The first time I started the scope app I assumed something must have failed because it came up so quickly!

There is a limit to the processor you can put in this scope without an additional power supply, this is why I went for the G4500T 35W processor which seemed a reasonable selection based on price and single thread performance for a 35W processor.

Tips for buying 5483xx scopes (for later further expansion I am sure)

  • Make sure it has the VP22 motherboard, with Windows XP installed.
  • Probes, in particular the digital probes for the MSO, are fairly rare eBay finds: they're not the same as those used on other lower end Agilent/Keysight MSOs.
« Last Edit: November 07, 2016, 09:10:09 pm by Howardlong »
 
The following users thanked this post: tv84, tonykara, coromonadalix, wd5jfr, Leho1986

Offline lukier

  • Supporter
  • ****
  • Posts: 634
  • Country: pl
    • Homepage
Re: Agilent 54831D modernising
« Reply #1 on: November 07, 2016, 05:14:17 pm »
Love this type of threads. I too often get ancient stuff and go great lengths to modernize it (my most recent one was upgrading the insides of TLA704 to make it TLA715).

I think the biggest issue is this silly VGA overlay that requires ancient CT65550 graphic card tightly coupled to the OS, drivers and the scope app.

I wonder why Agilent designed this the way it is. I could see another possibility - to treat oscilloscope graticules / intensity grading output similarly to a camera video stream. Back in the days of these scopes it would be, for example, Bt848 PCI video grabber/tuner. Grabs a frame and sends over PCI to the app and then the app can display that onto Win32/DirectX/OpenGL surface. It has small overhead of buffer copies, but it is much simpler and the modernization with a PCIe to PCI adapter should be possible.

The MSO8104A glitch sounds promising, maybe it is worth pursuing further. I could imagine (just guessing, I don't have the scope) that providing a phony DLL (that the scope app tries to use) instead of the real driver might stop the crashes.
 
The following users thanked this post: Leho1986

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Re: Agilent 54831D modernising
« Reply #2 on: November 07, 2016, 05:59:45 pm »
Love this type of threads. I too often get ancient stuff and go great lengths to modernize it (my most recent one was upgrading the insides of TLA704 to make it TLA715).

I think the biggest issue is this silly VGA overlay that requires ancient CT65550 graphic card tightly coupled to the OS, drivers and the scope app.

I wonder why Agilent designed this the way it is. I could see another possibility - to treat oscilloscope graticules / intensity grading output similarly to a camera video stream. Back in the days of these scopes it would be, for example, Bt848 PCI video grabber/tuner. Grabs a frame and sends over PCI to the app and then the app can display that onto Win32/DirectX/OpenGL surface. It has small overhead of buffer copies, but it is much simpler and the modernization with a PCIe to PCI adapter should be possible.

The MSO8104A glitch sounds promising, maybe it is worth pursuing further. I could imagine (just guessing, I don't have the scope) that providing a phony DLL (that the scope app tries to use) instead of the real driver might stop the crashes.

As I understand it the next gen 8000 series don't use VGA overlay, favouring more conventional (for today) graphics card methods instead. I am pretty sure upgrading the OS and motherboard on these will be a piece of cake, barring the I/O shield nonsense of course.

I remember doing a project in the mid 90s on Windows 95, and getting high speed simple machine-rendered screen blits to the screen at video speed was indeed a complete nightmare. That project failed pretty quickly because it was realised what we were trying to achieve simply wasn't economically viable with the day's technology, and that technology gap surprised me at the time: almost nothing seemed to have moved on in the PC graphics area from when I was writing Windows display drivers in the late 80s.

I had a look to see if the correct sized 8.4" XGA screens are available in case I get the non-CT65550 version to work, and it appears they are, Mitsubishi seem to be the manufacturers, but I'd need to check mountings are equivalent. There also appear to be HDMI/DVI to LVDS adapters for them for about $25, so it'd be a simple matter to interface to a modern motherboard. Not sure if the touch interfaces are as easy to source though. I did have a brief look for a ready made touch screen monitor that I could take apart, but I doubt many are XGA. In addition, there are fewer and fewer available in a 4:3 aspect ratio these days.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #3 on: November 08, 2016, 06:22:39 am »
I've spent the past few weeks on and off investigating getting my old PC-based 54831D scope modernised to a Skylake processor, and although it's not yet a completed project, I thought I'd put some notes down as I've seen a handful of posts on the subject of updating these scopes. The aim is to have a reasonably current OS, and improve the scope's general performance, particularly when dealing with large memory acquisitions and doing decodes, which is my typical use case for this instrument.

I doubt you'll see much improvement from a PC board upgrade, after all these scopes use a proprietary ASIC architecture which determines the scope performance, not the PC part. Windows is merely used as an UI and display layer on these scopes. A while ago I did a few tests upgrading my Infiniium DSO8064A, although all under WindowsXP, but overall scope performance did not change one bit with a faster CPU (although general Windows and application performance did, but was hardly worth it). Again, this was expected, as the scope performance is decided by the proprietary ASICs (i.e. MegaZoom) and not by the PC platform.

I think a better project would be to try to get the DSO9000A software (which is newer and supports more serial decode standards) to work on the old Infiniium 54800/DSO8000 Series.
« Last Edit: November 08, 2016, 06:33:53 am by Wuerstchenhund »
 

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Re: Agilent 54831D modernising
« Reply #4 on: November 08, 2016, 08:37:12 am »
I've spent the past few weeks on and off investigating getting my old PC-based 54831D scope modernised to a Skylake processor, and although it's not yet a completed project, I thought I'd put some notes down as I've seen a handful of posts on the subject of updating these scopes. The aim is to have a reasonably current OS, and improve the scope's general performance, particularly when dealing with large memory acquisitions and doing decodes, which is my typical use case for this instrument.

I doubt you'll see much improvement from a PC board upgrade, after all these scopes use a proprietary ASIC architecture which determines the scope performance, not the PC part. Windows is merely used as an UI and display layer on these scopes. A while ago I did a few tests upgrading my Infiniium DSO8064A, although all under WindowsXP, but overall scope performance did not change one bit with a faster CPU (although general Windows and application performance did, but was hardly worth it). Again, this was expected, as the scope performance is decided by the proprietary ASICs (i.e. MegaZoom) and not by the PC platform.

I think a better project would be to try to get the DSO9000A software (which is newer and supports more serial decode standards) to work on the old Infiniium 54800/DSO8000 Series.

While I realise boot time has no importance to you, in this instance the unit boots to the scope screen in under 20s, down form about 2 minutes. My workflow doesn't extend to having a scope switched on from 9am to 5pm, with a half hour break for coffee while I wait for it to warm up, I switch it on as I need it and accept that there might be some minor cal aberrations during the warm up time.

There is also the possibility from early indications that with the improved processing power and integrated GPU it introduces intensity grading, something not available on the original 54830 series. That's before measuring things like decode performance, the main reason for starting this.

Anyway, as I stated at the head of the OP, it's now a project "because you can" rather than having a real need that offsets the investment in time.

One problem with your 9000 suggestion, I don't have one, if I did I am sure I'd be doing what you suggest. If I saw one for a bargain price I might well do that, but they're in a totally different price point to the 54830 series, unless of course you can point to somewhere where they're on offer for about £2k.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #5 on: November 08, 2016, 08:59:14 am »
While I realise boot time has no importance to you, in this instance the unit boots to the scope screen in under 20s, down form about 2 minutes.

Well, I got from almost 2mins to <35s just by replacing the slow 5400rpm laptop hard drive in my DSO8064A with a SSD. But then, the intel board in that scope already comes with SATA, so it's already reasonable fast, and I appreciate that this isn't the case for the old Socket7 and Socket370 boards in the Infiniium 54800 Series. However, a cheap SiL SATA PCI adapter + SSD should solve that problem for the 54800 Series (I know that some of the cheap PCI SATA cards have problems booting in these old boards, especially those with Marvel controller).

Quote
My workflow doesn't extend to having a scope switched on from 9am to 5pm, with a half hour break for coffee while I wait for it to warm up, I switch it on as I need it and accept that there might be some minor cal aberrations during the warm up time.

At home I also don't have my scopes switched on all the time but only when I need them. I still don't care if booting requires 30s or 1min, as I usually use the boot time to prepare the required probing.

Quote
There is also the possibility from early indications that with the improved processing power and integrated GPU it introduces intensity grading, something not available on the original 54830 series. That's before measuring things like decode performance, the main reason for starting this.

Intensity grading is available on the DSO80/80k Series because it doesn't rely on that ancient C&T graphics card and hardware overlay, and I doubt that the 54800 can be made to work reliably with software overlay, at least not without modifying the scope app (which will either expect hardware overlay on the C&T grapics card for a 54800 Series scope and real DSO8k hardware for a DSO8k).

Quote
One problem with your 9000 suggestion, I don't have one, if I did I am sure I'd be doing what you suggest. If I saw one for a bargain price I might well do that, but they're in a totally different price point to the 54830 series, unless of course you can point to somewhere where they're on offer for about £2k.

You don't need to buy a DSO9k scope to download the DSO9k software. From a quick look I assume the scope application should work on the older Infiniiums (well, at least on the DSO8k, which is much closer to the DSO9k than to the old 54800 Series), however the software download lacks the hardware drivers for the older scopes.
« Last Edit: November 08, 2016, 09:32:27 am by Wuerstchenhund »
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #6 on: November 08, 2016, 09:21:15 am »
I wonder why Agilent designed this the way it is.

Simple, because when the Infiniium 54800 Series was designed back in the '90s hardware overlay was the only way to get live video onto a computer monitor without requiring specialized hardware or fast processors. Don't forget the Infiniium 54800 Series came out in late 1997, with early versions running Windows 95 (subsequently updated to Windows98 and later WindowsXP). Which makes the architecture more than 20 years old.

Quote
I could see another possibility - to treat oscilloscope graticules / intensity grading output similarly to a camera video stream. Back in the days of these scopes it would be, for example, Bt848 PCI video grabber/tuner. Grabs a frame and sends over PCI to the app and then the app can display that onto Win32/DirectX/OpenGL surface. It has small overhead of buffer copies, but it is much simpler and the modernization with a PCIe to PCI adapter should be possible.

That's how the Infiniium 8000, the successor to the old 54800 Series, does it, as well as subsequent scopes (i.e. the DSO9k). It's standard software overlay that pretty much any GPU made since the late '90s supports.

BTW, on LeCroy scopes, the scope app uses Direct3D to draw the waveforms on screen (no overlay).

Quote
The MSO8104A glitch sounds promising, maybe it is worth pursuing further. I could imagine (just guessing, I don't have the scope) that providing a phony DLL (that the scope app tries to use) instead of the real driver might stop the crashes.

I doubt that this helps, the DSO8k may look a lot like the 54800 Series on the outside, on the inside it's quite different (and a lot more reliable than the 54800 Series, which was plagued from various hardware issues). The scope app probably just checks for presence of the C&T graphics card to determine if it is running on a 54800 Series scope or a DSO8k, and with the card missing will believe its running on a DSO8k. It still expects to find DSO8k hardware, though.
« Last Edit: November 08, 2016, 09:32:56 am by Wuerstchenhund »
 

Online Jwalling

  • Supporter
  • ****
  • Posts: 1517
  • Country: us
  • This is work?
Re: Agilent 54831D modernising
« Reply #7 on: November 08, 2016, 10:32:08 am »
Somebody has already done an O/S upgrade on one of these:
https://www.eevblog.com/forum/buysellwanted/%28urgent%29-agilent-54831d-mso-oscilloscope-4a16dig-new-motherboard-windows-7/

A pity he didn't share his experience.




Jay

System error. Strike any user to continue.
 

Offline lukier

  • Supporter
  • ****
  • Posts: 634
  • Country: pl
    • Homepage
Re: Agilent 54831D modernising
« Reply #8 on: November 08, 2016, 11:11:35 am »
Simple, because when the Infiniium 54800 Series was designed back in the '90s hardware overlay was the only way to get live video onto a computer monitor without requiring specialized hardware or fast processors. Don't forget the Infiniium 54800 Series came out in late 1997, with early versions running Windows 95 (subsequently updated to Windows98 and later WindowsXP). Which makes the architecture more than 20 years old.

1997 is two years after Windows 95 that introduced DirectX (and AFAIR OpenGL software rendering was shipped with Win95 as well, NT had OpenGL even before DirectX as NT used to be multi-platform and MS had cooperation with SGI). Bt848 TV tuner chipset also came in 1997, so transferring full PAL video and displaying it live on the PC screen was definitely available in 1997 (I had Bt878 tuner a year or two later, on a P120MMX).

PCI bandwith is 133 MB/s which is quite a bit, with 54800 screen resolution of 640x480 you can stream RGB video at 150 fps - there is plenty of bandwith. The only issue is blitting the buffers onto the screen.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #9 on: November 08, 2016, 11:49:17 am »
1997 is two years after Windows 95 that introduced DirectX (and AFAIR OpenGL software rendering was shipped with Win95 as well, NT had OpenGL even before DirectX as NT used to be multi-platform and MS had cooperation with SGI). Bt848 TV tuner chipset also came in 1997, so transferring full PAL video and displaying it live on the PC screen was definitely available in 1997 (I had Bt878 tuner a year or two later, on a P120MMX).

That is all well and nice but the point you're missing is that a product like a scope isn't developed in a day, and for a scope that is as complex as the Infiniium 54800 Series and which has been brought to market in 1997 development has started closer to 1994, a time when Windows 95 was still in Beta (plus DirectX didn't even make it into the first Windows95 release but was added later only, and everything before DirectX 5 was absolute crap anyways). There was no software OpenGL in Windows 9x (OpenGL support had to come with the graphics drivers), and WindowsNT would have been way to ressource hungry (plus NT up to 3.5/3.51 was pretty slow in regards to graphics, which only got better with NT 4.0 where the gfx driver moved to Ring 0, but NT 4 came out in 1996 and was still a ressource hog). Not a lot of choice for HP.

The other thing is that HP, like any T&M vendor who wanted to be able to support its product for a very long time, was looking for components that were available for an extended period of time, something that wasn't and still isn't exactly common with PC parts. The C&T 65500 Series of GPUs found wide use in various embedded applications back then, and was available on standard graphics cards, which is very likely why HP settled on it for their design.

Quote
PCI bandwith is 133 MB/s which is quite a bit, with 54800 screen resolution of 640x480 you can stream RGB video at 150 fps - there is plenty of bandwith. The only issue is blitting the buffers onto the screen.

PCI32 33MHz bandwidth is 133MB/s only in theory, even on good modern boards the real-world BW is closer to 120MB/s, and back in the old days of Pentium, P2 and AMD K6 the PCI throughput was closer to 70MB/s due to the slow chipsets, often combined with a 25MHz bus clock rate instead of 33MHz.

Also, for a scope a update rate of 150 updates/s isn't exactly stellar. The Infiniium 54800 Series isn't fast but they manage a lot more than just 150 updates per second. Hardware overlay meant the overlay didn't have to go through the slow PCI bus.

I'm not a great fan of the 54800 Series (mainly due to their poor reliability), and some of the design decisions appear silly today, but the reality is that, when HP designed the 54800 Series, there wasn't a lot they could have done better. Had they waited for say 3 years or so the situation would have been different, but not for a scope that was designed at the time the Infiniium 54800 had been. After all, the Infinnium 54800s were amongst the first Windows-based scopes (together with a Windows scope from Nicolet).

As stated before, if you want to see a implementation that makes use of DirectX then have a look at LeCroy's X-Stream scopes. It's a CPU-centric architecture (i.e. the PC CPU does all scope calculations, not proprietary ASICs as on other brands' scopes) where no overlay involved, which produces higher update rates and makes the scope app work like any other Windows app, i.e. it is independent of the screen resolution, can be used fullscreen or windowed, freely moved around the desktop, and is independent on which display it runs in a multi-monitor setup. But LeCroy started a lot later than HP (the first X-Stream scopes appeared in late 2001/early 2002, development started around 1999).
« Last Edit: November 08, 2016, 11:52:52 am by Wuerstchenhund »
 

Offline lukier

  • Supporter
  • ****
  • Posts: 634
  • Country: pl
    • Homepage
Re: Agilent 54831D modernising
« Reply #10 on: November 08, 2016, 12:20:36 pm »
Also, for a scope a update rate of 150 updates/s isn't exactly stellar. The Infiniium 54800 Series isn't fast but they manage a lot more than just 150 updates per second. Hardware overlay meant the overlay didn't have to go through the slow PCI bus.

I didn't mean to transfer every trigger acquisition over the PCI bus, just the rendered result done by the hardware (e.g. what Rigol is doing in their FPGA, trigger, acquire, render with intensity/persistance and using fast Cypress SRAM as a framebuffer), now just send the contents of this framebuffer like a webcam or previously mentioned TV tuner does, over PCI, and 30 fps would be perfectly fine (LCD panel cannot display more anyway, modern ones 60 Hz).

With modern PCie buses on the lower end scopes (<15 GSPS) it might be possible to DMA the data straight from the ADC/sample memory let's say to a GPU and with CUDA/OpenCL process it (math, filters) and generate OpenGL output (persistance/grading etc) in no time (GTX1080 device memory bandwith = 320 GB/s, 8 TFLOPs performance).

As stated before, if you want to see a implementation that makes use of DirectX then have a look at LeCroy's X-Stream scopes. It's a CPU-centric architecture (i.e. the PC CPU does all scope calculations, not proprietary ASICs as on other brands' scopes) where no overlay involved, which produces higher update rates and makes the scope app work like any other Windows app, i.e. it is independent of the screen resolution, can be used fullscreen or windowed, freely moved around the desktop, and is independent on which display it runs in a multi-monitor setup. But LeCroy started a lot later than HP (the first X-Stream scopes appeared in late 2001/early 2002, development started around 1999).

I'm currently saving for a WavePro 7000(A) following your recommendations on this forum, but it will take a while, as these scopes are expensive and have to be imported from eBay US. Also some sellers confuse the later -A models with the previous ones, marketing the old ones as the -A. From the specs it seems the differences are minimal (-A adds XXL memory AFAIR and USB ports on the front), is that true?

I really like LeCroy's approach to PC-oscilloscope integration, as it can benefit from the advances in computing that happened since the 7000A were released. I hope when I get one I'll be able to put a mini-ITX SkyLake board like Howardlong, PCIe to PCI adapter for the LeCroys card interfacing to the acquisition board, dxl's fixed front panel driver and SSD of course and the scope will get a massive boost.

The only thing that bothers me with LeCroy is their late to the game MSO. I think some later models had some funny USB adapters for MSO and only very recent ones have MSO. But I have a separate big logic analyzer (Tek TLA) so having MSO is not that critical.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #11 on: November 08, 2016, 01:47:24 pm »
Also, for a scope a update rate of 150 updates/s isn't exactly stellar. The Infiniium 54800 Series isn't fast but they manage a lot more than just 150 updates per second. Hardware overlay meant the overlay didn't have to go through the slow PCI bus.

I didn't mean to transfer every trigger acquisition over the PCI bus, just the rendered result done by the hardware (e.g. what Rigol is doing in their FPGA, trigger, acquire, render with intensity/persistance and using fast Cypress SRAM as a framebuffer), now just send the contents of this framebuffer like a webcam or previously mentioned TV tuner does, over PCI, and 30 fps would be perfectly fine (LCD panel cannot display more anyway, modern ones 60 Hz).

Sure, today. Even that cheap Cypress FPGA is already a lot more capable than what you could find as reasonably priced PC components back in 1994/1995.

Quote
With modern PCie buses on the lower end scopes (<15 GSPS) it might be possible to DMA the data straight from the ADC/sample memory let's say to a GPU and with CUDA/OpenCL process it (math, filters) and generate OpenGL output (persistance/grading etc) in no time (GTX1080 device memory bandwith = 320 GB/s, 8 TFLOPs performance).

Indeed, even for faster sample rates (i.e. 40GSa/s 8bit, or 20GSa/s 12bit) that should be possible today already. GPU processing would be ideal for waveform analysis and signal processing (and you wouldn't even need a high end GPU for that). Sadly, scope manufacturers aren't there yet.

Quote
I'm currently saving for a WavePro 7000(A) following your recommendations on this forum, but it will take a while, as these scopes are expensive and have to be imported from eBay US.

Yes, they are. Actually the prices have increased quite a bit over the last couple of years.

Quote
Also some sellers confuse the later -A models with the previous ones, marketing the old ones as the -A. From the specs it seems the differences are minimal (-A adds XXL memory AFAIR and USB ports on the front), is that true?

Yes, XXL (96Mpts) was only available with the WP7kA, although it was introduced quite late.

Quote
I really like LeCroy's approach to PC-oscilloscope integration, as it can benefit from the advances in computing that happened since the 7000A were released. I hope when I get one I'll be able to put a mini-ITX SkyLake board like Howardlong, PCIe to PCI adapter for the LeCroys card interfacing to the acquisition board, dxl's fixed front panel driver and SSD of course and the scope will get a massive boost.

That's true. Mine is waiting for an upgrade as well, although I won't go as far as Skylake (I'll probably stick to one of the earlier Core I3 on an industrial mainboard with LVDS port for the internal display). Should be pretty much Plug & Play ;)

Quote
The only thing that bothers me with LeCroy is their late to the game MSO. I think some later models had some funny USB adapters for MSO and only very recent ones have MSO. But I have a separate big logic analyzer (Tek TLA) so having MSO is not that critical.

Yes, if you want MSO then these scopes aren't a good choice (and in this case I'd probably look at an Agilent MSO8000A, which is a good scope and much better than the Infiniium 54800D). LeCroy was pretty late to the MSO game and there was a MSO option (called MS-32) for these older LeCroy scopes which is an external box connecting to the scope via USB, which was bought in and plain and simple a piece of crap. Also, once enabled on the scope it will not work without it. Later scopes (WRXi/WSXs and up) use different MSO boxes (MS-250 and MS-500) which connect via L-Bus and which work great but still cost an arm and a leg 2nd hand.
 

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Re: Agilent 54831D modernising
« Reply #12 on: November 08, 2016, 01:54:18 pm »
Somebody has already done an O/S upgrade on one of these:
https://www.eevblog.com/forum/buysellwanted/%28urgent%29-agilent-54831d-mso-oscilloscope-4a16dig-new-motherboard-windows-7/

A pity he didn't share his experience.

I did contact the individual via PM. Rather than giving suggestions, he replied tersely with a bill for $300, and proceeded to tell me I hadn't tried very hard. The conversation ended there as far as I was concerned.
 

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Re: Agilent 54831D modernising
« Reply #13 on: November 08, 2016, 02:10:48 pm »
The other thing to note about ancient graphics cards adhering roughly to the EGA/VGA standard was that they were useless for bitblt because of the layered way they were accessed in the limited memory window, they weren't a simple memory map, far from it in fact. This particularly affected non-byte aligned continuous colour RMW operations, typical in bitblt.

It's been a very long time since I wrote graphics drivers for VGA, I'd estimate over 25 years ago, and I don't know how much the CT65550 was dependent on VGA's graphics modes and the limited memory window, but I do know VGA was indeed a right royal pain in arse to code drivers for!

I did an experiment watching a Youtube video on the CT65550 on a Skylake board, and I estimated it was about ~15fps full screen. With the VP22 motherboard it's about 2fps.
 

Online Jwalling

  • Supporter
  • ****
  • Posts: 1517
  • Country: us
  • This is work?
Re: Agilent 54831D modernising
« Reply #14 on: November 08, 2016, 02:52:44 pm »
Somebody has already done an O/S upgrade on one of these:
https://www.eevblog.com/forum/buysellwanted/%28urgent%29-agilent-54831d-mso-oscilloscope-4a16dig-new-motherboard-windows-7/

A pity he didn't share his experience.

I did contact the individual via PM. Rather than giving suggestions, he replied tersely with a bill for $300, and proceeded to tell me I hadn't tried very hard. The conversation ended there as far as I was concerned.

Shithead.  :--
Hopefully someone here will get a chance to return the favor someday, when he needs help.  >:D

EDIT: And you even helped him in the past. https://www.eevblog.com/forum/testgear/dsox2000-and-3000-series-licence-have-anyone-tried-to-hack-that-scope/msg893788/#msg893788
« Last Edit: November 08, 2016, 02:57:49 pm by Jwalling »
Jay

System error. Strike any user to continue.
 

Offline lukier

  • Supporter
  • ****
  • Posts: 634
  • Country: pl
    • Homepage
Re: Agilent 54831D modernising
« Reply #15 on: November 08, 2016, 03:13:19 pm »
Shithead.  :--

S....d indeed.

But it means that a major modernization is possible somehow.

I doubt this guy did some DIY hardware solution that might be possible. In the 54830 service guide, theory of operation, it says the driving for the LCD is done by the Display Board, and the ancient graphics card (marked A11) just feeds this custom Display Board (A12). One could imagine some piece of hardware that uses modern GPU, LVDS to parallel converter and an FPGA to convert the parallel video to the format produced by CT65550 / expected by the Display Board.

But that's a lot of work.

I suspect the guy spent a lot of time with IDA or similar disassembler to figure out various things (including how to enable options) and somehow managed to get it running (maybe the MSO8104A glitch idea with some phony mock-DLL).

Edit:
Maybe not even that:
http://www.ebay.ca/itm/Agilent-54831D-MSO-4an-16dig-wUpgraded-Motherboard-cpu-2-3-2GHz-RAM-SSD-ALL-OPTS-/291821246325?hash=item43f1e6d775:g:OKgAAOSwARZXi-PL

Asus P5Q-VM DO is not that modern, it has 3 PCI slots so I guess the cards stayed the same. Note also from the description:
"The scope on Windows Vista was fully functional. The scope app had problems with the graphics card on Windows 7. Some functions were not working correctly, but the OS was running fine. I didn't found a good driver for Chips&Tech 65550 (VEN_102C DEV_00E0) graphics card  for Windows 7. The newest version which was tested was 2.51c."

"I solved the issue with the scope application on Windows 7. Now it's running fine. The scope is running on Windows 7. All functions are fully functional.
The issue was virtual memory. I needed to set the size manualy. Now it's working."

Note that the system is 32 bit, apparently the old drivers crash on x64.

Meh, E6700 is OK but I wouldn't call that lightning fast, especially with the original GPU  :-\
« Last Edit: November 08, 2016, 03:26:13 pm by lukier »
 

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Re: Agilent 54831D modernising
« Reply #16 on: November 08, 2016, 05:03:38 pm »
Shithead.  :--

S....d indeed.

But it means that a major modernization is possible somehow.

I doubt this guy did some DIY hardware solution that might be possible. In the 54830 service guide, theory of operation, it says the driving for the LCD is done by the Display Board, and the ancient graphics card (marked A11) just feeds this custom Display Board (A12). One could imagine some piece of hardware that uses modern GPU, LVDS to parallel converter and an FPGA to convert the parallel video to the format produced by CT65550 / expected by the Display Board.

But that's a lot of work.

I suspect the guy spent a lot of time with IDA or similar disassembler to figure out various things (including how to enable options) and somehow managed to get it running (maybe the MSO8104A glitch idea with some phony mock-DLL).

I am pretty sure it was another forum member that spent a lot of time with IDA ;-)

Quote
Edit:
Maybe not even that:
http://www.ebay.ca/itm/Agilent-54831D-MSO-4an-16dig-wUpgraded-Motherboard-cpu-2-3-2GHz-RAM-SSD-ALL-OPTS-/291821246325?hash=item43f1e6d775:g:OKgAAOSwARZXi-PL

That's the one. There is a slightly more recent version knocking about which shows the floppy drive replaced with an IO panel and the IO shield hacked out if you look closely.

Quote
Asus P5Q-VM DO is not that modern, it has 3 PCI slots so I guess the cards stayed the same. Note also from the description:
"The scope on Windows Vista was fully functional.

I have one of those mobos here, and was going to install it but it's a tight fit, when I realised (doh) I needed to hack an IO shield out all bets were off I'm afraid. It also gives you a somewhat gentler move from IDE/PATA to SATA/AHCI as well as a more compatible ACPI.

Quote
The scope app had problems with the graphics card on Windows 7. Some functions were not working correctly, but the OS was running fine. I didn't found a good driver for Chips&Tech 65550 (VEN_102C DEV_00E0) graphics card  for Windows 7. The newest version which was tested was 2.51c."

"I solved the issue with the scope application on Windows 7. Now it's running fine. The scope is running on Windows 7. All functions are fully functional.
The issue was virtual memory. I needed to set the size manualy. Now it's working."

I wonder if the screen update is fast? He mentions a problem with the C&T driver, that was indeed a problem I found if the CT65550 wasn't found.

Quote
Note that the system is 32 bit, apparently the old drivers crash on x64.

I didn't even consider x64 a starter, there are about ten proprietary drivers to install, all 32 bit.

Quote
Meh, E6700 is OK but I wouldn't call that lightning fast, especially with the original GPU  :-\

There is a limit is the TDP draw with the internal PSU, I'd say 65W in the case of the E6700 is pushing it, the stock P3 is circa 33W, although the application is very largely single threaded. You could install your own SFF PSU I guess, but that would have to somehow work with the acquisition board. Although I agree it's not the fastest CPU, I am pretty sure an E6700 will be a significant improvement over the P3.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #17 on: November 08, 2016, 05:19:04 pm »
Shithead.  :--

S....d indeed.

Well, he tried to sell a hacked scope, which alone says a lot about the person.  :palm:

Quote
But it means that a major modernization is possible somehow.

I doubt this guy did some DIY hardware solution that might be possible. In the 54830 service guide, theory of operation, it says the driving for the LCD is done by the Display Board, and the ancient graphics card (marked A11) just feeds this custom Display Board (A12). One could imagine some piece of hardware that uses modern GPU, LVDS to parallel converter and an FPGA to convert the parallel video to the format produced by CT65550 / expected by the Display Board.

That wouldn't help. You need a hardware mixer to overlay the waveform image onto the graphics content, and that needs to happen in the GPU. Older graphics cards (i.e. Radeon 7000/VE and Geforce 2 MX) had a pin connector for hardware overlay, but that is gone from newer cards. And on-board graphics pretty much always lacked the hw overlay inputs.

Once you found a card old enough to support hw overlay (which will very likely be AGP and not PCIe) then you only have to connect the overlay data output to the overlay input of the graphics card. That doesn't help with the performance, though, as the update rate and number (or lack) of intensity grading is determined by the waveform engine.

Quote
I suspect the guy spent a lot of time with IDA or similar disassembler to figure out various things (including how to enable options) and somehow managed to get it running (maybe the MSO8104A glitch idea with some phony mock-DLL).

Well, hacking the options is somewhat an open secret on this forum, so no surprise he managed that.

Quote
Edit:
Maybe not even that:
http://www.ebay.ca/itm/Agilent-54831D-MSO-4an-16dig-wUpgraded-Motherboard-cpu-2-3-2GHz-RAM-SSD-ALL-OPTS-/291821246325?hash=item43f1e6d775:g:OKgAAOSwARZXi-PL

Asus P5Q-VM DO is not that modern, it has 3 PCI slots so I guess the cards stayed the same. Note also from the description:
"The scope on Windows Vista was fully functional. The scope app had problems with the graphics card on Windows 7. Some functions were not working correctly, but the OS was running fine. I didn't found a good driver for Chips&Tech 65550 (VEN_102C DEV_00E0) graphics card  for Windows 7. The newest version which was tested was 2.51c."

Not surprising regarding the driver, considering that there are no new drivers for that ancient graphics chipset since Windows95.

Quote
"I solved the issue with the scope application on Windows 7. Now it's running fine. The scope is running on Windows 7. All functions are fully functional.
The issue was virtual memory. I needed to set the size manualy. Now it's working."

Note that the system is 32 bit, apparently the old drivers crash on x64.

64bit Windows is unlikely to work, if only because there are no 64bit drivers for the scope hardware (and the antique C&T GPU).

Quote
Meh, E6700 is OK but I wouldn't call that lightning fast, especially with the original GPU  :-\

Considering the difficulties caused by the hardware mixing of desktop and waveform display I'd say that is probably as good as it gets with these scopes.
 

Offline lukier

  • Supporter
  • ****
  • Posts: 634
  • Country: pl
    • Homepage
Re: Agilent 54831D modernising
« Reply #18 on: November 08, 2016, 05:26:56 pm »
That wouldn't help. You need a hardware mixer to overlay the waveform image onto the graphics content, and that needs to happen in the GPU. Older graphics cards (i.e. Radeon 7000/VE and Geforce 2 MX) had a pin connector for hardware overlay, but that is gone from newer cards. And on-board graphics pretty much always lacked the hw overlay inputs.

Once you found a card old enough to support hw overlay (which will very likely be AGP and not PCIe) then you only have to connect the overlay data output to the overlay input of the graphics card. That doesn't help with the performance, though, as the update rate and number (or lack) of intensity grading is determined by the waveform engine.

I thought (based on the diagram in the service manual) that it is other way around. PC generates whatever image it normally generates (windows desktop, scope app etc) and feeds this video (parallel video bus?) to the "Display Board" that replaces specific pixels (RGB color 10,9,9) with the hardware rendered waveform. See this post:

https://www.eevblog.com/forum/testgear/who-uses-lecroy-scopes-with-xstream-software/msg450337/#msg450337

So an LVDS -> Parallel -> FPGA -> "Display Board" might work, but probably 640x480 only anyway. Still, a lot of work.

LCD is also connected to the display board, not the GFX card. But as I've said, I don't have the scope, just guessing based on the service manual.

Considering the difficulties caused by the hardware mixing of desktop and waveform display I'd say that is probably as good as it gets with these scopes.

I guess that too. A bit of a bummer. But even this motherboard is an improvement, with SATA, one PCIe slot for USB 3.0 for example, plenty of USB and somehow faster CPU.
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4922
  • Country: si
Re: Agilent 54831D modernising
« Reply #19 on: November 08, 2016, 05:52:18 pm »
Very interesting project you have there. I had a look at these scopes before but reading trough various documentation online it was quickly operant that all the weird oldschool hardware would probably put up a fight. I was considering trying to score one on the cheep to upgrade in to a proper modern scope.

However i do have a DSO9000 here that i got the same idea with. I already have a small form factor industrial motherboard that should fit in it while supporting anything socket 1150 so up to latest haswell stuff. On top of that it does also have a LVDS video output connector that should work with the display in it.

The process of upgrading that should be pretty smooth as the only connection to the actual scope part is trough two SATA cables that are used to bring over the signals of a PCIe 1x bus. So all thats needed to make it work is a converter PCB that turns one PCIe slot in to two sata connector with the right pinout. Software side of things is easy since the scopes run windows 7 from the factory (I hope they didn't 'upgrade' from win 7 yet). On this scope i would expect to get a significant speed boost from a modern CPU as all of the waveform rendering is done on the CPU. A little bit of it could be happening on the GPU since it is using DirectX, but i assume that's mostly to get things on to the screen quickly.

And yes the scope software for infiniium scopes installs just fine on a regular PC where it can be used to open up the waveform files and look trough them. It appears that the installation process for it detects its running on a scope since when installing it on the actual scope it really went to town and barfed Keysight branding all over the OS by replacing the wallpapers, login screens etc.

I had the guts of the infiniium software open in IDA Pro for other reasons, but it has probably changed way too much to be applicable for the version you run. As far as i remember the list of scope hardware info inside the code did not have scopes this old, but its hard to be sure since a lot of code names show up in there (The newer scopes all seam to be code named after dinosaurs)
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #20 on: November 08, 2016, 06:11:12 pm »
That wouldn't help. You need a hardware mixer to overlay the waveform image onto the graphics content, and that needs to happen in the GPU. Older graphics cards (i.e. Radeon 7000/VE and Geforce 2 MX) had a pin connector for hardware overlay, but that is gone from newer cards. And on-board graphics pretty much always lacked the hw overlay inputs.

Once you found a card old enough to support hw overlay (which will very likely be AGP and not PCIe) then you only have to connect the overlay data output to the overlay input of the graphics card. That doesn't help with the performance, though, as the update rate and number (or lack) of intensity grading is determined by the waveform engine.

I thought (based on the diagram in the service manual) that it is other way around. PC generates whatever image it normally generates (windows desktop, scope app etc) and feeds this video (parallel video bus?) to the "Display Board" that replaces specific pixels (RGB color 10,9,9) with the hardware rendered waveform.

No, that's not the case:
https://en.wikipedia.org/wiki/Hardware_overlay

HW overlay happens in the GPU where some area of the video memory is reserved for streaming data from an external device (i.e. a TV card, or the waveform engine of an Infiniium 54800 scope), and where the GPU replaces ("overlays") a specific color set with data from that reserved memory.

Quote
See this post:

https://www.eevblog.com/forum/testgear/who-uses-lecroy-scopes-with-xstream-software/msg450337/#msg450337

Lukas' post just confirms that. Because it happens all in the GPU the trick he describes to give the UI the stencil color (which results in parts of the UI replaced by the overlay image) can work.

Quote
So an LVDS -> Parallel -> FPGA -> "Display Board" might work, but probably 640x480 only anyway.

No, it wouldn't. You need a GPU with hardware overlay support (which pretty much means 10+yr old AGP or PCI cards), and then convert the output to LVDS for the LCD panel.

Quote
LCD is also connected to the display board, not the GFX card.

Very likely just to convert the digital interface of the antique C&T 65550 to LVDS for the TFT panel.

But as I've said, I don't have the scope, just guessing based on the service manual.

Quote
Considering the difficulties caused by the hardware mixing of desktop and waveform display I'd say that is probably as good as it gets with these scopes.

I guess that too. A bit of a bummer. But even this motherboard is an improvement, with SATA, one PCIe slot for USB 3.0 for example, plenty of USB and somehow faster CPU.

But is it really worth it when the scope performance is not improved? I'm not sure I could be bothered to spend the time and money for little improvement, at least if I had a fully working scope. On a scope with a defective mainboard, maybe (but even there I'd probably just look for a cheap drop-in replacement).
 

Online Jwalling

  • Supporter
  • ****
  • Posts: 1517
  • Country: us
  • This is work?
Re: Agilent 54831D modernising
« Reply #21 on: November 09, 2016, 11:02:12 am »

Not surprising regarding the driver, considering that there are no new drivers for that ancient graphics chipset since Windows95.


Those scopes came with Windows XP, unless they were sold to the military.
Jay

System error. Strike any user to continue.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Agilent 54831D modernising
« Reply #22 on: November 09, 2016, 11:12:31 am »

Not surprising regarding the driver, considering that there are no new drivers for that ancient graphics chipset since Windows95.


Those scopes came with Windows XP, unless they were sold to the military.

That is not true. The original Infiniiums were sold with Windows95 and shortly after Windows98 on the general market (there wasn't a WindowsXP in 1998, and there weren't any special military versions of any Infiniium scope). The move to WindowsXP only came later (around 2002) in an attempt to move away from Windows9x, which also brought improved stability.

The WindowsXP driver for the C&T 65550 is pretty much a repackaged NT4 driver.
 

Online Jwalling

  • Supporter
  • ****
  • Posts: 1517
  • Country: us
  • This is work?
Re: Agilent 54831D modernising
« Reply #23 on: November 09, 2016, 12:00:19 pm »

Not surprising regarding the driver, considering that there are no new drivers for that ancient graphics chipset since Windows95.


Those scopes came with Windows XP, unless they were sold to the military.

That is not true. The original Infiniiums were sold with Windows95 and shortly after Windows98 on the general market (there wasn't a WindowsXP in 1998, and there weren't any special military versions of any Infiniium scope).

Regarding Windows, that's true for the first generation 54810A, 15A, 20A, 25A, 35A, and 45A. The N versions were for the military (US Navy), but there was no difference in them hardware-wise. The N versions could not come from Malaysia, only the US or Singapore. However, on the second generation scopes (54831B etc.) The M versions for the military (US Marines) came with Windows 98, while the rest were shipped with Windows XP. Once again, the military versions could not come from Malaysia.


The move to WindowsXP only came later (around 2002) in an attempt to move away from Windows9x, which also brought improved stability.

No argument there!

The WindowsXP driver for the C&T 65550 is pretty much a repackaged NT4 driver.

That I'm not so sure about...
Jay

System error. Strike any user to continue.
 

Offline HowardlongTopic starter

  • Super Contributor
  • ***
  • Posts: 5315
  • Country: gb
Re: Agilent 54831D modernising
« Reply #24 on: November 09, 2016, 12:06:11 pm »
FWIW, although of limited value, here is some documentation on the CT65550. I'm not suggesting trying to hack this, it's just for idle interest purposes: it's almost 20 years old!

http://vesta.informatik.rwth-aachen.de/ftp/pub/Linux/suse/people/eich/.forMats/010177007.pdf

http://www.chinaeds.com/zl/Laoli%5CC%5CChips%5Cpo52_4.pdf
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf