In my experience Altium really doesn't like sleep or hibernation. I don't ever turn off my PC (except a couple of times a year for cleaning, or every once in a while for a component installation/removal), but I let it sleep/hibernate, and whenever it wakes up, there is close to 50/50 chance that Altium is going to be messed up and will need to be restarted. While Orcad PCB editor often stayed open for over a month with sleep and hibernation, and never had any problems. And none of the issues I had with Altium were caused by PC itself. And Windows is obviously not a problem too as Orcad manages to work for a very long time just fine.
I dont think it is hibernation, I think it is changing video cards and or outputs. Put my desktop into hibernation for months without restart, no issues. Got a laptop with two video cards, constant crashes and errors when doing that.
That is a hardware / driver issue. If one monitor is on the embedded graphics and the other is on the discrete gpu it does indeed give problems. if you hibernate, close the lid ( but keep running external through a docking station ) the system is thrown fro one graphics driver to another. if they don't support the same feature set in terms of hardware acceleration your application can crash ( and not only altium ... )
When the application starts it interrogates what hardware it runs on and decides to use one GPU library. At startup the JIT engine 'compiles' this library in and the application starts. if you now all of a sudden switch graphics hardware your program is using a graphics library that is not made for that hardware ! Ideally this should work if you are using only the basic , common functions of a graphics card. But once you use the accelerators .. are bets are off. Here run this heavily optimized code for an ATI card (using ATI specific libraries) on an nvidia. it doesn't work. You cannot expect an application to dynamically switch hardware configurations. It is an unreasonable scenario. if it is removable hardware like usb to ethernet dongles etc the OS has a mechanism. But a graphics card is not a 'removable' device.
If you run altium on such a system and drag the window from one screen to another ( or set it in the middle between two , you will only be useing the common windows graphics library. no acceleration will be started. if you start altium on the real GPU , acceleration launches. drag it to the other window and acceleration ceases for the remainder of the session. do this from the OS side (hibernation ,turnign the graphics card off ) without signalling the application : all bets are off.
That's why ,on the Zbook i use the discrete graphics is turned off. it is disabled in the bios. only the Quadro shows up. Two monitors on the dedicated displayports (doscking station) . One on the Thunderbolt port that switches to mini displayport mode.
I've seen all these issues that are mentioned and, like i said before, my standard answer is : get a real computer. in the sense : get a workstation grade machine with fixed hardware and none of this 'dynamically can change' or 'emulated' ( wine, parallels , bootcamp ) environments. You are running CAD software, not video games. I had engineers mucking around with altium on a 13 inch macbook... complaining it is slow. well duh -facepalm- . besides you have to be insane running it on 13 inch screens. back in the dos days the standard was already 14 inch.
What's next ? altium for Iphone ? If we are going that route i also want it on flipphones with 320x240 like motorola razor (it has a second 64x64 display. dual screen ! schematic one , pcb other)? (no , sorry we will not support blackberry , don't ask. Eagle already has that market
)
how about nokia 3310 ? that certainly needs to be supported ! such an ubiquitous platform !
Maybe we should run it on HD44780 'graphics cards'. That a known standard that's been around for ages. Lets go for 2x8 char configuration. they are cheap on ebay . less than a dollar. let's hook up a few hundred to an arduino and see if we can make it run altium. We already did Doom, so how hard can it be ?