Ok, I plugged in the DP to HDMI adapter into computer DP port and HDMI cable running straight to the second monitor. Still nothing, but computer detected it. Went through all the settings for resolution and refresh that I knew worked before when I used old computer with HDMI output. Still a blank screen.
Guess how I eventually got it to work…..
I switched the displays around (in display settings) from primary and secondary… made the DP-HDMI monitor primary and VGA connected monitor secondary (before the VGA-connected monitor was primary and the DP was secondary). Voila! It worked!!!! Then I switched them BACK…. Guess what… They STILL worked!!!!
I haven’t tried the HDMI-over-Ethernet boxes yet [EDIT: Now I have, it works fine] but now that I saw the symptoms I’m pretty sure everything will be good. Seems like I needed to swap primary-Secondary in Windows to have it “trigger” something in the software… because when I switched them back to the original configuration the setting it worked. Strange or what
?
Maybe some programming error in Windows or intel display drivers. Something has to initiate and maybe that only works when you make the other monitor primary and after that first time it sets the system properly, so you can switch back and forth (swap primary/secondary as many times as you want) and it will be fine.
I can see how this stupidity can waste a lot of time and money if someone were to chase down the rabbit hole. I ended up returning active DP-HDMI adapters for passive (which were cheaper anyways) but this could have gone sideways really fast… buy discrete video cards, rewire cabling to ceiling monitor, etc… wow, all over a software bug that I discovered by accident?!?