Author Topic: [MOVED] Hacking NVidia Cards into their Professional Counterparts  (Read 1218301 times)

0 Members and 4 Guests are viewing this topic.

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Passiv 3D with dual projection and Nvidia 3D vision
« Reply #450 on: July 21, 2013, 11:21:12 am »
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

How exactly do you figure a K5000 isn't good enough when a GTX680 is? The spec between them is near identical.

How, pretty simple, with comparisons:

http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0

http://www.videocardbenchmark.net/gpu.php?gpu=Quadro+K5000

The GTX 680 is about 1,5 faster than the K5000 for games... ;)

And here are benchmark of a modded GTX 680 into a K5000... The performances don't change with the mod (what is good for me and bad for others...).
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155
« Last Edit: July 21, 2013, 11:49:13 am by Soulnight »
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Passiv 3D with dual projection and Nvidia 3D vision
« Reply #451 on: July 21, 2013, 01:40:22 pm »
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

How exactly do you figure a K5000 isn't good enough when a GTX680 is? The spec between them is near identical.

I am pretty sure that a quadro card gives framesynced dual output from one card...

I'm pretty sure all Nvidia cards do, going back at least to 8xxx series (I am running an IBM T221 DG1 off an 8800GT with two DVI outputs, and that monitor supposedly requires genlocked inputs).
Vsync is a subtly different issue, and again, I'm pretty sure all Nvidia cards do run vsynced across separate outputs if they are told to run multiple screens as a same frame buffer. Again, on a T221 DG1 ATI cards (4850, cannot use 5xxx+ since they only comne with single DL-DVI outputs) produce tearing along the middle of the screen (genlocked but not vsynced), but Nvidia cards (tried with 8800GT, various 4xx, 580 and 680 cards, quadrified and vanilla, and a Quadro 2000) do not produce the tearing artifact - which implies they all run multiple screens vsynced. So provided you configure your setup correctly, it should work just fine.

Plus, I would like to go sli with 2 modded GTX 680 into quadro K5000. But I've read that the quadro just support sli for choosen "complete Workstation" from dell etc... However it may still be possible to do it since they won't be true K5000.

I didn't think Quadro cards came with SLI bridge ports. My Quadro 2000 certainly doesn't.

One thing is for sure, though, two separate cards won't be providing genlocked output, which means they won't be vsynced either.

Have you done some gaming with that monitor? It's sure that Dual projection Stereo 3D is out off sync with geforce line when using Tridef(up to one frame out of sync) and geforce do not even support dual output S3D via 3D Vision but Quadros support and it's synced. There is no support for 2 screen Surround for nvidia that provides synced signal so that Tridef could be used with SBS mode. Maybe GeForce drivers enables sync when IBM T221 DG1 is connected, like those drivers is going to do with those Sharp based 4K monitors(i think Asus one is allready supported).

K5000 is capable to support synced dual projection with up to 2-way SLI without need of Quadro Sync card, it can output synced Mosaic up to 8 displays. >> http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html

AMD eyefinity puts out synced signal for expanded screens for fullscreen apps(3840x1080 for example).

Edit:correct link
« Last Edit: July 21, 2013, 01:46:28 pm by Jager »
 

Offline gordan

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
    • Tech Articles
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #452 on: July 21, 2013, 05:19:28 pm »
How, pretty simple, with comparisons:

I wouldn't be so quick to disregard K5000's gaming performance. The only thing the 680 has on it's side is higher clock speeds, but they are not _that_ much higher. And the GPU core is the same.

Have you done some gaming with that monitor?

Yes, running Crysis in a VM on a Quadrified GTX480 (Quadro 6000) on my 2nd T221 and there is no tearing whatsoever. Nor was there any tearing on my other T221 with the 8800GT card, but there is massive tearing visible with the Radeon 4850.

It's sure that Dual projection Stereo 3D is out off sync with geforce line when using Tridef(up to one frame out of sync) and geforce do not even support dual output S3D via 3D Vision but Quadros support and it's synced. There is no support for 2 screen Surround for nvidia that provides synced signal so that Tridef could be used with SBS mode. Maybe GeForce drivers enables sync when IBM T221 DG1 is connected, like those drivers is going to do with those Sharp based 4K monitors(i think Asus one is allready supported).

I suspect this is a different use case. The difference is that in my setup I am using a single large frame buffer to produce one large desktop across two separate "screens" (which just happen to form a single TFT panel).
In the case of 3D stuff, the frame buffer is probably not the same. Can you configure it as one large "desktop" with one set of 3D images being displayed on columns 1-1920 and the other set on columns 1921-3840?

K5000 is capable to support synced dual projection with up to 2-way SLI without need of Quadro Sync card, it can output synced Mosaic up to 8 displays. >> http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html

The only way I can see you getting synced and genlocked frames is by all outputs to screens coming off a single card, i.e. traditional SLI. That allows you to do processing on two cards but the monitor signal all comes out of a single frame buffer and a single vsync. Any more than that requires an external genlock device (QuadroSync mentioned on that page is a separate hardware device).

AMD eyefinity puts out synced signal for expanded screens for fullscreen apps(3840x1080 for example).

I wouldn't want to even remotely take a chance on an ATI card for the next decade - too much ongoing, persistent disappointment in pretty much every aspect of both drivers and hardware over the past 3 years. It's going to take an enormous improvement before I would deem their products fit for any non-trivial purpose (i.e. single card with a single screen).
« Last Edit: July 21, 2013, 05:21:33 pm by gordan »
 

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #453 on: July 21, 2013, 05:30:37 pm »
How, pretty simple, with comparisons:

I wouldn't be so quick to disregard K5000's gaming performance. The only thing the 680 has on it's side is higher clock speeds, but they are not _that_ much higher. And the GPU core is the same.


Higher clock speed and better power supply...and 350€ for the gtx 680 against 1500€ for the K5000.
For gaming purposes, when the modded GTX 680 does have the quadro functionnality, I do not see any advantages for the real K5000... ^-^
 

Offline gordan

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
    • Tech Articles
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #454 on: July 21, 2013, 06:06:12 pm »
How, pretty simple, with comparisons:

I wouldn't be so quick to disregard K5000's gaming performance. The only thing the 680 has on it's side is higher clock speeds, but they are not _that_ much higher. And the GPU core is the same.


Higher clock speed and better power supply...and 350€ for the gtx 680 against 1500€ for the K5000.
For gaming purposes, when the modded GTX 680 does have the quadro functionnality, I do not see any advantages for the real K5000... ^-^

Depends on what you want to use it for. Evidence thus far shows that at least some of the GL functionality seems to be disabled on hardware level (certain GL primitives are missing, see a dump of glxinfo for Quadrified GTS450 vs. a real Quadro 2000 a few pages back).
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #455 on: July 21, 2013, 06:26:33 pm »
I suspect this is a different use case. The difference is that in my setup I am using a single large frame buffer to produce one large desktop across two separate "screens" (which just happen to form a single TFT panel).
In the case of 3D stuff, the frame buffer is probably not the same. Can you configure it as one large "desktop" with one set of 3D images being displayed on columns 1-1920 and the other set on columns 1921-3840?

There is no way to do single wide "desktop" for two monitors and using it for gaming. This is where nVidia surround is needed and it do not support 2 monitors. I can off course do wide desktop or clone one to another. Sharp 4K based monitors use same large frame trick(1920x2160+1920x2160) as your monitor to achieve 4k 60hz resolution so maybe there is driver enabled "mosaic/surround" for your setup too. Sharp 4k is not supported yet and therefor not work but ASUS(rebranded sharp) work with latest drivers, so driver look up if ASUS is connected and enables "mosaic/surround" support for it.

The only way I can see you getting synced and genlocked frames is by all outputs to screens coming off a single card, i.e. traditional SLI. That allows you to do processing on two cards but the monitor signal all comes out of a single frame buffer and a single vsync. Any more than that requires an external genlock device (QuadroSync mentioned on that page is a separate hardware device).

Link says otherwise, with K5000 you can do synced Mosaic up to 8 screen with SLI OR with Quadro Sync. You can use outputs(limited) from 2 cards when doing Surround with geforces(600 series and up) too and surround is always synced.
 

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Passiv 3D with dual projection and Nvidia 3D vision
« Reply #456 on: July 21, 2013, 06:31:01 pm »
Depends on what you want to use it for. Evidence thus far shows that at least some of the GL functionality seems to be disabled on hardware level (certain GL primitives are missing, see a dump of glxinfo for Quadrified GTS450 vs. a real Quadro 2000 a few pages back).

As I already said, I need this quadro function:

Hello everybody,

I'm new on this forum and I would like to thank everybody (and especially gnif) for the mod of geforce GTX 680 to Quadro K5000.

I'm a gamer. I don't want more performance but just one (quadro) functionnality more.

I play games in 3D with the help of nvidia 3D vision on my monitor (input signal 1080P 3D 120Hz).
I would like to do the same but with passive 3D dual projection. I need therefore the option of a quadro K5000 to activate the possibility of passive stereo with 2 projectors connected to the graphic card. Here is the link to the procedure to do passive stereo with a quadro.

http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.

Could someone please confirm that the option is available with a modded GTX 680 to quadro K5000?

Thank you,
Soulnight  ;)
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #457 on: July 21, 2013, 06:39:55 pm »
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.
 

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #458 on: July 21, 2013, 06:42:41 pm »
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #459 on: July 21, 2013, 06:53:59 pm »
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)

That's true, i don't need it but someone could like to use RF glasses. It should work with any 3D vision compatible device as well and can be used for gaming like normal 3D vision. IR was not good for my setup, flashes etc and needed looong usb cable but not using this anymore because W1070 is dlp-link and off course goal is passive 3D...
 

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #460 on: July 21, 2013, 07:02:56 pm »
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)

That's true, i don't need it but someone could like to use RF glasses. It should work with any 3D vision compatible device as well and can be used for gaming like normal 3D vision. IR was not good for my setup, flashes etc and needed looong usb cable but not using this anymore because W1070 is dlp-link and off course goal is passive 3D...

I'm sorry to tell you that but the benq W1070 (great projector) is not so good for the omega filters. It' a DLP: great! But the throw ratio is not big enough (max 1,5:1 with full Zoom) and you will have color uniformity problems with it. Motormann advises a throw ratio of Minimum 1,5:1 and ideally 2:1.

Source: http://www.avsforum.com/t/1407101/official-omega-3d-passive-projection-system-thread/150#post_22927789

I think that if you cannot max out the zoom with yours W1070, you should buy 2 cheap H9500 (2D) which have vertical et horizontal Lenshift and are all together better than the W1070.
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #461 on: July 21, 2013, 07:14:31 pm »
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)

That's true, i don't need it but someone could like to use RF glasses. It should work with any 3D vision compatible device as well and can be used for gaming like normal 3D vision. IR was not good for my setup, flashes etc and needed looong usb cable but not using this anymore because W1070 is dlp-link and off course goal is passive 3D...

I'm sorry to tell you that but the benq W1070 (great projector) is not so good for the omega filters. It' a DLP: great! But the throw ratio is not big enough (max 1,5:1 with full Zoom) and you will have color uniformity problems with it. Motormann advises a throw ratio of Minimum 1,5:1 and ideally 2:1.

Source: http://www.avsforum.com/t/1407101/official-omega-3d-passive-projection-system-thread/150#post_22927789

I think that if you cannot max out the zoom with yours W1070, you should buy 2 cheap H9500 (2D) which have vertical et horizontal Lenshift and are all together better than the W1070.

No problem to max out zoom, it's so short throw anyway. I will mod those filters inside too in the future so it's no issue. Lens shift is a bit off issue but it's just enough. Room temp is another :) but this is all for another thread...
 

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #462 on: July 21, 2013, 08:48:04 pm »
@ Jager: we should talk about filters and projectors on the appropriate forum on avs. See you there!
And I'm really eager to read your modding results!  :D
 

Offline gordan

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
    • Tech Articles
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #463 on: July 21, 2013, 09:39:38 pm »
I suspect this is a different use case. The difference is that in my setup I am using a single large frame buffer to produce one large desktop across two separate "screens" (which just happen to form a single TFT panel).
In the case of 3D stuff, the frame buffer is probably not the same. Can you configure it as one large "desktop" with one set of 3D images being displayed on columns 1-1920 and the other set on columns 1921-3840?

There is no way to do single wide "desktop" for two monitors and using it for gaming. This is where nVidia surround is needed and it do not support 2 monitors. I can off course do wide desktop or clone one to another.

I can tell you right now that you are completely wrong. I am running XP x64 and have a T221 running which the GPU sees as two separate, independent 1920x2400 monitors. You go to the Nvidia control panel and tell it to stretch the desktop across both monitors (horizontal span). This gives me a single, seamless 3840x2400 desktop, and the 3840x2400 mode is available in games too (as I explained earlier, I am running Crysis on this, and have used it with a real Quadro 2000, GTS450, Quadrified (2000) GTS450, GTX470, Quadrified GTX470 (5000), GTX480, Quadrified GTX480 (6000), GTX580, Quadrified GTX580 (7000) and a GTX680 (not modified into a Quadro yet, will be modifying to Grid K2 in the next few days).

It works just fine, it always has. Which is much more than I can say for ATI cards.

I make no statement about the lesser versions of Windows and driver capability on them (Vista, 7 and 8) but this works perfectly (and with normal, unmodified GeForce cards) on XP/XP64.

Sharp 4K based monitors use same large frame trick(1920x2160+1920x2160) as your monitor to achieve 4k 60hz resolution so maybe there is driver enabled "mosaic/surround" for your setup too. Sharp 4k is not supported yet and therefor not work but ASUS(rebranded sharp) work with latest drivers, so driver look up if ASUS is connected and enables "mosaic/surround" support for it.

Don't know about the Sharp/Asus monitor, but T221 works without any special driver coordination. Two discrete monitors work the same - you can stretch the desktop across both and this mode is then available to games.

The only way I can see you getting synced and genlocked frames is by all outputs to screens coming off a single card, i.e. traditional SLI. That allows you to do processing on two cards but the monitor signal all comes out of a single frame buffer and a single vsync. Any more than that requires an external genlock device (QuadroSync mentioned on that page is a separate hardware device).

Link says otherwise, with K5000 you can do synced Mosaic up to 8 screen with SLI OR with Quadro Sync. You can use outputs(limited) from 2 cards when doing Surround with geforces(600 series and up) too and surround is always synced.

The key word being with QuadroSync this is not a piece of software, it is a separate, external genloc device costing more than a GTX680 card will cost you. Without QuadroSync, I am pretty sure that this works using the SLI trick and all the screens end up driven off the same card, the secondary card ends up being used only for processing, not for generating screen output. By all means tell me I'm wrong if you have actually tried this setup, but in the absence of any evidence to the contrary this is by far the most likely possibility.
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #464 on: July 21, 2013, 11:00:29 pm »
Gordan, you do not need Quadro Sync addon card with 2 GPU SLI and synced Mosaic up to 8 displays with K5000, if you have even more K5000's installed then Quadro Sync Card is needed. Other Quadros need Sync card when more then one installed. All this info you can find from nVidia site. You can also use outputs from second card just like SLI surround and SLI stereo3D surround with Geforces(600 series and up and only 3 monitor supported). Surround is always synced with geforces too. Surround is just Mosaic for desktop.

look table "Number of synchronized displays/projectors from a single system with NVIDIA® Mosaic technology"
here >>> http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html

Well i think XP may have this expanded thing working. It is NOT working in windows 7. There is no way to game with 3840x1080p in Win7 with two screens WITH GeForce. You need nVidia surround for multimonitor gaming and even requested several times nVidia is not adding support for 2 monitor surround, they feel no one like to play bezel in middle...

EDIT: It was actually Fermi era that introduced 3D Vision Surround and that needed SLI for extra outputs so 3 monitors could be used. With Keplers you can do it with single card. Fermi based Quadro's seems to be able to do 4 monitor mosaic without need of sync card when SLI.
« Last Edit: July 21, 2013, 11:30:30 pm by Jager »
 

Offline gordan

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
    • Tech Articles
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #465 on: July 21, 2013, 11:12:06 pm »
Maybe Windows 7 is deficient in this way - I wouldn't know. If it is why would you want to use it?
All I can say is that XP64 and Linux both work wonderfully with this setup. :)
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #466 on: July 21, 2013, 11:49:51 pm »
Maybe Windows 7 is deficient in this way - I wouldn't know. If it is why would you want to use it?
All I can say is that XP64 and Linux both work wonderfully with this setup. :)

Windows 7 is good most of the stuff i do. It's more at nVidia site this time. I just hope that Quadro hack will fix things :). After all there is no point to buy "underclocked" Kepler for gaming that cannot even do SLI without certified system...
 

Offline hanyuefeng

  • Newbie
  • Posts: 3
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #467 on: July 22, 2013, 07:53:30 am »
Hi everyone,

I`m new in this area and track this post for a time.

My situation is that I have IBM T221 and a GTX 590, however, it didn`t support mosaic mode so that cannot give me 3840x2400 decently, thus I had to buy another ATI card which has analogous feature.

I see there are methods to mod resistors or mod soft-straps to convert a geforece card to a quadro one, and know 450 and 600 series can been done in that way. But I don`t how to deal with my idle 590 as for my limited knowledge, could anyone provide some detail info on it?

Any help will be very appreciated.
 

Offline gordan

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
    • Tech Articles
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #468 on: July 22, 2013, 03:20:00 pm »
My situation is that I have IBM T221 and a GTX 590, however, it didn`t support mosaic mode so that cannot give me 3840x2400 decently, thus I had to buy another ATI card which has analogous feature.

Don't know what you're talking about, I've been running a T221 successfully at seamless single desktop resolution of 3840x2400 with Nvidia cards from 8800GT to various 4xx series cards, a GTX580 and a GTX680, before and after quadrifying the cards without any issues at all. No modification is required to get Nvidia cards working perfectly with T221 monitors, at least in XPx64 and Linux (the only two OS-es I use).

My experience with ATI cards and T221 has been utterly dire. The most recent hardware I've had moderate success with on T221 has been the Radeon 4xxx series cards, and windows drivers of that approximate generation (2011). This is because of the following:

1) No ATI cards I have ever found that are later than 4xxx series have more than a single DL-DVI output. If you are using DL-DVI->LFH60 adapters (google for Cirthix T221 and you'll find it) you need 2x DL-DVI ports.

2) If you are using 4x SL-DVI ports, you are even more out of luck because no ATI card comes with quad DVI ports. Nvidia cards don't either. You could potentially get away with running two cards on a DG5/DGP T221, but DG1 and DG3 require genlocked outputs (allegedly - never tested this myself with multiple cards on my DG3 - perhaps I should) which won't be the case with outputs from 2 cards.

3) If you are using 2xSL-DVI, and you want more than 25Hz, you won't be able to achieve it with ATI cards because any recent ATI driver will flat out refuse to do any mode except what the monitor reports via EDID. The driver will even ignore custom monitor .inf drivers. If you use an Nvidia card (or use the open source radeon Xorg driver on Linux which supports custom modelines) you can actually achieve about 33Hz with 2x SL-DVI outputs.

4) Nvidia cards happily reliably push 165MHz on each SL-DVI output (maximum standard supports). My Radeon 4850 produces strange blue artifacts when pushing 165MHz, I had to drop it down to 160MHz.

5) ATI cards produce pretty obvious tearing between the two halves of the screen. This is obvious in all applications (e.g. dragging an application window around the center of the screen, very obvious in full screen games at 3840x2400 or when playing back HD videos). Nvidia cards do not suffer from this - they seem to use a single large framebuffer so there are no sync issues.

It is all these issues with ATI cards that a T221makes particularly obvious that are the main reason why I consider ATI cards unfit for purpose.

The only reason I am using a 4850 at all is because I have a slightly broken motherboard PCIe slot that seems to have a failed PCIe lane #1. ATI cards seem to be clever enough to go into x8 mode and use the 2nd 8 lanes while all my other cards (GPU and otherwise) don't do that and thus don't detect in that slot. This is the only good thing I can say about the ATI cards.
« Last Edit: July 22, 2013, 04:06:28 pm by gordan »
 

Offline hanyuefeng

  • Newbie
  • Posts: 3
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #469 on: July 22, 2013, 07:21:41 pm »
Thanks for your reply, gordan.

My mode is DG3 and I have 2 specified DL-DVI cables which brought from japanese seller and each one hava an EDID chip socket on it so you can easily control the refresh rate by installting different chip, however, I don`t want to two 1920x2400@60Hz to combine an 3840x2400@60Hz which is the only way my 590 providing me. So that is why I want ATI Eyefinity or Nvidia Mosaic which only bundled with quadro cards can meet my requirement.

By the way, gordan, have you successfully modded any 500 series geforce card into quadro ? any experience?

Thank you very much.
« Last Edit: July 22, 2013, 07:25:48 pm by hanyuefeng »
 

Offline gordan

  • Frequent Contributor
  • **
  • Posts: 277
  • Country: 00
    • Tech Articles
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #470 on: July 22, 2013, 08:04:09 pm »
OK, with a DG3 you cannot use the Cirthix adapter because the input circuitry is slightly different. As far as I can tell, on the DG5 you can do this because it accepts 1920x2400@48Hz in interlaced format, so one half of the DL carries odd lines, and the other even lines. On the DG3 this won't work.

The only thing you can really do on the DG3 is either use 2xSL-DVI for 32-33Hz or 4xSL-DVI for more. I haven't tried feeding 4xSL-DVI into my DG3, but if it is true that the signal needs to be genlocked you are pretty much out of luck - all you can really do is get an old Nvidia NVS card with 2xLFH60 ports and use that to drive the monitor. What you _might_ be able to do is drive it using 2xSL-DVI + HDMI (as another SL-DVI) + DP (with a passive SL-DVI adapter). That might actually give you genlocked 4xSL-DVI outputs. The only problem is that if you try this with an ATI card you will find that the vast majority of them will only let you use up to 3 outputs simultaneously, which shoots down the idea of using ATI cards unless you get one of the older eyefinity models.

Note: The most you will ever get out of a T221 with custom modes (so forget ATI since they won't do custom modes as I explained earlier) is 55Hz using 2xDL-DVI inputs (only possible on the DG5). If you are using 4xSL-DVI it will probably be less than 55Hz max because each input has to have separate blanking which will eat into the bandwidth. By default DG5 supports up to 48Hz, and I'm finding this is perfectly fine even for gaming, so I never bothered to "overclock it" to 55Hz. If you decide to try doing that you might want to consider using thermal epoxy to glue some heatsinks to the FPGA chips inside the T221.

I use my DG3 for general purpose desktop work and am driving it with 2xSL-DVI @ 32Hz (with an Nvidia card I can get 33Hz out of it - with ATI I have to knock the clock rate from 165MHz down to 160MHz which results in 1Hz less refresh).

I have never used Mosaic on these, I never needed it. In XP64 I just go into the Nvidia control panel, configure multiple displays, and configure it for horizontal span. It just works. It is possible that they decided to try to extort more money out of customers by removing this feature on Vista and Windows 7 - I never tried it on those.

And yes, a GTX580 can be modified into a Quadro 7000. I am currently selling mine if you are interested:
http://www.ebay.co.uk/itm/281138321842
 

Offline hanyuefeng

  • Newbie
  • Posts: 3
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #471 on: July 24, 2013, 03:35:49 pm »
I see your point gordan, however, I already have a 590 idle so I`m curious your work on 580 but not itself, could you share some details as an reference to my 590 modding.

Thanks. ;)
 

Offline Jager

  • Contributor
  • Posts: 30
 

Offline Soulnight

  • Contributor
  • Posts: 28
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #473 on: July 25, 2013, 09:44:33 am »
K6000 Quadro. http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx. 780/TITAN hack needed :)

YES! A hack of the GTX 780 would be needed indeed... A lot more of horse power for playing with 3D vision in 1080p! Someone?

How are you doing Jager? I'm waiting for you to mod your gtx 670 and to tell me that it work for 3d passiv dual projection with 3d vision you know :)
Have you bought your second BENQ W1070 yet?
 

Offline Jager

  • Contributor
  • Posts: 30
Re: Hacking NVidia Cards into their Professional Counterparts
« Reply #474 on: July 25, 2013, 12:07:52 pm »
K6000 Quadro. http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx. 780/TITAN hack needed :)

YES! A hack of the GTX 780 would be needed indeed... A lot more of horse power for playing with 3D vision in 1080p! Someone?

How are you doing Jager? I'm waiting for you to mod your gtx 670 and to tell me that it work for 3d passiv dual projection with 3d vision you know :)
Have you bought your second BENQ W1070 yet?

Another W1070 arrived to local post office, will collect it today. I have been trying to find those resistors from old stuff i have but not found, so i need to wait those ebay resistors.
 You can get 780 for 550€ and overclock those to close TITAN performance or even better. There is also bios mods that disables boost feature and forces clocks to say 1200 or whatever. Maybe this will give boost to those pro programs that do not use correct power state(if there is any) of modded quadro's.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf