EEVblog Electronics Community Forum

General => General Chat => Topic started by: gnif on March 15, 2013, 03:05:16 pm

Title: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 15, 2013, 03:05:16 pm
Hi All,

I did originally post this on the nvidia forums but they have silently deleted it  :--, obviously they do not like what I have found becoming public  >:D.

Update: The GTX680 has been hacked also (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)

Firstly I will give a bit of history for those that are unaware. NVidia's has for a long time had two ranges of cards, the GeForce for the gaming market, and Quadro for the professional market, and more recently the Tesla range for high end parallel computing stuff. As I am sure most of you would be aware, it is cheaper to manufacture a single chip and cripple it in some way for different product lines then it is to make different silicon for every product.

In the past it has been possible to convert the GeForce cards into Quadro if you could find what they call 'hardware straps' on the board and change them. These straps control the PCI Device ID that the card reports to the computer, and as such, what the drivers will allow the card to do. Recently nVidia changed the way this all works and it has not been possible for quite a few generations of cards until someone on the nVidia forums discovered that the GeForce 4xx something can be turned into its higher end card by changing the hardware strap values by means of an undocumented override in the EEPROM. They were quick to disable this by changing the drivers to look at only the hardware straps for the PCI ID.

I own a NVidia GTX 690 which I bought for two reasons, gaming, and multi monitor setup for work, NVidia made it very clear that this card would drive up to 3 screens in 2d, which it does quite nicely  :-+... under windows  :--! The tight asses have decided that if you want this feature under Linux you have to get a Quadro which has Mosaic support  :palm:. So naturally I decided to look at how mod the card, as the price difference is over $1000 between the GTX 690 and the Quadro K5000 (same GPU) and, get this... the K5000 is only single GPU and clocked some 25-30% slower then the gaming card, what a joke :-DD.

What NVidia has done is changed the way that it handles the straps, instead of just pulling the straps high or low to control the switches as they did previously, they are now read as analogue values. The scheme is as follows:

When pulling high:

5K   = 8
10K = 9
15K = A
20K = B
25K = C
30K = D
35K = E
40K = F

When pulling low I expect this to be the same, but for 7 - 0, but I did not test this as the device ID I was targeting is >= 8.

There are two tiny SMD resistors on the board, one for each nibble of the PCI Device ID byte. Originally the GTX 690 has a device id of 0x1188, so to become a Quadro K5000 this has to be changed to 0x11BA, which equates to 20K and 15K resistors. If you wanted to change it to a Tesla K10, you would want to change it to 0x118F, which equates to 5K and 40K resistors.

This will only change the rear GPU on the GTX 690, I am yet to identify the resistors to change for the front one. I would also wager a bet that the new NVidia Titan can be upgraded into the Tesla K20 using the same method.

Anyway, enough with the description, here are the photos of what to change:
(https://files.spacevs.com/gtx%20690%20back.jpg)
(https://files.spacevs.com/gtx%20690%20mod.jpg)

And the results:
(https://files.spacevs.com/quatro.png)

(https://files.spacevs.com/k10.png)

Edit:
For those that are just spewing trash on HaD comments without doing a little research... the parts are identical, changing the Device ID just makes the binary blob advertise the additional features to the system, and enables them. It does NOT affect the clock speeds, and will not make the card faster for general day to day work unless you are using the specialised software that takes advantage of these 'professional' features. Changing the ID does not affect the clock speeds as they are configured by the BIOS which we are not touching.

And stock, the GTX690 is clocked FASTER then the K5000 and the Tesla K10, so you are getting a faster card in comparison, not making the GTX690 faster.

I repeat, this does NOT make your GTX 6XX card faster, nor does it make it slower.

Donations:
To give credit to those that have donated and to keep things fair, the following are the current donations I have received to go towards NVidia hardware:
Total: $1070
Target: $1000 (EVGA GTX 690)
Remaining: $-70

Thank you everyone, the card has been replaced, I will see about getting a mod of my 690 up here soon
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Strada916 on March 15, 2013, 03:55:17 pm
Great work there gnif. Have they sorted out the "Your video card has stopped responding."  |O errors. I thought for ages it was my system. Until 6 months ago when I started to work for a new company who have all Lenovo computers. These machines too gave this error message. This was all under windows7. Sorry I diress, great work there non the less.  :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 15, 2013, 03:58:04 pm
Great work there gnif. Have they sorted out the "Your video card has stopped responding."  |O errors. I thought for ages it was my system. Until 6 months ago when I started to work for a new company who have all Lenovo computers. These machines too gave this error message. This was all under windows7. Sorry I diress, great work there non the less.  :-+

Thanks :) Aparrently it is fixed in the later drivers, but I doubt it, I had to go to a 6 month old driver to get away from all sorts of issues with SLI and surround vision under windows.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on March 15, 2013, 10:52:53 pm
Do you mean the "front" one, are the strapping resistors really closer to the other GPU, or is your computer back to front (http://www.electricstuff.co.uk/backwardspc.html)?

Carefully comparing the PCBs (or suitable images thereof) of the 690 with the K5000 might yield useful results.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 15, 2013, 10:59:33 pm
Do you mean the "front" one, are the strapping resistors really closer to the other GPU, or is your computer back to front (http://www.electricstuff.co.uk/backwardspc.html)?

Carefully comparing the PCBs (or suitable images thereof) of the 690 with the K5000 might yield useful results.

Sorry, to be clear, when looking at a card, the rear connectors I have always seen as the front of the card. To be specific, it is GPU2 that can be changed. Comparing to a K5000 will not help as it does not have two GPUs on it. Comparing to a Tesla K10 may help as the use the same reference design.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: airthimble on March 16, 2013, 01:37:53 am
gnif,

I saw your post on the cuda developer forums, which got deleted, then I tracked your username to overclock.net and finally made my way here. I am interested in exploring this with some of the single gpu cards. I am curious as to how you found this mod, were you able to get the schematics for the nvidia reference design?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Ed.Kloonk on March 16, 2013, 03:24:05 am
Ole Linus has had a bit to say in regards to Nvidia. And he's dead right. A while back Nvidia was the shit for Linux with it's tolerable binary blob drivers. But once they got a rep for being Linux friendly, they seemed to do everything possible to back away from that position. Maybe M$ scared them?

Further, I have been watching the history of the Amiga computer and the demise of Commodore. How great hardware combined with a great user community can still be destroyed by a few business-men fools.

IMO, nvidia is following in Commodore's footsteps and will probably end up just as bankrupt also.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: olsenn on March 16, 2013, 03:29:00 am
You've got balls for taking a soldering iron to a GTX-690!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Kaluriel on March 16, 2013, 03:33:23 am
I didn't even realise nVidia was doing this. Years ago it use to just be broken parts that were disabled, and now they're just purposely disabling parts. For shame
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 16, 2013, 08:42:46 am
gnif,

I saw your post on the cuda developer forums, which got deleted, then I tracked your username to overclock.net and finally made my way here. I am interested in exploring this with some of the single gpu cards. I am curious as to how you found this mod, were you able to get the schematics for the nvidia reference design?

No, no schematic, what I did was look for resistors that looked like they had an alternative position, have a look at the photos and you will see what I mean. Any that I suspected of being a strap I used a meter to check if the resistor was connected to ground of 3.3V directly, and looked where the general traces were going in the area. If they went towards the GPU and connected to one of the rails it was a pretty good bet that it was a hard strap.

Ole Linus has had a bit to say in regards to Nvidia. And he's dead right. A while back Nvidia was the shit for Linux with it's tolerable binary blob drivers. But once they got a rep for being Linux friendly, they seemed to do everything possible to back away from that position. Maybe M$ scared them?

I watched that a few days ago, couldn't agree more with him.

You've got balls for taking a soldering iron to a GTX-690!

No, just anger at NVidia for not clearly advertising that surround does not work under linux.

I didn't even realise nVidia was doing this. Years ago it use to just be broken parts that were disabled, and now they're just purposely disabling parts. For shame

Agreed, NVidia are the Microsoft of the hardware industry.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 16, 2013, 09:06:16 am
I would wager a bet that these are the straps on the EVGA 670 and 680 (they use the same PCB), I would say the top set will me more likely to contain the PCI Deivce ID straps. The lower set I would only investigate if I couldn't find it it in the top set as they look less like straps.

A quick and simple way I found to test was to take a 2.2K resistor, tack the wire onto the board without removing the resistor and pulling it to ground or 3.3V+. Make a DOS bootable USB device and throw nvflash onto it, then running 'nvflash -a' will show you what the current ID is without having to boot all the way into windows/linux.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 16, 2013, 09:10:33 am
And these might be them on the EVGA GTX 660
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: toster on March 17, 2013, 10:34:39 pm
Hum.. what happens if I get the device id wrong? If the driver doesn't work with the chip that's on the card? Does it simply fall back to "software rendering" or can something worse happen?

I have a 9600GT I could mess around with, however the chip is G94b. It looks like the counterpart is Quadro FX 1800 which has a G94. The missing 'b' bothers me.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Zibri on March 19, 2013, 04:41:31 am
hmm very interesting...
I wonder if a similar approach can enable disabled cores on x70 cards to make them x80 (660/670 >> 680   or 560/570 >> 580).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 19, 2013, 05:18:37 am
Hum.. what happens if I get the device id wrong? If the driver doesn't work with the chip that's on the card? Does it simply fall back to "software rendering" or can something worse happen?

I have a 9600GT I could mess around with, however the chip is G94b. It looks like the counterpart is Quadro FX 1800 which has a G94. The missing 'b' bothers me.

Windows will just start up in VGA mode as it does not know the card, there is no risk to the card other then botching your soldering.

hmm very interesting...
I wonder if a similar approach can enable disabled cores on x70 cards to make them x80 (660/670 >> 680   or 560/570 >> 580).

Unknown, you would have to test it, I do know however that you can change the device ID to make a card become a 670, but do not know if it disables/enabled cores.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: SeanB on March 19, 2013, 06:25:16 am
Normally the parts that fail full QC are used to make the lower spec units, but often as process yield improves the rejects become fewer, so they start using full spec devices instead, and just disable them. Normally this is done with a chip level probe test and zener zapping an internal fuse. You could in the old days change certain Intel and AMD parts back as they had either a pin strapping or an on top of case jumper that was cut to do options.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: HLA-27b on March 19, 2013, 08:22:16 am
Here is where all the traffic is coming from. In case you were wondering.

http://www.reddit.com/r/linux_gaming/comments/1aj3n3/not_cool_nvidia/ (http://www.reddit.com/r/linux_gaming/comments/1aj3n3/not_cool_nvidia/)

And thanks OP for the brilliant writeup.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: cloudscapes on March 19, 2013, 10:42:03 am
It's on hackaday as well, gratz!

http://hackaday.com/2013/03/18/hack-removes-firmware-crippling-from-nvidia-graphics-card/ (http://hackaday.com/2013/03/18/hack-removes-firmware-crippling-from-nvidia-graphics-card/)

This has been going on for a while. I remember a decade ago you could some of the better Geforce 2's into a Quadro 2. I also heard years ago that Quadros, though better at workstation stuff like CAD and 3D packages, might offer inferior performance in gaming. Like they trade off the speed for greater precision when going Geforce->Quadro, or something like that. A higher definition z-buffer, or something. I'm unsure if that's still the case, if it ever was.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Zibri on March 19, 2013, 01:33:37 pm
hmm sure, but I bet such "fuses" are outside the chip.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Omnipotent_Galaxy on March 19, 2013, 03:12:41 pm
hi gnif,

I am wondering if the same method will be useful in mobility Geforce cards since I own a Clevo GTX675MX card of GK104 and I'm looking forward to a K4000m card with the same CUDA cores.

Though some players have announced their successful soft-adjustments from 675MX to K4000m by VBIOS or others, I'm still searching for a way to make a long-term stable work.

Anyway, the price of K4000m with varranty is its worst part maybe, and many users are using ES cards or other cards from factories without varranty... What a pity.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: EEVblog on March 19, 2013, 03:15:53 pm
Two Hack-A-Days from GNIF in almost as many days!  :-+
What will he hack next?  :-//

Dave.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Omnipotent_Galaxy on March 19, 2013, 03:21:32 pm
hmm very interesting...
I wonder if a similar approach can enable disabled cores on x70 cards to make them x80 (660/670 >> 680   or 560/570 >> 580).

Some GTXx70/x60 cards do use the original CUDA core with the GTXx80 cards but there is likely to be some problems with the 'hidden' parts of the core anyway.

This generation of NV cards uses the GK104 core from 576sp cards (mobility ones of K3000m) to 1344sp cards. It is hard to imagine the power supply of GTX660Ti can be enough for GTX680...

Though with great expectation, Kepler is not like Fermi... GTX560Ti 448sp was just a dream maybe?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: jerry507 on March 19, 2013, 03:42:26 pm
It wasn't quite clear to me, did you just turned your 690 into a K5000? I understand that you wanted the multiple monitor support for linux, but you said you also bought it for gaming. The K5000 is significantly worse in that respect, is there something about this hack that preserves that?

I suppose it's a question of priorities.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 19, 2013, 04:25:45 pm
Two Hack-A-Days from GNIF in almost as many days!  :-+
What will he hack next?  :-//

Dave.

Yay, and looks like my topic has blown out our total number of users on-line 5 fold, makes me glad I went to the trouble to tune the site a bit more :).

HaD have missed one thing though, they said the 690 is a K5000 PCB, etc... it isn't, it is a Tesla K10.

As for those that are just spewing trash on HaD without doing a little research... the parts are identical, changing the Device ID just makes the binary blob advertise the additional features to the system, and enables them. It does NOT affect the clock speeds, and will not make the card faster for general day to day work unless you are using the specialised software that takes advantage of these 'professional' features. Changing the ID does not affect the clock speeds as they are configured by the BIOS which we are not touching.

And stock, the GTX690 is clocked FASTER then the K5000 and the Tesla K10, so you are getting a faster card in comparison, you are not making the GTX690 faster.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: HLA-27b on March 19, 2013, 11:50:48 pm
Quote
14:45    (Read 54112 times)
14:46    (Read 54322 times)
3896 Guests are viewing this topic

0.0
I think you've just made the server impervious to the notorious internet_kiss_of_death !!
 :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: cloudscapes on March 19, 2013, 11:53:17 pm
hack-a-day: the best way to instantly blow your bandwidth limit if you're using a smaller provider.  ;D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 19, 2013, 11:54:41 pm
Quote
14:45    (Read 54112 times)
14:46    (Read 54322 times)
3896 Guests are viewing this topic

0.0
I think you've just made the server impervious to the notorious internet_kiss_of_death !!
 :-+

That was the idea :). For those that are interested, this thread has generated 3,316,841 hits to this thread as of writing this post. We peaked out at 5x the highest ever concurrent users on the forums and only saw a peak server load of 0.8 out of 8.0. Just goes to show what a properly configured server with Nginx is capable of :).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: BravoV on March 20, 2013, 12:00:02 am
Just look at the number of "Most online today" and "'Most online ever" at the bottom of this forum front page , they are equal.  ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 12:01:27 am
Just look at the number of "Most online today" and "'Most online ever" at the bottom of this forum front page , they are equal.  ;)

I am eager to see the daily stats for today, just waiting for them to get generated.

Edit: I wonder if we will hit 100K reads, has to be some kind of server record :).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: BravoV on March 20, 2013, 12:08:39 am
The current forum stat page shows that this thread is on rank #8 in Top Ten Topic (by views), will it overthrone the legendary tinhead's "Hantek - Tekway - DSO hack - get 200MHz bw for free" thread ?  :o
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 12:17:26 am
gnif: congrats on modifying the card, but do you think you can make both GPU cores K5000 too?
I am interested in this in Windows, would be great if you or somebody else would successfully modify the GTX 680 card :)

In the past I was always changing Geforce to Quadro - until last generation when it was possible with "softquadro" technique using RivaTuner (forum.guru3d.com), because of performance in AutoCAD/3DS Max, nowadays the performance in these applications is almost equal for both the geforce and quadro models, but additional features available in the driver for Quadro models are always very interesting...

Looking forward to seeing some development in this topic
Cheers!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 12:19:49 am
gnif: congrats on modifying the card, but do you think you can make both GPU cores K5000 too?
I am interested in this in Windows, would be great if you or somebody else would successfully modify the GTX 680 card :)

In the past I was always changing Geforce to Quadro - until last generation when it was possible with "softquadro" technique using RivaTuner (forum.guru3d.com), because of performance in AutoCAD/3DS Max, nowadays the performance in these applications is almost equal for both the geforce and quadro models, but additional features available in the driver for Quadro models are always very interesting...

Looking forward to seeing some development in this topic
Cheers!

Neo,

The only thing preventing me from doing this to the other GPU is the risk of killing my card, see the risk was minimal for the initial hack as the layout of the board hinted at where the straps are, but there is no layout hints for GPU1. If someone was to donate one I would be willing to track down the straps for GPU1 also.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 12:26:47 am
Unfortunately at the moment I am not able to donate the card :)
But I can donate up to $50, maybe you could collect the money, we just need other 19 people willing to donate $50...
Or for starters - a GTX 680 for about half the price - only 9 people... :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 12:28:53 am
Unfortunately at the moment I am not able to donate the card :)
But I can donate up to $50, maybe you could collect the money, we just need other 19 people willing to donate $50...
Or for starters - a GTX 680 for about half the price - only 9 people... :)

If there is enough people willing to do this I will sort out a way to donate. A 680 can be modded also but it will not help with the 690 as the layout on the PCB is very different to accommodate the 2nd GPU.

That said I would be willing to track down straps for any of the GTX 670 and upwards cards if I had them also. Cards based on the NVidia reference design are the best for this as they are more common across brands, the GTX690 I have here is pretty much exactly the reference card that NVidia released.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 12:31:22 am
OK, let me know :) Paypal is no problem for me...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 12:41:11 am
OK, let me know :) Paypal is no problem for me...

Ok, I have been contacted out of band by a few others who have been keen to donate (that was fast), so here is the link:

https://sourceforge.net/donate/?user_id=2785077

I will also update the initial post with this information.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 12:45:17 am
Neo_Moucha: Donate: done!  ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 12:47:57 am
Neo_Moucha: Donate: done!  ;)

Thank you very much :). If the funds can be raised to get the card, I will offer to mod cards of anyone who donates $50 or more once the straps can be identified, you will just have to cover the cost of shipping the card to AU and Back.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 12:50:17 am
Great!  :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: HLA-27b on March 20, 2013, 01:05:02 am
This promises to be very interesting. I'm gonna get me some popcorn.

Gnif, anything you do will help move us toward better open source Linux drivers and farther away from the closed source binary blobs.

I think what you discovered is the exact reason why those bastards won't release anything in the open.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 01:08:14 am
This promises to be very interesting. I'm gonna get me some popcorn.

Gnif, anything you do will help move us toward better open source Linux drivers and farther away from the closed source binary blobs.

I think what you discovered is the exact reason why those bastards won't release anything in the open.

Possibly, I have been thinking on how they could prevent this in the future and there is some methods they could use, but I will not mention them here simply because I do not want to feed them information :). Needless to say though, if they use them, then the next generation of cards will be impossible to mod without resorting to hacking the binary drivers, which without a doubt encroaches on leagallity issues.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: HLA-27b on March 20, 2013, 01:24:58 am
Quote
Possibly, I have been thinking on how they could prevent this in the future and there is some methods they could use, but I will not mention them here simply because I do not want to feed them information :). Needless to say though, if they use them, then the next generation of cards will be impossible to mod without resorting to hacking the binary drivers, which without a doubt encroaches on leagallity issues.

Same as jailbreaking a phone I guess...
Anyway, they could have played nicely with Linux from the beginning, but no....

edit: can't spell
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 01:30:40 am
Quote
Possibly, I have been thinking on how they could prevent this in the future and there is some methods they could use, but I will not mention them here simply because I do not want to feed them information :). Needless to say though, if they use them, then the next generation of cards will be impossible to mod without resorting to hacking the binary drivers, which without a doubt encroaches on leagallity issues.

Same as jailbreking a phone I guess...
Anyway, they could have played nicely with Linux from the beginning, but no....

Yeah, I hate big companies that think that all Linux users are no money burnouts that like to hack things. One day they are going to realise that Linux is attracting all the professional programmers and engineers, and is very fast pulling general users away from windows and mac, especially with the recent move by Steam to release for Linux now due to the crappy APIs in Windows 8.

By not supporting the Linux users, because there is such a high percentage of engineers/developers using it, when things don't work the way they should... out comes the reverse engineering tools and all that R&D they spent on securing their product goes out the window. I wonder if they look at projects like Nouveau and realise that these open drivers will eventually be able to ignore the device ID and enable the professional features regardless?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: tony on March 20, 2013, 05:23:54 am
Have you looked at the front of the PCB? http://i.imgur.com/WJZqGyl.jpg (http://i.imgur.com/WJZqGyl.jpg)

The same 8-pin soic that is under the resistors you changed is on the lower-right of the other GPU and surrounded by a few resistors and unpopulated spots. It looks pretty promising to me.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Zibri on March 20, 2013, 05:36:54 am
Do you see the straps here, anywhere?

http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg (http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg)

I wonder what a GTX580 can become :) (if anything useful)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 07:28:59 am
Have you looked at the front of the PCB? http://i.imgur.com/WJZqGyl.jpg (http://i.imgur.com/WJZqGyl.jpg)

The same 8-pin soic that is under the resistors you changed is on the lower-right of the other GPU and surrounded by a few resistors and unpopulated spots. It looks pretty promising to me.

Agreed, but as stated, this could be completely unrelated to it, it could be controlling voltage and modding could potentially kill the GPU.

Do you see the straps here, anywhere?

http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg (http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg)

I wonder what a GTX580 can become :) (if anything useful)

Nothing stands out, they may be on the front of the card.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 07:54:03 am
Do you see the straps here, anywhere?

http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg (http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg)

I wonder what a GTX580 can become :) (if anything useful)

I don't think there is neither a Quadro nor a Tesla card equivalent for any GF110 GPU card
see http://en.wikipedia.org/wiki/Nvidia_Quadro (http://en.wikipedia.org/wiki/Nvidia_Quadro) and http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units (http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Zibri on March 20, 2013, 08:53:08 am
Do you see the straps here, anywhere?

http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg (http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg)

I wonder what a GTX580 can become :) (if anything useful)

Nothing stands out, they may be on the front of the card.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/images/front_full.jpg (http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/images/front_full.jpg)

Oh, by the way, hats off for your findings so far!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 09:02:23 am
Do you see the straps here, anywhere?

http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg (http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg)

I wonder what a GTX580 can become :) (if anything useful)

Nothing stands out, they may be on the front of the card.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/images/front_full.jpg (http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/images/front_full.jpg)

Oh, by the way, hats off for your findings so far!

Thanks,

I still can not see anything immediately obvious, and I doubt there will be anything as it seems that Neo_Moucha is correct, there was never a Quadro GF110 card.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: mamalala on March 20, 2013, 10:04:27 am
Yeah, I hate big companies that think that all Linux users are no money burnouts that like to hack things. One day they are going to realise that Linux is attracting all the professional programmers and engineers, and is very fast pulling general users away from windows and mac, especially with the recent move by Steam to release for Linux now due to the crappy APIs in Windows 8.

Funny thing is that if you look at the internet in general, Linux has already won the game. On the desktop it is still a long way to go, but it is getting there. My guess is that the mobile market will give the desktop quite a boost. Android is one thing, but then we also have the QT library, which is cross-plattform and gaining quite some traction.

Then there is the movement towards open formats in general. Stuff like Amazon deleting/denying-access-to eBooks that customers bought makes them more aware of the issues involved. Also, look at all these set-top-boxes and "intelligent" TV's using Linux that are popping up. All that propietary DRM crap is now showing it's true face to those who have these things, giving them more incentive to favour open formats.

That said, my first Linux was Yggdrasil, and that is now 20 years ago. Never looked back! Sure, i still have a Windows, but it is a Win 2000 running in a VirtualBox. And that will go soon as well, since MPLab X is now sufficiently usable on Linux (used the VM only for old MPLab stuff i do/did for customers).

Too bad that i have no use for your hack, since i'm simply not using that line of cards. I am not gaming, my focus is on desktop stuff, and for that my old card does just fine (dual-head, of course).

Anyways, great stuff you are doing here, keep it up!

Greetings,

Chris
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: decaf on March 20, 2013, 10:11:07 am
Is it possible to do this with a kernel hack? By just changing I.D. So system and also nvidia's driver see Quadro I.D. and behave that way.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 10:48:51 am
Is it possible to do this with a kernel hack? By just changing I.D. So system and also nvidia's driver see Quadro I.D. and behave that way.

I wish, I tried this first, the binary blob interrogates the card directly ignoring the kernel reported ID
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 05:48:49 pm
Is it possible to do this with a kernel hack? By just changing I.D. So system and also nvidia's driver see Quadro I.D. and behave that way.

this was possible in the past using "softquadro" with RivaTuner software, see forum.guru3d.com
but nvidia blocked it in newer drivers...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 20, 2013, 06:11:28 pm
Is it possible to do this with a kernel hack? By just changing I.D. So system and also nvidia's driver see Quadro I.D. and behave that way.

this was possible in the past using "softquadro" with RivaTuner software, see forum.guru3d.com
but nvidia blocked it in newer drivers...

No, it was not modding the kernel, it was overriding the software straps by writing to the register on the GPU directly.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: moisyes on March 20, 2013, 10:41:56 pm
Have you looked at the front of the PCB? http://i.imgur.com/WJZqGyl.jpg (http://i.imgur.com/WJZqGyl.jpg)

The same 8-pin soic that is under the resistors you changed is on the lower-right of the other GPU and surrounded by a few resistors and unpopulated spots. It looks pretty promising to me.

Agreed, but as stated, this could be completely unrelated to it, it could be controlling voltage and modding could potentially kill the GPU.

Do you see the straps here, anywhere?

http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg (http://www.ixbt.com/video3/images/gf110/gtx580-scan-back.jpg)

I wonder what a GTX580 can become :) (if anything useful)

Nothing stands out, they may be on the front of the card.
Gnif, would it be possible to follow the tracks from the second processor pins the same way the are routed to the resistors to find them anyway?
I have a GTS 450 too and after searching the web a lot, couldn't find any hardmod information, I think for obivious reasons. Dou you have any information on how I could hack my 2GB GTS 450 into a Quadro one. I use linux and need to hardmod instead to softmod, what have abundant windows information. Thak you in advance.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 20, 2013, 10:47:31 pm
this should be theoretically possible - change it into Quadro 2000 but only for older models before 2011? (GF106 based)
I have one real Quadro 2000 in my possession, so I can scan it and we can look for differences...

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ihasmario on March 21, 2013, 12:11:41 am
Does this apply to lesser cards such as the 670?

Superb work.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: mungewell on March 21, 2013, 03:03:57 am
Have you looked at the front of the PCB? http://i.imgur.com/WJZqGyl.jpg (http://i.imgur.com/WJZqGyl.jpg)

The same 8-pin soic that is under the resistors you changed is on the lower-right of the other GPU and surrounded by a few resistors and unpopulated spots. It looks pretty promising to me.

Agreed, but as stated, this could be completely unrelated to it, it could be controlling voltage and modding could potentially kill the GPU.

I'd agree that the placement of the links is approximately correct. Given the coding (resistor values) this would be another hint that you've found the right ones.

Given that you've confirmed the resistors which control GPU1, you could X-ray the board to find out the ball that these go into and then trace from that ball on the other GPU - just looking at surface layers might give you enough.

If anyone has a dead card you could even 'heat gun' the GPUs off to buzz out tracking.

Anyway, congratulations on your find. Look forward to using the 670 as a Quadro for some future 3D work under Linux ;-).
Simon
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 21, 2013, 08:47:15 am
Gnif, would it be possible to follow the tracks from the second processor pins the same way the are routed to the resistors to find them anyway?
I have a GTS 450 too and after searching the web a lot, couldn't find any hardmod information, I think for obivious reasons. Dou you have any information on how I could hack my 2GB GTS 450 into a Quadro one. I use linux and need to hardmod instead to softmod, what have abundant windows information. Thak you in advance.

Not easy, they are a 6+ layer PCB.

Does this apply to lesser cards such as the 670?

Superb work.

Read back through the thread to find the answer.

this should be theoretically possible - change it into Quadro 2000 but only for older models before 2011? (GF106 based)
I have one real Quadro 2000 in my possession, so I can scan it and we can look for differences...

That would be interesting, but it may yield nothing as the resistors may not be moved, but be different values.

I'd agree that the placement of the links is approximately correct. Given the coding (resistor values) this would be another hint that you've found the right ones.

Given that you've confirmed the resistors which control GPU1, you could X-ray the board to find out the ball that these go into and then trace from that ball on the other GPU - just looking at surface layers might give you enough.

If anyone has a dead card you could even 'heat gun' the GPUs off to buzz out tracking.

Anyway, congratulations on your find. Look forward to using the 670 as a Quadro for some future 3D work under Linux ;-).
Simon

Thanks, and yeah, If I had an X-ray machine I would have already done it :). I am able to remove the chip with my SMD workstation, but I do not have a dead card to try it on, and it is not likely to get one as everyone that has a dead card will still have warranty as they are still a pretty recent card.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ihasmario on March 21, 2013, 12:57:58 pm
Quote
Read back through the thread to find the answer.

Are you interested in (borrowing) a 670 to test your suspicions you mentioned? I am local enough (WA though), and my 670 struggles to run my screens because they're incapable of NVIDIA mosaic, so it won't be missed too dearly if it goes wrong.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 21, 2013, 01:01:19 pm
Quote
Read back through the thread to find the answer.

Are you interested in (borrowing) a 670 to test your suspicions you mentioned? I am local enough (WA though), and my 670 struggles to run my screens because they're incapable of NVIDIA mosaic, so it won't be missed too dearly if it goes wrong.

I would be willing to have a go at the card, but you must understand there is a risk to damaging the card in order to find the correct straps. And if it is possible, what would you want it to become? A 680, Quadro K5000 or Tesla K10?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ihasmario on March 21, 2013, 01:04:56 pm
Quote
Read back through the thread to find the answer.

Are you interested in (borrowing) a 670 to test your suspicions you mentioned? I am local enough (WA though), and my 670 struggles to run my screens because they're incapable of NVIDIA mosaic, so it won't be missed too dearly if it goes wrong.

I would be willing to have a go at the card, but you must understand there is a risk to damaging the card in order to find the correct straps. And if it is possible, what would you want it to become? A 680, Quadro K5000 or Tesla K10?

I don't have a preference between quadro or Tesla, because I assume both can perform mosaic. Essentially the problem is that I use T221s, which always identify as two screens, meaning that true multi-monitor support is impossible with NVIDIA consumer cards.

I'll send a PM.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 21, 2013, 01:09:14 pm
Quote
Read back through the thread to find the answer.

Are you interested in (borrowing) a 670 to test your suspicions you mentioned? I am local enough (WA though), and my 670 struggles to run my screens because they're incapable of NVIDIA mosaic, so it won't be missed too dearly if it goes wrong.

I would be willing to have a go at the card, but you must understand there is a risk to damaging the card in order to find the correct straps. And if it is possible, what would you want it to become? A 680, Quadro K5000 or Tesla K10?

I don't have a preference between quadro or Tesla, because I assume both can perform mosaic. Essentially the problem is that I use T221s, which always identify as two screens, meaning that true multi-monitor support is impossible with NVIDIA consumer cards.

I'll send a PM.

Wow, that is an impressive LCD, I am still using 3x Dell 2100FP from 2005. As for Mosaic, I have not checked if a Tesla works with it, as the Tesla K10 never had video outputs on it, so not sure. IMO it would be better to go for the Quadro K5000 as it better meets the requirement and less likely a NVidia update would break video output on modded Tesla cards.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on March 21, 2013, 08:56:45 pm
Given that you've confirmed the resistors which control GPU1, you could X-ray the board to find out the ball that these go into and then trace from that ball on the other GPU - just looking at surface layers might give you enough.
I was thinking of this too. Someone here has xrayed some boards (https://www.eevblog.com/forum/chat/apollo-saturn-v-computer-logic-reverse-engineered-with-working-model/), maybe you could ask her for advice?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 22, 2013, 07:27:18 am
Hi there, thanks for this :)

I just wrote up a pretty detailed post about how this thread made me look at my 3GB GTX 660 Ti's BIOS and find it was similar to a GTX 670s (and different to 2GB GTX 660 Ti's), but because I couldn't read the CAPTCHA I pressed request another, which refreshed the page and lost the post  >:(

Basically my 660 Ti's BIOS is almost the same as a 670's: it uses the same board and SKU numbers (20040005) but has extra code inserted which uses the normal 660 Ti board number (20040001, the 670 has similar code at a different address, but uses the 20040005 board number instead), maybe to emulate/downgrade it to a 660?

After seeing this I pressed further and compared 670 2GB vs 670 4GB, and then mapped the values that were different onto my 660 Ti's BIOS (the addresses were a bit different but it wasn't hard to locate them), they matched with the 670 4GB :o

This started to make me think they might have just crippled 4GB 670s into 3GB 660 Ti's, until I opened up my card and found 6x2Gb chips (H5GQ2H24AFR, which is 1.5GB? Maybe I read the datasheet wrong, or the rest were on the back...) There were 2 unfilled spaces though, so I'm guessing it hasn't got the full 4GB :(

The datasheet of those chips mentions that they're 256-bit, but the 660 Ti is reported as only being 192-bit... Maybe flashing the 670 BIOS over would enable the full bandwidth? I'd be willing to try it but I'm worried that the BIOS might do a check against the hardware device ID... I'd guess not since you can change the HW and it still works, but that could be down to a combined Quadro/Tesla/Geforce BIOS... Any info about this would be great!

Also any info about bad flashes would be great too, the only things I can find about them are from BIOS modders, not crossflashing :-\ I'm scared that the wrong RAM config/HW device ID/other stupid check might throw off the card from even being detected in nvflash...

If anyone wants to look further:

660 Ti 3GB BIOS: http://www.techpowerup.com/vgabios/127140/EVGA.GTX660Ti.3072.120806.html (http://www.techpowerup.com/vgabios/127140/EVGA.GTX660Ti.3072.120806.html)
660 Ti 2GB BIOS: http://www.techpowerup.com/vgabios/127242/EVGA.GTX660Ti.2048.120910.html (http://www.techpowerup.com/vgabios/127242/EVGA.GTX660Ti.2048.120910.html)

670 2GB BIOS: http://www.techpowerup.com/vgabios/125688/EVGA.GTX670.2048.120807.html (http://www.techpowerup.com/vgabios/125688/EVGA.GTX670.2048.120807.html)
670 4GB BIOS: http://www.techpowerup.com/vgabios/126722/EVGA.GTX670.4096.120712.html (http://www.techpowerup.com/vgabios/126722/EVGA.GTX670.4096.120712.html)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 22, 2013, 08:07:33 am
Hi there, thanks for this :)

I just wrote up a pretty detailed post about how this thread made me look at my 3GB GTX 660 Ti's BIOS and find it was similar to a GTX 670s (and different to 2GB GTX 660 Ti's), but because I couldn't read the CAPTCHA I pressed request another, which refreshed the page and lost the post  >:(

Basically my 660 Ti's BIOS is almost the same as a 670's: it uses the same board and SKU numbers (20040005) but has extra code inserted which uses the normal 660 Ti board number (20040001, the 670 has similar code at a different address, but uses the 20040005 board number instead), maybe to emulate/downgrade it to a 660?

After seeing this I pressed further and compared 670 2GB vs 670 4GB, and then mapped the values that were different onto my 660 Ti's BIOS (the addresses were a bit different but it wasn't hard to locate them), they matched with the 670 4GB :o

This started to make me think they might have just crippled 4GB 670s into 3GB 660 Ti's, until I opened up my card and found 6x2Gb chips (H5GQ2H24AFR, which is 1.5GB? Maybe I read the datasheet wrong, or the rest were on the back...) There were 2 unfilled spaces though, so I'm guessing it hasn't got the full 4GB :(

The datasheet of those chips mentions that they're 256-bit, but the 660 Ti is reported as only being 192-bit... Maybe flashing the 670 BIOS over would enable the full bandwidth? I'd be willing to try it but I'm worried that the BIOS might do a check against the hardware device ID... I'd guess not since you can change the HW and it still works, but that could be down to a combined Quadro/Tesla/Geforce BIOS... Any info about this would be great!

Also any info about bad flashes would be great too, the only things I can find about them are from BIOS modders, not crossflashing :-\ I'm scared that the wrong RAM config/HW device ID/other stupid check might throw off the card from even being detected in nvflash...

If anyone wants to look further:

660 Ti 3GB BIOS: http://www.techpowerup.com/vgabios/127140/EVGA.GTX660Ti.3072.120806.html (http://www.techpowerup.com/vgabios/127140/EVGA.GTX660Ti.3072.120806.html)
660 Ti 2GB BIOS: http://www.techpowerup.com/vgabios/127242/EVGA.GTX660Ti.2048.120910.html (http://www.techpowerup.com/vgabios/127242/EVGA.GTX660Ti.2048.120910.html)

670 2GB BIOS: http://www.techpowerup.com/vgabios/125688/EVGA.GTX670.2048.120807.html (http://www.techpowerup.com/vgabios/125688/EVGA.GTX670.2048.120807.html)
670 4GB BIOS: http://www.techpowerup.com/vgabios/126722/EVGA.GTX670.4096.120712.html (http://www.techpowerup.com/vgabios/126722/EVGA.GTX670.4096.120712.html)

No worries :). I do not think that the ram configuration is stored in the BIOS at all as it is needed to be known before the GPU even reads the BIOS from the EEPROM. In earlier versions it was based on the hard straps, I do not see any reason why they would have changed this.

As for the RAM size, from what I can see, that module is 2Gb which is 0.25GB per chip * 6 = 1.5GB total. This is very odd unless I am also reading it wrong if you say that the card should have 3GB of RAM. Can you have a real close look at the card to be doubly sure that the part number you provided is correct? Also, you did count the chips on both sides of the PCB?

As for seeing cards with less ram accessible then is physically installed, I highly doubt this would ever occur, the cost saving to the mfg is too large to just disable/hide/waste the additional RAM.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 22, 2013, 09:01:16 am
No worries :). I do not think that the ram configuration is stored in the BIOS at all as it is needed to be known before the GPU even reads the BIOS from the EEPROM. In earlier versions it was based on the hard straps, I do not see any reason why they would have changed this.

Well that's good news then :D, I was scared of flashing the 4GB BIOS in case it messed up because of the missing 1GB, and the 2GB one had me scared because of some different values in spots that matched between the 3GB 660 and 4GB 670, if that's true then the 4GB one is sounding even better :)

As for the RAM size, from what I can see, that module is 2Gb which is 0.25GB per chip * 6 = 1.5GB total. This is very odd unless I am also reading it wrong if you say that the card should have 3GB of RAM. Can you have a real close look at the card to be doubly sure that the part number you provided is correct? Also, you did count the chips on both sides of the PCB?

I took some pictures of the chips while I had it open:
http://imgur.com/FrJ5tJr (http://imgur.com/FrJ5tJr)
http://imgur.com/VrQhuB8 (http://imgur.com/VrQhuB8)
Haven't checked the other side yet because of some metal grille covering it, I can see some sort of thermal paste coming through the holes of it though. I'll take it out later and have a check for definite.

As for seeing cards with less ram accessible then is physically installed, I highly doubt this would ever occur, the cost saving to the mfg is too large to just disable/hide/waste the additional RAM.
Ah yeah that's true, probably should have thought of that before I took it apart ;D

Think I might wait and buy another 3GB 660 Ti when I get money, don't really want to go without if it messed up :-\
If you get hold of a 670 can you note down the resistors? Might come in handy if the BIOS does some sort of check.
Edit: nevermind, just saw that it would be 5K and 10K, bit slow today :palm:
I'll pull the card out and take a peek under the grille now

Edit2: Yep, the other 6 were there, along with another 2 unfilled spaces :( Oh well, heres some pictures:
http://imgur.com/a/TolyJ#8 (http://imgur.com/a/TolyJ#8)
Now check out
http://images.anandtech.com/doci/5818/GeForce_GTX_670_B-1.jpg (http://images.anandtech.com/doci/5818/GeForce_GTX_670_B-1.jpg)
Look similar?  ;D
This is a normal 2GB 660 Ti (best pic I could find): http://rwlabs.com/images/articles/evga/660ti_superclock/08.JPG (http://rwlabs.com/images/articles/evga/660ti_superclock/08.JPG)
Seems like this board might have a good chance :D

Edit3: Looking around led me to this spot for the resistors:
http://imgur.com/cOZg9PM (http://imgur.com/cOZg9PM)
Which seems to have a different config to the 670: http://www.ixbt.com/video3/images/ref/gtx670-scan-back.jpg (http://www.ixbt.com/video3/images/ref/gtx670-scan-back.jpg)
I admit I'm no good when it comes to hardware though, so I'm probably wrong about this... Don't really have any idea how I'd find out where they are, this was just me checking for similarity :-[

Edit4: Remembered you saying about when it's pulled in a different direction it's a different set of values...
http://imgur.com/8SKKD1w (http://imgur.com/8SKKD1w)
90% sure this is the spot now, hope you can tell us more :D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: nhp12345 on March 22, 2013, 02:40:40 pm
Hi there,
Nice job gnif  8)
Could you show me your modified card`s benchmark results with 3DS Max, 3DMark... or some game benchmarks?!  :-DMM I really want to see how it performs.
Thanks a lot and good luck in your further tweaking!  :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 22, 2013, 02:43:46 pm
Hi there,
Nice job gnif  8)
Could you show me your modified card`s benchmark results with 3DS Max, 3DMark... or some game benchmarks?!  :-DMM I really want to see how it performs.
Thanks a lot and good luck in your further tweaking!  :-+

I would love to when I get some more time. The only place you may see an improvement in performance is CUDA stuff as only one core has been modified. Once the other GPU is modified the results will be much more interesting.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on March 22, 2013, 05:53:43 pm
Edit4: Remembered you saying about when it's pulled in a different direction it's a different set of values...
http://imgur.com/8SKKD1w (http://imgur.com/8SKKD1w)
90% sure this is the spot now, hope you can tell us more :D
Either by coincidence or just reuse/modify of an existing PCB, the ID resistors seem to be always next to one of the heatsink mounting holes.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 22, 2013, 05:59:15 pm
Edit4: Remembered you saying about when it's pulled in a different direction it's a different set of values...
http://imgur.com/8SKKD1w (http://imgur.com/8SKKD1w)
90% sure this is the spot now, hope you can tell us more :D
Either by coincidence or just reuse/modify of an existing PCB, the ID resistors seem to be always next to one of the heatsink mounting holes.

Could be but I doubt it, when pulling high, the value starts at 8, not 9, both cards should in theory have the resistor in the same location, just a different value. It is more likely to do with the amount of RAM installed or some other configurable feature of the GPU.

Good spotting though, this resistor would now be one of my last to test if the others did not bear any results as it seems less likely to be the one for the device ID strapping.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: AlphaC on March 23, 2013, 11:26:01 am
gnif, you're awesome  :)

I have a few questions though.

Does this allow the modded card to get full performance of a Quadro in workstation applications? If yes, have you tried it on specviewperf11 and speccapc? (http://www.spec.org/benchmarks.html (http://www.spec.org/benchmarks.html))

A GK104 GTX 680 4GB is the same as a K5000  ; GK107 GTX 650 2GB is the same as the K2000. The K4000 is iffy, the GK106 GTX 650 Ti /Boost has the same CUDA Cores though it doesn't have 3GB VRAM. I believe the Boost version is the direct correlation since it has 192-bit memory bus, but it's not out yet.

GeForce GTX 650 Ti    0x11C3 http://www.techpowerup.com/gpudb/2059/NVIDIA_GeForce_GTX_650_Ti_Boost.html (http://www.techpowerup.com/gpudb/2059/NVIDIA_GeForce_GTX_650_Ti_Boost.html)
GeForce GTX 650 Ti    0x11C6http://www.techpowerup.com/gpudb/1188/NVIDIA_GeForce_GTX_650_Ti.html (http://www.techpowerup.com/gpudb/1188/NVIDIA_GeForce_GTX_650_Ti.html)
vs
Quadro K4000    0x11FA http://www.techpowerup.com/gpudb/1841/NVIDIA_Quadro_K4000.html (http://www.techpowerup.com/gpudb/1841/NVIDIA_Quadro_K4000.html)

GeForce GTX 650    0x0FC6 http://www.techpowerup.com/gpudb/894/NVIDIA_GeForce_GTX_650.html (http://www.techpowerup.com/gpudb/894/NVIDIA_GeForce_GTX_650.html)
vs
Quadro K2000    0x0FFE http://www.techpowerup.com/gpudb/1838/NVIDIA_Quadro_K2000.html (http://www.techpowerup.com/gpudb/1838/NVIDIA_Quadro_K2000.html)
or
Quadro K2000D    0x0FF9 http://www.techpowerup.com/gpudb/2021/NVIDIA_Quadro_K2000D.html (http://www.techpowerup.com/gpudb/2021/NVIDIA_Quadro_K2000D.html)


From NVIDIA's Linux Drivers, http://www.nvidia.com/object/linux-display-amd64-310.40-driver.html (http://www.nvidia.com/object/linux-display-amd64-310.40-driver.html)
http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/index.html (http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/index.html) ; http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/supportedchips.html (http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/supportedchips.html)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 23, 2013, 01:42:16 pm
gnif, you're awesome  :)

I have a few questions though.

Does this allow the modded card to get full performance of a Quadro in workstation applications? If yes, have you tried it on specviewperf11 and speccapc? (http://www.spec.org/benchmarks.html (http://www.spec.org/benchmarks.html))

A GK104 GTX 680 4GB is the same as a K5000  ; GK107 GTX 650 2GB is the same as the K2000. The K4000 is iffy, the GK106 GTX 650 Ti /Boost has the same CUDA Cores though it doesn't have 3GB VRAM. I believe the Boost version is the direct correlation since it has 192-bit memory bus, but it's not out yet.

GeForce GTX 650 Ti    0x11C3 http://www.techpowerup.com/gpudb/2059/NVIDIA_GeForce_GTX_650_Ti_Boost.html (http://www.techpowerup.com/gpudb/2059/NVIDIA_GeForce_GTX_650_Ti_Boost.html)
GeForce GTX 650 Ti    0x11C6http://www.techpowerup.com/gpudb/1188/NVIDIA_GeForce_GTX_650_Ti.html (http://www.techpowerup.com/gpudb/1188/NVIDIA_GeForce_GTX_650_Ti.html)
vs
Quadro K4000    0x11FA http://www.techpowerup.com/gpudb/1841/NVIDIA_Quadro_K4000.html (http://www.techpowerup.com/gpudb/1841/NVIDIA_Quadro_K4000.html)

GeForce GTX 650    0x0FC6 http://www.techpowerup.com/gpudb/894/NVIDIA_GeForce_GTX_650.html (http://www.techpowerup.com/gpudb/894/NVIDIA_GeForce_GTX_650.html)
vs
Quadro K2000    0x0FFE http://www.techpowerup.com/gpudb/1838/NVIDIA_Quadro_K2000.html (http://www.techpowerup.com/gpudb/1838/NVIDIA_Quadro_K2000.html)
or
Quadro K2000D    0x0FF9 http://www.techpowerup.com/gpudb/2021/NVIDIA_Quadro_K2000D.html (http://www.techpowerup.com/gpudb/2021/NVIDIA_Quadro_K2000D.html)


From NVIDIA's Linux Drivers, http://www.nvidia.com/object/linux-display-amd64-310.40-driver.html (http://www.nvidia.com/object/linux-display-amd64-310.40-driver.html)
http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/index.html (http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/index.html) ; http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/supportedchips.html (http://us.download.nvidia.com/XFree86/Linux-x86_64/310.40/README/supportedchips.html)

Thanks :).

At the moment since only GPU2 can be modded, benchmarking apps like these wont help any since you need to have the Quadro as the primary card. I do know however that in the GTX4xx series that the conversion using the BIOS mod to change the PCI Device ID did indeed give an enormous performance gain. Until I get either a GTX680 or another GTX690 to work on and find the straps on, there is little more I can do. As soon as the funds have been raised and I can find the straps for GPU1 I will do a heap of benchmarking and post all the results on these forums.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: olsenn on March 25, 2013, 01:48:41 am
I just discovered that this post is being covered on Tomshardware, haha: http://www.tomshardware.com/news/Nvidia-GTX-690-Quadro-K5000,21656.html (http://www.tomshardware.com/news/Nvidia-GTX-690-Quadro-K5000,21656.html)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 25, 2013, 02:23:29 am
gnif,

Truly exemplary work!

Would you please post your card's brand and full model # as obviously not all cards are based on the reference design?

I'd also like other people to post their brand/model #s as well if they've successfully done a similar mod because those of us who might be looking at the mod need to know what exactly to look for.

Thanks.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 25, 2013, 09:18:06 am
I just discovered that this post is being covered on Tomshardware, haha: http://www.tomshardware.com/news/Nvidia-GTX-690-Quadro-K5000,21656.html (http://www.tomshardware.com/news/Nvidia-GTX-690-Quadro-K5000,21656.html)

Awesome! A little annoyed though that they did not link to this thread.
Edit: Hats off to TomsHardwareGuide for updating the article promptly with this information.

Would you please post your card's brand and full model # as obviously not all cards are based on the reference design?

It is the EVGA GeForce GTX 690 (04G-P4-2690-KR).

does anyone know what would a GTX660 TI mod into? or if its even modable?

Please read through the thread for the answer, unless someone donates one to the cause we will never know.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 25, 2013, 10:57:32 am
does anyone know what would a GTX660 TI mod into? or if its even modable?

GTX660 Ti is based on GK104 so *possibly* a Quadro K5000 or Tesla K10 as both share the same chip as GTX.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on March 25, 2013, 11:37:00 am
Gnif,
I'v been keeping up with this thread...I ran across this while researching how I might pass-through my k20 or 660 ti(s) to a virtual machine using esxi 5.1 and horizon view 5.2. I will donate a gtx 660 ti to the cause. How can I get this to you?

bdx
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 25, 2013, 11:56:48 am
I ran across this while researching how I might pass-through my k20 or 660 ti(s) to a virtual machine using esxi 5.1 and horizon view 5.2.

I know it's off topic but I am also interested in passing hardware directly to the VMs. You probably know this already but if not, in order to do it you need to have a vt-d capable CPU and motherboard as well as BIOS support enabled.

Here's a thread of interest http://forums.mydigitallife.info/threads/33730-VT-d-enabled-motherboards-and-CPUs-for-Paravirtualization (http://forums.mydigitallife.info/threads/33730-VT-d-enabled-motherboards-and-CPUs-for-Paravirtualization)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on March 25, 2013, 12:34:49 pm
Amigo-
Thanks for the heads up....
I have been attempting this type of configuration for a year+ now. I've had success passing through ATI cards. Concerning Nvidia, I have had success (though it is a true pain) by following the methods described here http://communities.vmware.com/message/2036345. (http://communities.vmware.com/message/2036345.)
I am currently trying to configure Horizon View 5.2 for vDGA using Nvidia cards.
If you wish to discuss this further please contact me privately per topic of this thread.
Thanks
bdx
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 25, 2013, 01:18:42 pm
Gnif,
I'v been keeping up with this thread...I ran across this while researching how I might pass-through my k20 or 660 ti(s) to a virtual machine using esxi 5.1 and horizon view 5.2. I will donate a gtx 660 ti to the cause. How can I get this to you?

bdx

Hi bdx, thanks for the vote of support, I will PM you details:

I ran across this while researching how I might pass-through my k20 or 660 ti(s) to a virtual machine using esxi 5.1 and horizon view 5.2.

I know it's off topic but I am also interested in passing hardware directly to the VMs. You probably know this already but if not, in order to do it you need to have a vt-d capable CPU and motherboard as well as BIOS support enabled.

Here's a thread of interest http://forums.mydigitallife.info/threads/33730-VT-d-enabled-motherboards-and-CPUs-for-Paravirtualization (http://forums.mydigitallife.info/threads/33730-VT-d-enabled-motherboards-and-CPUs-for-Paravirtualization)

I have done this with Xen, but like amigo, never with NVidia, only ATI. Have a look here: http://wiki.xen.org/wiki/Xen_PCI_Passthrough. (http://wiki.xen.org/wiki/Xen_PCI_Passthrough.) Otherwise please create a new thread as this is off topic here (may be offtopic to this entire forum as this forum is Electronics related).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 25, 2013, 07:10:07 pm
Successfully modifed:
Zotac GT640 2G to NVIDIA GRID K1 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332)
Palit GTX 650 2G to NVIDIA GRID K1 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg235610/#msg235610)
SPARKLE GeForce GTS 450  1GB to Quadro 2000 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210854/#msg210854)

EVGA GTX670 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798) by blanka (https://www.eevblog.com/forum/index.php?action=profile;u=66164)
EVGA GTX670 good pictures (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534) by shlomo.m (https://www.eevblog.com/forum/index.php?action=profile;u=71518)




Hello,

I managed to find the resistors responsible for PCI Id in the graphics card gtx 680 2gb  GV-N680OC-2GD, Device Id: 10DE 1180

Below you can find a list of IDs that I run successfully:
gtx 670, Device Id: 10DE 1189 with 1536 cores.
tesla k10, Device Id: 10DE 118F
quadro k5000, Device Id: 10DE 11BA
vgx grid k2, Device Id: 10DE 11BF
not tested:
gtx770 Device Id 0x1184
gtx660 ti Device Id 0x1183

Also, I succeeded in running the driver on every modification.
My goal was to get the opportunity to do gpu passthough and I achieved it. Only vgx grid k2 and tesla k10 supports this technology. Quadro k5000 works too, but only till I reboot the virtual machine for the 1st time.
I ran gpu passthough using kvm by doing some mouse clicking in the virt-manager. Config of my system is i5-3570, GA-Z77MX-D3H + gpu.
Moreover, I tested the stability on a virtual machine with windows 7 using furmark. Everything works perfectly, the virtual machine can be rebooted as many times as you want without rebooting the host.

Here are a couple of tests:
Tesla on the virtual machine:  file: https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42263 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42263)
Tesla on the host: file: https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42265 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42265)

NVIDIA Quadro K5000 http://www.ozone3d.net/benchmarks/furmark_score_180.php?id=fe897bb5eef07ca3e8265832340895df (http://www.ozone3d.net/benchmarks/furmark_score_180.php?id=fe897bb5eef07ca3e8265832340895df)
NVIDIA GRID K2 http://www.ozone3d.net/benchmarks/furmark_score_180.php?id=0c875302f22cf53d2d58faad5eb65a1e (http://www.ozone3d.net/benchmarks/furmark_score_180.php?id=0c875302f22cf53d2d58faad5eb65a1e)

and here comes the most exciting  part: the resistors location
(https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=43392;image)

let's consider the example of the modification

Initially device id is 1180
third symbol = 8
4th symbol = 0

1 resistor is responsible for the third symbol. Initially it's 25k on gtx680. tried to replace it with 10k - got A deleting resistor B. Works, but in a weird way :).
2 resistor is responsible for the 4th, symbol 8-f. Tested values: 10k = 9, 15k = A.
3 resistor is responsible for the 4th, symbol 0-7. It is originally 5k on gtx680.
If you use second resistor, third one has to be removed, and vice versa.

Summary
GPU NameResistor 0 / 3th byte Resistor 1 / 3th byte Resistor 2 / 8-f 4th byteResistor 3 / 0-7 4th byte
GTX 660 tinone25knone20k
GTX 670none25k10knone
GTX 680none25knone5k
GTX 770none25knone25k
tesla k10none25k40knone
quadro k500040knone15knone
grid k240knone40knone

upd:
removing resistor 1 may cause random ID changes after reboot :) I will update post after i solve it.
upd April 07, 2013, 07:15:34 AM:
status: testing, works fine, tested about 10 reboots and poweroffs.
upd April 09, 2013, 05:53:00 PM:
works fine, tested many reboots, looks like 40k resistor fixed issue.
April 13, 2013, 07:12:24 AM:
works fine. ;)

upd1 from gamezr2ez
Quote
I was able to successfully modify my card to a Grid K2.

Something that was interesting was I kept getting kernel panics with the 40k resistors. After some experimenting I found a stable solution for this card.

Resistor 0: None
Resistor 1: None
Resistor 2: 100k
Resistor 3: None

My card is an Asus GTX680. I know they build their own PCB layout and my PCB was slightly different than yours (different spacing, same location). I am guessing that may have something to do with it, but I am still a bit confused as to how the resistors directly effect the ID anyway. This was my first time working with SMD components so I may have messed something up, who knows? It works, that is what matters.

Thoughts:
I wonder what happens if we take gtx 670 and modify the id to 680 and the upload 680 BIOS. will it unlock cores?
You may need to compare some other resistances too.
upd: It is imposible due to different names of gk104. 
gtx 680 gk104-400
gtx 670 gk104-325
gtx 660ti gk104-300
I think processor is laser-cutted.
intresting what if someone buy gk104-400 from alibaba(105$) and replace gk104-300 for example ;)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 25, 2013, 07:25:27 pm
Hello,

I managed to find the resistors responsible for PCI Id in the graphics card gtx 680 2gb, Device Id: 10DE 1180

Great work! That is one less card to try to identify :).

I wonder what happens if we take gtx 670 and modify the id to 680 and the upload 680 BIOS. will it unlock cores?

A user is kindly donating one to the cause to see what we can do with it.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 12:46:11 am
verybigbadboy,

Great job. Please tell us the brand and exact model of the card?

I know I sound like I'm repeating myself, but there's a reason for this. Some manufacturers use reference NV designs in their boards, while others change them.

When they use reference design, we can then pinpoint exactly the location of resistors to change across multiple boards because they are all based on the same reference. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 01:08:47 am
Further to my post, let me show you what I mean:

(http://i.imgur.com/aZ9hz63.jpg)

I matched your photo composite. Can you guess what board this comes from? :)

Inno3D GTX 670.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 01:39:11 am
Ok well why not take it all the way...

(http://i.imgur.com/ExD8pE1.jpg)

I took the Inno3D board photos and flipped the back side vertically, then matched it to the front side.

I marked the REFERENCE for alignment and then PADS where you can see the vias coming through and connecting to the other side.

This should be self explanatory, if not, basically it shows how those resistors are related to each other, front to back. I guess when they laid the board out for whatever reason they did not put them all on the same side.

I do not have any cards yet to test, but I would postulate that you can make any GK104 series (Kepler) into Quadro or Tesla or VGX. Of course the issue of the GK104 chip's fuses being burnt on GTX cards would prevent exact specification match to Quadro/Tesla/VGX. But something is better than nothing. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ErikTande on March 26, 2013, 06:15:57 am
I have a GTX 660 Ti that I would be willing to submit as a guinea pig.     If someone with the abilty and desire to attempt this with a GTX 660 Ti card, send me a PM.   I'll ship the card as long as you ship it back :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 06:38:57 am
I have a GTX 660 Ti that I would be willing to submit as a guinea pig.   Is this something that gnif or someone else could walk me through over skype?

*edit* nevermind, it looks a little too involved for me to handle.  I would need to send the card to someone else.

If someone with the abilty to attempt this wants a card, send me a PM.   I'll ship a 660 Ti as long as you ship it back  :-+

You edited your message while I was preparing an image for you...

If you have a steady hand and decent tools to move the resistors, here's where I think the resistors are on the 660 Ti (of course I could be terribly wrong :) )

Left: Quadro K5000, right: GTX 660 Ti
(http://i.imgur.com/UsHpSG5.jpg)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 26, 2013, 06:51:22 am
I have a GTX 660 Ti that I would be willing to submit as a guinea pig.   Is this something that gnif or someone else could walk me through over skype?

*edit* nevermind, it looks a little too involved for me to handle.  I would need to send the card to someone else.

If someone with the abilty to attempt this wants a card, send me a PM.   I'll ship a 660 Ti as long as you ship it back  :-+

You edited your message while I was preparing an image for you...

If you have a steady hand and decent tools to move the resistors, here's where I think the resistors are on the 660 Ti (of course I could be terribly wrong :) )

Left: Quadro K5000, right: GTX 660 Ti
(http://i.imgur.com/UsHpSG5.jpg)
Same place that I marked earlier... good to know I wasn't off track  :-+
Now I'm just hoping that the 670 gnif is getting uses the same board  8)

Maybe we should setup a board number -> resistor location table somewhere?
Seems a lot of different cards use the same board, would make it easier for people to see if their card is supported yet.
Board number is pretty easy to find too, most times it's written on the card (like here (http://www.ixbt.com/video3/images/ref/gtx680-scan-back.jpg) it's written as <BOARDNUMBER>.rom)
Another way to get it is looking with GPU-Z, the BIOS version field usually has it (it does for my card at least)

does anyone know what would a GTX660 TI mod into? or if its even modable?
Well it looks like some 660 Ti's share the same board as the 670, so it might be possible to convert them over and get full use of the 256-bit memory interface... But for all we know these 660-on-670 cards might be some sort of binned hardware with 660 firmware put on to "cripple" them into not using the damaged parts :/ I'm thinking it should work though because the RAM chips themselves are 256-bit.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 07:14:32 am
660 Ti will turn into K5000, because the board designs are exactly the same, except that the K5000 has the full feature chip (1536/128/32 Shaders/TMUs/ROPs) instead of a possibly crippled/damaged version (1344/1112/24 Shaders/TMUs/ROPs).

Perhaps the resistors on the back of the board underneath the GPU set the configuration, but we need a really good photo close-ups of the boards to see the differences. It's a long shot but we got nothing to lose. :)

My guesstimate is that the Device ID resistors are usually located on the back side in-between U504 and U505 for smaller boards (GTX 660, K5000); on the larger boards (GTX 670, 680) they are also around there as well as around U1 and Y1 on the front side.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ErikTande on March 26, 2013, 07:27:03 am

Perhaps the resistors on the back of the board underneath the GPU set the configuration, but we need a really good photo close-ups of the boards to see the differences. It's a long shot but we got nothing to lose. :)

I'll take pics as soon as I get home from work.  It will be in about 4 hours from now.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 26, 2013, 07:27:12 am
660 Ti will turn into K5000, because the board designs are exactly the same, except that the K5000 has the full feature chip (1536/128/32 Shaders/TMUs/ROPs) instead of a possibly crippled/damaged version (1344/1112/24 Shaders/TMUs/ROPs).
Hmm, do we know where that limitation exists though? It seems that my 660 Ti's board is the same number as a 670s, which should mean that it uses the same kind of chip. Here's some info I was editing into my other post before I saw yours:


I'm hoping the limit is in the BIOS, so I can just reflash and swap resistors and hey presto a ghetto 670 %-B
Probably not likely but it's worth a shot...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 07:39:40 am
See that's the thing, unless the board numbers match, most manufacturers design their own GTX 670 boards because they are higher-end items and consumers demand better power supply design and better components.

With regards to getting the missing computing units, that's neither here nor there. The chips (chip die, silicon) used are all the same (cheaper to manufacture) but they could have factory burnt fuses (inside GPUs) that disable computing units; on-board limitations (like resistors); bios restrictions (least likely imho  because it would be a huge "fail", but I could be wrong) or simply be damaged GPUs that did not pass QA for higher boards.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 26, 2013, 08:06:23 am
See that's the thing, unless the board numbers match, most manufacturers design their own GTX 670 boards because they are higher-end items and consumers demand better power supply design and better components.

With regards to getting the missing computing units, that's neither here nor there. The chips (chip die, silicon) used are all the same (cheaper to manufacture) but they could have factory burnt fuses (inside GPUs) that disable computing units; on-board limitations (like resistors); bios restrictions (least likely imho  because it would be a huge "fail", but I could be wrong) or simply be damaged GPUs that did not pass QA for higher boards.

I'd figured as much, really though I'm not that interested in making the card more powerful, just hoping to get the improved memory bandwidth... The card performs pretty well until it comes to memory-intensive stuff, then it starts to make me wonder why I didn't pay the extra £50 for the 670 :(

I'd like to think that they'd handle the memory controller stuff in the BIOS, can't really see them making it dependent on e-fuses when it's all down to the number of address lanes on the board, why make it take the time to check some fuses when you know exactly how many lanes the hardware supports (or is crippled to)? That'd just be wasting space in the BIOS...

At least that's how I'd think it out, but then again I'm not a for-profit company... Only way to find out is experimenting  O0 (or if any insiders from NVIDIA feel like helping out, go ahead :P)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 08:17:40 am
I think the most interesting bits are making these cards support hardware virtualization which brings the whole field down to the Earth, for all hobbyists who do not have $10k+ to spend on Nvidia VGX solutions. :)

And why should it be any other way: true innovation and progress in the last 100 years had always come from individual people (a lone nut in a garage) and not from some huge research facility attached to some corporation. The latter generally appropriate (steal? :) ) innovation, make improvements, then patent it and basically lock it away with high prices from the people it was originally intended for.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: winjet1 on March 26, 2013, 08:48:00 am
I would be extremely interested to learn how verybigbadboy was able to convert his 680 into a VGX/GRID K1 (i.e. what to solder, what resistors to use and where to get them, etc).  That would open up a lot of things to the home-virualization crowd.

I'd even give a little funding for some "idiots guide to turning your 680 into a VGX"

Just sayin.........
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 09:04:47 am
My 20040005 660Ti's BIOS seems to have code to emulate/disguise/downgrade itself to a 20040001 board internally, possibly to make drivers work properly (although gpu-z still picks up the proper number)

By the way, how do you know that your BIOS is downgrading your card, perhaps you have some pointers to look at?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 26, 2013, 09:07:03 am
I would be extremely interested to learn how verybigbadboy was able to convert his 680 into a VGX/GRID K1

Just remove resistor 1 and 3 showed on picture in my first post and you will get grid k2

Also now I am trying to modify gts450 to quadro 2000. But I have a problem with getting 4th symbol. I think it is possible to modify almost all nvidia cards which have counterparts. Gts450 have similar way to setup device id.

I think gt200 series can be modified too, I looked at died gt240 and I think I know where are right resistors.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 09:10:19 am
I would be extremely interested to learn how verybigbadboy was able to convert his 680 into a VGX/GRID K1 (i.e. what to solder, what resistors to use and where to get them, etc).  That would open up a lot of things to the home-virualization crowd.

I'd even give a little funding for some "idiots guide to turning your 680 into a VGX"

Just sayin.........

Both verybigbadboy and me have posted enough information for anyone to mod a card based on the reference designs.

His images referring to locations and values as well as the images I posted of GTX 670 apply to GTX 680 as well, unless you have a different board. But then again you did not specify what brand/model you have?

Surely if you come here you must know where to get SMT components? :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 26, 2013, 09:21:50 am
yeah...! just registered, only to follow this thread..

first of all, congratulations to gnif and others who are working on this.
as a cad user "with small budget", this thread is very interesting.

second, modding a 680 to a k5000 will be great. i've read verybigbadboy's post, but it's a little bit confusing. i personally don't need k5000 performances, i could go with a k2000 (650) or a k4000 (650 ti).

p.s.:
i can somewhat help with a quadro 600, (it's a fermi, i know, but it can be useful for someone) if someone has a gt430 i can compare the pcbs (i have a pny 430, but it's not reference pcb).

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 09:35:00 am
yeah...! just registered, only to follow this thread..

second, modding a 680 to a k5000 will be great. i've read verybigbadboy's post, but it's a little bit confusing. i personally don't need k5000 performances, i could go with a k2000 (650) or a k4000 (650 ti).

Welcome aboard.

You can only get K5000, if modded, because K2000 is based on GK107 and not GK104 GPU.

Further more, you will gain access to what I believe are locked-out high-end features of drivers (and/or applications) because your card will physically identify as something else.

If you compare raw specs, K5000 is clocked much lower than GTX 680 and if you choose to flash the BIOS you will probably reduce performance you have now, but perhaps gain more stability. That is one of the selling points of the high-end visualization cards.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 26, 2013, 09:37:06 am
My 20040005 660Ti's BIOS seems to have code to emulate/disguise/downgrade itself to a 20040001 board internally, possibly to make drivers work properly (although gpu-z still picks up the proper number)

By the way, how do you know that your BIOS is downgrading your card, perhaps you have some pointers to look at?
I'm not 100% about what it's doing, comparing a 670 bios to my 660 Ti's showed that the part of the BIOS where versions/board numbers/SKUs are was almost the same, but lower down in the code there was another reference to the board number which in the 670 BIOS was 20040005 (as expected) but 20040001 on my 660Ti's:

Where it shows the version number/board number/SKU (http://i.imgur.com/H0sKwE8.png)
Where the other reference to the board number is (http://i.imgur.com/SOPQOvZ.png)
(left is 660Ti, right is 670)

I think the second reference might be used for when the driver asks the card what board it is, although I'm not even sure if the driver actually does this  :-\
Still, I find it strange that it'd use the board number for the normal 660Ti when the BIOS itself is meant for the 20040005 model. Makes me think it might be doing more than just masquerading the board number...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 26, 2013, 09:56:30 am

Welcome aboard.

You can only get K5000, if modded, because K2000 is based on GK107 and not GK104 GPU.

Further more, you will gain access to what I believe are locked-out high-end features of drivers (and/or applications) because your card will physically identify as something else.

If you compare raw specs, K5000 is clocked much lower than GTX 680 and if you choose to flash the BIOS you will probably reduce performance you have now, but perhaps gain more stability. That is one of the selling points of the high-end visualization cards.

hi. thankyou for the reply. i know that "raw specs" are worse on quadro, but i need quadro capability. just for CAD, not other features.

now i use a 580 and a quadro 600. so,no 680 atm.
and the 600 is sometimes slow (but i don't have the money to afford a superior card).

would you kindly explain me why gk107 can't be modded.?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 10:03:51 am
Where it shows the version number/board number/SKU (http://i.imgur.com/H0sKwE8.png)
Where the other reference to the board number is (http://i.imgur.com/SOPQOvZ.png)
Somehow I am not sure those numbers are actually board number/version, looking at other boards that do not have those numbers.

Have you also checked against an older/newer version of the BIOS if those numbers repeat or are different on the same model line?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 10:06:06 am
would you kindly explain me why gk107 can't be modded.?
GK104 GPU is in the GTX 660ti, 670, 680, 690 models.

GK107 GPU is in the GTX 650.

Can't make one turn chip into another. But, you could buy a GTX 650 and *possibly* mod it to K2000, based on this thread findings. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: emoose on March 26, 2013, 10:09:57 am
Where it shows the version number/board number/SKU (http://i.imgur.com/H0sKwE8.png)
Where the other reference to the board number is (http://i.imgur.com/SOPQOvZ.png)
Somehow I am not sure those numbers are actually board number/version, looking at other boards that do not have those numbers.

Have you also checked against an older/newer version of the BIOS if those numbers repeat or are different on the same model line?
EVGA doesn't like to release BIOS updates :( Maybe because so many of their cards use so many different boards :palm:
Other 660 Ti's use the 20040001 board number in the header, where it is in http://i.imgur.com/H0sKwE8.png (http://i.imgur.com/H0sKwE8.png)

You can see the difference here
Normal 2GB 660Ti: http://www.techpowerup.com/vgabios/127242/EVGA.GTX660Ti.2048.120910.html (http://www.techpowerup.com/vgabios/127242/EVGA.GTX660Ti.2048.120910.html)
20040005 3GB 660Ti: http://www.techpowerup.com/vgabios/127140/EVGA.GTX660Ti.3072.120806.html (http://www.techpowerup.com/vgabios/127140/EVGA.GTX660Ti.3072.120806.html)
4GB 670: http://www.techpowerup.com/vgabios/126722/EVGA.GTX670.4096.120712.html (http://www.techpowerup.com/vgabios/126722/EVGA.GTX670.4096.120712.html)

Edit: theres an extra bit in the header which I forgot to mention, just above http://i.imgur.com/H0sKwE8.png (http://i.imgur.com/H0sKwE8.png)
On 3GB 660Ti/670 it's "3GK104 P2004 SKU 0005 VGA BIOS (HWDIAG)"
On 2GB 660Ti it's "3GK104 P2004 SKU 0001 VGA BIOS (HWDIAG)"

Edit2: Just got some NVFLASH output:
Quote
NVIDIA Firmware Update Utility (Version 5.118)

Adapter: GK1xx                (10DE,1183,3842,3663) H:--:NRM B:01,PCI,D:00,F:00

The display may go *BLANK* on and off for up to 10 seconds during access to the
EEPROM depending on your display adapter and output device.

Identifying EEPROM...
EEPROM ID (C8,4012) : GD GD25Q20 2.7-3.6V 2048Kx1S, page
Reading adapter firmware image...
Image Size            : 182272 bytes
Version               : 80.04.4B.00.60
~CRC32                : 8AB5DABA
OEM String            : NVIDIA
Vendor Name           : NVIDIA Corporation
Product Name          : GK104 Board - 20040005
Product Revision      : Chip Rev
Device Name(s)        : GK1xx
Board ID              : E11D
PCI ID                : 10DE-1183
Subsystem ID          : 3842-3663
Hierarchy ID          : Normal Board
Chip SKU              : 300-0
Project               : 2004-0001
CDP                   : N/A
Build Date            : 07/09/12
Modification Date     : 08/06/12
Sign-On Message       : GK104 P2004 SKU 0005 VGA BIOS (HWDIAG)
Guess we can see where that 20040001 is used: Project: 2004-0001
I got a feeling that's in there for more than just NVFLASH output though...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 26, 2013, 10:29:41 am
would you kindly explain me why gk107 can't be modded.?
GK104 GPU is in the GTX 660ti, 670, 680, 690 models.

GK107 GPU is in the GTX 650.

Can't make one turn chip into another. But, you could buy a GTX 650 and *possibly* mod it to K2000, based on this thread findings. :)

yes, i know that... read my first post, page #7. :-)
maybe i have been a little confusing, sorry.

i'm trying to compare 650s to k2000/4000...but it's hard to find decent photos.
plus, being a "mechanic" doesn't help. :-)

if i fail, i could save € and go for a 660ti. it will be a castrated k5000, but "something is better than nothing"..
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ErikTande on March 26, 2013, 11:31:07 am
Here's my high rez image of the back of a GeForce GTX 660 Ti:

http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg (http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 26, 2013, 12:08:21 pm
Here's my high rez image of the back of a GeForce GTX 660 Ti:

http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg (http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg)

It looks like a standard board for 660 Ti... http://www.ixbt.com/video3/images/ref/gtx660ti-scan-back.jpg (http://www.ixbt.com/video3/images/ref/gtx660ti-scan-back.jpg)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: pigeond on March 26, 2013, 01:10:18 pm
Hi gnif and all,

I know it's old, but I'm wondering is it possible to do the same for the GTX460?

I have two GV-N460OC-1GI. I'm a Xen-er and I had been through the pain trying to get them to work with VGA passthrough. I ended up taking the easy way and bought some ATI cards instead. Would be nice to make those GTXs useful again.

I could upload some photos of the card if needed.

Thanks!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 26, 2013, 01:41:40 pm
I managed to find the resistors responsible for PCI Id in the graphics card gtx 680 2gb  GV-N680OC-2GD, Device Id: 10DE 1180
I have its 670 sibling, GV-N670OC-2GD. Based on the pictures (attached), it's the same card (except for the resistors, I guess).
Since I'm interested in the same - vSGA - I'd be willing to try modding it.

Below you can find a list of IDs that I run successfully:
gtx 670, Device Id: 10DE 1189 with 1536 cores.
tesla k10, Device Id: 10DE 118F
quadro k5000, Device Id: 10DE 11BA
vgx grid k2, Device Id: 10DE 11BF
Since the K5000 wasn't stable, I'd go with Tesla K10.
In this case just one - 4th - symbol needs to be changed: from 9 to F.
And that means only the second resistor has to be replaced (both, 9 and F are in the 8-F range covered by the second resistor).

2 resistor is responsible for the 4th, symbol 8-f. Tested values: 10k = 9, 15k = A.
3 resistor is responsible for the 4th, symbol 0-7. It is originally 5k on gtx680.
If you use second resistor, third one has to be removed or be 40k, and vice versa.
And here is my question:
Making the 4th symbol an F means 40K resistor (in place of the 10k, a "9" symbol)
But based on the quoted text, that is the same as no resistor, aka see what the 3rd has to say.

Do I over analyse it?

Any comments would be appreciated.


Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 26, 2013, 05:30:31 pm
And here is my question:
Making the 4th symbol an F means 40K resistor (in place of the 10k, a "9" symbol)
But based on the quoted text, that is the same as no resistor, aka see what the 3rd has to say.

Do I over analyse it?

Any comments would be appreciated.

I just removed 2 and 3 to get F symbol.

you may also remove them, or you may try to put 40k  "in place of the 10k, a "9" symbol"
I think there is no difference.

Looks like "F" is default value for 4 symbol.
and "B" is default  value for 3 symbol.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: nindza on March 26, 2013, 08:26:31 pm
Hi, guys. I'm admiring your work. And since I'm a  complete rookie - what do you think I can do with my gtx 660 2Gb that has a GK106 CPU. Does that mean I can mybe mod it to Quadro K4000, also GK106? Device ID is 10DE-11C0 and Quadro K400 ID is xxxx-11FA. Thanks for your thoughts!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 26, 2013, 09:22:05 pm
Hi, guys. I'm admiring your work. And since I'm a  complete rookie - what do you think I can do with my gtx 660 2Gb that has a GK106 CPU. Does that mean I can mybe mod it to Quadro K4000, also GK106? Device ID is 10DE-11C0 and Quadro K400 ID is xxxx-11FA. Thanks for your thoughts!

yes, maybe..
but quadro k4000 is more like 650Ti boost than 660 (same gk106, but different specs).

you can see : http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units (http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units) with nvidia card comparison.
it's very well done.

p.s.: i've seen a 660 pcb, but i couldn't find the "resistor pattern" i've seen in gk104 based card, but i'm a rookie as you. maybe it's in the front http://images.anandtech.com/doci/6276/GTX660PCB.jpg (http://images.anandtech.com/doci/6276/GTX660PCB.jpg) right to the red dot (lower left angle of gpu core)
i think that we have to wait if 660ti -> k5000 works well..
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: nindza on March 26, 2013, 10:06:02 pm
Hi, guys. I'm admiring your work. And since I'm a  complete rookie - what do you think I can do with my gtx 660 2Gb that has a GK106 CPU. Does that mean I can mybe mod it to Quadro K4000, also GK106? Device ID is 10DE-11C0 and Quadro K400 ID is xxxx-11FA. Thanks for your thoughts!

yes, maybe..
but quadro k4000 is more like 650Ti boost than 660 (same gk106, but different specs).

you can see : http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units, (http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units,) with nvidia card comparison.
it's very well done.

p.s.: i've seen a 660 pcb, but i couldn't find the "resistor pattern" i've seen in gk104 based card, but i'm a rookie as you. maybe it's in the front http://images.anandtech.com/doci/6276/GTX660PCB.jpg, (http://images.anandtech.com/doci/6276/GTX660PCB.jpg,) right to the red dot (lower left angle of gpu core)
i think that we have to wait if 660ti -> k5000 works well..

Tkank you, yes, we should be patient. I can't seem to open your pic link, could you update it please? tnx
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: moisyes on March 26, 2013, 11:05:34 pm
I would be extremely interested to learn how verybigbadboy was able to convert his 680 into a VGX/GRID K1 (i.e. what to solder, what resistors to use and where to get them, etc).  That would open up a lot of things to the home-virualization crowd.

I'd even give a little funding for some "idiots guide to turning your 680 into a VGX"

Just sayin.........

Both verybigbadboy and me have posted enough information for anyone to mod a card based on the reference designs.

His images referring to locations and values as well as the images I posted of GTX 670 apply to GTX 680 as well, unless you have a different board. But then again you did not specify what brand/model you have?

Surely if you come here you must know where to get SMT components? :)
Would you have a Quadro 2000 image. I own a GTS 450 2GB and I wanto to hardmod it to Quadro. If you help me in it anyway, Tahnk you so much.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 26, 2013, 11:33:37 pm
Hi, guys. I'm admiring your work. And since I'm a  complete rookie - what do you think I can do with my gtx 660 2Gb that has a GK106 CPU. Does that mean I can mybe mod it to Quadro K4000, also GK106? Device ID is 10DE-11C0 and Quadro K400 ID is xxxx-11FA. Thanks for your thoughts!

yes, maybe..
but quadro k4000 is more like 650Ti boost than 660 (same gk106, but different specs).

you can see : http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units, (http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units,) with nvidia card comparison.
it's very well done.

p.s.: i've seen a 660 pcb, but i couldn't find the "resistor pattern" i've seen in gk104 based card, but i'm a rookie as you. maybe it's in the front http://images.anandtech.com/doci/6276/GTX660PCB.jpg, (http://images.anandtech.com/doci/6276/GTX660PCB.jpg,) right to the red dot (lower left angle of gpu core)
i think that we have to wait if 660ti -> k5000 works well..

Tkank you, yes, we should be patient. I can't seem to open your pic link, could you update it please? tnx

link now are ok.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 27, 2013, 12:14:06 am
Would you have a Quadro 2000 image. I own a GTS 450 2GB and I wanto to hardmod it to Quadro. If you help me in it anyway, Tahnk you so much.
Maybe this?
http://h20000.www2.hp.com/bizsupport/TechSupport/SoftwareDescription.jsp?lang=en&cc=us&prodTypeId=0&prodSeriesId=3718668&prodNameId=3718669&swEnvOID=4060&swLang=13&mode=2&swItem=wk-104548-1 (http://h20000.www2.hp.com/bizsupport/TechSupport/SoftwareDescription.jsp?lang=en&cc=us&prodTypeId=0&prodSeriesId=3718668&prodNameId=3718669&swEnvOID=4060&swLang=13&mode=2&swItem=wk-104548-1)

Although it seems GTS 450 is a GF106 only in OEM, v2 and v3 are GF116...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 27, 2013, 12:34:52 am
I just removed 2 and 3 to get F symbol.

you may also remove them, or you may try to put 40k  "in place of the 10k, a "9" symbol"
I think there is no difference.

Looks like "F" is default value for 4 symbol.
and "B" is default  value for 3 symbol.
Thank you, verybigbadboy.

Will start looking for the soldering iron...:)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on March 27, 2013, 01:20:17 am
And here is my question:
Making the 4th symbol an F means 40K resistor (in place of the 10k, a "9" symbol)
But based on the quoted text, that is the same as no resistor, aka see what the 3rd has to say.

Do I over analyse it?

Any comments would be appreciated.

I just removed 2 and 3 to get F symbol.

you may also remove them, or you may try to put 40k  "in place of the 10k, a "9" symbol"
I think there is no difference.

Looks like "F" is default value for 4 symbol.
and "B" is default  value for 3 symbol.

Hi,
Thanks for gnif and your  findings.
I would like to build a new work station and I am going to buy 2 GTX 680 and mod one of them to Quadro K5000 and one tesla K10
Before this I would like to confirm the modding as 2 fo them may cost over thousand dollars

For a 680 to K10....ID "1180" to "118F"
There is only one symbol has to be changed and you said only remove resistor 2 and 3 is OK....no change to resistor 1 right?

For a 680 to K5000....ID "1180" to "11BA"
To get the third symbol from "8" to "B"....I need to change the no. 1 resistor to 20K right?
And for the Symbol "0" to A....
Sorry for this one...I don't understand....Forgive my foolishness...Woudl you mind telling me how to do that?

Thanks very much
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: winjet1 on March 27, 2013, 01:29:57 am
I would be extremely interested to learn how verybigbadboy was able to convert his 680 into a VGX/GRID K1 (i.e. what to solder, what resistors to use and where to get them, etc).  That would open up a lot of things to the home-virualization crowd.

I'd even give a little funding for some "idiots guide to turning your 680 into a VGX"

Just sayin.........

Both verybigbadboy and me have posted enough information for anyone to mod a card based on the reference designs.

His images referring to locations and values as well as the images I posted of GTX 670 apply to GTX 680 as well, unless you have a different board. But then again you did not specify what brand/model you have?

Surely if you come here you must know where to get SMT components? :)


In being completely truthful, modding a graphics card into a virtualized graphics card is what caught my eye.  Aside from basic knowledge all my mental stock is in IT (which is why I asked very basic questions) :)  I am extremely grateful for the instructions thus far.  I'm sure this thread will attract a lot of IT, VMware, and CAD enthusiasts.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 27, 2013, 01:31:21 am
For a 680 to K10....ID "1180" to "118F"
There is only one symbol has to be changed and you said only remove resistor 2 and 3 is OK....no change to resistor 1 right?

For a 680 to K5000....ID "1180" to "11BA"
To get the third symbol from "8" to "B"....I need to change the no. 1 resistor to 20K right?
And for the Symbol "0" to A....

I believe you need to put a 15K for resistor 2.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 27, 2013, 01:33:22 am
In being completely truthful, modding a graphics card into a virtualized graphics card is what caught my eye.  Aside from basic knowledge all my mental stock is in IT (which is why I asked very basic questions) :)  I am extremely grateful for the instructions thus far.  I'm sure this thread will attack a lot of IT, VMware, and CAD enthusiasts.
I just wonder if there will be a backlash from Nvidia if people start selling modded versions of cards.

I don't think they care if a few enthusiasts mod their card or not, but when someone commercializes on it they would probably take notice.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: winjet1 on March 27, 2013, 01:46:06 am
In being completely truthful, modding a graphics card into a virtualized graphics card is what caught my eye.  Aside from basic knowledge all my mental stock is in IT (which is why I asked very basic questions) :)  I am extremely grateful for the instructions thus far.  I'm sure this thread will attack a lot of IT, VMware, and CAD enthusiasts.
I just wonder if there will be a backlash from Nvidia if people start selling modded versions of cards.

I don't think they care if a few enthusiasts mod their card or not, but when someone commercializes on it they would probably take notice.

Seems similar to the situation where cellular companies would lock down the bootloaders of smartphones preventing people from unlocking extra software features and modders created ROMs to unlock them.  It didn't prevent people from buying the high-end phones, and it made people who weren't going to buy, purchase a mid-range phone upgrade due to that fact.

I was never going to to be able to afford a $2k virtualized graphics card.  I didn't plan on buying a $500 card from Nvidia this week either but now they are up at least one high-end card purchase.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: moisyes on March 27, 2013, 04:12:07 am
Would you have a Quadro 2000 image. I own a GTS 450 2GB and I wanto to hardmod it to Quadro. If you help me in it anyway, Tahnk you so much.
Maybe this?
http://h20000.www2.hp.com/bizsupport/TechSupport/SoftwareDescription.jsp?lang=en&cc=us&prodTypeId=0&prodSeriesId=3718668&prodNameId=3718669&swEnvOID=4060&swLang=13&mode=2&swItem=wk-104548-1 (http://h20000.www2.hp.com/bizsupport/TechSupport/SoftwareDescription.jsp?lang=en&cc=us&prodTypeId=0&prodSeriesId=3718668&prodNameId=3718669&swEnvOID=4060&swLang=13&mode=2&swItem=wk-104548-1)
Than you Amigo.

Although it seems GTS 450 is a GF106 only in OEM, v2 and v3 are GF116...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 27, 2013, 06:04:44 am
I just wonder if there will be a backlash from Nvidia...
Of course there will. I think the logic will change.
But most likely not before the next generation cards arrive.

Selling modded cards will be simply illegal...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 27, 2013, 06:17:57 am
Selling modded cards will be simply illegal...

You obviously haven't checked eBay then... :D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 27, 2013, 07:02:29 am
I just wonder if there will be a backlash from Nvidia...
Of course there will. I think the logic will change.
But most likely not before the next generation cards arrive.


Selling modded cards will be simply illegal...

i think they will be forced to modify the gpu core to avoid completely mods.. but it will be expensive.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 27, 2013, 10:56:22 am
i think they will be forced to modify the gpu core to avoid completely mods.. but it will be expensive.
I don't think so.
Even Intel, owning all its fabs, doesn't do that.
They will make modding harder and bricking easier... That will do it.

You obviously haven't checked eBay then... :D
I meant as a business model.
For example: building a business on selling Hackintoshes...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 27, 2013, 02:43:57 pm
Looks like "F" is default value for 4 symbol.
and "B" is default  value for 3 symbol.
I other words, if I remove the resistors in all positions - 1,2 and 3 - I'll get a 11BF part, aka GRID K2.

Now my last question: does going GRID (or Tesla) disable the outputs?
Or will that happen only after the appropriate BIOS is installed (if it can be installed)?

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on March 27, 2013, 05:33:01 pm
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Edit: used GPU-Z

(http://i324.photobucket.com/albums/k359/InitialDriveGTR/k10.gif)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 27, 2013, 08:19:13 pm
i think they will be forced to modify the gpu core to avoid completely mods.. but it will be expensive.
I don't think so.
Even Intel, owning all its fabs, doesn't do that.
They will make modding harder and bricking easier... That will do it.

what i intended is exactly what intel does to differentiate cpus..
for example, 3770-3770k.
they can make the same silicon, but something changes inside.

we only have to wait and see...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 27, 2013, 08:23:15 pm
... I just turned my EVGA GTX 670 FTW into a K10 ...

glad to know that everything worked for you.
have you tried some benchmarks/tests to see if you gain something?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 27, 2013, 09:28:46 pm
Looks like "F" is default value for 4 symbol.
and "B" is default  value for 3 symbol.
I other words, if I remove the resistors in all positions - 1,2 and 3 - I'll get a 11BF part, aka GRID K2.

Now my last question: does going GRID (or Tesla) disable the outputs?
Or will that happen only after the appropriate BIOS is installed (if it can be installed)?
Hi,
Tesla disable outputs.
Grid not disable outputs.
I not tried to change bios.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 27, 2013, 11:20:38 pm
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Please do some tests first before replacing the BIOS because you might not get that much of a difference changing ROMs, actually it might degrade your performance due to more conservative settings for the high-end line.

I think it's the features that have become enabled that makes the difference, for example unlocking the virtualization pathway, which driver and applications check/look for.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 28, 2013, 12:16:41 am
Hi,
Tesla disable outputs.
Grid not disable outputs.
I not tried to change bios.
Thank you, verybigbadboy.

And thanks to gnif for showing us the way.

My soldering skills leave much to be desired but I just love the chance to "stick it to the man"...:)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on March 28, 2013, 12:39:22 am
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Edit: used GPU-Z

(http://i324.photobucket.com/albums/k359/InitialDriveGTR/k10.gif)

Yeah ~~
I can only find K5000's Bios....is this one correct?
http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html (http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html)
then use NVFlash to flash the bios into the card?

I still working on finding K10...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 12:45:32 am
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Edit: used GPU-Z

(http://i324.photobucket.com/albums/k359/InitialDriveGTR/k10.gif)

Yeah ~~
I can only find K5000's Bios....is this one correct?
http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html (http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html)
then use NVFlash to flash the bios into the card?

I still working on finding K10...

You are much better off getting your original BIOS and using a hex editor to update its device ID, then use the KGB voltage mod tool to fix the checksum, don't bother with the voltage mod stuff. I highly doubt that the BIOS controls the number of cores available, this will either be another hardware strap, or burnt out fuses in the GPU.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: winjet1 on March 28, 2013, 01:13:28 am
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Please do some tests first before replacing the BIOS because you might not get that much of a difference changing ROMs, actually it might degrade your performance due to more conservative settings for the high-end line.

I think it's the features that have become enabled that makes the difference, for example unlocking the virtualization pathway, which driver and applications check/look for.

So the hardware mod opens up those virtualization pathways, not a BIOS re-flash (or edit)?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on March 28, 2013, 01:18:24 am
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Edit: used GPU-Z



Yeah ~~
I can only find K5000's Bios....is this one correct?
http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html (http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html)
then use NVFlash to flash the bios into the card?

I still working on finding K10...

You are much better off getting your original BIOS and using a hex editor to update its device ID, then use the KGB voltage mod tool to fix the checksum, don't bother with the voltage mod stuff. I highly doubt that the BIOS controls the number of cores available, this will either be another hardware strap, or burnt out fuses in the GPU.

gnif, Thanks for your inspiration!!
I understand what you are telling about the BIOS but hex editor and KBG voltage mod tool is out of my knowledge...Seems I may need to stop here and wait for experts |O

However, if i use a GTX 680 4G and hard-mod it to K5000. The hardware config. are the same. Could I get the functions after I install K5000 driver?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 01:27:41 am
gnif, Thanks for your inspiration!!
I understand what you are telling about the BIOS but hex editor and KBG voltage mod tool is out of my knowledge...Seems I may need to stop here and wait for experts |O

However, if i use a GTX 680 4G and hard-mod it to K5000. The hardware config. are the same. Could I get the functions after I install K5000 driver?

No worries. Yes, that is the entire point of this mod.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 28, 2013, 01:44:52 am
I understand what you are telling about the BIOS but hex editor and KBG voltage mod tool is out of my knowledge...Seems I may need to stop here and wait for experts |O

(http://i.imgur.com/K2cetme.jpg)

I've highlighted for you in blue the portion of the ROM file (in this case EVGA GTX 680 4GB) that contains the Device ID to be changed. Bytes are in the Little Endian order (least significant byte first, indicative of Intel platforms) so DE 10 80 11 translates to 10DE (NV Vendor ID) 1180 (GTX 680 Device ID).

The red highlight just shows the beginning of the actual VBIOS image, the sequence 55 AA is a header so you know you are in the right section, beside seeing all the text around there, too. :)

Once you've done the editing (any hex editor would do, ie. HxD), get the KGB tool (https://www.dropbox.com/s/fsxyvofr1idazhm/kgb_0.6.2.zip (https://www.dropbox.com/s/fsxyvofr1idazhm/kgb_0.6.2.zip)) to fix the new ROM image checksum (very important).

Also, always remember to backup your original ROM image first. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on March 28, 2013, 02:08:26 am
I understand what you are telling about the BIOS but hex editor and KBG voltage mod tool is out of my knowledge...Seems I may need to stop here and wait for experts |O

(http://i.imgur.com/K2cetme.jpg)

I've highlighted for you in blue the portion of the ROM file (in this case EVGA GTX 680 4GB) that contains the Device ID to be changed. Bytes are in the Little Endian order (least significant byte first, indicative of Intel platforms) so DE 10 80 11 translates to 10DE (NV Vendor ID) 1180 (GTX 680 Device ID).

The red highlight just shows the beginning of the actual VBIOS image, the sequence 55 AA is a header so you know you are in the right section, beside seeing all the text around there, too. :)

Once you've done the editing (any hex editor would do, ie. HxD), get the KGB tool (https://www.dropbox.com/s/fsxyvofr1idazhm/kgb_0.6.2.zip (https://www.dropbox.com/s/fsxyvofr1idazhm/kgb_0.6.2.zip)) to fix the new ROM image checksum (very important).

Also, always remember to backup your original ROM image first. :)

Woooo!!!
I am getting smaller in this...You guys are awesome
So, Should I change the "DE 10 80 11" of the GTX 680 to "DE 10 BA 11"which is K5000 's ID?
What else to do? Sorry that I am just a newbie on this
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 28, 2013, 02:20:27 am
So, Should I change the "DE 10 80 11" of the GTX 680 to "DE 10 BA 11"which is K5000 's ID?
What else to do? Sorry that I am just a newbie on this

Correct. You need to make sure your checksum is valid before flashing. Even if you manually do this (256 possibilities) still ned to verify it
in some BIOS editor like KGB or Kepler Bios Editor (http://rghost.net/43828722 (http://rghost.net/43828722)) or Kepler BIOS Tweaker (http://ul.to/5vuhxe60 (http://ul.to/5vuhxe60))

Then do some tests, before and after changing the BIOS. Actually should've run some tests before modding, too.

Real credit goes to gnif and verybigbadboy for original disclosure of their findings. The rest is application of existing knowledge. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 28, 2013, 02:58:33 am
Summary table...

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 03:11:21 am
I want to make a few things clear.

YOU DO NOT need to modify your BIOS at this point in time, the driver DOES NOT check the Device ID in the BIOS. Unless NVidia smarten up and start checking both there is no need to do this.

To unlock your cards additional features just modifying the resistors is enough.

A GTX690 is two GK104 GPUs connected via a PCIe bridge/expansion chip, they present to the computer as if you had two separate cards installed.
A Tesla K10 is identical, but has larger ECC RAM chips.

Thus, you can mod ANY single GPU card into a Tesla K10 if you can change its Device ID, but obviously you will only get the single device.

In theory 2x GTX680s in SLI converted to K10s would perform faster then a GTX690 fully converted into a K10 as they would have less latency due to the lack of the PCIe bridge/expansion chip on the GTX690.

Changing the card's device ID does not make the driver try to use additional RAM on the card, the RAM amount is configured by hardware straps on the board also.

Flashing the BIOS to the BIOS of what you turned it into is NOT a good idea, as you will start to use the memory and GPU timings contained in the other BIOS, either causing instabilities, bricking your card, or just giving a performance loss. If you insist on changing your BIOS to match the Device ID that is has been modded to, mod the BIOS on your card as per THESE (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg208490/#msg208490) instructions, and be sure to KEEP A BACKUP.

A side note, the two GPUs on the GTX690 have independent BIOSes, and they are DIFFERENT, if you decide to mod the Device ID in the bios you need to do each one independently.

But again I say, there is NO need at current to mod your BIOS at all
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 03:42:23 am
I would also like to post a BIG THANKS to everyone that has donated so far, we are at $300 with $700 remaining at the time of this post.

I plan to video the modification of the GTX690 with before and after benchmarks which will be posted on YouTube. I will also record the process of finding the hardware straps for the other GPU through deductive reasoning and simple testing that is relatively safe to the hardware. So if you want to see this, throw a little into the pool so I can get a card I can safely do this on.

Once we have all the information all figured out I will trawl through this thread and compile everything into a single post making it easy to reference.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: beaker7 on March 28, 2013, 03:46:49 am
Summary table...

K20x is PCI-E 2.0 x16.  I've got one here.  Last minute change by nVidia.

I would imagine the k10 and k20 are as well.

Hoping to turn some Titans into K20x
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on March 28, 2013, 04:39:28 am
Well right now I have no idea how to actually test the K10, nor to see what hardware is enabled/disabled, as GPU-Z has a lot of missing information. What resistor values would I need for a GTX 680?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 28, 2013, 04:48:20 am
Well right now I have no idea how to actually test the K10, nor to see what hardware is enabled/disabled, as GPU-Z has a lot of missing information. What resistor values would I need for a GTX 680?
You can't just turn GTX 670 into GTX 680 (I'm presuming you are talking about your GTX 670 unless you also have a 680). You could change the Device ID but that will not bring the rest of the features out.

As gnif and myself pointed out in previous posts, the number of Shading Units etc would be setup either in another hardware strap, or burnt out fuses in the GPU. The former might be fixable while the later is most probably not.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 04:55:43 am
Well right now I have no idea how to actually test the K10, nor to see what hardware is enabled/disabled, as GPU-Z has a lot of missing information. What resistor values would I need for a GTX 680?

Why would you want to turn a < 680 into a 680, it will not unlock any features, nor make it faster. It would be like sticking a ferrari badge on your bike.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 28, 2013, 05:02:09 am
....It would be like sticking a ferrari badge on your bike.
Myself and those of us who used to put fancy car badges on our bicycles resent that analogy. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 28, 2013, 06:17:54 am
K20x is PCI-E 2.0 x16.  I've got one here.  Last minute change by nVidia.
Fixed.

Can you check Vendor ID and DeviceID?
May I ask what you paid for it?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ErikTande on March 28, 2013, 06:28:41 am
If I understand this right, I should be able to turn my GeForce GTX 660 Ti into a Quadro K5000.   Here's a picture of my exact card:

http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg (http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg)

Can anyone point out what exactly needs to be changed?   I'm going to try take it to a local shop and have them give it a shot, but I need to know exactly what to tell them.

 >:D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 28, 2013, 06:46:17 am
Can anyone point out what exactly needs to be changed?
This is the million dollar question: nobody knows (yet).

At this point only two cards (and their PCB clones) have this question answered. Everything else is guesses at best.

See the posts by gnif and verybigbadboy.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ErikTande on March 28, 2013, 06:53:13 am
Can anyone point out what exactly needs to be changed?
This is the million dollar question: nobody knows (yet).

At this point only two cards (and their PCB clones) have this question answered. Everything else is guesses at best.

See the posts by gnif and verybigbadboy.

Ah ok, got it.    Is the only way to know by testing it out?   Cuz I'm willing to sacrifice my card.  :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 28, 2013, 06:58:52 am
Is the only way to know by testing it out?
Yes, but at this point the resistors to be modded (aka in charge of the Device ID) on this board haven't been identified yet.

If you are ready to lose the card, send it to gnif (ask him first).

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on March 28, 2013, 07:43:26 am
If I understand this right, I should be able to turn my GeForce GTX 660 Ti into a Quadro K5000.   Here's a picture of my exact card:

http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg (http://www.eriktande.com/nvidia_geforce_gtx_660_ti.jpg)

Can anyone point out what exactly needs to be changed?   I'm going to try take it to a local shop and have them give it a shot, but I need to know exactly what to tell them.

 >:D
Look to the right of the top right heatsink mounting hole. Same pattern there. That would be my guess.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Mawson on March 28, 2013, 08:22:21 am
I've never been shy of doing hardware mods as easy as changing out resistors. Anyways, I just turned my EVGA GTX 670 FTW into a K10... I simply removed resistor 2 as per verbigbadboy. Will post back in a bit on the question of whether or not those cores got enabled. I still need a K10 BIOS, so if anyone with access to a real K10, it would be a huge help if you could share that, as I highly doubt I will find it on the internet...

Edit: used GPU-Z

(http://i324.photobucket.com/albums/k359/InitialDriveGTR/k10.gif)

Yeah ~~
I can only find K5000's Bios....is this one correct?
http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html (http://www.techpowerup.com/vgabios/129867/NVIDIA.QuadroK5000.4096.120817.html)
then use NVFlash to flash the bios into the card?

I still working on finding K10...

You are much better off getting your original BIOS and using a hex editor to update its device ID, then use the KGB voltage mod tool to fix the checksum, don't bother with the voltage mod stuff. I highly doubt that the BIOS controls the number of cores available, this will either be another hardware strap, or burnt out fuses in the GPU.

Hi guys, first post here! :)

I was interested in the possibility of unlocking the 670 as well, but  some research on OCN has lead me no believe that the chips are laser cut during production, making it impossible.

On another note, since I am quite interested in having a 680 with a short pcb, how challenging would it be to swap out the chip from a full 680 with one from a 670? And more importantly, would the end product function?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 28, 2013, 08:37:24 am
On another note, since I am quite interested in having a 680 with a short pcb, how challenging would it be to swap out the chip from a full 680 with one from a 670? And more importantly, would the end product function?
Hah, that would be an ultimate hack. You need to source the chip out first then remove the original GPU, clean the pads, reball the new chip and then mount it. All in the full  BGA glory. Perhaps if you have access to the chip and the BGA equipment it is theoretically doable. But then there might be other resistors to adjust as well, flash the ROM, etc.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 08:55:42 am
I was interested in the possibility of unlocking the 670 as well, but  some research on OCN has lead me no believe that the chips are laser cut during production, making it impossible.

Not true, a GK104 is a GK104 is a GK104, the card (or pure speculation, maybe fuses in the silicon) configures its identification and capabilities. These idiots that keep saying that this mod is 'luck of the draw' if it works or not don't understand the technology as well as they think they do. The only confirmed difference between chips is the speed binning, which is not what this mod changes.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on March 28, 2013, 09:15:40 am
Well it wont hurt to try. The way I see it is if NVidia got the silicon manufacturing process down to the point where defective units were not common enough to keep up with the demand for 670's, they may have used units that pass the qualifications for a 680 on the 670 boards. I work for a equipment design group and most of the assembly lines we make have a unit production output of about 1 unit every 8 1/2 seconds, with a failure rate of about 1 unit for every ~6000 units (and every unit is tested for quality control). Some of the lines we have made are designed to change product features on the fly, usually producing enough of one product to fill our retail demand for the next quarter, then changing the settings for a version with more/less features. I can't say what this product is, but I assure you many of you have one, and frankly I'd be screwed if I told you that you can upgrade to the better device with a firmware flash and a couple pins on the CPU tied low.

But point is, I want to see if all nvidia did was just disable stuff with this method for the GPU's that are 680 spec, but intended for 670 boards.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on March 28, 2013, 10:25:53 am
Could someone clarify the difference between the resistor = "symbol" concept, and what has been described by vbbb, below where he removes two resistors to get a symbol?
 
And here is my question:
Making the 4th symbol an F means 40K resistor (in place of the 10k, a "9" symbol)
But based on the quoted text, that is the same as no resistor, aka see what the 3rd has to say.

Do I over analyse it?

Any comments would be appreciated.

I just removed 2 and 3 to get F symbol.

you may also remove them, or you may try to put 40k  "in place of the 10k, a "9" symbol"
I think there is no difference.

Looks like "F" is default value for 4 symbol.
and "B" is default  value for 3 symbol.

I have a GTX 660 Ti that I would be willing to submit as a guinea pig.   Is this something that gnif or someone else could walk me through over skype?

*edit* nevermind, it looks a little too involved for me to handle.  I would need to send the card to someone else.

If someone with the abilty to attempt this wants a card, send me a PM.   I'll ship a 660 Ti as long as you ship it back  :-+

You edited your message while I was preparing an image for you...

If you have a steady hand and decent tools to move the resistors, here's where I think the resistors are on the 660 Ti (of course I could be terribly wrong :) )

Left: Quadro K5000, right: GTX 660 Ti
(http://i.imgur.com/UsHpSG5.jpg)

Based on what I can gather from your photos of the Quadro, whats been discussed on this thread, and the measurements of resistances I've taken on the EVGA GTX 660 ti, I have a feeling the device ID is measured by how much current gets through the circuit i.e. the magnitude of Req. Something like DEVICE ID = i = V/Req, where Req = R1+R2+R3+R4+...+Rn...

Here is a list of resistor magnitudes from the 660 ti mapped according to my drawings below (hopefully this will be of some use)....
R1 = 40k
R2 = 20k
R3 = 5k
R4 = 5k
R5 = 45k
R6 = 33 ohms
R7 = 33 ohms
R8 = 2k
R9 = 2k
R10 = 45k
R11 = 10k
R12 = 10k

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vacaloca on March 28, 2013, 10:38:59 am
K20x is PCI-E 2.0 x16.  I've got one here.  Last minute change by nVidia.

I would imagine the k10 and k20 are as well.

Hoping to turn some Titans into K20x
The K20 is PCI-E 2.0 x16, had one until I got a Titan instead, didn't really need the extra enterprise features for my purposes.

I don't want to mess with my Titan at the moment though... my soldering skills aren't that great, anyway. I could probably have one of my friends give it to a soldering tech at his company to do the work though, lol.

I'm seeing if I can get a 660 Ti relatively cheap (~$200) to convert to a K10 or K5000 in the short term, though. If I do get one, I'd be willing to make it a guinea pig -- Edit: Judging by the mapping that was just posted above, it might be figured out very quickly!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 10:40:02 am
Could someone clarify the difference between the resistor = "symbol" concept, and what has been described by vbbb, below where he removes two resistors to get a symbol?

The terminology he used is incorrect. You are adjusting the two value of the last byte in the device ID, so each resistor represents one nibble in the device ID. See the original post.

How did you measure these resistors, in circuit or did you remove them?

Edit: According to the values provided, R2 or R3 could be for the first nibble as they are 5K which matches up with '8'. For values lower then 8 we do not know, they have not been mapped, but if the 5k per value is consistent when pulling low I would expect it to be a 20-25K, which R2 seems to fit the bill.
Title: Re: Hacking NVidia Cards into their Professional Counterparts AKA GnifMod
Post by: coffeegeek on March 28, 2013, 11:01:16 am
Hi all,

Isn't it about time this mod get's a name? In honor of it's author, I propose "GnifMod" or "ModiGnified".

Any other ideas?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on March 28, 2013, 11:20:13 am
I took measurements with the resistors in series on the card. I know this is frowned upon, but my measured values match those of the resistors when they are detached from the board, I assume it to be an accurate form of measurement. Would you concur?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 11:21:56 am
I took measurements with the resistors in series on the card. I know this is frowned upon, but my measured values match those of the resistors when they are detached from the board, I assume it to be an accurate form of measurement. Would you concur?

Better then nothing but your measurement can be skewed by surrounding components (ie, the GPU).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 28, 2013, 11:25:39 am
R1 = 40k
R2 = 20k
R3 = 5k
R4 = 5k
R5 = 45k
R6 = 33 ohms
R7 = 33 ohms
R8 = 2k
R9 = 2k
R10 = 45k
R11 = 10k
R12 = 10k
Based on those numbers, the R1, R2 and R3 could be playing the roles of 3,2,1 resistors in vbbb's post here
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)

R1 is 40K, meaning R2 is setting the 4th symbol: 20K - 3 (but could be B).
R3 defines the 3rd symbol: 5K - 8 (but could be 0).

So, the ID is 83 (if you have a good reason to exclude 03, 8B and 0B).
Exactly what 660Ti is - 1183.

Just a guess!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 11:31:17 am
It does not make sense to me that there is a third resistor involved, I do not understand why it is thought that removing the 40K resistor you mentioned is also required. The hard straps will not change in their scheme across the entire GK104 series.
Title: Re: Hacking NVidia Cards into their Professional Counterparts AKA GnifMod
Post by: eos on March 28, 2013, 11:36:01 am
Isn't it about time this mod get's a name? In honor of it's author, I propose "GnifMod" or "ModiGnified".

Any other ideas?
How about cGNIFit, aka significant...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: reefjunkie on March 28, 2013, 12:25:07 pm
Hi-
 I'm getting ready to pull the trigger on TWO 4GB GTX 680 (Gigabyte GV-N680OC-4GD) cards and will be Modding them to a K5000 & K10.
I run 1/2 the time gaming and normal programs and 1/2 the time CAD and graphic programs. From the benchmarks I have seen ( http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0 (http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0) ) a K5000 is 3 - 4 times as fast as a 2GB GTX 680 running CAD programs and a 2GB GTX 680 is about 1.5 times as fast as a  K5000 running games.
So, what do I  do...
1.) SLI the GTX 680's together. It will be great for games but so-so for CAD.
2.) Mod both cards to a K5000 & K10 (or two K5000's). this will be blazing for CAD and rendering but, so-so for games.
-OR-
3.) Build "daughter cards" for each and be able to switch the resistors from 680 to K5000 & K10. Also, should I dual boot Win 7 and put the GTX 680 drivers and gaming programs on one boot. And put the Quadro drivers and CAD on the other boot.  :-//

If the "daughter cards" work, could I just put both GTX 680 & Quadro drivers on the same boot partition & Win 7 will know which ones to use depending what I have the cards set to?

Thanks.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 28, 2013, 12:52:52 pm
Hi-
 I'm getting ready to pull the trigger on TWO 4GB GTX 680 (Gigabyte GV-N680OC-4GD) cards and will be Modding them to a K5000 & K10.
I run 1/2 the time gaming and normal programs and 1/2 the time CAD and graphic programs. From the benchmarks I have seen ( http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0 (http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0) ) a K5000 is 3 - 4 times as fast as a 2GB GTX 680 running CAD programs and a 2GB GTX 680 is about 1.5 times as fast as a  K5000 running games.
So, what do I  do...
1.) SLI the GTX 680's together. It will be great for games but so-so for CAD.
2.) Mod both cards to a K5000 & K10 (or two K5000's). this will be blazing for CAD and rendering but, so-so for games.
-OR-
3.) Build "daughter cards" for each and be able to switch the resistors from 680 to K5000 & K10. Also, should I dual boot Win 7 and put the GTX 680 drivers and gaming programs on one boot. And put the Quadro drivers and CAD on the other boot.  :-//

If the "daughter cards" work, could I just put both GTX 680 & Quadro drivers on the same boot partition & Win 7 will know which ones to use depending what I have the cards set to?

Thanks.

Personally I would go for two K5000s. The benchmarks that show that the K5000 is slower for games is only because the real K5000 is clocked slower then the GTX680, so by keeping them both K5000 in SLI you will get the best of both worlds. Both cards use the same driver, there is no 'quadro' driver anymore.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 28, 2013, 07:16:39 pm
gnif, just an advise..

i think that you should put all "successful mod guides" in first page.
just to make some order and clean things up.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on March 28, 2013, 10:05:47 pm
I don't know if this has been mentioned before but would x-raying help to nondestructively identify which balls the ID straps are connected to?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 28, 2013, 10:22:06 pm
hi again, now I try  to modify palit gts 450 green 1gb to quadro 2000.

I successfully changed device id, but card don't want to start. I need help.
Can someone to create highres scans or proho quadro 2000 card?

Also I found that correct resistors are allways placed near SOIC-8. this ic allways have connected 3 7 8 pins together.

I looked gtx690 pictures and think this ic is located on front side, between left processor and pcie.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 28, 2013, 10:30:19 pm
I will try to make the Quadro 2000 scans tonight.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: raven on March 29, 2013, 02:06:31 am
I guess a TITAN cannot be modded in to a Quadro card only Tesla? Not until K6000 comes out I would think. Also can someone post benchmarks of K5000 mod :)

Would be really interesting to compare SPEC benchmarks with K5000.

Also what Tesla and Quadro unlock is a feature called "out of core" rendering for IRAY. Which is avaliable in Nvidia Design Garage for people to test.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 29, 2013, 02:21:56 am
I guess a TITAN cannot be modded in to a Quadro card only Tesla? Not until K6000 comes out I would think. Also can someone post benchmarks of K5000 mod :)

Would be really interesting to compare SPEC benchmarks with K5000.

+1
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ray78 on March 29, 2013, 03:37:34 am
First off thanks to all contributing to this project!  :-+

file: (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42267)

I just ordered a GV-N680OC-2GD. I want to set it to K5000, Value need is 11BA instead of 1180. I'll have to use 15k for resistor 2 and remove resistor 3. What value will be needed for Resistor 1?
Another question without having seen this card by now... does "front" in the image mean you have to remove the fan or is it located somewhere easily accessible?
Thanks
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 29, 2013, 03:56:52 am
... does "front" in the image mean you have to remove the fan
Yes.

(http://www.overclockers.ru/images/lab/2012/10/21/1/05_big.jpg)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 29, 2013, 05:00:06 am
I just ordered a GV-N680OC-2GD. I want to set it to K5000, Value need is 11BA instead of 1180. I'll have to use 15k for resistor 2 and remove resistor 3. What value will be needed for Resistor 1?

Hello ray78

added to my first post summary table.
Summary
GPU NameResistor 1Resistor 2Resistor 3
GTX 67025k10knone
GTX 68025knone5k
tesla k1025knonenone
quadro k5000none15knone
grid k2nonenonenone
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 29, 2013, 05:51:27 am
OK, for all that are interested in hires pictures of Quadro2000 card, here it goes:

www.moucha.net/temp/Quadro2000.7z (http://www.moucha.net/temp/Quadro2000.7z)

Please note, that it has 61.3 MB - 2 hires tiff photos - front and back

As soon as all who asked for it has downloaded it, I will delete it.

Enjoy.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ray78 on March 29, 2013, 06:10:50 am
@verybigbadboy and eos: Great and thanks a lot  :-+
Okay... I'm a little bit of a noob :-DD but just searched for a 15k resistor and asked myself which power rating I should choose?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: mungewell on March 29, 2013, 07:07:40 am
It might be useful to 'correct' the terminology regarding resitor naming, to make it clear which is for which high/low nyble...

In terms of performing the mod, it's easiest just to remove a resistor and not have to solder anything down. Is there any problem with just making everything a Grid-K2?

Do we yet have confirmation that the 660ti or 670 will work with the driver when 'told' to report a different PCI-ID? (asking as the number of cores/etc is different to that they spoof... I assume that the amount of memory is actually reported/probed so that would not be an issue.

Simon
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 29, 2013, 07:29:01 am
Is there any problem with just making everything a Grid-K2?
Not really.
Just the Grid is the least known/used flavor of GK104.
And likely the most expensive (if even sold to anybody but server makers).
Future driver updates might treat it differently...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on March 29, 2013, 10:05:42 am
I have my GTX 670 FTW setup as a K2 right now, Video outputs work, but unfortunately the disabled cores remain disabled  :-\

But:

(http://i324.photobucket.com/albums/k359/InitialDriveGTR/GridK2.gif)

Gonna go see if it will still run my surround setup.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 29, 2013, 12:23:57 pm
I have my GTX 670 FTW setup as a K2 right now, Video outputs work...
Thanks.
That's exactly what I want to accomplish.

Did you just remove the resistors 1 & 2?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 29, 2013, 02:21:53 pm
Hi all,

I decided to have a go at finding the straps for GPU 1 on my card, with both success and failure as the result. I was able to locate them and modify the GTX690 to be a dual core Quadro K5000, but I made the stupid mistake of running it without a heatsync on the bridge chip in the middle of the two while testing. The chip quickly died from overheating when I got excited and let Linux boot into the graphical environment, and there goes my $1000 video card for the greater good, and as such donations are now more important then ever to replace this card now.

I am now running on a semi faulty GT220 (random lockups) and an AMD Radeon X300 to get my triple head working, but as you can imagine this is a very buggy configuration.

(https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42485)

Also that SOIC that sits near the straps I believe is the EEPROM.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 29, 2013, 02:33:13 pm
Thank you for taking this mod all the way, and sorry to hear your 690 died in the process. :(

The chip near is the EEPROM from GigaDevice 25Q20 (http://www.gigadevice.com/WebPage/PageDetail.php?PageId=127&WebPageTypeId=98&WebPageTypeId2=151&WebPageTypeId3=134 (http://www.gigadevice.com/WebPage/PageDetail.php?PageId=127&WebPageTypeId=98&WebPageTypeId2=151&WebPageTypeId3=134))

That might be interesting to note, perhaps all the Device ID straps are always located around it?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 29, 2013, 02:40:50 pm
Thank you for taking this mod all the way, and sorry to hear your 690 died in the process. :(

The chip near is the EEPROM from GigaDevice 25Q20 (http://www.gigadevice.com/WebPage/PageDetail.php?PageId=127&WebPageTypeId=98&WebPageTypeId2=151&WebPageTypeId3=134 (http://www.gigadevice.com/WebPage/PageDetail.php?PageId=127&WebPageTypeId=98&WebPageTypeId2=151&WebPageTypeId3=134))

That might be interesting to note, perhaps all the Device ID straps are always located around it?

I would not be surprised as one of these straps (the top one) connects through to pin 6, which is SCLK according to the data sheet for that part. Also pin 8 match up as GND, but pin 3 looks like it is VCC, not WP as it is permanently tied high on this card. I do not think it a match, but pretty close.

 So in theory you can find one of the straps by searching for resistors that are connected to pin 6.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 29, 2013, 03:14:06 pm
It seems this might be a new invention at NVidia (for 6xx) series, looking at the GTX 570 and its EEPROM (red box), or just a coincidence that the Device ID straps are near the EEPROM in some 6xx series cards.

(http://i.imgur.com/1baLvED.jpg)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 29, 2013, 05:54:15 pm
It seems this might be a new invention at NVidia (for 6xx) series, looking at the GTX 570 and its EEPROM (red box), or just a coincidence that the Device ID straps are near the EEPROM in some 6xx series cards.

Not only 600 series. one resistor is located around eeprom since 8000 series. I checked 8600 240 450 cards.
But on gts 450, second is like a sum of 2 resistors on different sides near proccessor.
gts 450 id is 0dc4
Resistors are 10k + "switchable high-low" 20k = 4. I mean 10k = 1, 20k = 3, 1 + 3 = 4 ;)

The easiest way to found them is to check every resistor for 5k, 10k, ... 40k resistances and look how they are connected with "unsoldered" parts.

Not sure about 7000 series, but i think it is too old to modify it :)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 29, 2013, 08:30:25 pm
gnif: man, this is sad! (but great work) - patience is everything... is it all dead or is there a chance to replace the damaged chip from another damaged card?
Next month I will donate another $50 to you.

people: c'mon donate at least 10 bucks... everything counts!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 29, 2013, 10:37:39 pm
gnif: man, this is sad! (but great work) - patience is everything... is it all dead or is there a chance to replace the damaged chip from another damaged card?
Next month I will donate another $50 to you.

people: c'mon donate at least 10 bucks... everything counts!

Getting a dead card would be next to impossible, and replacing the chip would require equipment I do not have, so no, not really viable. And thanks for the support, you have no idea how much it is appreciated.

Seems like someone heard you, just received a $250 donation, WOW :scared: Over 1/2 way there now.  :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on March 30, 2013, 06:17:20 am
I have all the stuff for BGA reworking, if you manage to find a replacement chip I can swap it in for you

When designing circuits such as graphics cards, it's usually a good practice for the engineers in charge of the board design to locate components by their function. IE Power control circuitry in the back farthest away from the GPU, so it makes sense that NVidia (Who I have always thought to make very elegant circuit designs) would group GPU hardware straps near the EEPROM.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 30, 2013, 06:36:17 am
Seems like someone heard you, just received a $250 donation, WOW :scared: Over 1/2 way there now.  :-+

Great! :D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on March 30, 2013, 06:49:57 am
I removed resistors R2 and R3 on the GTX 660 ti and now have a k10(deviceID 118F)!! What does this say about which resistors need to be removed to give this card the deviceID of 11BF (grid k2)?
From what I can gather it seems I might need to remove R1......but R1 is 40k which would be the same if it were removed.
Any words of wisdom?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 30, 2013, 07:35:37 am
I removed resistors R2 and R3 on the GTX 660 ti and now have a k10(deviceID 118F)!! What does this say about which resistors need to be removed to give this card the deviceID of 11BF (grid k2)?
From what I can gather it seems I might need to remove R1......but R1 is 40k which would be the same if it were removed.
Any words of wisdom?

Hi bdx,
I think you should trace pin 6 from eeprom and you will find correct resistor.

Also gtx 660ti may have "softstrap". You may check that by erasing your eeprom by nvflash. do not forget to save bios first.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on March 30, 2013, 09:00:25 am
By removing R1, R2, and R3, I have turned an GTX 660 ti into a GTX 680.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 30, 2013, 09:05:10 am
I removed resistors R2 and R3 on the GTX 660 ti and now have a k10(deviceID 118F)!!
Since the 660Ti is still in the guessing game, you should remove resistors one by one.

Most likely by removing R2 you changed the last symbol from 3 to F, aka 1183 -> 118F. That is the Tesla K10.
R3 probably has nothing to do with the Device ID. Can you solder the R3 back on and check again?
R4 could be the resistor of interest... Replaced by a 20K resistor, aka "B"...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on March 30, 2013, 09:14:18 am
I stopped my investigation about modification Palit gts 450 Green (OEM) to quadro 2000. I think it is impossible to modify only OEM version with 144 cores. But I think my knowledge can help to someone who want's to modify another card.

gts 450 device id is 0x0DC4.
quadro 2000 id is 0x0DD8

Lets me explain how to set device id.
4th byte value is calculated by resistors 1, 2, 3.
Initially resistance and their values are:
indexmeaningresistancevalue
1shift10k1
2value 0-720k3
3value 8-fnonenone

Result value equals sum of 1 and 2 or 3. If value more than range of resistor 2 or 3 then it equals to high value.
Example:
if 1 = 5k, 2 = none, 3 = 5k  then result 8.
if 1 = 5k, 2 = 25k, 3 = none  then result 4.

setting 3th byte:
After checking different cards I found that 3 byte always set by resistors connected to pin 6 of eeprom.

indexmeaningresistance
4value C35k
5value Dnone

Setting 3th byte was difficult for me because card have soft straps. (it is new for me)
I tried every resistor value for 5th resistor, and this is my result:
4 = none, 5 = 5...20k device id is 0x0DD8, and card don't want to start due to unable to read eeprom. (but clear works fine ;) )
4 = none, 5 = 25k ... 40k device id is 0x0DC8, card works but have wrong id :(
After I cleared eeprom by nvflash, device id has become 0x0DD8.
Then I get soft straps from quadro bios and upload it to eeprom with original bios.
Now card is starting with 0x0DD8 device id and works fine until I installed drivers :). When windows tries to enable 3d I always have blue screen ;). Tested with every 5th resistor value, also tried to remove other resistors.

Also tried to upload quadro bios. It not works at all. Random lines appears ;).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 30, 2013, 10:11:10 am
When comparing Tesla K10 and GRID K2 (from the virtualization application perspective) there is very little info to go by.
In terms of specs, both are the pro version of the (underclocked) GTX 690 or two GTX 680 on one PCB.

One vendor announced to carry these cards was Dell. Here is the the PE R720 that has has such option
http://configure.us.dell.com/dellstore/config.aspx?oc=bectj4&model_id=poweredge-r720&c=us&l=en&s=bsd&cs=04 (http://configure.us.dell.com/dellstore/config.aspx?oc=bectj4&model_id=poweredge-r720&c=us&l=en&s=bsd&cs=04)
The GRID K2 is 15% more expensive than Tesla K10 for some reason...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 30, 2013, 10:13:32 am
The GRID K2 is 15% more expensive than Tesla K10 for some reason...

Niche market that NVidia want to squeeze some extra cash out of.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 30, 2013, 10:23:59 am
I have all the stuff for BGA reworking, if you manage to find a replacement chip I can swap it in for you

Thanks for the offer, I am not certain that it is the point of failure, but it the only thing there that would have failed like this, voltages on the surrounding circuitry all seem to checkout and the bridge chip does not even show up in lspci or via nvflash any more, so I think the GPUs are fine, but the bridge chip is dead. Apparently the same chip is used on some ASUS motherboards, if I can locate one I will be in contact :).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on March 30, 2013, 10:57:00 am
Niche market that NVidia want to squeeze some extra cash out of.
They definitely will do their best...

Maybe because Tesla system needs another NV video card when used in a Windows environment (no outputs).
http://nvidia.custhelp.com/app/answers/detail/a_id/2144/kw/tesla/related/1 (http://nvidia.custhelp.com/app/answers/detail/a_id/2144/kw/tesla/related/1)

GRID in its native form also doesn't have outputs
http://www.nvidia.ca/object/grid-boards.html (http://www.nvidia.ca/object/grid-boards.html)
But modding a GTX670 to a GRID keeps them live, i.e. no additional card needed...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 30, 2013, 11:10:43 am
GRID in its native form also doesn't have outputs
http://www.nvidia.ca/object/grid-boards.html (http://www.nvidia.ca/object/grid-boards.html)
But modding a GTX670 to a GRID keeps them live, i.e. no additional card needed...

Same for Tesla K10.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on March 30, 2013, 07:11:11 pm
Sorry to hear your card died (I suppose a PCIe switch chip needing a heatsink is a little strange in and of itself) but maybe now you can remove the GPUs and see what balls the ID resistors are connected to?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 30, 2013, 09:05:24 pm
........Apparently the same chip is used on some ASUS motherboards, if I can locate one I will be in contact :).

That sounds promising, even I have a "dead" ASUS motherboard (failed BIOS update, no boot, otherwise all OK), let me think what model it is... I think P5K PRO or something like that.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 30, 2013, 11:00:26 pm
........Apparently the same chip is used on some ASUS motherboards, if I can locate one I will be in contact :).

That sounds promising, even I have a "dead" ASUS motherboard (failed BIOS update, no boot, otherwise all OK), let me think what model it is... I think P5K PRO or something like that.

The chip the card uses is a PEX8747 (see: http://www.plxtech.com/products/expresslane/pex8747 (http://www.plxtech.com/products/expresslane/pex8747)). Some boards I read somewhere are using it to expand the number of PCIe slots on it.

Edit: The ASUS P8Z77-V Premium uses it and it seems it is not using a heatsync! Perhaps I have misdiagnosed the fault, I will have another go tomorrow and check things on it to see if I missed anything obvious.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 30, 2013, 11:06:25 pm
OK, so probably only on newer motherboards... :(
http://www.anandtech.com/show/6170/four-multigpu-z77-boards-from-280350-plx-pex-8747-featuring-gigabyte-asrock-ecs-and-evga/31 (http://www.anandtech.com/show/6170/four-multigpu-z77-boards-from-280350-plx-pex-8747-featuring-gigabyte-asrock-ecs-and-evga/31)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 31, 2013, 10:06:48 am
I'm going to order me some thin film resistors for this mod (0402 imperial/1005 metric) but I'm split between Vishay and Panasonic.

While I'm at it I'm ordering an assortment of values (5K-40K), but getting Vishay is generally twice the price of Panasonic.

Obviously I want to keep the cost low so would Panasonic be adequate in this case?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Fsck on March 31, 2013, 10:24:59 am
Panasonic's a good brand. They'll be more than enough assuming you've selected adequate ratings.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on March 31, 2013, 12:02:13 pm
Thanks for the heads-up.

The best I can get is 0.1% but that's about as low as they go and at 25ppm/C they should be stable even when the board gets hot.

Although in this application I doubt that'll serve any purpose after the system boots up and Device ID is acquired.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: poorchava on March 31, 2013, 12:03:34 pm
I'm usually using Yageo, Vishay and Royal Ohm (whichever i can get cheapest). Never actually had any problem with them.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on March 31, 2013, 12:50:18 pm
I am using Vishay, but through testing I found that the values can be +- 2K and it still works, I do not think you need to worry too much.

Also, the straps are read every time the video driver loads, if you are using Linux, this means every time you logout/login. Or windows, every time you get a 'driver is not responding' error and it recovers.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: reefjunkie on March 31, 2013, 04:51:01 pm
Hi-
Well I ended up getting Two EVGA 04G-P4-3687-KR GeForce 4GB GTX 680. Core Clock 1084mhz and Boost Clock 1150mhz.
The boards are the same as the GV-N680OC-2GD except mine are 4GB. I modded both of them to Quadro K5000 (Thanks old Playstation 3 for the resistors  :-DD)
I ran the latest Nvidia 314.22 drivers and Quadro 311.35. It seems the 314.22 drivers are a little bite better so I'm using those.
I did some benchmarking to compare the cards before and after the mod's.
                                                   GTX 680 #1   GTX 680 #2   K5000  #1   K5000 #2
3DMARK 11                                      9022                   8987           9077           9016
Passmark 8 (3D Graphics Mark)     6044                   6091           6025          5996
PCMark Vantage (Gaming)             19336                 18956         18880        16177
PhysX                                         10158-166 fps   10003-165 fps    10176-167 fps   10123-166 fps
SPECviewperf 11
   Catia-03                                       6.05                       5.98               5.9             10.20
   Ensight-04                                  32.20                     32.23              32.20         32.27
   Lightwave-01                             13.23                      12.84            13.14           13.22
   Maya-03                                     12.77                     12.73             12.86           12.85
   Proe-05                                       0.96                      1.00                  1.00           0.99
   Sw-02                                         11.09                    11.37                11.36         12.78
   Tcvis-02                                        1.01                      1.17               1.02             1.02   
   Snx-01                                         3.42                      3.37                 3.40            3.42
As you can see all the scores between stock and modded cards are about the same. The problem is with the SPECviewperf 11 scores. This is  the benchmark for Graphic and CAD programs. This is what the Quadro cards were made  for. The scores for the modded K5000 should be MUCH higher. Take a look here.
http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html (http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html)

It looks to me that just because the computer thinks it’s a Quadro K5000 does not mean that it will act like a K5000.
I even tried this benchmark with the Quadro drivers and got the same results. Hopefully It's just a driver issue and not a hardware issue.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on March 31, 2013, 07:39:31 pm
did you try vga bios mod with nvflash?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: natiss88 on March 31, 2013, 08:10:44 pm

I ran the latest Nvidia 314.22 drivers and Quadro 311.35. It seems the 314.22 drivers are a little bite better so I'm using those.
I did some benchmarking to compare the cards before and after the mod


please, can you benchmark the cards with quadro drivers?

p.s. happy easter to everyone.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on March 31, 2013, 08:34:23 pm


It looks to me that just because the computer thinks it’s a Quadro K5000 does not mean that it will act like a K5000.
I even tried this benchmark with the Quadro drivers and got the same results. Hopefully It's just a driver issue and not a hardware issue.
[/quote]

Thanks for your test!
I would like to make some suggestion as well...there is some another drivers which can be found in nvidia's homepage....like 191.66 Quadro win 7....would you mind try them as well?

That's a very useful information from your test
I just got my Gigabyte 4G 680 last night and will start modding in the coming days
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: niopio on March 31, 2013, 09:08:22 pm
1) Please can somebody check if this hack gives you 10bit colors output? Thanks!

2) Whats the progress in flashin quadro bios (and overclocking it) to the gtx cards?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ss_march on March 31, 2013, 09:51:42 pm
Hi-
Well I ended up getting Two EVGA 04G-P4-3687-KR GeForce 4GB GTX 680. Core Clock 1084mhz and Boost Clock 1150mhz.
The boards are the same as the GV-N680OC-2GD except mine are 4GB. I modded both of them to Quadro K5000 (Thanks old Playstation 3 for the resistors  :-DD)
I ran the latest Nvidia 314.22 drivers and Quadro 311.35. It seems the 314.22 drivers are a little bite better so I'm using those.
I did some benchmarking to compare the cards before and after the mod's.
Thank you for tests.
Here i made one photo how to edit stock rom (people comments help here) with HxD and checksum fix with Kepler Bios Tweaker (http://www.file-upload.net/download-7309019/KeplerBiosTweaker.exe.html).
And one photo with my guess for GTX660Ti.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: NiHaoMike on April 01, 2013, 12:35:52 am
Maybe someone could put on a pin header in place of those resistors and have a play with some resistance substitution boxes? Combine that with a liveUSB designed to boot quickly and automatically do a lspci and it shouldn't take too long to figure things out.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Marius3D on April 01, 2013, 01:38:24 am
Hi-
Well I ended up getting Two EVGA 04G-P4-3687-KR GeForce 4GB GTX 680. Core Clock 1084mhz and Boost Clock 1150mhz.
The boards are the same as the GV-N680OC-2GD except mine are 4GB. I modded both of them to Quadro K5000 (Thanks old Playstation 3 for the resistors  :-DD)
I ran the latest Nvidia 314.22 drivers and Quadro 311.35. It seems the 314.22 drivers are a little bite better so I'm using those.
I did some benchmarking to compare the cards before and after the mod's.
                                                   GTX 680 #1   GTX 680 #2   K5000  #1   K5000 #2
3DMARK 11                                      9022                   8987           9077           9016
Passmark 8 (3D Graphics Mark)     6044                   6091           6025          5996
PCMark Vantage (Gaming)             19336                 18956         18880        16177
PhysX                                         10158-166 fps   10003-165 fps    10176-167 fps   10123-166 fps
SPECviewperf 11
   Catia-03                                       6.05                       5.98               5.9             10.20
   Ensight-04                                  32.20                     32.23              32.20         32.27
   Lightwave-01                             13.23                      12.84            13.14           13.22
   Maya-03                                     12.77                     12.73             12.86           12.85
   Proe-05                                       0.96                      1.00                  1.00           0.99
   Sw-02                                         11.09                    11.37                11.36         12.78
   Tcvis-02                                        1.01                      1.17               1.02             1.02   
   Snx-01                                         3.42                      3.37                 3.40            3.42
As you can see all the scores between stock and modded cards are about the same. The problem is with the SPECviewperf 11 scores. This is  the benchmark for Graphic and CAD programs. This is what the Quadro cards were made  for. The scores for the modded K5000 should be MUCH higher. Take a look here.
http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html (http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html)

It looks to me that just because the computer thinks it’s a Quadro K5000 does not mean that it will act like a K5000.
I even tried this benchmark with the Quadro drivers and got the same results. Hopefully It's just a driver issue and not a hardware issue.


Check in nvidia control panel in profile menus. If you have GeForce like profile menus with games on it, then the mod is useless, if you have pro 3D apps we are getting somewhere. I remember when I softmoded a 8800 GTS and a 8800 GTX. Some times the pro apps in NVCP didn't show and I have no performance gain. Also the same when I did bios mod on a 8800GT/9800GT to FX 3700. The performance gain only when I had the right profile in NVCP.

I would also like to donate but since Easter, I can't (bank holidays). Also I would like to buy a card to mod it my self. Currently I have a GTX 580 in my system and as far as I know it cam only be modded into Tesla M2090 since this is the only pro card with GF110. I also have on my desk an reference PNY GTX 465 which can be modded to a Quadro 5000 since both have GF100.


I was really interested in modding nvidia card to Quadro a few ears ago. I work in Maya, Mudbox, Zbrush, 3D Coat etc. Now I am very interested again ;D.

This is what I did in the past

http://translate.google.com/translate?hl=en&sl=ro&tl=en&u=http%3A%2F%2Fwww.mygarage.ro%2Fghiduri-si-tutoriale%2F29854-geforce-g80-g92-mod-quadro-fx-tutorial-si-rezultate.html (http://translate.google.com/translate?hl=en&sl=ro&tl=en&u=http%3A%2F%2Fwww.mygarage.ro%2Fghiduri-si-tutoriale%2F29854-geforce-g80-g92-mod-quadro-fx-tutorial-si-rezultate.html)

http://translate.google.com/translate?hl=en&sl=ro&tl=en&u=http%3A%2F%2Fwww.mygarage.ro%2Fghiduri-si-tutoriale%2F29854-geforce-g80-g92-mod-quadro-fx-tutorial-si-rezultate.html (http://translate.google.com/translate?hl=en&sl=ro&tl=en&u=http%3A%2F%2Fwww.mygarage.ro%2Fghiduri-si-tutoriale%2F29854-geforce-g80-g92-mod-quadro-fx-tutorial-si-rezultate.html)

The thread language is in Romanian (my native one also) so I tried to give a translated version. Anyway one picture is like 1000 words right  ;D?  Hope you can see them.

Hope I didn't bored you
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: reefjunkie on April 01, 2013, 04:38:01 am
Here is a list of drivers I have tried and I get the same benchmarks from all of them:
NVIDIA DESKTOP DRIVER
314.22-desktop-win8-win7-winvista-64bit-english-whql

QUADRO DRIVERS
314.07-quadro-tesla-win8-win7-winvista-64bit-international-whql
311.35-quadro-tesla-win8-win7-winvista-64bit-international-whql  (PERFORMANCE DRIVER)
311.35-quadro-tesla-win8-win7-winvista-64bit-international-whql  (ODE DRIVER)
307.45-Performance quadro-tesla-win8-win7-winvista-64bit-international-whql   (CATIA PERFORMANCE DRIVER)

I'll keep trying.......
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 01, 2013, 04:42:25 am
It should be noted that this mod was originally performed not to get a high performance Quadro or Telsa card, it was done to unlock additional features such as Mosaic support which does indeed work.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on April 01, 2013, 04:59:05 am
reefjunkie:
it is important to completely uninstall previous nvidia drivers from system before installing other version
have you done that?
if not - check driversweeper from guru3d http://www.guru3d.com/content-page/guru3d-driver-sweeper.html (http://www.guru3d.com/content-page/guru3d-driver-sweeper.html)

also as somebody already stated - if you can find application profiles in driver setting instead of games profiles, the driver should be fine, can you check that?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: reefjunkie on April 01, 2013, 05:29:10 am
I will check... Thanks for the tip.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: blanka on April 01, 2013, 06:25:04 pm
Really thanks for the efforts of gnif and verybigbadboy
However, since I have a EVGA GTX670 with the same PCB layout like GTX660Ti
So I need to find the modification by myself and here is the result.
For the 4th digit, as everyone already knows, it is right on the position of resistor 1 and 2.
Depend on which card you have and you can remove resistor 1 and change it to tesla(40K), grid k2(40K) or Quadro(15K) on resistor 2.
For the 3rd digit, it is the tricky part.
As the low byte on the top side of the PCB with resistor 4.
You don't need to do anything for Tesla K10.
However, if you need to change it to a Quadro K5000 or Grid K2
You need to remove resistor 4 and install resistor 3 "MANUALLY" since no place for resistor 3 any more in the PCB of GTX670 and GTX660Ti
As you can see in my attached  bottom side photo for the "rework".
You need to connect to EEPROM pin 6 with a 20K Ohm and pull up to VCC.
My rework is quite ugly but it works fine!
Please be careful and take your own risk for modifying your card!!

Summary
GPU Name         Resistor 1 / 0-7 4th byte                  Resistor 2 / 8-f 4th byte                 Resistor 3/ 3th byte (high)       Resistor 4 / 3th byte(low)           
GTX 660Ti          20K                                                  None                                               None                                         25k                                                                             
GTX 670            None                                                10K                                                  None                                         25k                                                                             
tesla k10           none                                                40K                                                  None                                         25k                                                                             
quadro k5000    none                                                15k                                                  20K                                            none                                                                             
grid k2              none                                                40K                                                  20K                                           none                                                                             
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 01, 2013, 09:41:02 pm
Hi all again ;)
I successfully modified SPARKLE SXS4501024D5SNM GeForce GTS 450  1GB to Quadro 2000
board is reference nvidia gts 450 http://www.ixbt.com/video3/images/ref/gts450-scan-back.jpg (http://www.ixbt.com/video3/images/ref/gts450-scan-back.jpg)
upd:
gpu passthrough works fine.

Initial values are:
indexmeaningresistance
13 byte value Dnone
23 byte value C35k
34 byte values 8-fnone
44 byte values 0-725k

device / resistors table

device nameR1R2R3R4
gts 450none35knone25k
Quadro 200035knone5knone

furmark: http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=120616 (http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=120616)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on April 02, 2013, 04:14:58 am
........Apparently the same chip is used on some ASUS motherboards, if I can locate one I will be in contact :).

That sounds promising, even I have a "dead" ASUS motherboard (failed BIOS update, no boot, otherwise all OK), let me think what model it is... I think P5K PRO or something like that.

The chip the card uses is a PEX8747 (see: http://www.plxtech.com/products/expresslane/pex8747 (http://www.plxtech.com/products/expresslane/pex8747)). Some boards I read somewhere are using it to expand the number of PCIe slots on it.

Edit: The ASUS P8Z77-V Premium uses it and it seems it is not using a heatsync! Perhaps I have misdiagnosed the fault, I will have another go tomorrow and check things on it to see if I missed anything obvious.

Submitted a quote for the part through avnet on one of our BOM's... If it's < $50 I'll get it on order.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on April 02, 2013, 05:39:01 am
It should be noted that this mod was originally performed not to get a high performance Quadro or Telsa card, it was done to unlock additional features such as Mosaic support which does indeed work.

So if I mod it to a K5000...Does it support Maxiums? Or I mod it to a Tesla K10...will be function  like a tesla card?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: smiddereens on April 02, 2013, 10:32:36 am
Has anybody had any success modifying a GT 640 (GK107-based) to a GRID K1 or Quadro K2000?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: winjet1 on April 03, 2013, 07:04:01 am
Hey I know this is off topic so please don't flame.

I was able to, by the grace of god and not my soldering skills, change my 680 into a GRID K2 (mini).  If anyone is doing this for virtualization reasons check this VMware thread out for help.  I can now report that I am sharing my 680 among multiple Virtual Machines.

http://communities.vmware.com/thread/415887?start=30&tstart=0 (http://communities.vmware.com/thread/415887?start=30&tstart=0)

Please everyone else donate something if this has helped you out!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on April 03, 2013, 01:53:19 pm
Hey I know this is off topic so please don't flame.

I was able to, by the grace of god and not my soldering skills, change my 680 into a GRID K2 (mini).  If anyone is doing this for virtualization reasons check this VMware thread out for help.  I can now report that I am sharing my 680 among multiple Virtual Machines.

http://communities.vmware.com/thread/415887?start=30&tstart=0 (http://communities.vmware.com/thread/415887?start=30&tstart=0)

Please everyone else donate something if this has helped you out!

Been using mine with Windows Server 2012's Hyper-V RemoteFX. It's pretty cool being able to remote desktop in to a virtual machine and then play GTA IV on a 2007 Macbook Pro lol
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 01:59:43 pm
Hey I know this is off topic so please don't flame.

I was able to, by the grace of god and not my soldering skills, change my 680 into a GRID K2 (mini).  If anyone is doing this for virtualization reasons check this VMware thread out for help.  I can now report that I am sharing my 680 among multiple Virtual Machines.

http://communities.vmware.com/thread/415887?start=30&tstart=0 (http://communities.vmware.com/thread/415887?start=30&tstart=0)

Please everyone else donate something if this has helped you out!

Been using mine with Windows Server 2012's Hyper-V RemoteFX. It's pretty cool being able to remote desktop in to a virtual machine and then play GTA IV on a 2007 Macbook Pro lol

This is pretty sweet! I will have to have a go at it as I hate rebooting into windows for the odd game. I had never heard of the GRID K2 nor what it could do until members mentioned it in this thread.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 03, 2013, 02:30:19 pm
I've been researching soft straps for the past few days and there's a way to change the Device ID without soldering.

Problem is that I could not find full information about the strap bits and so what I could piece together so far is that you can change the last two digits to a certain extent.

If you have a 0x1180 (GTX 680) you can go up to 0x119F (range: 1180-119F), basically you can change bits 0-4.

I do not know if/where the bit 5 is to take it above 9F into As and Bs for the 3rd character. I am not sure if that bit is even present in the soft straps but seeing there's a resistor for it, I'm hoping it must be somewhere in there...

Anyone with some insight into soft straps, bit 5 and beyond please post. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 02:43:46 pm
I've been researching soft straps for the past few days and there's a way to change the Device ID without soldering.

Problem is that I could not find full information about the strap bits and so what I could piece together so far is that you can change the last two digits to a certain extent.

If you have a 0x1180 (GTX 680) you can go up to 0x119F (range: 1180-119F), basically you can change bits 0-4.

I do not know if/where the bit 5 is to take it above 9F into As and Bs for the 3rd character. I am not sure if that bit is even present in the soft straps but seeing there's a resistor for it, I'm hoping it must be somewhere in there...

Anyone with some insight into soft straps, bit 5 and beyond please post. :)

Did you actually test this or is it based on the soft-strap information documented here:
https://github.com/pathscale/envytools/blob/master/hwdocs/pstraps.txt

I tried this first and had no success, Linux would ignore them, and we know that in previous generations that the NVidia driver would compare the soft to the hard straps and if they differed enabled 'unstable code' that was designed to cause random hardware faults.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on April 03, 2013, 03:22:16 pm

This is pretty sweet! I will have to have a go at it as I hate rebooting into windows for the odd game. I had never heard of the GRID K2 nor what it could do until members mentioned it in this thread.

What the GRID K2 cards are essentially aimed at is the ability for a person to connect from a system on a say an ultrabook, where hardware is not capable of very high end graphics, to a server, and supply a much higher performance than the local hardware is capable of by itself. You can connect to a virtual machine being hosted on a server with a GRID K2 and use the discrete graphics card in things such as say Solidworks. Where I work we have our servers for our engineering department with GRID K2 cards, we used to use Dell desktops/Laptops with high end quadro cards, but now instead of dropping 3 - 4K on a laptop that might fail after a year or two, everyone gets a cheapo laptop configured to use a virtual machine. This works very well on our gigabit ethernet network too.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 03:28:31 pm

This is pretty sweet! I will have to have a go at it as I hate rebooting into windows for the odd game. I had never heard of the GRID K2 nor what it could do until members mentioned it in this thread.

What the GRID K2 cards are essentially aimed at is the ability for a person to connect from a system on a say an ultrabook, where hardware is not capable of very high end graphics, to a server, and supply a much higher performance than the local hardware is capable of by itself. You can connect to a virtual machine being hosted on a server with a GRID K2 and use the discrete graphics card in things such as say Solidworks. Where I work we have our servers for our engineering department with GRID K2 cards, we used to use Dell desktops/Laptops with high end quadro cards, but now instead of dropping 3 - 4K on a laptop that might fail after a year or two, everyone gets a cheapo laptop configured to use a virtual machine. This works very well on our gigabit ethernet network too.

So does that mean that multiple users can share the same video card across multiple VMs? or just a single VM? And how well does it work with games, etc? It would be nice to be able to share my high end card out for my daughter to use instead of having to buy her a high end card also.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: InitialDriveGTR on April 03, 2013, 04:33:54 pm

This is pretty sweet! I will have to have a go at it as I hate rebooting into windows for the odd game. I had never heard of the GRID K2 nor what it could do until members mentioned it in this thread.

What the GRID K2 cards are essentially aimed at is the ability for a person to connect from a system on a say an ultrabook, where hardware is not capable of very high end graphics, to a server, and supply a much higher performance than the local hardware is capable of by itself. You can connect to a virtual machine being hosted on a server with a GRID K2 and use the discrete graphics card in things such as say Solidworks. Where I work we have our servers for our engineering department with GRID K2 cards, we used to use Dell desktops/Laptops with high end quadro cards, but now instead of dropping 3 - 4K on a laptop that might fail after a year or two, everyone gets a cheapo laptop configured to use a virtual machine. This works very well on our gigabit ethernet network too.

So does that mean that multiple users can share the same video card across multiple VMs? or just a single VM? And how well does it work with games, etc? It would be nice to be able to share my high end card out for my daughter to use instead of having to buy her a high end card also.


Edit: I think this is the kind of off topic that deserves it's own thread:

https://www.eevblog.com/forum/projects/things-to-do-with-a-proffesional-nvidia-graphics-card/ (https://www.eevblog.com/forum/projects/things-to-do-with-a-proffesional-nvidia-graphics-card/)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: athanor on April 03, 2013, 08:48:21 pm
Hi everyone, new to this forum so go easy on me !!

Great work on the mods been done on the GK104 chip - I want to have a look at my GK110 chip now  :-/O .. might be I should wait until the K6000 card for the device ID + drivers but might settle to try for a K20X

I have attached some pictures of the EVGA Titan card - thinking of looking at the resistors near what I think is the eeprom - am I on the right track ?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 08:57:44 pm
Hi everyone, new to this forum so go easy on me !!

Great work on the mods been done on the GK104 chip - I want to have a look at my GK110 chip now  :-/O .. might be I should wait until the K6000 card for the device ID + drivers but might settle to try for a K20X

I have attached some pictures of the EVGA Titan card - thinking of looking at the resistors near what I think is the eeprom - am I on the right track ?

The resistor from pin 6 looks exactly like a hardware strap, and if you follow the trace you can see the alternate position for it, so yeah, I think that's them, but only testing will tell.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 03, 2013, 11:33:46 pm
Did you actually test this or is it based on the soft-strap information documented here:
https://github.com/pathscale/envytools/blob/master/hwdocs/pstraps.txt

I tried this first and had no success, Linux would ignore them, and we know that in previous generations that the NVidia driver would compare the soft to the hard straps and if they differed enabled 'unstable code' that was designed to cause random hardware faults.

I did test on a smaller card (8600GT) as a proof of concept. It is a very delicate operation because setting up wrong straps will hose the card (I know - I did it).

To recover you need to short CE# and Vss pins on the card's EEPROM during boot, then unshort them before running nvflash, to recover from the bad flash.

I am going to conduct a few more tests just to be sure.

The references I used to collect the needed information were the link you posted above; this thread: https://devtalk.nvidia.com/default/topic/489965/cuda-programming-and-performance/gtx480-to-c2050-hack-or-unlocking-tcc-mode-on-geforce/1 and couple of other places for random other details.

I've also looked at dozens of ROMs comparing their soft strap configurations and what not. Most manufacturers do not modify the Device ID in ROMs but here and there you can find them doing it (ASUS for example). You just need to find the same card model with different Device IDs (usually with models with various VRAM configurations)...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 11:36:16 pm
Did you actually test this or is it based on the soft-strap information documented here:
https://github.com/pathscale/envytools/blob/master/hwdocs/pstraps.txt

I tried this first and had no success, Linux would ignore them, and we know that in previous generations that the NVidia driver would compare the soft to the hard straps and if they differed enabled 'unstable code' that was designed to cause random hardware faults.

I did test on a smaller card (8600GT) as a proof of concept. It is a very delicate operation because setting up wrong straps will hose the card (I know - I did it).

To recover you need to short CE# and Vss pins on the card's EEPROM during boot, then unshort them before running nvflash, to recover from the bad flash.

I am going to conduct a few more tests just to be sure.

The references I used to collect the needed information were the link you posted above; this thread: https://devtalk.nvidia.com/default/topic/489965/cuda-programming-and-performance/gtx480-to-c2050-hack-or-unlocking-tcc-mode-on-geforce/1 and couple of other places for random other details. I've also looked at dozens of ROMs comparing their soft strap configurations and what not.

I can confirm that changing the software straps does not change the device ID in the 6 series.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 03, 2013, 11:38:59 pm
I can confirm that changing the software straps does not change the device ID in the 6 series.

The fellow from that NVidia forum post modded a 480 and 580 with this process. I wonder if NVidia caught up to this in 6xx...

EDIT: If you change the Device ID in the straps, obviously you need to change the ROM as well...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 11:40:00 pm
I can confirm that changing the software straps does not change the device ID in the 6 series.

The fellow from that NVidia forum post modded a 480 and 580 with this process. I wonder if NVidia caught up to this in 6xx...

That is what I have said, they did, it does not work in the 6 series, I spent many hours testing this method before restoring to hardware hacking.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 03, 2013, 11:45:04 pm

That is what I have said, they did, it does not work in the 6 series, I spent many hours testing this method before restoring to hardware hacking.

Humbug....well, Fedex is coming today with my 0402 resistors and some larger EEPROMs. :D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 11:46:11 pm

That is what I have said, they did, it does not work in the 6 series, I spent many hours testing this method before restoring to hardware hacking.

Humbug....well, Fedex is coming today with my 0402 resistors and some larger EEPROMs. :D

Why the larger EEPROM? are you going to try and install the quadro/tesla bios?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 03, 2013, 11:57:56 pm
Why the larger EEPROM? are you going to try and install the quadro/tesla bios?

Yes, since I have a GTX 680 4GB its shader/memory configuration is the same with the higher model cards.

I hoped for the soft straps because I wanted a way of finding a resistor configuration that combined with the soft straps could take a card from GTX to Quadro and back just by changing the soft straps.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 03, 2013, 11:59:25 pm
I hoped for the soft straps because I wanted a way of finding a resistor configuration that combined with the soft straps could take a card from GTX to Quadro and back just by changing the soft straps.

You could always hotglue some dip-switches to it and wire them up with wire-wrap.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 04, 2013, 12:05:34 am
You could always hotglue some dip-switches to it and wire them up with wire-wrap.

Meh. I suppose if I want both worlds it'll have to come down to routing wires across the board.

I thought that would've been a messy solution, so I didn't order any dip-switches...and now it bites me back. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 05, 2013, 10:58:27 pm
Hello all again ;) I have good news.

I successfully modified
Zotac PCI-E NV ZT-60206-10L GT640 Synergy 2G 128bit DDR3 900/1600 DVI*2+mHDMI RTL
To NVIDIA GRID K1. It is working fine. passthough works too. BUT Device ID mofidication posible only after bios modification. Bios modification is needed only for specific vendors.

upd:
myweb found resistor places for  Asus GT640-1GD3-L, no bios modification is needed. pic attached to post.

Benchmark:
http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=121229 (http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=121229)

Also I tested NVIDIA Quadro K600. it works.
http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=121226 (http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=121226)

I not tested NVIDIA Quadro K2000 but i think it will work too.

About modification:
all resistors are soldered at back side.
http://www.techpowerup.com/reviews/Zotac/GeForce_GT_640/images/back_full.jpg (http://www.techpowerup.com/reviews/Zotac/GeForce_GT_640/images/back_full.jpg)

Device id
GT640 0FC1
GTX650 0FC6
GRID K1 0FF2
Quadro K600 0FFA
Quadro K2000 0FFE

Initial values are:
indexmeaningresistance
13 byte C D25k
23 byte F none
34 byte values 0-710k
44 byte values 8-fnone

device nameR1R2R3R4
GT 64025knone10knone
GTX 65025knone35knone
Quadro K600none40knone15k
GRID K1none40k15knone
K2000none40knone35K

And you should use unlocked bios. Unlocked bios removed from post due to it is a copyrighted work.
To unlock bios you need to change masks and update checksum.
0000000010: 08 E2 00 00 00 04 00 00 ? 02 10 10 82 FF C3 FF 2F
0000000020: 00 04 00 80
to
0000000010: 08 E2 00 00 00 04 00 00 ? 02 10 10 82 FF FF FF 7F
0000000020: 00 00 00 80
and update bios checksum.

It is not possible to change device id without bios modification for this card.

it is may not necessary to change bios if you using other vendor gt640 card (not Zotac).
To check  "is your bios good for hard mode" you need to do next steps:

1. Create boot dos flash with nvflash.exe tool.
2. Save bios by nvflash --save yourgt640biosname.rom (name should be 8 symbols max long)
3. Create a bios backup.
4. Open bios in any hex editor and check bios values at
0000000010: ?? ?? ?? ?? ?? ?? ?? ?? | ?? ?? ?? ?? FF FF FF 7F;  ?? - any value is good
0000000020: 00 00 00 80
if values are equal your bios is good and you don't need to change anything. else step 4.
4. Change values to be equal values from 4.
5. Update checksum. I do it by nibitor tool. just open bios rom and save it. It produces lot of warnings, but it is ok.
6. Upload bios back to card.
Now you can change values by resistors ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 05, 2013, 11:07:34 pm
Hello all again ;) I have good news.

I successfully modified
Zotac PCI-E NV ZT-60206-10L GT640 Synergy 2G 128bit DDR3 900/1600 DVI*2+mHDMI RTL
To NVIDIA GRID K1. It is working fine. passthough works too. BUT It is posible after bios modification. Bios modification is needed only for specific vendors.

Great work!

And you should use unlocked bios gt640om.rom. I attached it to post.
original bios is gt640ori.rom. I changed masks and updated checksum.

Please remove the BIOS from here, it is a copyrighted work and could bring NVidia down on this forum.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on April 06, 2013, 01:32:33 am

upd:
removing resistor 1 may cause random ID changes after reboot :) I will update post after i solve it


Any news about this? :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gnif on April 06, 2013, 01:36:44 am

upd:
removing resistor 1 may cause random ID changes after reboot :) I will update post after i solve it


Any news about this? :)

I have a donated 680 coming in a few days, I will see if I can track this down. All my SMD components arrived also, so can do a professional rework now :).

Note: donated for testing, not for keeps :(
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Neo_Moucha on April 06, 2013, 01:38:10 am
this is great news gnif!!! :)
I have a GTX 680 ready for making the mod after it is resolved >D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 06, 2013, 04:05:11 am

upd:
removing resistor 1 may cause random ID changes after reboot :) I will update post after i solve it


Any news about this? :)

just removing resistor is not good idea. Value on pin may change randomly because pin is not connected. It is needs to be connected with 40k resistor. I have not tried to find right place for it yet, because it happened once after kernel panic& hard reset ;) and reboot again solve this.
it is still actual issue.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: smiddereens on April 07, 2013, 05:50:02 am
I successfully modified
Zotac PCI-E NV ZT-60206-10L GT640 Synergy 2G 128bit DDR3 900/1600 DVI*2+mHDMI RTL
To NVIDIA GRID K1. It is working fine. passthough works too. BUT Device ID mofidication posible only after bios modification. Bios modification is needed only for specific vendors.

Congratulations on the mod! I'm interested in doing the same thing with my GT 640 but it appears that its PCB is laid out quite differently from yours. It is an EVGA 02G-P4-2645-KR, a photo of its back is attached.

Does this mean that I would need to remove the cooling unit to reach the necessary resistors?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 07, 2013, 06:53:28 am
Does this mean that I would need to remove the cooling unit to reach the necessary resistors?

Yes.

gt 640 front pic:
http://www.hdd.com.pl/zdjecia/106545/VGA/NVD/VGAEVGNVD0260/2.jpg (http://www.hdd.com.pl/zdjecia/106545/VGA/NVD/VGAEVGNVD0260/2.jpg)
I just looked your card and i think u10 is eeprom. you need to check resistors around it.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: jnowak1054 on April 08, 2013, 12:47:03 pm
Looking at the new gtx 650 Ti Boost, it looks to be almost identical to the Quadro K4000 short the memory (same processor and cuda core count.) Would it be a hardware only or a require bios or softstrap mod as well to be done properly?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133485 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814133485)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130909 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814130909)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 08, 2013, 08:09:45 pm
Looking at the new gtx 650 Ti Boost, it looks to be almost identical to the Quadro K4000 short the memory (same processor and cuda core count.) Would it be a hardware only or a require bios or softstrap mod as well to be done properly?
It is impossible to say until someone get bios from card.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ks6g10 on April 09, 2013, 12:18:42 am
For those whom have changed their "model" of their cards, does this enable use of nvidia-smi options, and does it enables use of higher versions of CUDA functions?

E.g. going from CUDA version 3.0 (GTX 670) to 3.5 (K20) would enable Funnel shift as described in http://stackoverflow.com/questions/12767113/funnel-shift-what-is-it (http://stackoverflow.com/questions/12767113/funnel-shift-what-is-it)

Thank you.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on April 09, 2013, 07:22:21 am
Dear All,
I have Asus GT640-1GD3-L card and I would like to make it recognizable as K2000 in order to have working VGA Passthrough in XEN (currently guest Windows recognizes card as gt640, but show error 43).
Please find bellow Asus GT640-1GD3-L photos:
http://www.overclockers.ru/images/lab/2012/12/24/1/15_ASUS_back_big.jpg (http://www.overclockers.ru/images/lab/2012/12/24/1/15_ASUS_back_big.jpg)
http://www.overclockers.ru/images/lab/2012/12/24/1/18_ASUS_PCB_big.jpg (http://www.overclockers.ru/images/lab/2012/12/24/1/18_ASUS_PCB_big.jpg)
Could you please specify which resistors on the photo should be replaced?
Is it a software method how to make Asus GT640-1GD3-L recognizable as K2000?
I already have checked the BIOS:
Code: [Select]
010: 08 e2 00 00 00 04 00 00 02 10 10 82 ff ff ff 7f
020: 00 00 00 80 0e 10 10 82 ff ff ff 7f 00 00 00 80
Looks like the BIOS id correct.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: airthimble on April 09, 2013, 01:30:17 pm
For those whom have changed their "model" of their cards, does this enable use of nvidia-smi options, and does it enables use of higher versions of CUDA functions?

E.g. going from CUDA version 3.0 (GTX 670) to 3.5 (K20) would enable Funnel shift as described in http://stackoverflow.com/questions/12767113/funnel-shift-what-is-it (http://stackoverflow.com/questions/12767113/funnel-shift-what-is-it)

Thank you.

From what I understand this wouldn't be possible, 670 uses the GK104 chip while the K20 uses the GK110, I don't think the 670 has the hardware to support these features.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 09, 2013, 05:37:04 pm
Hello myweb,
Dear All,
I have Asus GT640-1GD3-L card and I would like to make it recognizable as K2000 in order to have working VGA Passthrough in XEN (currently guest Windows recognizes card as gt640, but show error 43).
Quadro K2000 does not support gpu passthrough.
http://wiki.xen.org/wiki/Xen_VGA_Passthrough_Tested_Adapters (http://wiki.xen.org/wiki/Xen_VGA_Passthrough_Tested_Adapters)
http://hcl.xensource.com/GPUPass-throughDeviceList.aspx (http://hcl.xensource.com/GPUPass-throughDeviceList.aspx)
you need to modify it to GRID K1

Please find bellow Asus GT640-1GD3-L photos:
http://www.overclockers.ru/images/lab/2012/12/24/1/15_ASUS_back_big.jpg (http://www.overclockers.ru/images/lab/2012/12/24/1/15_ASUS_back_big.jpg)
http://www.overclockers.ru/images/lab/2012/12/24/1/18_ASUS_PCB_big.jpg (http://www.overclockers.ru/images/lab/2012/12/24/1/18_ASUS_PCB_big.jpg)
Could you please specify which resistors on the photo should be replaced?
It is hard to say without ohmmeter where resistors are located.
I not sure but I think they are near big capacitors and i think top sop-8 IC is EEPROM. Resistors located on front and back near empty resistor places. If you have ohmmeter you can try it to find by yourself.

Is it a software method how to make Asus GT640-1GD3-L recognizable as K2000?
I already have checked the BIOS:
Code: [Select]
010: 08 e2 00 00 00 04 00 00 02 10 10 82 ff ff ff 7f
020: 00 00 00 80 0e 10 10 82 ff ff ff 7f 00 00 00 80
Looks like the BIOS id correct.
Your bios is great. You need change resistors only.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: seneelya on April 10, 2013, 04:10:50 am
hello every one!
First of all i`m very happy about you great and very important work!

i`m follow this topic since the very begining of it and i`m very very interested in this genius modification because it is only one possible solution for me(and not only for me) to obtain "REAL"   ;) K5000 in near future.
But as i see only one performance test in professional software (specviewperf 11) was done with very sad results.

So does the hope exist to unlock professional openGl performance features in some future? or may be one of hard + soft modifications already done it? i mean GT 640 -> K2000 or similar.

i have one quadro 4000 at work so if i can do smth not destructible with it for the Great Future =) i will do it for you.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on April 10, 2013, 06:14:04 am
Quadro K2000 does not support gpu passthrough.
http://wiki.xen.org/wiki/Xen_VGA_Passthrough_Tested_Adapters (http://wiki.xen.org/wiki/Xen_VGA_Passthrough_Tested_Adapters)
http://hcl.xensource.com/GPUPass-throughDeviceList.aspx (http://hcl.xensource.com/GPUPass-throughDeviceList.aspx)
you need to modify it to GRID K1
I am confused: information provided via mentioned links confirms than k2000 works with XEN Passthrough. Please correct me if I understood the information incorectly.

It is hard to say without ohmmeter where resistors are located.
I not sure but I think they are near big capacitors and i think top sop-8 IC is EEPROM. Resistors located on front and back near empty resistor places. If you have ohmmeter you can try it to find by yourself.
Yes, I have ohmmeter  - I will try to find some, but I need start region (the place where needed resistor could be located from your point of view). Please find bellow detailed photo of Pm25LD020 and area around plus back side (it would be nice if you could highlight the resistors that I should check at first):
https://dl.dropbox.com/u/52618061/IMG_0249.JPG
https://dl.dropbox.com/u/52618061/IMG_0250.JPG
https://dl.dropbox.com/u/52618061/IMG_0251.JPG
https://dl.dropbox.com/u/52618061/IMG_0253.JPG
https://dl.dropbox.com/u/52618061/IMG_0254.JPG
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 10, 2013, 05:04:37 pm
Quadro K2000 does not support gpu passthrough.
http://wiki.xen.org/wiki/Xen_VGA_Passthrough_Tested_Adapters (http://wiki.xen.org/wiki/Xen_VGA_Passthrough_Tested_Adapters)
http://hcl.xensource.com/GPUPass-throughDeviceList.aspx (http://hcl.xensource.com/GPUPass-throughDeviceList.aspx)
you need to modify it to GRID K1
I am confused: information provided via mentioned links confirms than k2000 works with XEN Passthrough. Please correct me if I understood the information incorectly.
I think you mixing up Quadro 2000 and Quadro K2000, it is different cards.
Please read links again ;)

Yes, I have ohmmeter  - I will try to find some, but I need start region (the place where needed resistor could be located from your point of view). Please find bellow detailed photo of Pm25LD020 and area around plus back side (it would be nice if you could highlight the resistors that I should check at first):
https://dl.dropbox.com/u/52618061/IMG_0249.JPG (https://dl.dropbox.com/u/52618061/IMG_0249.JPG)
https://dl.dropbox.com/u/52618061/IMG_0250.JPG (https://dl.dropbox.com/u/52618061/IMG_0250.JPG)
https://dl.dropbox.com/u/52618061/IMG_0251.JPG (https://dl.dropbox.com/u/52618061/IMG_0251.JPG)
https://dl.dropbox.com/u/52618061/IMG_0253.JPG (https://dl.dropbox.com/u/52618061/IMG_0253.JPG)
https://dl.dropbox.com/u/52618061/IMG_0254.JPG (https://dl.dropbox.com/u/52618061/IMG_0254.JPG)

So I traced pin 6 from eeprom photos and I think:
R532 is R1 and should be 25k
R558 is R2
from https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332)

So you need to find R3 and R4 places. you need to find 10k resistor with empty place near.
just unsolder every 10k resistor step by step, test pci device id and solder resistor back.
It may help, look at picture in my post, it was 5k and 10k resistors near R3 and R4 in my case.
Afraid to unsolder 10k resistors which connected to fets.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: silicman on April 10, 2013, 10:28:48 pm
I've got my PNY GTX 670. Can I follow you guide "mod GTX 680 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)" to a Quadro?
I'm not an electrician. I just like kind of your works guys. I think it risky a bit. but it ok. I love your works.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on April 11, 2013, 07:22:24 am
I think you mixing up Quadro 2000 and Quadro K2000, it is different cards.
Please read links again ;)
Yes, You are right :)

So I traced pin 6 from eeprom photos and I think:
R532 is R1 and should be 25k
R558 is R2
from https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332)

So you need to find R3 and R4 places. you need to find 10k resistor with empty place near.
just unsolder every 10k resistor step by step, test pci device id and solder resistor back.
It may help, look at picture in my post, it was 5k and 10k resistors near R3 and R4 in my case.
Afraid to unsolder 10k resistors which connected to fets.
You are right again: R532 is 25K and it's R1, R558 is R2
R3 and R4 are resistors near the mounting hole.
As the result my videocard is recognized as Nvidia G1.
Guest windows also  recognized as Nvidia G1, but shows the same error: "Windows has stopped this device because it has reported problems. (Code 43)" :(. I use Ubuntu 13.04 (Beta), Xen 4.2.1, Asrock Z77 Pro4 and Core i5-3470.
Could you please help me solve the issue?
verybigbadboy, could you please specify software versions which you use to get working VGA path through on  GT640 (modified to Grid K1)?

Thank you in advance!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 11, 2013, 07:15:49 pm
You are right again: R532 is 25K and it's R1, R558 is R2
R3 and R4 are resistors near the mounting hole.
I attached your photo with marks on resistors, can you check marks? I would like to add this photo to gt640 post.

Guest windows also  recognized as Nvidia G1, but shows the same error: "Windows has stopped this device because it has reported problems. (Code 43)" :(. I use Ubuntu 13.04 (Beta), Xen 4.2.1, Asrock Z77 Pro4 and Core i5-3470.
Could you please help me solve the issue?
verybigbadboy, could you please specify software versions which you use to get working VGA path through on  GT640 (modified to Grid K1)?
I think it is nvidia drivers issue.
Can you try to remove nvidia geforce drivers. install quadro drivers after.
Also can you check is card working good without xen?

pc: debian 6 xen 4.2
home pc: gentoo, kernel 3.7.10, qemu 1.4.0 + libvirt and virt-manager for config.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on April 11, 2013, 09:32:00 pm
I attached your photo with marks on resistors, can you check marks? I would like to add this photo to gt640 post.
Yes, sure, I will check on today evening at home.
I think it is nvidia drivers issue.
Can you try to remove nvidia geforce drivers. install quadro drivers after.
Also can you check is card working good without xen?
Could you please specify which version of nvidia driver did you use when check vga path through?
I had installed nvidia geforce drivers on VM before I made resistors modifications. When modified videocard was installed Windows said that device driver is not found and I download and install quadro drivers. Installation of quadro drivers looks like uninstall geforce drivers (I do not see GeForce driver at Add/Remove Program)
Ok, will check if card is working on Ubuntu without Xen. Also, could you please clarify how Gt640 (Grid k1 mod) should work with DVI, HDMI outputs?


pc: debian 6 xen 4.2
home pc: gentoo, kernel 3.7.10, qemu 1.4.0 + libvirt and virt-manager for config.
Did you compile Xen from sourcea and apply patches for Nvidia path through support or use xen 4.2 from repository?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 12, 2013, 02:05:10 am
Could you please specify which version of nvidia driver did you use when check vga path through?
I had installed nvidia geforce drivers on VM before I made resistors modifications. When modified videocard was installed Windows said that device driver is not found and I download and install quadro drivers. Installation of quadro drivers looks like uninstall geforce drivers (I do not see GeForce driver at Add/Remove Program)
Ok, will check if card is working on Ubuntu without Xen. Also, could you please clarify how Gt640 (Grid k1 mod) should work with DVI, HDMI outputs?
I tested DVI outputs, it is works fine.

pc: debian 6 xen 4.2
home pc: gentoo, kernel 3.7.10, qemu 1.4.0 + libvirt and virt-manager for config.
Did you compile Xen from sourcea and apply patches for Nvidia path through support
Yes, but I think it should work without patches.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on April 12, 2013, 07:36:56 am
I attached your photo with marks on resistors, can you check marks? I would like to add this photo to gt640 post.
The photo is correct.
I successfully enable GT640 (mod Grid k1) with Nvidia and Nouveau drivers as secondary card under Ubuntu WITHOUT XEN
Windows guest under XEN was booted from Ubuntu live image and video output automatically switched to GT640 (mod Grid k1) HDMI (the color has some hue, but not significant )
Windows 7 x64 SP1 still can't work with GT640 (mod Grid k1) - I have removed all drivers and installed the latest one dowloaded from Nvidia site: Error 43  |O

verybigbadboy, did you check vga path through before making resistors modification?
Are there any ideas how to make Windows working with Nvidia GT640 (mod Grid k1) via Xen vga path through?

Thank you in advance.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Saibot on April 12, 2013, 11:19:41 pm
verybigbadboy that you so much for all the great information ! :) i have now modified a GTX680 into a Grid K2 following your instructions , and i am running it in my VMware view 5.2 environment and it is working perfectly .
But it would be great if i could use a card with 3 or 4GB of memory  , but i don't want to buy a card if i am not sure which resistors to change. do you know of any such card ?

/Saibot
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: SeSl on April 13, 2013, 08:28:11 pm
Hi, everybody! Thanks for an interesting topic.
To me it is most interesting "mod GTX 680 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-

counterparts/msg207550/#msg207550)" to quadro k5000. Now I have no any graphics card based on the chipset gk104. Therefore, I am

free in its choice. The most promising option I see GTX 680 + 4GB mem. For K5000 is important to have the largest possible volume of onboard memory.

With only one such card was a successful experience modding. EVGA 04G-P4-3687-KR GeForce 4GB GTX 680 mod to K5000 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-

professional-counterparts/msg210155/#msg210155). Unfortunately, reefjunkie did not

described  the details of this mod. It is possible, P?B was the same as with GV-N680OC-2GD
I consider following variants of devices:

Gigabyte GV-N680OC-4GD
photo#1reverse side (http://www.nix.ru/autocatalog/gigabyte/video/147201_2258_draft.jpg)
photo#2 reverse side (http://www.hardwareluxx.de/images/stories/galleries/reviews/2012/gigabyte-680oc/gigabyte-680-2.jpg)
photo#3 reverse side (http://www.hardwareluxx.de/images/stories/galleries/reviews/2012/gigabyte-680oc/gigabyte-680-5.jpg)
photo#4 reverse side (http://www.ixbt.com/video3/images/gk104-10/gigabyte-gtx680-scan-back.jpg)
photo front side (http://www.ixbt.com/video3/images/gk104-10/gigabyte-gtx680-scan-front.jpg)
Unlike the GV-N680OC-2GD, Y1 repositioned to the rear side. Different connectors resistors. Independently I do not find their location.

ZOTAC GTX 680 4GB [ZT-60103-10P]
back side (http://www.nix.ru/autocatalog/zotac/134525_2258_draft.jpg)
front side (http://www.easycom.com.ua/data/video/1304062052/img/07_ZOTAC_GeForce_GTX_680_AMP_Edition_Dual_Silencer.JPG)
Video card similar to GV-N680OC-2GD. It is possible, it would be similar mod. The board kept the reference design. It is strange that nobody had chose it for the experiment.

EVGA GeForce GTX 680 FTW+ 4GB [04G-P4-3687-KR]
Pictures with a PCB-s I have not found. Mod executed by reefjunkie described here (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155) However, no description or photos of the device before and after mods. If it is possible, reefjunkie , please give more details.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 13, 2013, 11:25:57 pm
Looking at the new gtx 650 Ti Boost, it looks to be almost identical to the Quadro K4000 short the memory (same processor and cuda core count.) Would it be a hardware only or a require bios or softstrap mod as well to be done properly?
It is impossible to say until someone get bios from card.

ROM BIOS for K4000 is available here http://www.techpowerup.com/vgabios/130511/NVIDIA.QuadroK4000.3072.120813.rom (http://www.techpowerup.com/vgabios/130511/NVIDIA.QuadroK4000.3072.120813.rom)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 13, 2013, 11:37:14 pm
There's also PNY GTX 680 4GB - I have one: http://www.amazon.com/PNY-GeForce-Graphics-Cards-VCGGTX6804XPB/dp/B009MQTVTE (http://www.amazon.com/PNY-GeForce-Graphics-Cards-VCGGTX6804XPB/dp/B009MQTVTE) which basically appears identical to EVGA one.

Not yet modded to K5000, thinking to put a dip switch so I can go between GTX 680 and K5000... :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 13, 2013, 11:44:30 pm
Hi, everybody! Thanks for an interesting topic.
To me it is most interesting "mod GTX 680 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)" to quadro k5000. Now I have no any graphics card based on the chipset gk104. Therefore, I am free in its choice. The most promising option I see GTX 680 + 4GB mem. For K5000 is important to have the largest possible volume of onboard memory.

With only one such card was a successful experience modding. EVGA 04G-P4-3687-KR GeForce 4GB GTX 680 mod to K5000 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155). Unfortunately, reefjunkie did not described  the details of this mod. It is possible, P?B was the same as with GV-N680OC-2GD

I consider following variants of devices:

Gigabyte GV-N680OC-4GD
ZOTAC GTX 680 4GB [ZT-60103-10P]
EVGA GeForce GTX 680 FTW+ 4GB [04G-P4-3687-KR]

Gigabyte board is not a reference, they have redesigned it (look it has 6+8 pin power connector).

Zotac has a board with 6+6 as well as EVGA and PNY (which I believe are identical).

I would probably go with EVGA or PNY (I have the later).

I believe that moving resistors around might only be half the process to get the true high-end features (SpecViewPref).

The system *will* detect the new hardware but it is possible that the ROM BIOS also plays a role in configuration it presents to the NV system driver. Thus ROM needs to be modded/flashed, or alternatively NV system driver hacked. Who's up for it? :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on April 14, 2013, 12:51:33 am
EVGA GeForce GTX 680 FTW+ 4GB [04G-P4-3687-KR]
Pictures with a PCB-s I have not found.
Here is one (taken from http://www.evga.com/forums/tm.aspx?m=1664376&mpage=1 (http://www.evga.com/forums/tm.aspx?m=1664376&mpage=1))
http://i18.photobucket.com/albums/b147/ArcticSilver/EVGA%20GTX%20680%20FTW%204Gb%20with%20Accelero%20Twin%20Turbo%20II/IMAG0212.jpg (http://i18.photobucket.com/albums/b147/ArcticSilver/EVGA%20GTX%20680%20FTW%204Gb%20with%20Accelero%20Twin%20Turbo%20II/IMAG0212.jpg)
It's identical to the 2GB one and I think this is the reason reefjunkie didn't elaborate on what was done.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on April 14, 2013, 01:30:40 am
Here is an article about an utility to modify power states (P0, P3, P8 and P12)
http://www.overclock.net/t/964370/howto-dual-monitor-downclocking-fix-for-nvidia-cards (http://www.overclock.net/t/964370/howto-dual-monitor-downclocking-fix-for-nvidia-cards)
It seems to apply to any NV card...

EDIT
Here is an interesting writeup about nvidia-smi
http://microway.com/hpc-tech-tips/2011/12/nvidia-smi_control-your-gpus/ (http://microway.com/hpc-tech-tips/2011/12/nvidia-smi_control-your-gpus/)
Couple things:
-it will (suppose to) work only on professional cards
-power state selection is semi-automatic

If the virtual GPU is passed to the VM in the P12 state (2D), no advantages of the modding will be realized (most likely).
That might explain why some people see better benchmarks and some don't.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on April 14, 2013, 02:28:58 am
Hi, everybody! Thanks for an interesting topic.
To me it is most interesting "mod GTX 680 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-

counterparts/msg207550/#msg207550)" to quadro k5000. Now I have no any graphics card based on the chipset gk104. Therefore, I am

free in its choice. The most promising option I see GTX 680 + 4GB mem. For K5000 is important to have the largest possible volume of onboard memory.

With only one such card was a successful experience modding. EVGA 04G-P4-3687-KR GeForce 4GB GTX 680 mod to K5000 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-

professional-counterparts/msg210155/#msg210155). Unfortunately, reefjunkie did not

described  the details of this mod. It is possible, P?B was the same as with GV-N680OC-2GD
I consider following variants of devices:

Gigabyte GV-N680OC-4GD
photo#1reverse side (http://www.nix.ru/autocatalog/gigabyte/video/147201_2258_draft.jpg)
photo#2 reverse side (http://www.hardwareluxx.de/images/stories/galleries/reviews/2012/gigabyte-680oc/gigabyte-680-2.jpg)
photo#3 reverse side (http://www.hardwareluxx.de/images/stories/galleries/reviews/2012/gigabyte-680oc/gigabyte-680-5.jpg)
photo#4 reverse side (http://www.ixbt.com/video3/images/gk104-10/gigabyte-gtx680-scan-back.jpg)
photo front side (http://www.ixbt.com/video3/images/gk104-10/gigabyte-gtx680-scan-front.jpg)
Unlike the GV-N680OC-2GD, Y1 repositioned to the rear side. Different connectors resistors. Independently I do not find their location.

ZOTAC GTX 680 4GB [ZT-60103-10P]
back side (http://www.nix.ru/autocatalog/zotac/134525_2258_draft.jpg)
front side (http://www.easycom.com.ua/data/video/1304062052/img/07_ZOTAC_GeForce_GTX_680_AMP_Edition_Dual_Silencer.JPG)
Video card similar to GV-N680OC-2GD. It is possible, it would be similar mod. The board kept the reference design. It is strange that nobody had chose it for the experiment.

EVGA GeForce GTX 680 FTW+ 4GB [04G-P4-3687-KR]
Pictures with a PCB-s I have not found. Mod executed by reefjunkie described here (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155) However, no description or photos of the device before and after mods. If it is possible, reefjunkie , please give more details.


So...that's not possible to mod a Gigabyte GTX 680 4G to a K5000?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: SeSl on April 14, 2013, 03:14:34 am
EVGA GeForce GTX 680 FTW+ 4GB [04G-P4-3687-KR]
Pictures with a PCB-s I have not found.
Here is one (taken from http://www.evga.com/forums/tm.aspx?m=1664376&mpage=1 (http://www.evga.com/forums/tm.aspx?m=1664376&mpage=1))
http://i18.photobucket.com/albums/b147/ArcticSilver/EVGA%20GTX%20680%20FTW%204Gb%20with%20Accelero%20Twin%20Turbo%20II/IMAG0212.jpg (http://i18.photobucket.com/albums/b147/ArcticSilver/EVGA%20GTX%20680%20FTW%204Gb%20with%20Accelero%20Twin%20Turbo%20II/IMAG0212.jpg)
It's identical to the 2GB one and I think this is the reason reefjunkie didn't elaborate on what was done.

Thank you! Now I see.

Gigabyte board is not a reference, they have redesigned it (look it has 6+8 pin power connector).
Zotac has a board with 6+6 as well as EVGA and PNY (which I believe are identical).

Zotac - yes (6+6), but  EVGA has 8+6 Power Design (http://www.evga.com/articles/00669/#GTX680FTW). There is also a variety of cards 680-series (http://www.evga.com/Products/ProductList.aspx?type=0&family=GeForce+600+Series+Family&chipset=GTX+680), made ??by EVGA. How to find out which of them is identical to the reference? In Russia sell two variants of 4Gb vers. EVGA GTX 680 04G-P4-3687-KR and EVGA GTX680 04G-P4-2686-KR.
PNY version I have excluded. It is not possible to buy somewhere near. Zotac vs EVGA?  Is the question. Why did not anyone binds to Zotac? On the other hand, 04G-P4-3687-KR has already tested in mod. Less risk, possible.


So...that's not possible to mod a Gigabyte GTX 680 4G to a K5000?
I have not said so. For me it is - a difficult task.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: shlomo.m on April 14, 2013, 07:37:35 am
I can confirm the mod done by blanka.
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)
But I pimped it a little bit.

670GTX to K5000 works!

R4 on the front side.
R1, R2, R3 on the bottom side.

K5000 works absolutely stable for me, but has no performance increase in SPECviewperf. I tested with few different Quadro drivers.

Summary
GPU Name         R1 / 0-7 4th byte        R2 / 8-f 4th byte   R3/ 3th (high)   R4 / 3th (low)
GTX 660Ti          20K                            None                       None             25k
GTX 670            None                           10K                        None              25k
tesla k10           none                            40K                         None            25k
Quadro k5000    none                            15k                          40K            none
grid k2              none                            40K                          40K            none

I flashed it (EVGA 670GTX 2GB 915MHz) with the K5000 bios from techpowerup.
"nvflash.exe -4 -5 -6 K5000.rom" had to be used because of different subsystem and board id.

It started with minor pixel errors but booted into win7.
After driver installation and reboot win7 didn't start anymore.
Flashing it back worked without problems.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on April 14, 2013, 08:25:24 am
To those that successfully modded their cards and run benchmarks:

Could you download the NVIDIA Inspector from here
http://www.majorgeeks.com/NVIDIA_Inspector_d6630.html (http://www.majorgeeks.com/NVIDIA_Inspector_d6630.html)
and check what power state it is in when running the benchmark.

You might want to add the benchmark to the list of programs that force the card into the P0 state
http://www.overclock.net/t/964370/howto-dual-monitor-downclocking-fix-for-nvidia-cards (http://www.overclock.net/t/964370/howto-dual-monitor-downclocking-fix-for-nvidia-cards)
and run the benchmark again.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 14, 2013, 08:50:47 am
and check what power state it is in when running the benchmark.

P0
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on April 14, 2013, 09:37:03 am
P0
Have you noticed improvements when running SPECviewperf (where it suppose to be most visible)?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: SeSl on April 14, 2013, 03:35:30 pm
I can confirm the mod done by blanka.
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)
But I pimped it a little bit.

670GTX to K5000 works!

R4 on the front side.
R1, R2, R3 on the bottom side.

K5000 works absolutely stable for me, but has no performance increase in SPECviewperf. I tested with few different Quadro drivers.

Summary
GPU Name         R1 / 0-7 4th byte        R2 / 8-f 4th byte   R3/ 3th (high)   R4 / 3th (low)
GTX 660Ti          20K                            None                       None             25k
GTX 670            None                           10K                        None              25k
tesla k10           none                            40K                         None            25k
Quadro k5000    none                            15k                          40K            none
grid k2              none                            40K                          40K            none

I flashed it (EVGA 670GTX 2GB 915MHz) with the K5000 bios from techpowerup.
"nvflash.exe -4 -5 -6 K5000.rom" had to be used because of different subsystem and board id.

It started with minor pixel errors but booted into win7.
After driver installation and reboot win7 didn't start anymore.
Flashing it back worked without problems.

Accept my congratulations!
This, of course, sadly, that the tests do not show the increase performance. For me this is important point. Card is needed to work with CATIA, in the case of a negative result, I'll be forced to take a real K5000 :-[ Apparently, as for many readers of this thread. In any case, thanks for the work done!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: WillV on April 14, 2013, 04:11:15 pm
Was wondering if there was anyone who has successfully got the EVGA GT 640 cards working?  I have the 4gb model of the one mentioned earlier and haven't been able to track down the area needed.  I've attached two pics of mine with the cooler removed.  The backside is identical to the pic posted earlier.  Any ideas or help is greatly appreciated.  I am also looking into this for a VMWare vSGA solution like others have mentioned.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 14, 2013, 06:05:13 pm
Was wondering if there was anyone who has successfully got the EVGA GT 640 cards working?
Hi WillV,
Can you create photos near U10 sop-8 IC please, and post the link to back side?
Thank you.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: WillV on April 14, 2013, 06:31:41 pm
Does this help?  I'll go ahead and post my pic of the back as well. 
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on April 14, 2013, 06:45:39 pm
Does this help?  I'll go ahead and post my pic of the back as well.
Sorry but trace is located under u10, It is imposible to find it without ohmmeter. You need to find resistor and resistor place connected to pin 6 of u10.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: WillV on April 14, 2013, 06:50:04 pm
Ah, that gives me a starting point at least.  I'll see what I can find.  Thank you.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on April 15, 2013, 07:01:31 am
Dear All,

Could you please help me to make working configuration:
Core i5 i5-3470
Asrock z77 pro4
ASUS GT640-1GD3-L (mod Grid K1)
ASUS HD7750-DCSL-1GD5

Xen 4.2.1
Kubuntu 13.04 (Beta)
3.8.0-17-generic #27-Ubuntu SMP Sun Apr 7 19:39:35 UTC 2013 x86_64 x86_64 x86_64 GNU/Linu

The both videocards are working without Xen.
The both videocards are not working in DomU.
I tried DomU: Wondows 7, Windows 8, Kubuntu 13.04 (Beta)

I get BSOD during boot of ASUS HD7750-DCSL-1GD5 under Windows 7, 8 : atikmpag.sys
I tried to install only drivers (that is impossible to do with drivers from Ati website: windows could not find atikmpag.sys, I installed drivers from asus site)
Windows 8 has embedded driver.

I use xorg edgers ppa under ubuntu - exactly the same that works without Xen.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: zuluriney on April 15, 2013, 11:40:23 pm
Hi,
first of all, great work :)

I will be getting a Zotac ZT-60106-10P GTX 680 with 4GB RAM soon-ish and proceed to start testing..
The goal is a K5000.

does anyone have experience with ZOTACs? do they keep reference layout?
if yes, do I need to Remove the cooler or are the resistors on the back side?

Thanks to everyone ^^

Cheers
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on April 16, 2013, 04:25:16 am
I will be getting a Zotac ZT-60106-10P GTX 680 with 4GB RAM soon-ish and proceed to start testing..
The goal is a K5000.
Why don't you get a card that is known to be moddable?
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: blanka on April 16, 2013, 10:29:16 pm
I can confirm the mod done by blanka.
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)
But I pimped it a little bit.

670GTX to K5000 works!

R4 on the front side.
R1, R2, R3 on the bottom side.

K5000 works absolutely stable for me, but has no performance increase in SPECviewperf. I tested with few different Quadro drivers.

Summary
GPU Name         R1 / 0-7 4th byte        R2 / 8-f 4th byte   R3/ 3th (high)   R4 / 3th (low)
GTX 660Ti          20K                            None                       None             25k
GTX 670            None                           10K                        None              25k
tesla k10           none                            40K                         None            25k
Quadro k5000    none                            15k                          40K            none
grid k2              none                            40K                          40K            none

I flashed it (EVGA 670GTX 2GB 915MHz) with the K5000 bios from techpowerup.
"nvflash.exe -4 -5 -6 K5000.rom" had to be used because of different subsystem and board id.

It started with minor pixel errors but booted into win7.
After driver installation and reboot win7 didn't start anymore.
Flashing it back worked without problems.

Hi, shlomo:

Thanks for your update.  I am just too lazy to fine tune my workaround for those resistor.
About the reason your windows can't start up, it is because you use the original "K5000" firmnware
DO NOT USE ANY OTHER FIRMWARE EXCEPT THEY ARE FROM THE SAME LAYOUT
The original K5000 firmware is use for 4096MB board with GTX680 layout.
Since we are using GTX670/GTX660Ti, Use the original firmware and modify the PCI Device ID is enough.
Please be aware that EVGA's firmware has 2 place that contain its Device ID.
Please use Hex editor and search 8911(hex value of GTX670) and change it to BA11.
Then use KaplerBIOSTweaker to fix the checksum or any utility you like.
This will make the board run at K5000 smoothly without any problem since you didn't change the firmware at all!!!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on April 18, 2013, 11:14:02 am
I can confirm the mod done by blanka.
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)
But I pimped it a little bit.

670GTX to K5000 works!

R4 on the front side.
R1, R2, R3 on the bottom side.

K5000 works absolutely stable for me, but has no performance increase in SPECviewperf. I tested with few different Quadro drivers.

Summary
GPU Name         R1 / 0-7 4th byte        R2 / 8-f 4th byte   R3/ 3th (high)   R4 / 3th (low)
GTX 660Ti          20K                            None                       None             25k
GTX 670            None                           10K                        None              25k
tesla k10           none                            40K                         None            25k
Quadro k5000    none                            15k                          40K            none
grid k2              none                            40K                          40K            none

I flashed it (EVGA 670GTX 2GB 915MHz) with the K5000 bios from techpowerup.
"nvflash.exe -4 -5 -6 K5000.rom" had to be used because of different subsystem and board id.

It started with minor pixel errors but booted into win7.
After driver installation and reboot win7 didn't start anymore.
Flashing it back worked without problems.

Hi, shlomo:

Thanks for your update.  I am just too lazy to fine tune my workaround for those resistor.
About the reason your windows can't start up, it is because you use the original "K5000" firmnware
DO NOT USE ANY OTHER FIRMWARE EXCEPT THEY ARE FROM THE SAME LAYOUT
The original K5000 firmware is use for 4096MB board with GTX680 layout.
Since we are using GTX670/GTX660Ti, Use the original firmware and modify the PCI Device ID is enough.
Please be aware that EVGA's firmware has 2 place that contain its Device ID.
Please use Hex editor and search 8911(hex value of GTX670) and change it to BA11.
Then use KaplerBIOSTweaker to fix the checksum or any utility you like.
This will make the board run at K5000 smoothly without any problem since you didn't change the firmware at all!!!

The problem of mine is that I got a 4G gigabyte GTX 680....which is not the same as EVGA...I can't find the correct resistors on the board...
Could someone help please?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 18, 2013, 11:39:44 am
The problem of mine is that I got a 4G gigabyte GTX 680....which is not the same as EVGA...I can't find the correct resistors on the board...
Could someone help please?

It should not make that much difference. What is the brand and model # of your card?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: victorngcm on April 18, 2013, 12:00:56 pm
The problem of mine is that I got a 4G gigabyte GTX 680....which is not the same as EVGA...I can't find the correct resistors on the board...
Could someone help please?

It should not make that much difference. What is the brand and model # of your card?

Gigabyte GTX 4G
I couldn't find the Y1 element on the front but on the back
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on April 18, 2013, 12:46:53 pm
Gigabyte GTX 4G
I couldn't find the Y1 element on the front but on the back

Like this? http://www.overclockzone.com/articles/zolkorn/2012/12/gigabyte_gtx_680_windforce/IMG_9112.JPG (http://www.overclockzone.com/articles/zolkorn/2012/12/gigabyte_gtx_680_windforce/IMG_9112.JPG)

Right below the rightmost memory chip, the quartz oscillator is there Y1 and then below it to the right could be an EEPROM.

Technically around the EEPROM was where all the resistors were found on most cards, perhaps some of those there are what you would be looking for.

You need to do some tracing and probing with a DMM to see what leads to the EEPROM. Also, the resistors could be on the front side directly above the area where the EEPROM is on the back.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: solarbot on April 22, 2013, 07:21:37 pm
Hello folks - I have just come across your very cool thread, thanks for sharing.

I'm very keen to give this a go and have been searching for specific info on how to do this to an ASUS GTX 660 DirectCU II but without luck - so wondered if I could run the question/image below past yo'all to see if anyone else has done this or can spot the resistors in question from afar?  I'm still waiting for my card so nothing in front of me at the moment but can hopefully post some decent marked up images soon.  Wondering if anyone has anyone produced a reference for the different cards - success and fails?

Cheers for now :-)

(http://rog.asus.com/wp-content/uploads/2012/09/GTX-660-card-underside.jpg)

from:

http://rog.asus.com/161872012/graphics-cards-2/graphics-go-nifty-on-route-660-asus-geforce-gtx-660-directcu-ii-top-unboxed/  (http://rog.asus.com/161872012/graphics-cards-2/graphics-go-nifty-on-route-660-asus-geforce-gtx-660-directcu-ii-top-unboxed/)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: lightsol on April 23, 2013, 06:01:37 am
You will probably need to take off the front cooler as well, because thats where they rest ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: solarbot on April 24, 2013, 08:05:20 pm
Morning,

My ASUS GTX 660 DirectCU II arrived just now so I took it straight apart :-) - Thanks lightsol, you are spot on by the looks of it, the resistors are on the GPU side :-)

Below are the photos with resistor values - wondering if anyone can confirm that I am looking in the right place and perhaps which resistors I need to change/add/remove to make the most of the ASUS GTX 660 DirectCU II as a Quadro card- ie what is the most we can get out of this baby :-)

In case it helps the datasheet for the Serial Flash IC is here: http://www.mxic.com.tw/QuickPlace/hq/PageLibrary4825740B00298A3B.nsf/h_Index/3F21BAC2E121E17848257639003A3146/$File/MX25L2006E,%203V,%202Mb,%20v1.3.pdf (http://www.mxic.com.tw/QuickPlace/hq/PageLibrary4825740B00298A3B.nsf/h_Index/3F21BAC2E121E17848257639003A3146/$File/MX25L2006E,%203V,%202Mb,%20v1.3.pdf)

Might I also ask if anyone knows what size these resistors are.... 0603 or 0402 ?

Cheers.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: solarbot on April 25, 2013, 12:30:58 am
Hi,

I was wondering if the info was available to allow us less able folk to work the resistors out ourselves - it might be in this thread already or written up somewhere else but I haven't found it yet?

I guess with the info below we would be able to track down the fix on a card by card basis?

- Which resistors are connected to with which pin on the Serial Flash chip - eg R1 connected to pin 1 on the Serial Flash chip.
- A list of resistor values that equate to device IDs - eg R1@10K, R2@25K & R3@1R = 11BA
- A list of Device IDs which reference card names - eg 11BA = Quadro 5000
- A list of GTX cards & what the metamorphose into - eg a GTX680 into a Quadro 5000

The pin out for the MX25L2006E IC is below.

Any pointers appreciated :-)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vacaloca on April 26, 2013, 08:08:03 am
This question is for those of you that have already modded to a Tesla K10, Quadro 4000 or Quadro 5000, or for that matter either GRID K1 or K2 variants:

Which (if any) settings are you able to change using nvidia-settings? See below:
http://microway.com/hpc-tech-tips/2011/12/nvidia-smi_control-your-gpus/ (http://microway.com/hpc-tech-tips/2011/12/nvidia-smi_control-your-gpus/)
Are you able to enable ECC memory settings, for example? Please try to change as many settings as you can test. :)

In Windows installations, nvidia-smi is (usually) present in %ProgramFiles%\NVIDIA Corporation\NVSMI for those that are looking for it.

There still seems to be debate as to how performance differs after a modification is done, and I'm surprised that hasn't been looked into -- virtualization is a great feature, but others are more interested in CAD performance (Quadro mods) or seeing if other nvidia-smi settings can be changed... like these:

Code: [Select]
    -e,   --ecc-config=         Toggle ECC support: 0/DISABLED, 1/ENABLED
    -p,   --reset-ecc-errors=   Reset ECC error counts: 0/VOLATILE, 1/AGGREGATE
    -c,   --compute-mode=       Set MODE for compute applications:
                                0/DEFAULT, 1/EXCLUSIVE_THREAD,
                                2/PROHIBITED, 3/EXCLUSIVE_PROCESS
    -dm,  --driver-model=       Enable or disable TCC mode: 0/WDDM, 1/TCC
    -fdm, --force-driver-model= Enable or disable TCC mode: 0/WDDM, 1/TCC
                                Ignores the error that display is connected.
          --gom=                Set GPU Operation Mode:
                                    0/ALL_ON, 1/COMPUTE, 2/LOW_DP
    -ac   --application-clocks= Specifies <memory,graphics> clocks as a
                                    pair (e.g. 2000,800) that defines GPU's
                                    speed in MHz while running applications on a GPU.
    -rac  --reset-application-clocks
                                Resets the application clocks to the default value.
    -pl   --power-limit=        Specifies maximum power management limit in watts.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Kniteman77 on April 26, 2013, 05:13:59 pm
So upon reading through this thread I have a couple questions.

It Appears that you CAN unlock a GTX680 into a full K5000 /w the same pipelines, added features and performance boost.

However, it appears you CANNOT unlock a GTX670 into a K5000 /w the pipelines and performance boost.

Is there any information to the contrary on that I'm missing?

I'm trying to build a video editing MacPro and I'm looking for the best single GPU card I can throw in.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 01, 2013, 05:31:25 am
Has anybody tried to modify a Titan into a Grid K2?
I'd like the performance of the extra shaders over the GTX680, but I specifically need it for Xen VGA passthrough. And I'm not sure what the driver will do when it sees a card that claims to be a Grid K2 with an extra thousand shaders.

Alternatively, has anyone modified a Gainward Phantom 4GB GTX680? Is the PCB a straight reference design?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: allyman on May 01, 2013, 07:58:58 am
First of all, great work on discovering this.

Has anyone gotten Nvidia Mosaic to work on the modified cards? I modified my EVAG GTX670 to a Quadro K5000 successfully, however using the quadro driver rev 314.07 or 311.50 no mosaic support in nv control panel.  Do I have to enable it somehow?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: moisyes on May 03, 2013, 01:41:31 am
Hi all,

I decided to have a go at finding the straps for GPU 1 on my card, with both success and failure as the result. I was able to locate them and modify the GTX690 to be a dual core Quadro K5000, but I made the stupid mistake of running it without a heatsync on the bridge chip in the middle of the two while testing. The chip quickly died from overheating when I got excited and let Linux boot into the graphical environment, and there goes my $1000 video card for the greater good, and as such donations are now more important then ever to replace this card now.

I am now running on a semi faulty GT220 (random lockups) and an AMD Radeon X300 to get my triple head working, but as you can imagine this is a very buggy configuration.

Thank you very much for you job, Gnif. I am working in a Desktop with a GTX 690 using Blender for Architectuiral Rendering. After finding these posts I adviced him And he purchased a GTX 680 GB Zotac, and I could (tks God) Mod it to Quadro K5000.I'm testing it and will take the GTX 690 for modding too. Will I need any extra heatsink to prevent what happenend to you hero card? In you opinion, will be better for my use to mod it to a dual quadro k5000 or to a K10? Thank very much in advance, Gnif.

(https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/?action=dlattach;attach=42485)

Also that SOIC that sits near the straps I believe is the EEPROM.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on May 03, 2013, 08:03:36 am
You do not need an extra heat sink, just make sure you put your original one back in place.

The reason gnif's 690 went up in smoke is because in haste he did not put the heat sink back on, so learn from his painful example.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 05, 2013, 08:44:35 pm

Might I also ask if anyone knows what size these resistors are.... 0603 or 0402 ?

Cheers.

Judging from the pictures, these have got to be 0402. I am planning on modding my gtx680 when i get a chance.

I would like to know this ahead of time if someone knows the answer.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: justanothercanuck on May 06, 2013, 02:18:11 pm
Wow, I didn't know people were still doing this...  I guess I'm a little late here, I should really browse the forums more.  :P

I remember people used to do this so they could run 3d animators and video editors that wouldn't run on the desktop cards...  I'm surprised they are skimping on the linux drivers though...  Kindof sad really, as they're all I would recommend for linux systems, as the ATI drivers were an absolute hellhole for the past 10 years.  What can you do I suppose...  :-//
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 06, 2013, 07:24:00 pm
I remember people used to do this so they could run 3d animators and video editors that wouldn't run on the desktop cards...  I'm surprised they are skimping on the linux drivers though...  Kindof sad really, as they're all I would recommend for linux systems, as the ATI drivers were an absolute hellhole for the past 10 years. What can you do I suppose...  :-//

I would have to agree with you there. They were bad, but slowly getting better, though they still have some major problems.

One area where AMD/ATI shines is virtualization. I can pass a 7970 through to a Xen guest with relative ease and get native performance within the VM, useful for gaming no doubt. I believe AMD even worked to help build the code that Xen uses for the gpu passthrough.

In any case, I do not recommend nVidia on linux anymore. I did buy a gtx680 just to do this though.... mmmm FLReset.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 08, 2013, 04:56:38 am

One area where AMD/ATI shines is virtualization. I can pass a 7970 through to a Xen guest with relative ease and get native performance within the VM, useful for gaming no doubt. I believe AMD even worked to help build the code that Xen uses for the gpu passthrough.

In any case, I do not recommend nVidia on linux anymore. I did buy a gtx680 just to do this though.... mmmm FLReset.

Both Nvidia and ATI have issues once you start straying from the basics. All you can do most of the time prioritize what you need to not be broken and pick the card which works for that specific environment. Just try getting things working reliable in both Linux and Windows with something like an IBM T221 and you'll find there's a number of pitfalls if you haven't done your research properly to begin with. Throw Xen virtualization into the mix and the number of complexities multiplies.

Regarding Xen virtualization, I haven't tried Nvidia yet (my Quadro 2000 for testing is in the post), but I sincerely hope the experience is less appalling than with the ATI. Granted, ATI cards almost work whereas desktop Nvidia cards don't work at all with VGA passthrough without a whole raft of extra Xen patches, but the experience is poor at best. The longest I've gotten out of my 6450 card I've been testing with is about 10 minutes before something odd happens and the driver decided to reset the card - at which point the VM crashes, and the only way you'll get it to boot again without BSOD-ing before the login screen is by rebooting the host. If you have more than one CPU you'll only get to the guest's login screen if you disable IRQ balancing. All in all, good enough for a demo, but absolutely not good enough for anything meaningful.

My plan is to test whether using a Quadro 2000 (drivers officially supports VGA passthrough) makes for a workable experience before I spend 4x as much on a GTX680 to modify into a Quadro K5000 or a Grid K2. Ideally I'd rather like to get a Titan and see what happens if I mod it's device ID to read as a K5000, but as far as I can tell nobody ever reported trying it, and I'd hate to end up with a Titan that I cannot use for it's intended purpose.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 08, 2013, 08:34:20 am
Regarding Xen virtualization, I haven't tried Nvidia yet (my Quadro 2000 for testing is in the post), but I sincerely hope the experience is less appalling than with the ATI. Granted, ATI cards almost work whereas desktop Nvidia cards don't work at all with VGA passthrough without a whole raft of extra Xen patches, but the experience is poor at best. All in all, good enough for a demo, but absolutely not good enough for anything meaningful.

The patches are only 5 files, about 100 lines of code in total. They are just to read the bios from an extracted bios rather than from the card at runtime as well as a few other things that Xen can't pull dynamically, unlike AMDs. It is fairly basic code, nothing fancy.

As far as only "good enough for a demo", I will have to disagree. You may have just had a poor experience and been unfortunate enough to have an uncooperative motherboard and graphics card. I can attest the fact that the passthrough is fairly stable once it is setup properly (that's the hard part). I had it running for 2 weeks as a gaming VM and it never had a hiccup with an older 5670 of mine. It was impressive! That said, I wouldn't put this into a production environment without alot more testing.

My plan is to test whether using a Quadro 2000 (drivers officially supports VGA passthrough) makes for a workable experience before I spend 4x as much on a GTX680 to modify into a Quadro K5000 or a Grid K2. Ideally I'd rather like to get a Titan and see what happens if I mod it's device ID to read as a K5000, but as far as I can tell nobody ever reported trying it, and I'd hate to end up with a Titan that I cannot use for it's intended purpose.

Keep in mind that the Quadro series does not support FLReset. It is probably not a good idea to use that for passthrough if you plan to start and stop the VM. It will work just fine, but Xen/linux won't be able to reset the card upon VM reboot. If you have to reboot the VM you'll still need to reboot the entire machine. You may have crashes or performance degradation otherwise.

The above also applies to the K5000 if you plan to modify a Titan, there will be no FLReset.

Good luck!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 08, 2013, 04:46:03 pm
Regarding Xen virtualization, I haven't tried Nvidia yet (my Quadro 2000 for testing is in the post), but I sincerely hope the experience is less appalling than with the ATI. Granted, ATI cards almost work whereas desktop Nvidia cards don't work at all with VGA passthrough without a whole raft of extra Xen patches, but the experience is poor at best. All in all, good enough for a demo, but absolutely not good enough for anything meaningful.

The patches are only 5 files, about 100 lines of code in total. They are just to read the bios from an extracted bios rather than from the card at runtime as well as a few other things that Xen can't pull dynamically, unlike AMDs. It is fairly basic code, nothing fancy.

I'm not saying it's a big deal, but it's unstable and not merged into the mainline last time I checked, which means that if you care about such niceties as package management, especially on a stable (e.g. EL6) rather than unstable bleeding edge (e.g. Fedora) distribution, you have to hunt around for the specific versions that the patches are against or versions that the patches apply against cleanly (and work). And considering that some of these things don't work too well even when extensively tested, I don't fancy my chances much with something that has been only tested by a handful of individuals.

As far as only "good enough for a demo", I will have to disagree. You may have just had a poor experience and been unfortunate enough to have an uncooperative motherboard and graphics card. I can attest the fact that the passthrough is fairly stable once it is setup properly (that's the hard part). I had it running for 2 weeks as a gaming VM and it never had a hiccup with an older 5670 of mine. It was impressive! That said, I wouldn't put this into a production environment without alot more testing.

It's not just about the motherboard or the card being uncooperative. Sure, having to disable irq balancing took a while to think of to try (google didn't find much obvious mention of it, and neither did anyone mention it on the threads I posted on xen-users). But if things work for 10 minutes, they should stay working and not randomly crash after that. And since the BSODs always refer to a device reset failure/timeout, it looks like the problem is mainly related to the driver not knowing how to reset the card properly. (Also see below re: FLreset.)

My plan is to test whether using a Quadro 2000 (drivers officially supports VGA passthrough) makes for a workable experience before I spend 4x as much on a GTX680 to modify into a Quadro K5000 or a Grid K2. Ideally I'd rather like to get a Titan and see what happens if I mod it's device ID to read as a K5000, but as far as I can tell nobody ever reported trying it, and I'd hate to end up with a Titan that I cannot use for it's intended purpose.

Keep in mind that the Quadro series does not support FLReset. It is probably not a good idea to use that for passthrough if you plan to start and stop the VM. It will work just fine, but Xen/linux won't be able to reset the card upon VM reboot. If you have to reboot the VM you'll still need to reboot the entire machine. You may have crashes or performance degradation otherwise.

The above also applies to the K5000 if you plan to modify a Titan, there will be no FLReset.

I know that they, too, lack FLreset, but I am not all that convinced that FLreset is all that necessary. Sure, it makes it a little easier for the driver to do it's job, but think about this at a low level like an embedded engineer for a moment. On the lowest level it comes down to setting registers on the device. Unless the card is poorly engineered and buggy (e.g. it drops off the bus in a questionable, un-re-attachable and uncontactable state), the driver should always be able t o set the registers to whatever they need to be to get the card to a known, initialized state, without even any help from the card's BIOS. FLreset is a nicety that means your driver doesn't have to handle the initialization of the hardware itself, but it doesn't strike me at all as a necessity to get something like this working properly.

I'll know one way or the other soon enough.

Edit: Having tried a Quadro 2000, the experience thus far is that it is even more unstable than using an ATI card. Most disappointing. I guess I won't be wasting my time modifying a GTX into a Quadro.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 10, 2013, 02:25:46 pm
I know that they, too, lack FLreset, but I am not all that convinced that FLreset is all that necessary. Sure, it makes it a little easier for the driver to do it's job, but think about this at a low level like an embedded engineer for a moment. On the lowest level it comes down to setting registers on the device. Unless the card is poorly engineered and buggy (e.g. it drops off the bus in a questionable, un-re-attachable and uncontactable state), the driver should always be able t o set the registers to whatever they need to be to get the card to a known, initialized state, without even any help from the card's BIOS. FLreset is a nicety that means your driver doesn't have to handle the initialization of the hardware itself, but it doesn't strike me at all as a necessity to get something like this working properly.

I'll know one way or the other soon enough.

Edit: Having tried a Quadro 2000, the experience thus far is that it is even more unstable than using an ATI card. Most disappointing. I guess I won't be wasting my time modifying a GTX into a Quadro.


Today I have found the time to test and successfully modify my GTX 680.

I tested a K5000 and Grid K2. To my surprise neither of them supported FLReset! At least not according to `lspci`. This means I cannot issue a reset to the card through the kernel. Despite that, it worked fine. It would appear you are correct about FLReset not being necessary.

The Grid K2 was my first to test, but both cards behaved identical in my testing with a small difference after rebooting the VM.

I complied xen-4.2.1 (stable) without any patches. I did a standard vga passthrough. The real video card is actually a secondary video card in the system with this method. I installed the appropriate drivers and reboot the VM. Upon reboot everything worked as desired! It was quite easy compared to the previous mess I have been through trying to get this to work.

The "K5000" card had performance issues upon VM reboot, but it did not blue screen. I ejected the K5000 using Windows "Safely Remove Device" feature. After it came back up the performance degradation was gone and it performed perfect as far as I could test it.

Thanks go to verybigbadboy and gnif!

The Xen portion of this is a bit offtopic, so if anyone has any questions on that feel free to send me a PM.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Kniteman77 on May 12, 2013, 07:26:53 pm
I'm looking at getting this (http://www.newegg.com/Product/Product.aspx?Item=N82E16814130799) card.

It seems like most of the 680's people are modding are the 2gb version, how can I tell I'll be able to mod this 4gb version?

I'm still working my way through the thread, it's quite long. :-/
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vacaloca on May 13, 2013, 10:41:16 am
I posted this a while back but no one addressed it:
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg223546/#msg223546 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg223546/#msg223546)

The TL,DR version is, does modifying any particular card into Tesla K10, Quadro 4000 or Quadro 5000, or for that matter either GRID K1 or K2 variants enable nvidia-smi support for changing the settings listed in the link above? e.g. ECC/TCC support, application clocks, power limit?

Note that not all the settings may work with a particular (transformed) card. If anyone could try to modify each of the settings for their modified card I would very much appreciate it!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 13, 2013, 01:47:33 pm
I posted this a while back but no one addressed it:
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg223546/#msg223546 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg223546/#msg223546)

The TL,DR version is, does modifying any particular card into Tesla K10, Quadro 4000 or Quadro 5000, or for that matter either GRID K1 or K2 variants enable nvidia-smi support for changing the settings listed in the link above? e.g. ECC/TCC support, application clocks, power limit?

Note that not all the settings may work with a particular (transformed) card. If anyone could try to modify each of the settings for their modified card I would very much appreciate it!

I have a GTX 680 modified to a Grid K2 passed through to a Windows 7 x64 xen VM. I am running the nVidia quadro/tesla/grid drivers version 320.00

Here is a pastebin of my nvidia-smi out (http://pastebin.com/8tj3M6wi)

ECC would have to be supported by the RAM, which they wouldn't install in a consumer grade card. The power features and other things I would _assume_ are also added hardware bits that physically don't exist on the card.

I am no expert on this subject or with nvidia-smi, though. If you would like me to try something else, I will. I may have just not used the correct commands.

After reading this entire thread can i conclude the following?
a GTX680 can be fairly easily modded into K5000 / K10 / Grid K2 by fixing some ID-resistors
this results in additional features (like gpu passthrough for VM's and Mosaic support)
but no performance gain for Pro apps (specviewperf 11)

Or has anyone (Gnif, VeryBigBadBoy, ReefJunkie, etc.) discovered
how to actually boost the OpenGL performance of a GTX680 ?
For many self-employed pro-users like me that would be truly awesome!

The performance boost you are asking about would have to come from a disabled feature or something to that effect. As far as I am aware, there are no disabled features related to performance. This just removed the arbitrary limitations placed on nVidia cards that have no purpose other than to get the user to upgrade to a professional card to gain access to those features.

Also, in the case of the Grid K2, the real card supports VGX. I have no gotten that to work with this card sadly. I haven't spent to much time with it yet. Just want to see it in action, play with it a little.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vacaloca on May 14, 2013, 12:30:24 am
I have a GTX 680 modified to a Grid K2 passed through to a Windows 7 x64 xen VM. I am running the nVidia quadro/tesla/grid drivers version 320.00

Here is a pastebin of my nvidia-smi out (http://pastebin.com/8tj3M6wi)

ECC would have to be supported by the RAM, which they wouldn't install in a consumer grade card. The power features and other things I would _assume_ are also added hardware bits that physically don't exist on the card.

I am no expert on this subject or with nvidia-smi, though. If you would like me to try something else, I will. I may have just not used the correct commands.
Thanks for the output. I'll give a bit of background. Under Linux, there are some card settings that cannot be read, and to my knowledge there isn't any sort of application that provides the equivalent control/monitoring of the Windows NVIDIA Inspector and EVGA Precision X/MSI Afterburner applications. At least with nvidia-smi support, monitoring under Linux would be possible if the settings are reported, however at least for your converted Grid K2, it doesn't seem like anything else is reported, boo. (i.e. no data for GPU Utilization)

The other major thing for me would be TCC support -- it turns off the card outputs and doesn't have the overhead of driving a display when you're running computational CUDA codes under Windows, for example.

Can you try these? If there was no change, nvidia-smi should say 'not supported'. If it nvidia-smi acknowledges that it's able to change any of the settings it would be great. :)

Code: [Select]
nvidia-smi -e 1
nvidia-smi -dm 1
nvidia-smi -fdm 1
nvidia-smi --gom=0
nvidia-smi -ac 2000,800
nvidia-smi -pl 250

Also, if anyone else can try these with a converted Quadro or Tesla to confirm those cards behave the same way, that'd be awesome too. (nvidia-smi doesn't explicitly state full support for GRID cards, just Tesla/Quadro)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 14, 2013, 01:59:07 am
Also, if anyone else can try these with a converted Quadro or Tesla to confirm those cards behave the same way, that'd be awesome too. (nvidia-smi doesn't explicitly state full support for GRID cards, just Tesla/Quadro)

Quote
Supported products:
- Full Support
    - NVIDIA Tesla Line:
            S2050, C2050, C2070, C2075,
            M2050, M2070, M2075, M2090,
            X2070, X2090,
            K10, K20, K20X, K20Xm, K20c, K20m, K20s
    - NVIDIA Quadro Line:
            410, 600, 2000, 4000, 5000, 6000, 7000, M2070-Q
            K2000, K2000D, K4000, K5000, K6000
    - NVIDIA GRID Line:
            K1, K2, K340, K520

It does say it fully supports GRID K2. I can mod it over to a Tesla and check the difference (if any). I am not at home at the moment, but I will run your other commands to test it out.

My extent for using features beyond gaming is just some oclhashcat and some experimental x264 gpu stuff. Also some very rare 3D modeling stuff, not enough to care about performance. But if I am reading your post correctly, then this could give added performance in those aspects, yes?

EDIT:
Code: [Select]
nvidia-smi -e 1
nvidia-smi -dm 1
nvidia-smi -fdm 1
nvidia-smi --gom=0
nvidia-smi -ac 2000,800
nvidia-smi -pl 250

All features came back as "not supported for GPU"
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vacaloca on May 14, 2013, 03:46:40 am
It does say it fully supports GRID K2. I can mod it over to a Tesla and check the difference (if any). I am not at home at the moment, but I will run your other commands to test it out.

My extent for using features beyond gaming is just some oclhashcat and some experimental x264 gpu stuff. Also some very rare 3D modeling stuff, not enough to care about performance. But if I am reading your post correctly, then this could give added performance in those aspects, yes?

Well, TCC mode is a Windows exclusive setting. It disables the display outputs, so it would only be useful for running applications that don't depend on a display -- running CUDA code that just crunches numbers is one of those situations. Here's a bit more details about it: http.developer.nvidia.com/ParallelNsight/2.1/Documentation/UserGuide/HTML/Content/Tesla_Compute_Cluster.htm (http://http.developer.nvidia.com/ParallelNsight/2.1/Documentation/UserGuide/HTML/Content/Tesla_Compute_Cluster.htm)

I guess nvidia-smi executables have different verbiage between versions or perhaps even architectures. Under Win 7 x64, 314.14 driver, nvidia-smi 4.314.14 version reports:

Code: [Select]
Supported products:
- Full Support
    - NVIDIA Tesla Line:
            S2050, C2050, C2070, C2075,
            M2050, M2070, M2075, M2090,
            X2070, X2090,
            K10, K20, K20X
    - NVIDIA Quadro Line:
            4000, 5000, 6000, 7000, M2070-Q, 600, 2000, 3000M and 410
    - NVIDIA GeForce Line:            None

which is why I thought GRID K2 wasn't supported fully. Thanks for testing, look forward to seeing if by any chance any of the mode changes work. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: snakema on May 15, 2013, 08:28:23 pm
hello
after a google search I found the forum and I appreciated this article.
if anyone can help me find the right card for the edit.
thank you
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: lightsol on May 16, 2013, 03:39:36 am
hello
after a google search I found the forum and I appreciated this article.
if anyone can help me find the right card for the edit.
thank you

Grab the 680, that card has been modded the most so far :).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: snakema on May 16, 2013, 06:24:56 pm
hello
after a google search I found the forum and I appreciated this article.
if anyone can help me find the right card for the edit.
thank you

Grab the 680, that card has been modded the most so far :).



is that it will be a real Quadro K5000?
performance, quality ...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on May 17, 2013, 09:35:23 am
is that it will be a real Quadro K5000?
performance, quality ...

It will not be a real K5000 in performance and quality, but it should allow certain options not available to gaming (GTX) series of cards, useful for virtualization etc.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: snakema on May 17, 2013, 05:59:17 pm
is that it will be a real Quadro K5000?
performance, quality ...

It will not be a real K5000 in performance and quality, but it should allow certain options not available to gaming (GTX) series of cards, useful for virtualization etc.


you can talk to me options unlocked?
I want to know if I was interested or not.
thanks
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 18, 2013, 03:59:30 am
hello
after a google search I found the forum and I appreciated this article.
if anyone can help me find the right card for the edit.
thank you

Grab the 680, that card has been modded the most so far :).



is that it will be a real Quadro K5000?
performance, quality ...

As far as I am aware, the only feature you won't get is the ECC memory. Everything else will be the same, except maybe the clock speeds. Quadro cards tend to be clocked a little more conservatively. GeForce cards tend to be pre-overclocked right to their thermal and stability limits, and occasionally beyond.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on May 18, 2013, 04:40:48 am
As far as I am aware, the only feature you won't get is the ECC memory...
Nobody (so far) has demonstrated that a modded 670/680 card can score the same as Quadro/Tesla/Grid in specviewperf11.
And that is the single most important reason why the pro cards are 3-4 times the $$$ of the GTXs...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on May 18, 2013, 11:13:12 am
Nobody (so far) has demonstrated that a modded 670/680 card can score the same as Quadro/Tesla/Grid in specviewperf11.
And that is the single most important reason why the pro cards are 3-4 times the $$$ of the GTXs...

You are absolutely right, and nobody will demonstrate it, because the cards will not score the same.

Many people fail to realize that Nvidia is also reading the Web and following what is going on with all these mods. They did not sit idly for the 4xx to 5xx transition where the soft-switch mod stopped working. And with the 6xx series they've most likely added more roadblocks to prevent any entrepreneurial people from causing them any further revenue loss.

Doing this for them is very easy, after all they are the ones who engineered the chips and so they have all the information needed. We are for the most part tapping in the dark, finding things out through trial and error.

I'd love to be proven wrong but as it stands, the facts point otherwise.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: seneelya on May 18, 2013, 08:08:49 pm
so one block have been successfully passed  - that was hardware straps and driver instalation. I think  one more left. It must be somewhere  in BIOS, perhaps OGL features report. May be we need to check it between origonal quadro and moded? Is there some difference present or not?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: seneelya on May 18, 2013, 08:25:44 pm
here is my AIDA64 report for quadro4000

may be it can somehow help if we compare features
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 18, 2013, 08:27:50 pm
As far as I am aware, the only feature you won't get is the ECC memory. Everything else will be the same, except maybe the clock speeds. Quadro cards tend to be clocked a little more conservatively. GeForce cards tend to be pre-overclocked right to their thermal and stability limits, and occasionally beyond.

In fact, nothing changes. You get no performance benefit. You get no additional features save the added features provided by the driver itself. According to lspci no additional features were reported as available. According to nvidia-smi no additional features could be accessed.

The card is not faster or slower than the card it originally was.

The only reason to mod this is if you need to use the professional drivers for some reason.

This has all been covered multiple times (including the original post...).

Many people fail to realize that Nvidia is also reading the Web and following what is going on with all these mods. They did not sit idly for the 4xx to 5xx transition where the soft-switch mod stopped working. And with the 6xx series they've most likely added more roadblocks to prevent any entrepreneurial people from causing them any further revenue loss.

Doing this for them is very easy, after all they are the ones who engineered the chips and so they have all the information needed. We are for the most part tapping in the dark, finding things out through trial and error.

I do not have a Tesla K10 or a Grid K2 to compare, but in this case I do believe physical hardware bits are missing allowing some of these features. I would love to have one to examine for a day or two just to look it over.

Then again these could all just be features locked by the bios. I am not about to got flash a different bios just yet.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on May 19, 2013, 12:38:52 am
You are absolutely right, and nobody will demonstrate it, because the cards will not score the same.
Oka-a-ay...
I'd love to be proven wrong...
So, what is it that you think? It can be done or not?

You can't have it both ways...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on May 19, 2013, 01:24:24 am
Oka-a-ay...
???

So, what is it that you think? It can be done or not?

You can't have it both ways...
In principle, anything can be done and I do have some ideas to explore, just waiting for a GTX card to arrive in the mail.

I opted to get a GTX 5xx because interwebs say 6xx series appears to be optimized towards gaming and not business applications, and is sub-par in performance to 5xx.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 20, 2013, 05:38:51 pm
As far as I am aware, the only feature you won't get is the ECC memory...
Nobody (so far) has demonstrated that a modded 670/680 card can score the same as Quadro/Tesla/Grid in specviewperf11.
And that is the single most important reason why the pro cards are 3-4 times the $$$ of the GTXs...

Two points:
1) What software do you use that is particularly well approximated with specviewperf11?
2) How much of a difference are we talking about? 300% difference, roughly equivalent to the difference in the price tag? I doubt it.

I can understand that ECC might be required for some application where reliability and correctness of calculations is paramount (science/research headless number crunching), but in the vast majority of cases it isn't particularly useful. In virtually all cases, people who are interested in a cheap alternative to Quadro cards only want compatibility with their software (e.g. for a driver that handles VGA passthrough without requiring additional hypervisor patches). A few % of performance difference between a GeForce and an equivalent Quadro will make no difference to the vast majority of people. If anything, for most people the GeForce will be faster because it is clocked slightly higher.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 20, 2013, 05:42:07 pm
Oka-a-ay...
???

So, what is it that you think? It can be done or not?

You can't have it both ways...
In principle, anything can be done and I do have some ideas to explore, just waiting for a GTX card to arrive in the mail.

I opted to get a GTX 5xx because interwebs say 6xx series appears to be optimized towards gaming and not business applications, and is sub-par in performance to 5xx.

How exactly do you figure the use case in which 1536 shaders is not at least as good as 512 shaders? I'm pretty sure a GTX680 will outperform a GTX580 in every way possible.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on May 20, 2013, 11:54:32 pm
How exactly do you figure the use case in which 1536 shaders is not at least as good as 512 shaders? I'm pretty sure a GTX680 will outperform a GTX580 in every way possible.

Interwebs is your friend...why don't you do a search and find out. Bigger is not always better. ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 21, 2013, 04:42:03 am
How exactly do you figure the use case in which 1536 shaders is not at least as good as 512 shaders? I'm pretty sure a GTX680 will outperform a GTX580 in every way possible.

Interwebs is your friend...why don't you do a search and find out. Bigger is not always better. ;)

And I'm asking you to cite a well informed source with scrutinizable empirical evidence. Surely you aren't about to claim that "it must be true because I read it on the internet".
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: lightsol on May 21, 2013, 05:14:03 am
How exactly do you figure the use case in which 1536 shaders is not at least as good as 512 shaders? I'm pretty sure a GTX680 will outperform a GTX580 in every way possible.

Interwebs is your friend...why don't you do a search and find out. Bigger is not always better. ;)

And I'm asking you to cite a well informed source with scrutinizable empirical evidence. Surely you aren't about to claim that "it must be true because I read it on the internet".

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17 (http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17)

or if you dont believe that, then just google some more benches and tests.
Anyways, don't get offtopic please :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on May 21, 2013, 06:09:04 am
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17 (http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17)

or if you dont believe that, then just google some more benches and tests.
Anyways, don't get offtopic please :)

I was going to suggest he does his own homework, as it's not right for any of us to do it for him. :)

There are many benchmarks in which 680 appears faster, but in actuality once you move into a real world and forget synthetic tests, the picture is totally different.

For example Adobe CS applications, Apple Motion and Final Cut Pro, DaVinci Resolve, etc. Then in the 3D land, most content creation packages perform better with older generations of NVidia chips such as 5xx, 4xx etc. They are closer to their equivalent Quadro chips and could potentially be modded to Quadro as well.

With 6xx, NVidia has clearly decided to separate gaming from business applications. They really want you to spend the big bucks on their pro-line (Quadro) for anything "serious" and so modding the 6xx will only take you so far. As I said before, they are not sitting idly either, waiting for modders to make their moves.

What's mind boggling is that it appears both 6xx and Quadro Kxxx chips come from the same factory line and the former are just crippled versions of the later; or those that don't pass QA testing for pro-line. But, we've already discussed that here, in the earlier posts...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: SeSl on May 21, 2013, 02:06:14 pm
Good day to all!


What's mind boggling is that it appears both 6xx and Quadro Kxxx chips come from the same factory line and the former are just crippled versions of the later; or those that don't pass QA testing for pro-line. But, we've already discussed that here, in the earlier posts...

It's sad, but it seems to be true. Maybe they do chip selection. It is also possible that the ECC is a beacon validity of pro-series.

Anyway, my EVGA 04G-P4-3687-KR GeForce 4GB GTX 680 for mod in a way. A month ago, paid for it on Amazon. Parcels into Russia are long time. If nothing comes out, it will be waste of money and time. Game card I do not need a gift, and for the K5000 will have to pay anyway.The hope was, that collective brainstorm will give a positive result. The hope dies last.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: seneelya on May 21, 2013, 08:45:54 pm
anybody reads me?

can we compare real quadro and fake one features reports?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 21, 2013, 11:24:49 pm
Small update:

I know most of you are trying to get professional cards out of consumer, I just want a card for passthrough so I could game "on linux". In every test I have run, this GTX680 modded to a Grid K2 runs exactly the same as the GTX680 except with Physx. Physx doesn't work with the "professional" drivers so I forced the consumer drivers to install. They work and run fine and now Physx will say it is enabled, but the card will not do any of the work. It offloads everything to the CPU while still reporting it is working to the program. Its a bit strange and annoying. Going to mod it back to confirm my results and trying a Telsa mod just in case that will support it (probably not).

Nvidia, if you are reading this, I just want virtualization FOR PLAYING GAMES. Seriously, AMD actually worked with the community on this one, why can't you just enable it? It clearly works fine.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: eos on May 22, 2013, 11:42:17 am
Two points:
1) What software do you use that is particularly well approximated with specviewperf11?
2) How much of a difference are we talking about? 300% difference, roughly equivalent to the difference in the price tag? I doubt it.
See this post
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155)
and the xbitlab link within
http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html (http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html)

The latter lists programs (software) and what specviewperf11 expect in pro-cards vs. gaming GTXs.

Those should answer both your questions...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on May 22, 2013, 07:43:58 pm
Hello all again :)
I have a new palit gtx 650 2gb card and successfully modificated it into grid k1.
Looks like palit have a reference card.
modification steps same asGT 640 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332). BIOS modification needed too.

Picture with resistor places attached.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vsgan on May 23, 2013, 03:20:17 am
Does the card actually work as NVIDIA GRID VGX after modification? e.g. vmware vsphere vgsa?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vsgan on May 23, 2013, 03:31:43 am
Small update:

I know most of you are trying to get professional cards out of consumer, I just want a card for passthrough so I could game "on linux". In every test I have run, this GTX680 modded to a Grid K2 runs exactly the same as the GTX680 except with Physx. Physx doesn't work with the "professional" drivers so I forced the consumer drivers to install. They work and run fine and now Physx will say it is enabled, but the card will not do any of the work. It offloads everything to the CPU while still reporting it is working to the program. Its a bit strange and annoying. Going to mod it back to confirm my results and trying a Telsa mod just in case that will support it (probably not).

Nvidia, if you are reading this, I just want virtualization FOR PLAYING GAMES. Seriously, AMD actually worked with the community on this one, why can't you just enable it? It clearly works fine.

How did you force the driver to install?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 23, 2013, 04:56:12 am
Small update:

I know most of you are trying to get professional cards out of consumer, I just want a card for passthrough so I could game "on linux". In every test I have run, this GTX680 modded to a Grid K2 runs exactly the same as the GTX680 except with Physx. Physx doesn't work with the "professional" drivers so I forced the consumer drivers to install. They work and run fine and now Physx will say it is enabled, but the card will not do any of the work. It offloads everything to the CPU while still reporting it is working to the program. Its a bit strange and annoying. Going to mod it back to confirm my results and trying a Telsa mod just in case that will support it (probably not).

Nvidia, if you are reading this, I just want virtualization FOR PLAYING GAMES. Seriously, AMD actually worked with the community on this one, why can't you just enable it? It clearly works fine.

+1

I actually had more success with PhysX. My Win7 64-bit VM with a Quadro 2000 worked fine with PhysX once I installed the PhysX package in addition to the Quadro drivers. I haven't checked whether it was offloading onto the CPU or not, though, but games and GPU-Z all reported PhysX capability.

On XP64, however, that didn't work - no PhysX on the Quadro 2000, with or without the PhysX software installed.

In the end, I gave up on Nvidia and decided to save myself money, modding effort and time and just got an ATI card instead, because I, too, only wanted a decent GPU to game with in a VM without having to dual boot machine. At least until Steam has a better selection of Linux capable games and get their client software working without requiring bleeding edge glibc.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 23, 2013, 05:42:33 am
Does the card actually work as NVIDIA GRID VGX after modification? e.g. vmware vsphere vgsa?

I tried a GTX 680 > Grid K2 and it did not work. I was sad.

How did you force the driver to install?

The Quadro K5000 is supported by the "consumer" graphics but it will lockup on VM boot with those drivers.

The Grid K2 is not supported, but since my card is for all intents and purposes a GTX 680 the drivers should work fine. I just did some inf edits to add the new strings for the card. The driver will not be a signed driver for that PCI ID though. Doesnt affect performance, but it will throw warnings.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 23, 2013, 05:50:27 am
I actually had more success with PhysX. My Win7 64-bit VM with a Quadro 2000 worked fine with PhysX once I installed the PhysX package in addition to the Quadro drivers. I haven't checked whether it was offloading onto the CPU or not, though, but games and GPU-Z all reported PhysX capability.

I may retract my CPU offloading comment later. After modding to a k5000 again the physx score is the same. Both grid and quadro report supporting phsyx, but they both only give the same performance as my aging cpu.

Setting up a new windows install to test the card types to verify everything.

In the end, I gave up on Nvidia and decided to save myself money, modding effort and time and just got an ATI card instead, because I, too, only wanted a decent GPU to game with in a VM without having to dual boot machine. At least until Steam has a better selection of Linux capable games and get their client software working without requiring bleeding edge glibc.

I use a chroot for steam on linux so I dont break the rest of my system or have it all bleeding edge. Space isnt an issue now a days and there is no less performance. Annoying, but workable.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vsgan on May 23, 2013, 02:05:50 pm
I just did some inf edits to add the new strings for the card. The driver will not be a signed driver for that PCI ID though. Doesnt affect performance, but it will throw warnings.

Could you share the method for modifying the driver? I tried to add some strings for my hardware id but the driver still cannot install correctly. Thanks.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on May 23, 2013, 05:40:48 pm
Does the card actually work as NVIDIA GRID VGX after modification? e.g. vmware vsphere vgsa?
I tested vga passthrough with xen only. It works fine.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 24, 2013, 12:55:04 am
Does the card actually work as NVIDIA GRID VGX after modification? e.g. vmware vsphere vgsa?
I tested vga passthrough with xen only. It works fine.

That is different than the vGPU offered by VGX. vGPU in VGX (http://www.nvidia.com/object/grid-vgx-software.html) did not work. I may play around with it more later to figure it out if it is possible.

Update on Physx offloading to cpu:
Turns out this is a well known issue. nVidia disables a bunch of stuff when it is not the primary graphics card in the OS. This is the case with typical vga passthrough and xen. I am trying to get primary passthrough to work now.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on May 24, 2013, 07:07:33 am

In the end, I gave up on Nvidia and decided to save myself money, modding effort and time and just got an ATI card instead, because I, too, only wanted a decent GPU to game with in a VM without having to dual boot machine. At least until Steam has a better selection of Linux capable games and get their client software working without requiring bleeding edge glibc.

I use a chroot for steam on linux so I dont break the rest of my system or have it all bleeding edge. Space isnt an issue now a days and there is no less performance. Annoying, but workable.

You could just install a newer glibc as required into a separate path and set LD_LIBRARY_PATH before invoking steam. LD_LIBRARY_PATH gets evaluated before ld.so.conf. No need for a full chroot. Just make sure your newer glibc isn't in one of the directories referenced in ld.so.conf or ld.so.conf.d.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: lightsol on May 28, 2013, 08:23:00 pm
anyone have luck with the actual VGX? It would be pretty damn nice if the gpu could be spread out to 10+ VM's at the same time
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on May 29, 2013, 01:47:05 am
anyone have luck with the actual VGX? It would be pretty damn nice if the gpu could be spread out to 10+ VM's at the same time

I spent this past weekend working on that. Nothing.

I still have a few avenues to go down before I give up.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: winjet1 on May 29, 2013, 02:03:36 am
anyone have luck with the actual VGX? It would be pretty damn nice if the gpu could be spread out to 10+ VM's at the same time

Here's the issue........

If you are just passing a 680 -> VGX card to a VM, there's really nothing special there.  You are just doing a hardware passthrough and since 5.1, pass-through has been pretty stable.

If you are using a 680 -> VGX to do sVGA, then you are truely trying to use a VGX as intended.  To use it here you have to enable a few things in VMware (see below).

http://communities.vmware.com/thread/415887?start=30&tstart=0 (http://communities.vmware.com/thread/415887?start=30&tstart=0)

Sadly, VMware's 3D drivers are really not up to par to run things like games.  In addition you can only farm out 256Mb of RAM and 256Mb of vRAM per VM (can you remember the last game you played on a 256Mb card?  Bet it wasn't Crysis 2).

Microsoft's Hyper-V and Citrix may be a better solution.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vsgan on June 03, 2013, 07:19:00 am
Does any vendor other than vmware have a stable shared graphic acceleration solution at this point? especially for *nix
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: lightsol on June 05, 2013, 02:52:20 am
Well from my experience RemoteFX is great, but it needs a lot of configuration
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 05, 2013, 11:09:36 am
RemoteFX is great, seconded.

I have not tried this yet, I will setup a Server 2012 tonight to test that. I will try it 2 ways, natively and then through xen. Thats right, Server 2012 with GPU passthrough, running vms using the GPU. How cool would that be? I am not so familiar with Hyper-V to know if that will work inside of a Xen domU, I suspect not. Maybe it won't whine to much about the extensions it doesnt have access to.

According to an nVidia blog post about remoteFX (http://blogs.msdn.com/b/rds/archive/2010/07/08/more-partner-momentum-around-microsoft-remotefx-in-windows-server-2008-r2-sp1-beta.aspx), Quadro cards support remoteFX. This post was a few years prior to GRID cards, but I believe they should work too.

Will get back to the community on this.

PS: Through some testing and config verification, the offloading physx and poor GPU performance in passthrough was a problem with my mobo not truly supporting xen. Upgraded and everything is great. It is truly wonderful. (Plus I have an onboard iGPU to passthrough now, yay!)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: lightsol on June 05, 2013, 06:06:17 pm
Try with the just released Server 2012 R2 preview , i heard that they upgraded the hyper-v engine even more, maybe there are some good RemoteFX upgrades as well :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: function on June 06, 2013, 10:59:39 am
I know that they, too, lack FLreset, but I am not all that convinced that FLreset is all that necessary. Sure, it makes it a little easier for the driver to do it's job, but think about this at a low level like an embedded engineer for a moment. On the lowest level it comes down to setting registers on the device. Unless the card is poorly engineered and buggy (e.g. it drops off the bus in a questionable, un-re-attachable and uncontactable state), the driver should always be able t o set the registers to whatever they need to be to get the card to a known, initialized state, without even any help from the card's BIOS. FLreset is a nicety that means your driver doesn't have to handle the initialization of the hardware itself, but it doesn't strike me at all as a necessity to get something like this working properly.

And I believe that is exactly the problem, GPU vendors do not want to reveal the initialization and setup of their hardware lest they end up revealing their IP. So the video bios does a lot of heavy lifting for bootstrapping the GPU. intel opregion is at least nice in that it allows for complete OS (in contrast to part firmware) based initialization.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 06, 2013, 04:38:57 pm
I know that they, too, lack FLreset, but I am not all that convinced that FLreset is all that necessary. Sure, it makes it a little easier for the driver to do it's job, but think about this at a low level like an embedded engineer for a moment. On the lowest level it comes down to setting registers on the device. Unless the card is poorly engineered and buggy (e.g. it drops off the bus in a questionable, un-re-attachable and uncontactable state), the driver should always be able t o set the registers to whatever they need to be to get the card to a known, initialized state, without even any help from the card's BIOS. FLreset is a nicety that means your driver doesn't have to handle the initialization of the hardware itself, but it doesn't strike me at all as a necessity to get something like this working properly.

And I believe that is exactly the problem, GPU vendors do not want to reveal the initialization and setup of their hardware lest they end up revealing their IP. So the video bios does a lot of heavy lifting for bootstrapping the GPU. intel opregion is at least nice in that it allows for complete OS (in contrast to part firmware) based initialization.

I think you got that backwards. If they implemented FLR, they would need to reveal _less_ because the reset would be a single, standards defined call to reset the card without having to reveal _anything_ about the hardware. What using proprietary initialization does do, however, is make it more difficult for open source drivers to be written. This enables companies like Nvidia to charge you 5x the amount for the same hardware just for changing 2 resistors and half a byte of firmware to get access to the "pro" feature set of the driver.

It's not about protecting the IP - it's about protecting the high-margin revenue streams.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: function on June 10, 2013, 07:09:58 pm
I know that they, too, lack FLreset, but I am not all that convinced that FLreset is all that necessary. Sure, it makes it a little easier for the driver to do it's job, but think about this at a low level like an embedded engineer for a moment. On the lowest level it comes down to setting registers on the device. Unless the card is poorly engineered and buggy (e.g. it drops off the bus in a questionable, un-re-attachable and uncontactable state), the driver should always be able t o set the registers to whatever they need to be to get the card to a known, initialized state, without even any help from the card's BIOS. FLreset is a nicety that means your driver doesn't have to handle the initialization of the hardware itself, but it doesn't strike me at all as a necessity to get something like this working properly.

And I believe that is exactly the problem, GPU vendors do not want to reveal the initialization and setup of their hardware lest they end up revealing their IP. So the video bios does a lot of heavy lifting for bootstrapping the GPU. intel opregion is at least nice in that it allows for complete OS (in contrast to part firmware) based initialization.

I think you got that backwards. If they implemented FLR, they would need to reveal _less_ because the reset would be a single, standards defined call to reset the card without having to reveal _anything_ about the hardware. What using proprietary initialization does do, however, is make it more difficult for open source drivers to be written. This enables companies like Nvidia to charge you 5x the amount for the same hardware just for changing 2 resistors and half a byte of firmware to get access to the "pro" feature set of the driver.

It's not about protecting the IP - it's about protecting the high-margin revenue streams.

FLR however is only for resetting the device at the PCI bus level, not actually bootstrapping and loading of firmware to the ICs, DSPs etc on the board. E.g. PCI devices also support PCI reset via D3-D0 transition but that doesn't reinitialize the hardware after it has been reset either. I would also think that moving this complicated device initialization to the HDL would make board specific customizations unnecessarily complex.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gpvecchi on June 11, 2013, 06:38:41 am
Hallo! I'd need to convert my MSI GTX 680 Lightning into a GTX 770 Lightning as I need a card to SLI (680 are unavailable!)...
Flashing the bios doesn't change device ID, so I suppose I need this hack... Any help, please?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 11, 2013, 07:10:18 am
Hallo! I'd need to convert my MSI GTX 680 Lightning into a GTX 770 Lightning as I need a card to SLI (680 are unavailable!)...
Flashing the bios doesn't change device ID, so I suppose I need this hack... Any help, please?

Hello, You may try to change device ID using gtx 680 guide (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)
I updated resistors values for gtx 770.
Thank you ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 11, 2013, 07:20:24 am
May be it is offtopic but I have question:
Did anyone try to flash bios from 670 to 680? will it lock cores?
If we unable unlock cores, may be we need to find how to lock it? ;)

It is really interesting to know function of near located resisors. ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gpvecchi on June 12, 2013, 05:57:26 am
Hallo! I'd need to convert my MSI GTX 680 Lightning into a GTX 770 Lightning as I need a card to SLI (680 are unavailable!)...
Flashing the bios doesn't change device ID, so I suppose I need this hack... Any help, please?

Hello, You may try to change device ID using gtx 680 guide (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)
I updated resistors values for gtx 770.
Thank you ;)
Great! Thank you very much! I just hope that Lightning PCB doesn't differ in that point...
EDIT: PCB is very different... Could you please help me find resistor n.3 in this hires image?
http://img823.imageshack.us/img823/9379/backfullb.jpg (http://img823.imageshack.us/img823/9379/backfullb.jpg)
Thanks!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 12, 2013, 03:45:25 pm
@gpvecchi

An image of the other side may be useful. I am assuming this is the "MSI GTX 680 Lightning" you spoke about previously.

R3 is possibly this one. (http://i.imgur.com/oKdeXET.jpg) Check to see if that is a 5k resistor. It should be if this is a GTX680. All the pieces seem to be there, just shifted around a bit.



@Anyone following GPU passthrough or GPU sharing

I moved this past weekend. It was a bit overly ambition of me to try and test a new server setup before. I will attempt Hyper-V on Server 2012 this weekend.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gpvecchi on June 12, 2013, 07:23:57 pm
Yes, that's the PCB of the MSI GTX 680 Lightning (more power phases and military class components on custom PCB)...
This is the front side:
http://img405.imageshack.us/img405/8878/frontfulle.jpg (http://img405.imageshack.us/img405/8878/frontfulle.jpg)
Thank you!
P.S.: Do someone know the difference between GK104-400 of the 680 and the GK104-425 of the 770?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: ret on June 12, 2013, 09:07:25 pm
Hi all!

First of all thanks for all your hard work, i really appreciate the "jailbreaking" way :)

Which card would you recommend for modding to a K1 card? I'd like to setup a lab to show off all the new 3D features which were introduced by Citrix - XenServer and vSphere supports GPU virtualization and this workaround would be awesome. Otherwise i'd have to spend ~ 2.000$ for a K1 card.

Thanks!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: vgeek on June 13, 2013, 12:40:28 am
For you who like to try these cards in vSphere and Horizon View 5.2, there is a new whitepaper from VMware that explains how to configure both vSGA (Shared GPU) and vDGA (Dedicated GPU):

http://www.vmware.com/files/pdf/techpaper/vmware-horizon-view-graphics-acceleration-deployment.pdf (http://www.vmware.com/files/pdf/techpaper/vmware-horizon-view-graphics-acceleration-deployment.pdf)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: maxlapshin on June 13, 2013, 05:54:32 am
Is it possible to download nvidia bios under linux?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 13, 2013, 04:25:07 pm
Bah. I just modified my GTX580 into a Quadro 7000 only to find out that my main/only reason for modifying it (gaming in a VM) is not applicable on this card - Quadro 7000 is not supported for VGA passthrough!  :palm:
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 13, 2013, 05:08:24 pm
Bah. I just modified my GTX580 into a Quadro 7000 only to find out that my main/only reason for modifying it (gaming in a VM) is not applicable on this card - Quadro 7000 is not supported for VGA passthrough!  :palm:

Hello :)
You may try to change device id to quadro 6000. :) I think it is strange to mod 1-GPU GeForce card to 2-GPU Quadro. ;)
Also can you provide picture with resistors positions please :)?

Thank you ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 13, 2013, 10:09:44 pm
You may try to change device id to quadro 6000. :)

That was what I looked into first, but the device ID between 580 and 7000 can be adjusted by only twiddling the bottom 5 bits o fthe ID. To go all the way to the 6000 requires modifying the bottom 13 bits, which is more difficult.

I think it is strange to mod 1-GPU GeForce card to 2-GPU Quadro. ;)

You mean like modifying a GTX680 to a Grid K2? :)
Anyway, Quadro 7000 is a single-GPU card. I suspect you are confusing Quadro 7000 with QuadroPlex 7000. The latter is an external PCIe enclosure with two Q7000 cards in it. A Q7000 card is essentially identical spec to a GTX580, apart from having 2-4x the RAM on-board (6GB on Q7000 vs. 1.5-3GB on a 580).

Also can you provide picture with resistors positions please :)?

I'll see what I can do, but I lack a decent digital camera to do this with, and my cheap ass Android phone's camera barely has enough dots to make people's faces vaguely recognizable (as in recognizable as human), let alone surface mount components. :(
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 13, 2013, 11:59:51 pm
You may try to change device id to quadro 6000. :)

That was what I looked into first, but the device ID between 580 and 7000 can be adjusted by only twiddling the bottom 5 bits o fthe ID. To go all the way to the 6000 requires modifying the bottom 13 bits, which is more difficult.
You found one of resistors, It's very good :) May be I try to find second one? ;)
I think it is possible to find photo via google, can you provide your card name please?

I looked bios from 580. It is different than 600 series bios ;) but I don't see any softs straps in bios. Looks like "and" and "or" masks allow any values. If I right It means you need only hardware mod ;)

thank you ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 14, 2013, 12:09:38 pm
After my failure to get a GTX580 to do what I want, I got my hands on a cheap GTS450 (thanks for your investigation on those cards earlier on the thread). Successfully modified it to a Q2000.

Good news: Works fine for gaming in a Xen VM. :)
Bad news: There is absolutely no performance boost in SPECviewperf from modifying them to Quadros. I have a genuine Quadro 2000 and that utterly annihilates the modified GeForce card in SPECviewperf.

The thing is - I cannot figure out why. I went as far as flashing a Q2000 BIOS to a GTS450, hacked about with the BIOS, compared clocks, GPU-Z and CUDA-Z info, and absolutely everything I can see shows that the GTS450 should be about 50% faster because it has the same number of shaders and memory and is clocked about 50% higher.

What is the special ingredient that makes the drivers decide to use a non-crippled OpenGL renderer? They clearly identify the card as a quadro, otherwise it wouldn't work in the VM. Most perplexing...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 15, 2013, 12:30:52 am
What is the special ingredient that makes the drivers decide to use a non-crippled OpenGL renderer? They clearly identify the card as a quadro, otherwise it wouldn't work in the VM. Most perplexing...

I suppose I should ask. You are running the Quadro drivers, yes?

Also, try these performance tests under native windows, not through Xen. You have to do some wierd stuff to get full options under Xen.

Bah. I just modified my GTX580 into a Quadro 7000 only to find out that my main/only reason for modifying it (gaming in a VM) is not applicable on this card - Quadro 7000 is not supported for VGA passthrough!  :palm:

Nether is the K5000. (http://www.dwhynes.com/nvidia-quadro-k5000/) It still works. I would suggest using the Geforce drivers and checking that. When modding to the Quadro on my card, it had different results when using the Geforce vs Quadro drivers.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 15, 2013, 03:02:24 am
What is the special ingredient that makes the drivers decide to use a non-crippled OpenGL renderer? They clearly identify the card as a quadro, otherwise it wouldn't work in the VM. Most perplexing...

I suppose I should ask. You are running the Quadro drivers, yes?

Of course.

Also, try these performance tests under native windows, not through Xen. You have to do some wierd stuff to get full options under Xen.

I am running them on bare metal Windows.

Bah. I just modified my GTX580 into a Quadro 7000 only to find out that my main/only reason for modifying it (gaming in a VM) is not applicable on this card - Quadro 7000 is not supported for VGA passthrough!  :palm:

Nether is the K5000. (http://www.dwhynes.com/nvidia-quadro-k5000/) It still works. I would suggest using the Geforce drivers and checking that. When modding to the Quadro on my card, it had different results when using the Geforce vs Quadro drivers.

I uninstalled the GeForce drivers - they are newer and clobber the Quadro drivers. WEIRD things happen when you have a GeForce and a Quadro in the same system. But putting in a Q2000 on it's own makes SPECviewperf go fast. Pulling it and replacing it with a modified GTS450 makes it go slow. Same driver, same version, according to device manager.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 15, 2013, 04:56:40 am
Quick question guys - regarding the GTX690 mod - can the ID of both GPUs be modified? I'm pondering unifying two gaming rigs into one. Get a GTX690, pass one GPU to each VM. One big box, 2 monitors, 2 keyboards, and 1x GTX690->Grid K2 seems like a neater solution than 2x GTX680s (that and I'm a little short PCIe slots for other things need in there).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: free_electron on June 15, 2013, 06:39:43 am

What is the special ingredient that makes the drivers decide to use a non-crippled OpenGL renderer? They clearly identify the card as a quadro, otherwise it wouldn't work in the VM. Most perplexing...

fusebits in the core. disabled. you think NVidia is so stupid that it would simply be a few resistors on the outside ?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 15, 2013, 09:59:05 am
@gordan

I am in the same boat, I have a GTX 570 and found the necessary resistors for it to go to Q7000 and Tesla C2075, C2090 but there's no performance changes.

What have you tried doing to the GTX BIOS to adjust it to the new mod?

Flashing the card to the actual Quadro/Tesla BIOS does not work because of timings and other things, but these parameters can be used as transplants.

I would be interested if you have worked further on getting the upper 13 bits figured out for the Quadro 4/5/6000, if that's even an option?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 15, 2013, 10:01:08 am
fusebits in the core. disabled. you think NVidia is so stupid that it would simply be a few resistors on the outside ?

That's an interesting hypothesis, but do you have anything to support it?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: free_electron on June 15, 2013, 10:51:32 am
That's an interesting hypothesis, but do you have anything to support it?

Here's how it works :

The resistors are used to downgrade boards depending on what is installed. You can only go down , not up. The max capability of the GPU is set in the GPU. The resistors determine the downmix (memory, speed, cores, voltage ,core voltage, ram voltage) .

You can set resistors to a combination above what the core is specced at but it will not enable. it is a logical OR of what the resistors say and what the fusebits say. Fusebits have priority. Sometimes there are no fusebits as the chip simply does not have the extra features. One board layout can be used for multiple different GPU's in a family.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 15, 2013, 11:07:03 am
@free_electron - not always the case.

In the case of GTS450 vs. Quadro 2000, however, I believe I have just figured it out - it is actually a slightly different chip between the two; I missed a tiny detail in the spec readings.

GTS450 == GF106
Quadro 2000 == GF106GL

I'm guessing the "GL" is important.  :palm:

Never mind - I got the VM VGA passthrough functionality for free - that was the main (and only) thing I was after anyway. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 15, 2013, 11:12:23 am
Here's how it works :

The resistors are used to downgrade boards depending on what is installed. You can only go down , not up. The max capability of the GPU is set in the GPU. The resistors determine the downmix (memory, speed, cores, voltage ,core voltage, ram voltage) .

You can set resistors to a combination above what the core is specced at but it will not enable. it is a logical OR of what the resistors say and what the fusebits say. Fusebits have priority. Sometimes there are no fusebits as the chip simply does not have the extra features. One board layout can be used for multiple different GPU's in a family.

Would you happen to know which features would be disabled in the core though that relate to Quadro capability?

Or is that that the fusebits only manage the features that exist in all chips from the same family (ie. GF100 or GF110: GTX480/GTX570 vs Quadro 4-5-6000/Quadro 7000). Therefore if the fusebit is set, the driver disables the feature but the feature still exists in the chip itself.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: SeanB on June 15, 2013, 07:38:47 pm
To do a zener zap on bits during an on wafer test that does a basic function test that then determines whether the chip is fully good, usable for the top line, or partially good and usable with disabling some cores ( you will have a little logic to enable selective disabling of cores and then reoranising the remainder to appear as contiguous) so it will be a lower spec chip, or is a DOA or fails in critical areas ( same thing really, if the cache logic or glue logic is faulty then it will never work properly) so is recycled, and these pads are never bonded out to pads on the BGA is cheap, and might only add 1 second more to the initial test, at the lowest cost part of the process, before you add an expensive package to what will be a dead unit. Then you can slice and sort based on the paint dot on the chip, and grade them into the bins as needed for assembly. Then you take the complete units and place in a test jig for speed grading, and then you can use resistors or links to enable features. That way you will get different chip suffixes, as they are now different items. If you are lucky they ran out of the low spec chips as they were doing the run for your card and then finished off with the full spec units, as they do cost the same. That though is the luck of the draw, you could have 2 cards made on the same line at the same day but they used different chips on it.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 15, 2013, 10:39:48 pm
Though question still remains, what does make Quadro QUADRO?

There are no special fancy features between the two lines of chips, perhaps slight core difference, if any, such as Shaders / TMUs / ROPs. But that explains the above process of selection so really the features are all there. So what decides whether something is a Quadro and what does Quadro do more that GeForce doesn't.

What I mean is that if you run a SPECViewPerf you will get two different results between the same line of chips (ie. GF100) just because one is GeForce and the other is Quadro. What cripples the performance, OGL driver, BIOS, hardstraps, fuses,...?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: jimseal on June 16, 2013, 04:22:05 am
Trying to do a proof of concept before investing in the real thing.  I have an ASUS GTX660TI-DC2T-2GD5 card and would like to mod it to a Quadro K5000 or some other GPU virtualization compatible card.  (Hyper-V 2012 ?).  The ASUS card seems different from the other images in this thread.  Here are some pictures of both sides, if someone would be so kind as to help me out:

Thank you,
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 16, 2013, 07:38:57 am
Trying to do a proof of concept before investing in the real thing.  I have an ASUS GTX660TI-DC2T-2GD5 card and would like to mod it to a Quadro K5000 or some other GPU virtualization compatible card.  (Hyper-V 2012 ?).  The ASUS card seems different from the other images in this thread.  Here are some pictures of both sides, if someone would be so kind as to help me out:

Thank you,
Hello, looks like they rotated GPU :)
Can you take a picture of the area shown on a pic?

I googled some photos, and I think I found locations of resistors.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: free_electron on June 16, 2013, 05:35:44 pm
@free_electron - not always the case.

like i said : same pcb layout can hold multiple chips. some are actually different , some are fusebits.

A key element of Quadro is the drivers. if you got a regular card and you experience a software glitch nobody gives a rat's ass.
If you have a quadro and there is a glitch with a particular software , it will get fixed and it will get fixed very quickly.
quadro's are workstation card. meaning they are used to visualised data from engineering software. solidworks, catia , adobe .. all the big stuff out there.

there are other elements on quadro . OpenGL is run in hardware on a quadro s opposed to software emulation on a non quadro. Quadros can have much more memory than other boards  (which is important for massive polygon scenes . And no SLI does not share memory ... ) Try loading a complete Catia design for a wing of an airplane down to the last nut and bolt. change the viewport. The Quadro will respond immediately. the GTX will curl up in a corner ,choke and may give you a screen refresh a few minutes later. unworkable if you are designing.

As software is being rewritten to take advantage of CUDA the quadro will loose its advantage. but tright now , 3d cad software is , in a lot of cases, still driven on OpenGL or DirectX and there the quadro runs circles around the other boards.

in short : for gaming , video processing : no advantage , stick with gtx
for anythin involving massive vertex count : quadro will chomp the gtx to bits. so if you are thinking to make the next Toystory or Monsters Inc , or run things like 3Dmax, Catia and other high end cad software ... quadro is the way to go. games have no advantage on quadro as the amount of polygons a game throws at the GPU is simply too small to notice an effect.

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: jimseal on June 17, 2013, 05:21:16 am
More info on the 660Ti ASUS

Hope that helps...

Thanks
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 17, 2013, 07:34:32 am
More info on the 660Ti ASUS

Hope that helps...

Thanks

Thank you for perfect photos ;)

You may use resistors value table from 680 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)


Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: sw on June 17, 2013, 03:32:20 pm
Hello, I have a EVGA GeForce GTX 660Ti  (03G-P4-3663-KR) that I would possibly like to try this on, I have read through this forum but still have a few questions. It seems to me this card can be shown as a tesla k10, quadro k5000 or a grid k2 according to a post on page 16. I game a lot, but also use processor intensive programs like Adobe CS6 and a few others that would seem to benefit from these higher end cards. Which would be the best card for me to change the resistor values to, and will I loose PhysX support or anything similar by reporting this cards to the computer as a higher model?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 17, 2013, 05:32:31 pm
... will I loose PhysX support or anything similar by reporting this cards to the computer as a higher model?

I have found this to be somewhat hit-and-miss. GTS450 -> Q2000 appears to lose PhysX on XP-64, but not on Win7-64. GTX480->Q6000 seems to retain PhysX capability on XP-64. The only way to really know for sure is to try it and see, over and above going over the spec of all quadro cards on the Nvidia website and checking which ones they claim support PhysX.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: sw on June 17, 2013, 05:42:09 pm
Ok, so will I use the resistors on the EVGA GTX 660Ti  (03G-P4-3663-KR) that have been shown for the 670 card because the board is the same? And is anyone working on a way to make this information easier to view? That would be nice, if not I may do that.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 18, 2013, 06:39:47 am
... will I loose PhysX support or anything similar by reporting this cards to the computer as a higher model?

To add to what gordan said, my experience has been strictly with one card (GTX680) and Windows 7 64.

When I was running native Windows, I never lost Physx under any scenario.

While running Windows under Xen, I lost phsyx when the card reported itself as a Quadro K5000 and I was using the Quadro (not the GeForce) drivers. While using the geforce drivers it worked fine.

While running Windows under Xen, I lost phsyx when the card reported itself as a Grid K2, I was able to retain Physx will doing primary gpu passthrough only.

Nvidia drivers seems to know they are running on Xen even with primary passthrough and like to cause trouble (some missing options and features that are tricky to get back).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on June 18, 2013, 08:00:22 am
There is no issue with physx for me.
I have grid k2 as xen domu with quadro/grid drivers. I have no physx by default, because quadro drivers doesn't contain "PhysX System Software". But it is easy to fix :) just download geforce drivers and unpack. after that install "PhysX System Software" from "PhysX" folder and enjoy :)
I able to run Supersonic sled demo from nvidia. :)

May be I am miss something? ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 18, 2013, 10:28:55 am
May be I am missing something? ;)

Bad luck. Ill give you some if you want it.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 18, 2013, 12:02:15 pm
When modding to the Quadro on my card, it had different results when using the Geforce vs Quadro drivers.

I take it this is GTX680 -> K5000 mod you are talking about. How much of a difference do you actually see in specviewperf? A slight difference or a ~5x difference in catia?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 19, 2013, 06:23:28 am
Just in case anyone is wondering, I just compared the output of
lspci -vvvv
between my genuine Q2000 and a modified GTS450, and there is no difference between them at all. The only things that come up on a diff are the different BAR addresses the card got mapped to. At least that excludes one thing that could be making it not quite behave like a real Quadro in terms of SPECviewperf, and we're back to the GF106 vs. GF106GL chip-wise.

Has anyone got a genuine Quadro 5000 or Quadro 6000 (or even Quadro 7000 - yes, I know, fat chance) that they could capture the output of lspci -vvvv on? We could then compare it to what my faux Quadros show.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: nporter on June 19, 2013, 11:14:24 pm
I have a Quadro 4000 and a Tesla C2075 if either of those would help in this or anything in the future.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 20, 2013, 01:35:42 am
I have a Quadro 4000 and a Tesla C2075 if either of those would help in this or anything in the future.

Hmm... Q4000 doesn't really have a straight GeForce equivalent, but a C2075 is essentially a GTX480 with 4x the RAM.
Give me a few days to re-mod my GTX480 into a pseudo-C2075 and see what I can come up with for comparing notes.
We'd need to find some kind of a benchmark to compare performance between them, though, since AFAIK Teslas have no video ports (or do they?), which means no SPECviewperf comparison, which is the only thing I've found that performs differently.

Edit:
Dismantled my Q2000 and GTS450s. There is a difference in the chip number.
Q2000: GF106-875
GTS450: GF106-250

Whether that means something conclusive or not remains to be seen.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: jimseal on June 20, 2013, 07:04:17 am
Thanks for your help so far,

Back concerning the ASUS GTX660TI-DC2T-2GD5

One more question please:

The config I am looking for (tk10) asks for a 40kOhms
and it seems that they are not that easy to find or no longer sold, so question is will a 39KOhms or a 46KOhms one work?

Thanks for anyone that could answer with some experiance at this detail here.



Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 20, 2013, 07:40:22 am
so question is will a 39KOhms or a 46KOhms one work?

39k should work. I don't actually use any 40k resistors as the 40k caused me some problems and are just a solution to the stability issues some have experienced.

Since you have an ASUS card in the same series as mine (ive got the 680 equivalent of that card), try without a 40k resistor and see if your ID changes randomly. If it doesn't change randomly and is the value you want, you can leave off the 40k ones.

EDIT: Clarified wording.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 21, 2013, 07:42:20 am
Interesting article on GeForce vs. Quadro performance:
http://doc-ok.org/?p=304 (http://doc-ok.org/?p=304)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 22, 2013, 05:09:30 am
I think I might be getting somewhere with this.

Attached is the glxinfo output from a genuine Q2000 and a GTS450 modified to Q2000.

They are quite substantially different.

I wonder what differences lurk in the BAR0 registers and if they might be changeable...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Bishop on June 26, 2013, 06:09:32 am
Good evening
I have a Galaxy GeForce GTX 670 GC 2 GB GDDR5  67NPH6DV5ZVX and I would like to turn it into one of the workstation counterparts. I mainly want it for Xen passthrough to a second monitor domU with 3d/gaming capabilities. The dom0 will be used by another person for work with the iGPU. Is this possible? Do I have to remove or solder resistors? Removing resistors is an option but resoldering cables/resistors freaks me out!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 26, 2013, 05:27:37 pm
The dom0 will be used by another person for work with the iGPU. Is this possible?

Possible, yes. The i915 (Intel HD4000) will only work if you are using it for a text only console. X has artifacting with the Xen kernel and intel driver when using the iGPU on dom0 (presumed Xen is at fault here, no solution in the works afaik).

I wouldn't recommend using dom0 as a workstation anyway. I passed the iGPU through to an hvm and used it that way. More secure, bit harder to manage dom0 when it is headless, though.

As for the resistors and such, I suggest reading the thread before attempting anything like this. The questions you posed are all answered.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 27, 2013, 05:32:50 am
I would have thought you'd still be able to use the iGPU with the dumb frame buffer, if performance isn't too much of a requirement.

Performance of accelerated drivers is often an issue on dom0 - Nvidia accelerated drivers get about 3fps in glxgears. ATI ones didn't work at all in Xen dom0 last time I checked (not that I would recommend ATI for anything these days - they have crippled and dumbed down their products to the point where Nvidia have me well and truly by the proverbials by being the only game in town for my use-cases).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 27, 2013, 06:43:21 am
nVidia and ATI/AMD proprietary and open source drivers all work with dom0 from reports I have read.

I have personal experience with AMD proprietary and open source working with dom0.

As far as I am aware, the _only_ crippling issue with video and dom0 is with the i915.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Bishop on June 27, 2013, 07:48:02 am
The dom0 will be used by another person for work with the iGPU. Is this possible?

Possible, yes. The i915 (Intel HD4000) will only work if you are using it for a text only console. X has artifacting with the Xen kernel and intel driver when using the iGPU on dom0 (presumed Xen is at fault here, no solution in the works afaik).

I wouldn't recommend using dom0 as a workstation anyway. I passed the iGPU through to an hvm and used it that way. More secure, bit harder to manage dom0 when it is headless, though.

As for the resistors and such, I suggest reading the thread before attempting anything like this. The questions you posed are all answered.
Got it. What about the nvidia gtx670 passthrough? I've been reading some stuff that says it is not possible, only the workstation counterparts are able to do it. I am sorry I'm just scared of bricking my card as I'm currently a broke student in a 3rd world country studying beginner's virtualization in a subpar laboratory  :-+
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 28, 2013, 12:27:44 am
I think I might be getting somewhere with this.

Attached is the glxinfo output from a genuine Q2000 and a GTS450 modified to Q2000.

They are quite substantially different.

I wonder what differences lurk in the BAR0 registers and if they might be changeable...

I'm not sure what version of NVidia drivers you are using, but as a hint try something older, perhaps from the 270.XX line.

With the original softmods back when, I believe that NVidia had caught up with that and disabled advanced features (ie. Quadro) in drivers.

Still trying to determine when it occurred, which requires testing many old versions of drivers.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 28, 2013, 06:15:27 am
I'm not sure what version of NVidia drivers you are using, but as a hint try something older, perhaps from the 270.XX line.

What would that prove?

With the original softmods back when, I believe that NVidia had caught up with that and disabled advanced features (ie. Quadro) in drivers.

How do you figure that? How does the driver know it's not a real Quadro? I'm using the latest, 319.23 on Linux. There are no separate Quadro and GeForce drivers on Linux, it's the same driver for both.

Still trying to determine when it occurred, which requires testing many old versions of drivers.

I suspect the functionality is laser cut out.
Then again, it could be there is extra strapping on the PCB (e.g. a cap across chip pins) that disables certain functionality, but this is hard to eyeball. All 3 GTS450 cards hav emarkedly different cap arrangements under the GPU, which is different again from the real Q2000.

I don't really use apps that benefit from the extra GL primitives, so I'm not that interested in researching further, but if anyone else is planning to put some more work into it, I would suggest comparing the register values in BAR0 before the driver loads (reasonably easy to do in Linux, just mmap() the bar and dump it out, and compare the real thing vs. the modified card). PMC should contain register values that should imply what engines are enabled/disabled and you _might_ be able to re-enable them by clobbering the values, as provided by the real Quadro. If you can figure it out, on Linux you could write a small program that gets executed before the binary driver loads and clobbers the values to get the driver convinced the card is the real deal.

I'm not a big Windows user so I'm not sure how that might work in Windows, but something similar might be possible.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 28, 2013, 06:56:16 am
What would that prove?
It would prove that things have changed in the mean time.

People have soft-modded their GTX cards to Quadros in the past, remember RivaTuner? It had a soft-strap editor that would change PCI Id to match whatever other card.

Also, remember that fellow who posted on NVidia CUDA forums about modding vbios soft-straps on his GTX 480 to a Tesla.

How do you figure that? How does the driver know it's not a real Quadro? I'm using the latest, 319.23 on Linux. There are no separate Quadro and GeForce drivers on Linux, it's the same driver for both.
I might as well say the same thing back at you: How do *you* figure that?? Obviously you have not done anything past installing the latest drivers and drawing a conclusion based on a quick look at it.

I suspect the functionality is laser cut out.
Then again, it could be there is extra strapping on the PCB (e.g. a cap across chip pins) that disables certain functionality, but this is hard to eyeball. All 3 GTS450 cards hav emarkedly different cap arrangements under the GPU, which is different again from the real Q2000.
Prove it.

Everyone assumes some kind of manufacturing process or what not involved to cripple GPUs. I suppose because this is an electronics forum that's inherent, but if you have not looked into other aspects, I would not draw conclusions so fast.

Actually I am somewhat positive that crippling occurs on the software side because, unlike you, I tried an older driver giving me different result than the new one, AND I took a peek under the hood (inside the driver).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 28, 2013, 08:41:55 am
I am somewhat positive that crippling occurs on the software side

I would agree with you that this is a very strong possibility. With my experience so far, the driver tends to not want to allow certain features under xen. I think this is mostly due to the way xen presents the GPU to the OS (and whether you have a secondary GPU, even if it is disabled). The point here is they have a documented history of crippling features within the software.

But I know more about software, so I would tend to believe it to be a software problem with a software solution.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 28, 2013, 05:08:23 pm
What would that prove?
It would prove that things have changed in the mean time.

Like what? Are you suggesting that newer drivers do more crippling and better faux-Quadro detection than the old ones?
Can you please list an old driver version that produces markedly better results in SPECviewperf tests on a pseudo-Quadro?

People have soft-modded their GTX cards to Quadros in the past, remember RivaTuner? It had a soft-strap editor that would change PCI Id to match whatever other card.

Also, remember that fellow who posted on NVidia CUDA forums about modding vbios soft-straps on his GTX 480 to a Tesla.

Soft-modding still works to a large extent. The only limitation is the strap bits that are exposed in software.
In the past three weeks I have soft-modded:
GTS450->Q2000
GTX470->Q5000
GTX480->Q6000
GTX580->Q7000
GTX680->Tesla K10

Q7000 and Tesla K10 aren't "MultiOS" capable, so the driver doesn't make them work with VGA passthrough, which makes the mod of limited usefulness, over and above enabling TCC feature in Windows.

Everything required to do this is pretty well documented by the nuoveau project. There are also other things that are useful (and easy) to mod, for example BAR1 size to be >= the size of the RAM on the card. This lets you take, say, a 4GB card and when you're not gaming, you can map it as a really fast block device for some fast swap or whatever you might need.

Kepler BIOSes are quite different, but having spent an hour looking at the dump out of my GTX680, I've worked out most of the bits relevant to this thread. The strapping on them is... odd. It's done in two places, but what is odd is that it is done the same way in two places.

The other odd thing is that the PCI device ID is set in two places, but it has a profound effect on the way the card is handled. For example, just changing the the strapping for the device ID from GTX680 -> Tesla K10 works fine. But if you also chance the PCI ID, at least in the primary location, the card no longer works properly - it gets detected as a standard unaccelerated VGA adapter, even though the Quadro driver is still running it, and the RAM on it shows up as 2990MB instead of 4096MB. Put the device IDs back to GTX680 (but keep the K10 strapping), and it works fine. I'm guessing there may be a checksum on the blocks containing the PCI IDs, but I didn't have a genuine Tesla K10 BIOS handy to cross-check against it last night.

Also, I'm not 100% sure, but I thought I saw a substantial boost in Crysis benchmark fps when running as a K10, but I could have just been dreaming that - will need to re-test that later today when I am more properly caffeineated.

How do you figure that? How does the driver know it's not a real Quadro? I'm using the latest, 319.23 on Linux. There are no separate Quadro and GeForce drivers on Linux, it's the same driver for both.
I might as well say the same thing back at you: How do *you* figure that?? Obviously you have not done anything past installing the latest drivers and drawing a conclusion based on a quick look at it.

I'm guessing that on the basis that laser cutting chips is cheap and easy during production, and the technique is widely used.
Plus the fact that the GF106 in the Q2000 has a different part number etched into it compared to the GF106 in the GTS450 (sample size 3 for GTS450, 1 for Q2000).

I suspect the functionality is laser cut out.
Then again, it could be there is extra strapping on the PCB (e.g. a cap across chip pins) that disables certain functionality, but this is hard to eyeball. All 3 GTS450 cards hav emarkedly different cap arrangements under the GPU, which is different again from the real Q2000.
Prove it.

Everyone assumes some kind of manufacturing process or what not involved to cripple GPUs. I suppose because this is an electronics forum that's inherent, but if you have not looked into other aspects, I would not draw conclusions so fast.

Actually I am somewhat positive that crippling occurs on the software side because, unlike you, I tried an older driver giving me different result than the new one, AND I took a peek under the hood (inside the driver).

Prove it. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 28, 2013, 11:18:05 pm
Soft-modding still works to a large extent. The only limitation is the strap bits that are exposed in software.
In the past three weeks I have soft-modded:
GTS450->Q2000
GTX470->Q5000
GTX480->Q6000
GTX580->Q7000
GTX680->Tesla K10

Q7000 and Tesla K10 aren't "MultiOS" capable, so the driver doesn't make them work with VGA passthrough, which makes the mod of limited usefulness, over and above enabling TCC feature in Windows.

Everything required to do this is pretty well documented by the nuoveau project. There are also other things that are useful (and easy) to mod, for example BAR1 size to be >= the size of the RAM on the card. This lets you take, say, a 4GB card and when you're not gaming, you can map it as a really fast block device for some fast swap or whatever you might need.

It seems to me you mistakenly compare ability to be detected as something else (spoofing) with real conversion of one to another. Perhaps in your application of this modification (Xen in Linux) it is sufficient to just spoof the wanted card but what I'm looking for is actually getting the Quadro performance instead.

I'd like to know how did you mod the GTX 580 to Q7000 with soft straps and have it work?

Because, I've spent some time on a similar project using a GTX 570 and the only real way to do this was through changing hard straps (driver ignores vbios soft-straps and the system will not boot, will boot with a black screen or not initialize the GUI). Some, if not all, Quadro vbioses actually check if the ROM and board PCI ID match and also lock out.

I am able to get a Q7000, C2075 through hard straps and the system actually boots and initializes. Then I am able to install the latest Quadro/Tesla driver, alas the performance is not there due to whatever limitations imposed (I'd say software).

Kepler BIOSes are quite different, but having spent an hour looking at the dump out of my GTX680, I've worked out most of the bits relevant to this thread. The strapping on them is... odd. It's done in two places, but what is odd is that it is done the same way in two places.

Why don't you share that information here, as it is not documented anywhere it will help everyone else?

The other odd thing is that the PCI device ID is set in two places, but it has a profound effect on the way the card is handled. For example, just changing the the strapping for the device ID from GTX680 -> Tesla K10 works fine. But if you also chance the PCI ID, at least in the primary location, the card no longer works properly - it gets detected as a standard unaccelerated VGA adapter, even though the Quadro driver is still running it, and the RAM on it shows up as 2990MB instead of 4096MB. Put the device IDs back to GTX680 (but keep the K10 strapping), and it works fine. I'm guessing there may be a checksum on the blocks containing the PCI IDs, but I didn't have a genuine Tesla K10 BIOS handy to cross-check against it last night.

You might be finding the PCI ID from the VBIOS and the EFI sections, that's why there are two matches.

On newer Windows systems and Macs, EFI is used to init/boot the card and so the PCI ID in that section also has to be changed - needs to match hard straps. EFI code itself also checks for the Vendor ID, PCI ID and Board ID, so that has to be changed accordingly.

As far as I know the checksums exist for the entire ROM and for the HWINFO section (soft-straps), although it is true that the GK ROM format has changed and they've introduced wrappers for the old VBIOS. Obviously one needs to make sure the HWINFO checksum is correct and then adjust the entire ROM checksum.

You've also probably noticed a cryptographic signature at the end of the ROM which UEFI/Winblows uses to verify authenticity of hardware. In Linux that is irrelevant but thanks to M$, they've managed to create a deadlock and control all OEMs who produce motherboards with this whole signature issue (remember Linux booting problems until the whole UEFI issue was figured out).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 29, 2013, 01:15:56 am
It seems to me you mistakenly compare ability to be detected as something else (spoofing) with real conversion of one to another. Perhaps in your application of this modification (Xen in Linux) it is sufficient to just spoof the wanted card but what I'm looking for is actually getting the Quadro performance instead.

Indeed, all the mods so far, hard or soft, have been purely about spoofing. I am not aware of any mod that will re-enable the missing part of the hardware GL implementation.

I'd like to know how did you mod the GTX 580 to Q7000 with soft straps and have it work?

The usual way. Change the PCI ID, re-calc the checksum, flash the card, change the straps, in that order. It "just works". Doesn't really gain you much, though. Mosaic app might work, but since Q7000 isn't MultiOS, it won't run in VGA passthrough mode.

Because, I've spent some time on a similar project using a GTX 570 and the only real way to do this was through changing hard straps (driver ignores vbios soft-straps and the system will not boot, will boot with a black screen or not initialize the GUI). Some, if not all, Quadro vbioses actually check if the ROM and board PCI ID match and also lock out.

I'm not using a Quadro BIOS, I'm using a modified version of the original BIOS. Switching to a Quadro BIOS would be a whole different can of worms to open.
I've not had any problems with it as Q7000, the only reason I bother keeping it as that is so that I don't have the GeForce driver clashing with the Quadro driver on the same system when I boot into Windows (the other card is a GTX480 -> Q6000, which i what I use for VGA passthrough).

I am able to get a Q7000, C2075 through hard straps and the system actually boots and initializes. Then I am able to install the latest Quadro/Tesla driver, alas the performance is not there due to whatever limitations imposed (I'd say software).

I got to that point without heating up my soldering iron.

Why don't you share that information here, as it is not documented anywhere it will help everyone else?

Half of the info was already posted on this thread, the first (newly added) strap is at the beginning of the BIOS. This seems to be the same as the first 8 bytes of the normal strap, apart from a discrepancy in th efirst bit of the OR mask, IIRC. The old strap you can easily work out where it is - nvflash edits it correctly. There appears to be a 0x400 length header prepended to the BIOS (includes the new strap), after which the BIOS is fairly similar to previous generations.

You might be finding the PCI ID from the VBIOS and the EFI sections, that's why there are two matches.

That seems quite plausible.

On newer Windows systems and Macs, EFI is used to init/boot the card and so the PCI ID in that section also has to be changed - needs to match hard straps. EFI code itself also checks for the Vendor ID, PCI ID and Board ID, so that has to be changed accordingly.

Hmm... I wonder if this could be stripped out to reduce the BIOS to the old, EFI-less state that is more open to modifying. It'd also make it a lot more similar to the previous BIOSes in terms of understanding what the various bits do.

As far as I know the checksums exist for the entire ROM and for the HWINFO section (soft-straps), although it is true that the GK ROM format has changed and they've introduced wrappers for the old VBIOS. Obviously one needs to make sure the HWINFO checksum is correct and then adjust the entire ROM checksum.

Prior to 6xx at least, the strap checksum is independent of the full checksum, IIRC (unless nvflash updates both when you use it with --straps, which I don't think it does).

You've also probably noticed a cryptographic signature at the end of the ROM which UEFI/Winblows uses to verify authenticity of hardware. In Linux that is irrelevant but thanks to M$, they've managed to create a deadlock and control all OEMs who produce motherboards with this whole signature issue (remember Linux booting problems until the whole UEFI issue was figured out).

I don't own any EFI motherboards, but I would have thought the whole EFI wrapping should be strippable out from the VBIOS. Once you skip past the 0x400 bytes of header, the rest of the BIOS is similar in terms of offsets of known areas to the Fermi BIOSes.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 29, 2013, 01:36:28 am
The usual way. Change the PCI ID, re-calc the checksum, flash the card, change the straps, in that order. It "just works". Doesn't really gain you much, though. Mosaic app might work, but since Q7000 isn't MultiOS, it won't run in VGA passthrough mode.
Ah, so you change the straps from the command line with nvflash.

I actually edit the straps while I'm editing the ROM. I find it easier because all I have to do afterwards is flash the ROM, although I do make sure that nvflash verifies my ROM first.

I got to that point without heating up my soldering iron.
For me, Windows Quadro driver would not install if I do not change hard-straps. Matter a fact in some cases I could not even get to Windows GUI to install the driver.

Hmm... I wonder if this could be stripped out to reduce the BIOS to the old, EFI-less state that is more open to modifying. It'd also make it a lot more similar to the previous BIOSes in terms of understanding what the various bits do.
That should be doable, but you have to make sure that you include all ROM parts (minus the EFI). In Fermi cards, the ROM had two parts (one was the vbios and the other another device, I believe HDMI audio).

I don't own any EFI motherboards, but I would have thought the whole EFI wrapping should be strippable out from the VBIOS. Once you skip past the 0x400 bytes of header, the rest of the BIOS is similar in terms of offsets of known areas to the Fermi BIOSes.
IIRC, according to the PCI firmware spec the wrapping stuff ends up being ignored on older computers with BIOS, until the 55 AA signature is detected. On the new UEFI, they will probable read the wrapper and then read the rest (55 AA, on).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 29, 2013, 02:33:31 am
Ah, so you change the straps from the command line with nvflash.

I actually edit the straps while I'm editing the ROM. I find it easier because all I have to do afterwards is flash the ROM, although I do make sure that nvflash verifies my ROM first.

I may be wrong but IIRC NiBiTor doesn't update the strap checksum, only the full checksum.

I got to that point without heating up my soldering iron.
For me, Windows Quadro driver would not install if I do not change hard-straps. Matter a fact in some cases I could not even get to Windows GUI to install the driver.

Sounds strange, I've never seen an issue like that on any of the cards I modified, it always "just works".
Then again, modifying a GTX580 is a complete waste of time anyway. GTX480 is just as fast (in some cases faster due to dual DMA channels when modded to Quadro/Tesla), and is trivial to modify into a Q6000. I only modified my 580 because I already had it and I was seeing odd driver clashing when using a GeForce and a Quadro in the same system under Windows. One driver would end up driving both cards, usually the later one (the GeForce one has a higher version number).

Hmm... I wonder if this could be stripped out to reduce the BIOS to the old, EFI-less state that is more open to modifying. It'd also make it a lot more similar to the previous BIOSes in terms of understanding what the various bits do.
That should be doable, but you have to make sure that you include all ROM parts (minus the EFI). In Fermi cards, the ROM had two parts (one was the vbios and the other another device, I believe HDMI audio).

Indeed, but you get the same thing on Kepler, they have a HDMI audio device, too.

I don't own any EFI motherboards, but I would have thought the whole EFI wrapping should be strippable out from the VBIOS. Once you skip past the 0x400 bytes of header, the rest of the BIOS is similar in terms of offsets of known areas to the Fermi BIOSes.
IIRC, according to the PCI firmware spec the wrapping stuff ends up being ignored on older computers with BIOS, until the 55 AA signature is detected. On the new UEFI, they will probable read the wrapper and then read the rest (55 AA, on).

Handy, so you could effectively s/.*55AA// and strip out the EFI capability and defeat crypto. Nice. Presumably the trailing garbage would just get ignored then.

The only question is whether there is an extra ID bit in Kepler soft straps. These are not yet fully documented. It could be one of the unknown bits (but it's not the unknowns next to ID bit 4, I tried those). The reason I say that is because until Kepler, all cards were modifiable using only soft-straps into any other card sporting the same GPU and memory type. And I only mention memory type because the GDDR3 GTS450 differed in more than just the last 5 bits of device ID, I had not seen such a case before.

What I'll probably do on my GTX680 is mod the resistor slot controlling the 3rd nibble by adding a dip switch, so I can toggle that bit between 1 and 0. From there on, I can modify the GTX680 into any card sporting the GK104 chip using the switch the 5 soft-strapped bits.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 29, 2013, 04:21:48 am
I may be wrong but IIRC NiBiTor doesn't update the strap checksum, only the full checksum.
Who said anything about NiBiTor? :)

I edit everything by hand in a hex editor - I've done so many things now that I know where to look and what to do with it blindfolded.

Sounds strange, I've never seen an issue like that on any of the cards I modified, it always "just works".
Then again, modifying a GTX580 is a complete waste of time anyway. GTX480 is just as fast (in some cases faster due to dual DMA channels when modded to Quadro/Tesla), and is trivial to modify into a Q6000. I only modified my 580 because I already had it and I was seeing odd driver clashing when using a GeForce and a Quadro in the same system under Windows. One driver would end up driving both cards, usually the later one (the GeForce one has a higher version number).
I deliberately chose GTX 570 because GTX 4xx are running much hotter than 5xx and if they even have any performance improvements over 5xx I think it's negligible. Altough even GTX 5xx run hot, heck they are all crap due to the terrible NVidia design which makes the GPU run additional 10-15C hotter if you plug two monitors into the card.

Handy, so you could effectively s/.*55AA// and strip out the EFI capability and defeat crypto. Nice. Presumably the trailing garbage would just get ignored then.

The only question is whether there is an extra ID bit in Kepler soft straps. These are not yet fully documented. It could be one of the unknown bits (but it's not the unknowns next to ID bit 4, I tried those). The reason I say that is because until Kepler, all cards were modifiable using only soft-straps into any other card sporting the same GPU and memory type. And I only mention memory type because the GDDR3 GTS450 differed in more than just the last 5 bits of device ID, I had not seen such a case before.
You can't defeat the signature but I think Linux will ignore it altogether so it does not matter if it's there. The only reason it would matter is to Windows.

Also, because of all the new crap added to the ROM the EEPROM is bigger too- instead of 256KB it's 512KB now thus requires soldering when modding some cards.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 29, 2013, 04:47:36 am
I may be wrong but IIRC NiBiTor doesn't update the strap checksum, only the full checksum.
Who said anything about NiBiTor? :)

I edit everything by hand in a hex editor - I've done so many things now that I know where to look and what to do with it blindfolded.

Can you please share the strap checksum algorithm? How do you compute it after modifying the strap manually?

Sounds strange, I've never seen an issue like that on any of the cards I modified, it always "just works".
Then again, modifying a GTX580 is a complete waste of time anyway. GTX480 is just as fast (in some cases faster due to dual DMA channels when modded to Quadro/Tesla), and is trivial to modify into a Q6000. I only modified my 580 because I already had it and I was seeing odd driver clashing when using a GeForce and a Quadro in the same system under Windows. One driver would end up driving both cards, usually the later one (the GeForce one has a higher version number).
I deliberately chose GTX 570 because GTX 4xx are running much hotter than 5xx and if they even have any performance improvements over 5xx I think it's negligible. Altough even GTX 5xx run hot, heck they are all crap due to the terrible NVidia design which makes the GPU run additional 10-15C hotter if you plug two monitors into the card.

Most of the reason why 580 runs cooler than the 480 is the power management that throttles certain operations. That plus they cut out the 2nd DMA engine. GF100 and GF110 are nearly identical silicon-wise.

Handy, so you could effectively s/.*55AA// and strip out the EFI capability and defeat crypto. Nice. Presumably the trailing garbage would just get ignored then.

The only question is whether there is an extra ID bit in Kepler soft straps. These are not yet fully documented. It could be one of the unknown bits (but it's not the unknowns next to ID bit 4, I tried those). The reason I say that is because until Kepler, all cards were modifiable using only soft-straps into any other card sporting the same GPU and memory type. And I only mention memory type because the GDDR3 GTS450 differed in more than just the last 5 bits of device ID, I had not seen such a case before.
You can't defeat the signature but I think Linux will ignore it altogether so it does not matter if it's there. The only reason it would matter is to Windows.

Also, because of all the new crap added to the ROM the EEPROM is bigger too- instead of 256KB it's 512KB now thus requires soldering when modding some cards.

I presume you speak of the 7xx series - My GTX680 ROM is 180KB.
As for Windows - the only reason for using that is disappearing with Steam acquiring Linux support.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 29, 2013, 11:14:15 am
Can you please share the strap checksum algorithm? How do you compute it after modifying the strap manually?
The algorithm is the same as for the ROM itself: 0x100 - (S & 0xFF), where S is sum of bytes from offset 0x58 to 0x6a (0x6a is always 0xA5, it's a HWINFO signature) in a standard vbios image. Offset 0x6b is then the checksum itself which is not part of the S.

I presume you speak of the 7xx series - My GTX680 ROM is 180KB.
Actually GTX 680 ROM is over 200KB due to the EFI portion and all the other crap like secure signature and new wrappers, etc. That is on a GTX 680 2 or 4GB I've seen.

As for Windows - the only reason for using that is disappearing with Steam acquiring Linux support.
Hear, hear!
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 29, 2013, 01:20:56 pm
Actually GTX 680 ROM is over 200KB due to the EFI portion and all the other crap like secure signature and new wrappers, etc. That is on a GTX 680 2 or 4GB I've seen.

My Asus is 180k.

A GV-N680OC-2GD is 64k.

Of note, the gigabyte one does not have uefi gop support; no space I am guessing.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on June 29, 2013, 10:11:36 pm
@amigo
Thanks for the checksum algorithm.

@gamezr2ez
Thanks for the info regarding the GB card.

I might just grab a copy of that BIOS from the TPU database and see if I can work out which bits can thus be stripped out from the Asus BIOS to remove all the UEFI crap. I suspect that with the UEFI headers and the other stuff removed, the BIOS will in fact be parseable by all the old tools that work on the previous generations' BIOSes. The only issue is trying to acquire a cheap sacrificial 680 for the research (I don't want to brick my shiny 4GB Gainward Phantom).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on June 30, 2013, 12:24:44 pm
As I am aware there is no UEFI stuff in the ASUS bios by default. I believe they have released an unofficial-official hybrid bios. By default though that is all left out.

Side thought, why would you care about stripping out anything from the BIOS; it wont result in added performance.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: amigo on June 30, 2013, 12:39:09 pm
As I am aware there is no UEFI stuff in the ASUS bios by default. I believe they have released an unofficial-official hybrid bios. By default though that is all left out.

Side thought, why would you care about stripping out anything from the BIOS; it wont result in added performance.

If I'm correct, current vbios editing tools do not support the new structures and so trying to edit those ROMs is not possible at the moment. Stripping the new wrappers exposes the old format which these editors can recognize and use.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 01, 2013, 09:05:15 am
Has anybody found where the second set of device ID strap resistors is on the GTX690 yet? Or on the Titan?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on July 01, 2013, 10:02:43 am
Has anybody found where the second set of device ID strap resistors is on the GTX690 yet?

Pretty sure gnif lost his card finding those. Check the first post again.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 01, 2013, 04:27:06 pm
I did, but that hasn't been updated in months and I joined the thread later on. If there's still a funding gap and the research into the 2nd set of straps is ongoing, I'm happy to contribute but I haven't seen a post from gnif in the last few pages of the thread.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on July 01, 2013, 04:50:29 pm
As per everything in the first post:

GPU 0 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg202901/#msg202901)
GPU 1 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg209227/#msg209227)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: bdx on July 03, 2013, 04:00:15 pm
Does anyone have experience modding to the Tesla grid k2, the nvs grid k2, the geforce k2 rather than the generic k2?
How about the k340 or k520?
 :-//
 What are the differences in these cards?

http://browse.feedreader.com/c/LaptopVideo2Go_News/358076787 (http://browse.feedreader.com/c/LaptopVideo2Go_News/358076787)

Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 03, 2013, 04:51:23 pm
Where did you find references to these cards? I can't find them in the supported list for the latest driver:
http://us.download.nvidia.com/XFree86/Linux-x86/319.23/README/supportedchips.html (http://us.download.nvidia.com/XFree86/Linux-x86/319.23/README/supportedchips.html)
Which means they are unlikely to exist (yet, at least).
Title: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Soulnight on July 14, 2013, 01:25:48 am
Hello everybody,

I'm new on this forum and I would like to thank everybody (and especially gnif) for the mod of geforce GTX 680 to Quadro K5000.

I'm a gamer. I don't want more performance but just one (quadro) functionnality more.

I play games in 3D with the help of nvidia 3D vision on my monitor (input signal 1080P 3D 120Hz).
I would like to do the same but with passive 3D dual projection. I need therefore the option of a quadro K5000 to activate the possibility of passive stereo with 2 projectors connected to the graphic card. Here is the link to the procedure to do passive stereo with a quadro.

http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

Could someone please confirm that the option is available with a modded GTX 680 to quadro K5000?

Thank you,
Soulnight  ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: myweb on July 18, 2013, 04:09:04 am
Dear All,

What inexpensive card would you recommend to try the modifications? Is it correct, that the most inexpensive card is GT640?
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Jager on July 18, 2013, 07:41:24 am
Hello everybody,

I'm new on this forum and I would like to thank everybody (and especially gnif) for the mod of geforce GTX 680 to Quadro K5000.

I'm a gamer. I don't want more performance but just one (quadro) functionnality more.

I play games in 3D with the help of nvidia 3D vision on my monitor (input signal 1080P 3D 120Hz).
I would like to do the same but with passive 3D dual projection. I need therefore the option of a quadro K5000 to activate the possibility of passive stereo with 2 projectors connected to the graphic card. Here is the link to the procedure to do passive stereo with a quadro.

http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

Could someone please confirm that the option is available with a modded GTX 680 to quadro K5000?

Thank you,
Soulnight  ;)

I'm on same boat. I'm just trying to figure best way to get Dual projection setup with omega filters to work. There is couple methods to get frame synced dual projection to work. One is using Tridef SBS with AMD eyefinity 3840x1080 and second method is using Quadro. I want to use 3D Vision and i'm also ready to try to mod my 670 to K5000 to get quadro features. I think 3D Vision dual projection will work with modded card. I'm bit unsure however if quadro gives framesynced dual output from one card...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 19, 2013, 04:15:33 am
@myweb - what is the reason for your looking to modify a card? Depending on the level of performance and features you require will dictate which series card you are best off with (e.g. 4xx series for Quadro x000 or 6xx series for Quadro Kx000 / Grid).
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Soulnight on July 20, 2013, 08:03:06 pm
Hello everybody,

I'm new on this forum and I would like to thank everybody (and especially gnif) for the mod of geforce GTX 680 to Quadro K5000.

I'm a gamer. I don't want more performance but just one (quadro) functionnality more.

I play games in 3D with the help of nvidia 3D vision on my monitor (input signal 1080P 3D 120Hz).
I would like to do the same but with passive 3D dual projection. I need therefore the option of a quadro K5000 to activate the possibility of passive stereo with 2 projectors connected to the graphic card. Here is the link to the procedure to do passive stereo with a quadro.

http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

Could someone please confirm that the option is available with a modded GTX 680 to quadro K5000?

Thank you,
Soulnight  ;)

I'm on same boat. I'm just trying to figure best way to get Dual projection setup with omega filters to work. There is couple methods to get frame synced dual projection to work. One is using Tridef SBS with AMD eyefinity 3840x1080 and second method is using Quadro. I want to use 3D Vision and i'm also ready to try to mod my 670 to K5000 to get quadro features. I think 3D Vision dual projection will work with modded card. I'm bit unsure however if quadro gives framesynced dual output from one card...

I am happy that I am not alone!  :clap:

I also want to use the omega filters for the dual projector setup. And I think I will buy two projectors ACER h9500 (ONLY 2D and costs just 850€ with lens shift and 1300 Lumens calibrated!). I'm just worried about the 1:1 Hdmi Mapping problem but that's another story...

I know the tridef solution but it's not ideal and doesn't work with all the games...and you MUST use tridef.
I would like to have a choice and to be able to play every games: therefore the quadro solution.
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

I am pretty sure that a quadro card gives framesynced dual output from one card...
See the link: http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

The real question is: does it work with a modded gtx 680 as well?

Plus, I would like to go sli with 2 modded GTX 680 into quadro K5000. But I've read that the quadro just support sli for choosen "complete Workstation" from dell etc... However it may still be possible to do it since they won't be true K5000.

As someone succeded in activating SLI with 2 modded GTX 680 into quadro K5000?

@ Jager: how far are you from testing the dual projector setup with the modded gtx 670? When?  ::)

Thank you!
Soulnight  ;)



Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Jager on July 20, 2013, 10:40:52 pm

I am happy that I am not alone!  :clap:

I also want to use the omega filters for the dual projector setup. And I think I will buy two projectors ACER h9500 (ONLY 2D and costs just 850€ with lens shift and 1300 Lumens calibrated!). I'm just worried about the 1:1 Hdmi Mapping problem but that's another story...

I know the tridef solution but it's not ideal and doesn't work with all the games...and you MUST use tridef.
I would like to have a choice and to be able to play every games: therefore the quadro solution.
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

I am pretty sure that a quadro card gives framesynced dual output from one card...
See the link: http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

The real question is: does it work with a modded gtx 680 as well?

Plus, I would like to go sli with 2 modded GTX 680 into quadro K5000. But I've read that the quadro just support sli for choosen "complete Workstation" from dell etc... However it may still be possible to do it since they won't be true K5000.

As someone succeded in activating SLI with 2 modded GTX 680 into quadro K5000?

@ Jager: how far are you from testing the dual projector setup with the modded gtx 670? When?  ::)

Thank you!
Soulnight  ;)

Ordered resistors from ebay, those are quite tiny(1mm*0,5mm). I have done some researching and i think up to 2 GPU SLI is no problem to get synced frames, above that K5000 needs Sync card and it should work with modded ones too. After all surround setups for desktops are synced. When i get those resistors(39K is closest to 40K i found, hope it works) i do this mod immediately.
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Soulnight on July 21, 2013, 04:04:55 am
Ordered resistors from ebay, those are quite tiny(1mm*0,5mm). I have done some researching and i think up to 2 GPU SLI is no problem to get synced frames, above that K5000 needs Sync card and it should work with modded ones too. After all surround setups for desktops are synced. When i get those resistors(39K is closest to 40K i found, hope it works) i do this mod immediately.

I hope it will work with a 39K resistor... How do you know the size of the resistors you must take? What are the references?

The problem for SLI with quadro cards is that Nvidia doesn't enable the SLI function for quadro cards UNLESS they are used in specific Workstation that they have certified!

Here the link to nvidia sh**t:
http://www.nvidia.com/object/quadro_sli_compatible_systems.html (http://www.nvidia.com/object/quadro_sli_compatible_systems.html)

But I do hope that because they are "just" modded GTX 680 the false K5000 can still do sli WITHOUT working with a specific nvidia certified workstation...
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Jager on July 21, 2013, 04:28:10 pm
Ordered resistors from ebay, those are quite tiny(1mm*0,5mm). I have done some researching and i think up to 2 GPU SLI is no problem to get synced frames, above that K5000 needs Sync card and it should work with modded ones too. After all surround setups for desktops are synced. When i get those resistors(39K is closest to 40K i found, hope it works) i do this mod immediately.

I hope it will work with a 39K resistor... How do you know the size of the resistors you must take? What are the references?

The problem for SLI with quadro cards is that Nvidia doesn't enable the SLI function for quadro cards UNLESS they are used in specific Workstation that they have certified!

Here the link to nvidia sh**t:
http://www.nvidia.com/object/quadro_sli_compatible_systems.html (http://www.nvidia.com/object/quadro_sli_compatible_systems.html)

But I do hope that because they are "just" modded GTX 680 the false K5000 can still do sli WITHOUT working with a specific nvidia certified workstation...

Damn missed that one! This is bad news indeed :(. I hope it is fixable with some softmodding/just work with modded one. 690 should be affected as well. As for size of resistor, if mean physical size, i just measured those and other info comes from this thread.

edit: There is HyperSLI for non sli motherboards, maybe this works if needed...

edit2:Seems that HyperSLI will NOT work with Quadros.
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Soulnight on July 21, 2013, 05:15:30 pm
Damn missed that one! This is bad news indeed :(. I hope it is fixable with some softmodding/just work with modded one. 690 should be affected as well. As for size of resistor, if mean physical size, i just measured those and other info comes from this thread.

Yeah I mean the physical size of the resistors to use. Could someone post a link to "right" resistors on the internet? Thank you...
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Jager on July 21, 2013, 05:33:56 pm
Damn missed that one! This is bad news indeed :(. I hope it is fixable with some softmodding/just work with modded one. 690 should be affected as well. As for size of resistor, if mean physical size, i just measured those and other info comes from this thread.

Yeah I mean the physical size of the resistors to use. Could someone post a link to "right" resistors on the internet? Thank you...

I ordered this set. Few extra ones i think :) >> http://www.ebay.com/itm/0402-SMD-SMT-Chip-Resistor-Assortment-Kit-170-value-x-20-pcs-component-pack-/200940135046?pt=LH_DefaultDomain_0&hash=item2ec8f72286 (http://www.ebay.com/itm/0402-SMD-SMT-Chip-Resistor-Assortment-Kit-170-value-x-20-pcs-component-pack-/200940135046?pt=LH_DefaultDomain_0&hash=item2ec8f72286)

edit:Wrong link
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: gordan on July 21, 2013, 08:42:30 pm
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

How exactly do you figure a K5000 isn't good enough when a GTX680 is? The spec between them is near identical.

I am pretty sure that a quadro card gives framesynced dual output from one card...

I'm pretty sure all Nvidia cards do, going back at least to 8xxx series (I am running an IBM T221 DG1 off an 8800GT with two DVI outputs, and that monitor supposedly requires genlocked inputs).
Vsync is a subtly different issue, and again, I'm pretty sure all Nvidia cards do run vsynced across separate outputs if they are told to run multiple screens as a same frame buffer. Again, on a T221 DG1 ATI cards (4850, cannot use 5xxx+ since they only comne with single DL-DVI outputs) produce tearing along the middle of the screen (genlocked but not vsynced), but Nvidia cards (tried with 8800GT, various 4xx, 580 and 680 cards, quadrified and vanilla, and a Quadro 2000) do not produce the tearing artifact - which implies they all run multiple screens vsynced. So provided you configure your setup correctly, it should work just fine.

Plus, I would like to go sli with 2 modded GTX 680 into quadro K5000. But I've read that the quadro just support sli for choosen "complete Workstation" from dell etc... However it may still be possible to do it since they won't be true K5000.

I didn't think Quadro cards came with SLI bridge ports. My Quadro 2000 certainly doesn't.

One thing is for sure, though, two separate cards won't be providing genlocked output, which means they won't be vsynced either.
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Soulnight on July 21, 2013, 09:21:12 pm
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

How exactly do you figure a K5000 isn't good enough when a GTX680 is? The spec between them is near identical.

How, pretty simple, with comparisons:

http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0 (http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_8.html#sect0)

http://www.videocardbenchmark.net/gpu.php?gpu=Quadro+K5000 (http://www.videocardbenchmark.net/gpu.php?gpu=Quadro+K5000)

The GTX 680 is about 1,5 faster than the K5000 for games... ;)

And here are benchmark of a modded GTX 680 into a K5000... The performances don't change with the mod (what is good for me and bad for others...).
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155)
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Jager on July 21, 2013, 11:40:22 pm
The real quadro K5000 is not good enough for 3d games (and is expensive) and I really think that a modded gtx 680 (or gtx 670) is the perfect solution to get the functionnality of the quadro AND the 3D games performances of the GTX 680!

How exactly do you figure a K5000 isn't good enough when a GTX680 is? The spec between them is near identical.

I am pretty sure that a quadro card gives framesynced dual output from one card...

I'm pretty sure all Nvidia cards do, going back at least to 8xxx series (I am running an IBM T221 DG1 off an 8800GT with two DVI outputs, and that monitor supposedly requires genlocked inputs).
Vsync is a subtly different issue, and again, I'm pretty sure all Nvidia cards do run vsynced across separate outputs if they are told to run multiple screens as a same frame buffer. Again, on a T221 DG1 ATI cards (4850, cannot use 5xxx+ since they only comne with single DL-DVI outputs) produce tearing along the middle of the screen (genlocked but not vsynced), but Nvidia cards (tried with 8800GT, various 4xx, 580 and 680 cards, quadrified and vanilla, and a Quadro 2000) do not produce the tearing artifact - which implies they all run multiple screens vsynced. So provided you configure your setup correctly, it should work just fine.

Plus, I would like to go sli with 2 modded GTX 680 into quadro K5000. But I've read that the quadro just support sli for choosen "complete Workstation" from dell etc... However it may still be possible to do it since they won't be true K5000.

I didn't think Quadro cards came with SLI bridge ports. My Quadro 2000 certainly doesn't.

One thing is for sure, though, two separate cards won't be providing genlocked output, which means they won't be vsynced either.

Have you done some gaming with that monitor? It's sure that Dual projection Stereo 3D is out off sync with geforce line when using Tridef(up to one frame out of sync) and geforce do not even support dual output S3D via 3D Vision but Quadros support and it's synced. There is no support for 2 screen Surround for nvidia that provides synced signal so that Tridef could be used with SBS mode. Maybe GeForce drivers enables sync when IBM T221 DG1 is connected, like those drivers is going to do with those Sharp based 4K monitors(i think Asus one is allready supported).

K5000 is capable to support synced dual projection with up to 2-way SLI without need of Quadro Sync card, it can output synced Mosaic up to 8 displays. >> http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html (http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html)

AMD eyefinity puts out synced signal for expanded screens for fullscreen apps(3840x1080 for example).

Edit:correct link
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 22, 2013, 03:19:28 am
How, pretty simple, with comparisons:

I wouldn't be so quick to disregard K5000's gaming performance. The only thing the 680 has on it's side is higher clock speeds, but they are not _that_ much higher. And the GPU core is the same.

Have you done some gaming with that monitor?

Yes, running Crysis in a VM on a Quadrified GTX480 (Quadro 6000) on my 2nd T221 and there is no tearing whatsoever. Nor was there any tearing on my other T221 with the 8800GT card, but there is massive tearing visible with the Radeon 4850.

It's sure that Dual projection Stereo 3D is out off sync with geforce line when using Tridef(up to one frame out of sync) and geforce do not even support dual output S3D via 3D Vision but Quadros support and it's synced. There is no support for 2 screen Surround for nvidia that provides synced signal so that Tridef could be used with SBS mode. Maybe GeForce drivers enables sync when IBM T221 DG1 is connected, like those drivers is going to do with those Sharp based 4K monitors(i think Asus one is allready supported).

I suspect this is a different use case. The difference is that in my setup I am using a single large frame buffer to produce one large desktop across two separate "screens" (which just happen to form a single TFT panel).
In the case of 3D stuff, the frame buffer is probably not the same. Can you configure it as one large "desktop" with one set of 3D images being displayed on columns 1-1920 and the other set on columns 1921-3840?

K5000 is capable to support synced dual projection with up to 2-way SLI without need of Quadro Sync card, it can output synced Mosaic up to 8 displays. >> http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html (http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html)

The only way I can see you getting synced and genlocked frames is by all outputs to screens coming off a single card, i.e. traditional SLI. That allows you to do processing on two cards but the monitor signal all comes out of a single frame buffer and a single vsync. Any more than that requires an external genlock device (QuadroSync mentioned on that page is a separate hardware device).

AMD eyefinity puts out synced signal for expanded screens for fullscreen apps(3840x1080 for example).

I wouldn't want to even remotely take a chance on an ATI card for the next decade - too much ongoing, persistent disappointment in pretty much every aspect of both drivers and hardware over the past 3 years. It's going to take an enormous improvement before I would deem their products fit for any non-trivial purpose (i.e. single card with a single screen).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 22, 2013, 03:30:37 am
How, pretty simple, with comparisons:

I wouldn't be so quick to disregard K5000's gaming performance. The only thing the 680 has on it's side is higher clock speeds, but they are not _that_ much higher. And the GPU core is the same.


Higher clock speed and better power supply...and 350€ for the gtx 680 against 1500€ for the K5000.
For gaming purposes, when the modded GTX 680 does have the quadro functionnality, I do not see any advantages for the real K5000... ^-^
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 22, 2013, 04:06:12 am
How, pretty simple, with comparisons:

I wouldn't be so quick to disregard K5000's gaming performance. The only thing the 680 has on it's side is higher clock speeds, but they are not _that_ much higher. And the GPU core is the same.


Higher clock speed and better power supply...and 350€ for the gtx 680 against 1500€ for the K5000.
For gaming purposes, when the modded GTX 680 does have the quadro functionnality, I do not see any advantages for the real K5000... ^-^

Depends on what you want to use it for. Evidence thus far shows that at least some of the GL functionality seems to be disabled on hardware level (certain GL primitives are missing, see a dump of glxinfo for Quadrified GTS450 vs. a real Quadro 2000 a few pages back).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 22, 2013, 04:26:33 am
I suspect this is a different use case. The difference is that in my setup I am using a single large frame buffer to produce one large desktop across two separate "screens" (which just happen to form a single TFT panel).
In the case of 3D stuff, the frame buffer is probably not the same. Can you configure it as one large "desktop" with one set of 3D images being displayed on columns 1-1920 and the other set on columns 1921-3840?

There is no way to do single wide "desktop" for two monitors and using it for gaming. This is where nVidia surround is needed and it do not support 2 monitors. I can off course do wide desktop or clone one to another. Sharp 4K based monitors use same large frame trick(1920x2160+1920x2160) as your monitor to achieve 4k 60hz resolution so maybe there is driver enabled "mosaic/surround" for your setup too. Sharp 4k is not supported yet and therefor not work but ASUS(rebranded sharp) work with latest drivers, so driver look up if ASUS is connected and enables "mosaic/surround" support for it.

The only way I can see you getting synced and genlocked frames is by all outputs to screens coming off a single card, i.e. traditional SLI. That allows you to do processing on two cards but the monitor signal all comes out of a single frame buffer and a single vsync. Any more than that requires an external genlock device (QuadroSync mentioned on that page is a separate hardware device).

Link says otherwise, with K5000 you can do synced Mosaic up to 8 screen with SLI OR with Quadro Sync. You can use outputs(limited) from 2 cards when doing Surround with geforces(600 series and up) too and surround is always synced.
Title: Re: Passiv 3D with dual projection and Nvidia 3D vision
Post by: Soulnight on July 22, 2013, 04:31:01 am
Depends on what you want to use it for. Evidence thus far shows that at least some of the GL functionality seems to be disabled on hardware level (certain GL primitives are missing, see a dump of glxinfo for Quadrified GTS450 vs. a real Quadro 2000 a few pages back).

As I already said, I need this quadro function:

Hello everybody,

I'm new on this forum and I would like to thank everybody (and especially gnif) for the mod of geforce GTX 680 to Quadro K5000.

I'm a gamer. I don't want more performance but just one (quadro) functionnality more.

I play games in 3D with the help of nvidia 3D vision on my monitor (input signal 1080P 3D 120Hz).
I would like to do the same but with passive 3D dual projection. I need therefore the option of a quadro K5000 to activate the possibility of passive stereo with 2 projectors connected to the graphic card. Here is the link to the procedure to do passive stereo with a quadro.

http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

Could someone please confirm that the option is available with a modded GTX 680 to quadro K5000?

Thank you,
Soulnight  ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 22, 2013, 04:39:55 am
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 22, 2013, 04:42:41 am
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 22, 2013, 04:53:59 am
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)

That's true, i don't need it but someone could like to use RF glasses. It should work with any 3D vision compatible device as well and can be used for gaming like normal 3D vision. IR was not good for my setup, flashes etc and needed looong usb cable but not using this anymore because W1070 is dlp-link and off course goal is passive 3D...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 22, 2013, 05:02:56 am
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)

That's true, i don't need it but someone could like to use RF glasses. It should work with any 3D vision compatible device as well and can be used for gaming like normal 3D vision. IR was not good for my setup, flashes etc and needed looong usb cable but not using this anymore because W1070 is dlp-link and off course goal is passive 3D...

I'm sorry to tell you that but the benq W1070 (great projector) is not so good for the omega filters. It' a DLP: great! But the throw ratio is not big enough (max 1,5:1 with full Zoom) and you will have color uniformity problems with it. Motormann advises a throw ratio of Minimum 1,5:1 and ideally 2:1.

Source: http://www.avsforum.com/t/1407101/official-omega-3d-passive-projection-system-thread/150#post_22927789 (http://www.avsforum.com/t/1407101/official-omega-3d-passive-projection-system-thread/150#post_22927789)

I think that if you cannot max out the zoom with yours W1070, you should buy 2 cheap H9500 (2D) which have vertical et horizontal Lenshift and are all together better than the W1070.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 22, 2013, 05:14:31 am
Looks like 670 pcb and K5000 are exactly same. Components are same etc. However 3D Vision pro connector is missing. Wonder if soldering this would give this option too...Other difference is that K5000 have soldered power cords and 670 have connectors soldered on board.

Yes but what for? The 3D vision pro is to use with professional application and those do need the performance of a true K5000. ;)

That's true, i don't need it but someone could like to use RF glasses. It should work with any 3D vision compatible device as well and can be used for gaming like normal 3D vision. IR was not good for my setup, flashes etc and needed looong usb cable but not using this anymore because W1070 is dlp-link and off course goal is passive 3D...

I'm sorry to tell you that but the benq W1070 (great projector) is not so good for the omega filters. It' a DLP: great! But the throw ratio is not big enough (max 1,5:1 with full Zoom) and you will have color uniformity problems with it. Motormann advises a throw ratio of Minimum 1,5:1 and ideally 2:1.

Source: http://www.avsforum.com/t/1407101/official-omega-3d-passive-projection-system-thread/150#post_22927789 (http://www.avsforum.com/t/1407101/official-omega-3d-passive-projection-system-thread/150#post_22927789)

I think that if you cannot max out the zoom with yours W1070, you should buy 2 cheap H9500 (2D) which have vertical et horizontal Lenshift and are all together better than the W1070.

No problem to max out zoom, it's so short throw anyway. I will mod those filters inside too in the future so it's no issue. Lens shift is a bit off issue but it's just enough. Room temp is another :) but this is all for another thread...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 22, 2013, 06:48:04 am
@ Jager: we should talk about filters and projectors on the appropriate forum on avs. See you there!
And I'm really eager to read your modding results!  :D
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 22, 2013, 07:39:38 am
I suspect this is a different use case. The difference is that in my setup I am using a single large frame buffer to produce one large desktop across two separate "screens" (which just happen to form a single TFT panel).
In the case of 3D stuff, the frame buffer is probably not the same. Can you configure it as one large "desktop" with one set of 3D images being displayed on columns 1-1920 and the other set on columns 1921-3840?

There is no way to do single wide "desktop" for two monitors and using it for gaming. This is where nVidia surround is needed and it do not support 2 monitors. I can off course do wide desktop or clone one to another.

I can tell you right now that you are completely wrong. I am running XP x64 and have a T221 running which the GPU sees as two separate, independent 1920x2400 monitors. You go to the Nvidia control panel and tell it to stretch the desktop across both monitors (horizontal span). This gives me a single, seamless 3840x2400 desktop, and the 3840x2400 mode is available in games too (as I explained earlier, I am running Crysis on this, and have used it with a real Quadro 2000, GTS450, Quadrified (2000) GTS450, GTX470, Quadrified GTX470 (5000), GTX480, Quadrified GTX480 (6000), GTX580, Quadrified GTX580 (7000) and a GTX680 (not modified into a Quadro yet, will be modifying to Grid K2 in the next few days).

It works just fine, it always has. Which is much more than I can say for ATI cards.

I make no statement about the lesser versions of Windows and driver capability on them (Vista, 7 and 8) but this works perfectly (and with normal, unmodified GeForce cards) on XP/XP64.

Sharp 4K based monitors use same large frame trick(1920x2160+1920x2160) as your monitor to achieve 4k 60hz resolution so maybe there is driver enabled "mosaic/surround" for your setup too. Sharp 4k is not supported yet and therefor not work but ASUS(rebranded sharp) work with latest drivers, so driver look up if ASUS is connected and enables "mosaic/surround" support for it.

Don't know about the Sharp/Asus monitor, but T221 works without any special driver coordination. Two discrete monitors work the same - you can stretch the desktop across both and this mode is then available to games.

The only way I can see you getting synced and genlocked frames is by all outputs to screens coming off a single card, i.e. traditional SLI. That allows you to do processing on two cards but the monitor signal all comes out of a single frame buffer and a single vsync. Any more than that requires an external genlock device (QuadroSync mentioned on that page is a separate hardware device).

Link says otherwise, with K5000 you can do synced Mosaic up to 8 screen with SLI OR with Quadro Sync. You can use outputs(limited) from 2 cards when doing Surround with geforces(600 series and up) too and surround is always synced.

The key word being with QuadroSync this is not a piece of software, it is a separate, external genloc device costing more than a GTX680 card will cost you. Without QuadroSync, I am pretty sure that this works using the SLI trick and all the screens end up driven off the same card, the secondary card ends up being used only for processing, not for generating screen output. By all means tell me I'm wrong if you have actually tried this setup, but in the absence of any evidence to the contrary this is by far the most likely possibility.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 22, 2013, 09:00:29 am
Gordan, you do not need Quadro Sync addon card with 2 GPU SLI and synced Mosaic up to 8 displays with K5000, if you have even more K5000's installed then Quadro Sync Card is needed. Other Quadros need Sync card when more then one installed. All this info you can find from nVidia site. You can also use outputs from second card just like SLI surround and SLI stereo3D surround with Geforces(600 series and up and only 3 monitor supported). Surround is always synced with geforces too. Surround is just Mosaic for desktop.

look table "Number of synchronized displays/projectors from a single system with NVIDIA® Mosaic technology"
here >>> http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html (http://www.nvidia.com/object/quadro-scalable-visualization-solutions.html)

Well i think XP may have this expanded thing working. It is NOT working in windows 7. There is no way to game with 3840x1080p in Win7 with two screens WITH GeForce. You need nVidia surround for multimonitor gaming and even requested several times nVidia is not adding support for 2 monitor surround, they feel no one like to play bezel in middle...

EDIT: It was actually Fermi era that introduced 3D Vision Surround and that needed SLI for extra outputs so 3 monitors could be used. With Keplers you can do it with single card. Fermi based Quadro's seems to be able to do 4 monitor mosaic without need of sync card when SLI.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 22, 2013, 09:12:06 am
Maybe Windows 7 is deficient in this way - I wouldn't know. If it is why would you want to use it?
All I can say is that XP64 and Linux both work wonderfully with this setup. :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 22, 2013, 09:49:51 am
Maybe Windows 7 is deficient in this way - I wouldn't know. If it is why would you want to use it?
All I can say is that XP64 and Linux both work wonderfully with this setup. :)

Windows 7 is good most of the stuff i do. It's more at nVidia site this time. I just hope that Quadro hack will fix things :). After all there is no point to buy "underclocked" Kepler for gaming that cannot even do SLI without certified system...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: hanyuefeng on July 22, 2013, 05:53:30 pm
Hi everyone,

I`m new in this area and track this post for a time.

My situation is that I have IBM T221 and a GTX 590, however, it didn`t support mosaic mode so that cannot give me 3840x2400 decently, thus I had to buy another ATI card which has analogous feature.

I see there are methods to mod resistors or mod soft-straps to convert a geforece card to a quadro one, and know 450 and 600 series can been done in that way. But I don`t how to deal with my idle 590 as for my limited knowledge, could anyone provide some detail info on it?

Any help will be very appreciated.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 23, 2013, 01:20:00 am
My situation is that I have IBM T221 and a GTX 590, however, it didn`t support mosaic mode so that cannot give me 3840x2400 decently, thus I had to buy another ATI card which has analogous feature.

Don't know what you're talking about, I've been running a T221 successfully at seamless single desktop resolution of 3840x2400 with Nvidia cards from 8800GT to various 4xx series cards, a GTX580 and a GTX680, before and after quadrifying the cards without any issues at all. No modification is required to get Nvidia cards working perfectly with T221 monitors, at least in XPx64 and Linux (the only two OS-es I use).

My experience with ATI cards and T221 has been utterly dire. The most recent hardware I've had moderate success with on T221 has been the Radeon 4xxx series cards, and windows drivers of that approximate generation (2011). This is because of the following:

1) No ATI cards I have ever found that are later than 4xxx series have more than a single DL-DVI output. If you are using DL-DVI->LFH60 adapters (google for Cirthix T221 and you'll find it) you need 2x DL-DVI ports.

2) If you are using 4x SL-DVI ports, you are even more out of luck because no ATI card comes with quad DVI ports. Nvidia cards don't either. You could potentially get away with running two cards on a DG5/DGP T221, but DG1 and DG3 require genlocked outputs (allegedly - never tested this myself with multiple cards on my DG3 - perhaps I should) which won't be the case with outputs from 2 cards.

3) If you are using 2xSL-DVI, and you want more than 25Hz, you won't be able to achieve it with ATI cards because any recent ATI driver will flat out refuse to do any mode except what the monitor reports via EDID. The driver will even ignore custom monitor .inf drivers. If you use an Nvidia card (or use the open source radeon Xorg driver on Linux which supports custom modelines) you can actually achieve about 33Hz with 2x SL-DVI outputs.

4) Nvidia cards happily reliably push 165MHz on each SL-DVI output (maximum standard supports). My Radeon 4850 produces strange blue artifacts when pushing 165MHz, I had to drop it down to 160MHz.

5) ATI cards produce pretty obvious tearing between the two halves of the screen. This is obvious in all applications (e.g. dragging an application window around the center of the screen, very obvious in full screen games at 3840x2400 or when playing back HD videos). Nvidia cards do not suffer from this - they seem to use a single large framebuffer so there are no sync issues.

It is all these issues with ATI cards that a T221makes particularly obvious that are the main reason why I consider ATI cards unfit for purpose.

The only reason I am using a 4850 at all is because I have a slightly broken motherboard PCIe slot that seems to have a failed PCIe lane #1. ATI cards seem to be clever enough to go into x8 mode and use the 2nd 8 lanes while all my other cards (GPU and otherwise) don't do that and thus don't detect in that slot. This is the only good thing I can say about the ATI cards.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: hanyuefeng on July 23, 2013, 05:21:41 am
Thanks for your reply, gordan.

My mode is DG3 and I have 2 specified DL-DVI cables which brought from japanese seller and each one hava an EDID chip socket on it so you can easily control the refresh rate by installting different chip, however, I don`t want to two 1920x2400@60Hz to combine an 3840x2400@60Hz which is the only way my 590 providing me. So that is why I want ATI Eyefinity or Nvidia Mosaic which only bundled with quadro cards can meet my requirement.

By the way, gordan, have you successfully modded any 500 series geforce card into quadro ? any experience?

Thank you very much.
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 23, 2013, 06:04:09 am
OK, with a DG3 you cannot use the Cirthix adapter because the input circuitry is slightly different. As far as I can tell, on the DG5 you can do this because it accepts 1920x2400@48Hz in interlaced format, so one half of the DL carries odd lines, and the other even lines. On the DG3 this won't work.

The only thing you can really do on the DG3 is either use 2xSL-DVI for 32-33Hz or 4xSL-DVI for more. I haven't tried feeding 4xSL-DVI into my DG3, but if it is true that the signal needs to be genlocked you are pretty much out of luck - all you can really do is get an old Nvidia NVS card with 2xLFH60 ports and use that to drive the monitor. What you _might_ be able to do is drive it using 2xSL-DVI + HDMI (as another SL-DVI) + DP (with a passive SL-DVI adapter). That might actually give you genlocked 4xSL-DVI outputs. The only problem is that if you try this with an ATI card you will find that the vast majority of them will only let you use up to 3 outputs simultaneously, which shoots down the idea of using ATI cards unless you get one of the older eyefinity models.

Note: The most you will ever get out of a T221 with custom modes (so forget ATI since they won't do custom modes as I explained earlier) is 55Hz using 2xDL-DVI inputs (only possible on the DG5). If you are using 4xSL-DVI it will probably be less than 55Hz max because each input has to have separate blanking which will eat into the bandwidth. By default DG5 supports up to 48Hz, and I'm finding this is perfectly fine even for gaming, so I never bothered to "overclock it" to 55Hz. If you decide to try doing that you might want to consider using thermal epoxy to glue some heatsinks to the FPGA chips inside the T221.

I use my DG3 for general purpose desktop work and am driving it with 2xSL-DVI @ 32Hz (with an Nvidia card I can get 33Hz out of it - with ATI I have to knock the clock rate from 165MHz down to 160MHz which results in 1Hz less refresh).

I have never used Mosaic on these, I never needed it. In XP64 I just go into the Nvidia control panel, configure multiple displays, and configure it for horizontal span. It just works. It is possible that they decided to try to extort more money out of customers by removing this feature on Vista and Windows 7 - I never tried it on those.

And yes, a GTX580 can be modified into a Quadro 7000. I am currently selling mine if you are interested:
http://www.ebay.co.uk/itm/281138321842 (http://www.ebay.co.uk/itm/281138321842)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: hanyuefeng on July 25, 2013, 01:35:49 am
I see your point gordan, however, I already have a 590 idle so I`m curious your work on 580 but not itself, could you share some details as an reference to my 590 modding.

Thanks. ;)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 25, 2013, 06:43:16 pm
K6000 Quadro. http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx (http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx). 780/TITAN hack needed :)
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 25, 2013, 07:44:33 pm
K6000 Quadro. http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx (http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx). 780/TITAN hack needed :)

YES! A hack of the GTX 780 would be needed indeed... A lot more of horse power for playing with 3D vision in 1080p! Someone?

How are you doing Jager? I'm waiting for you to mod your gtx 670 and to tell me that it work for 3d passiv dual projection with 3d vision you know :)
Have you bought your second BENQ W1070 yet?
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 25, 2013, 10:07:52 pm
K6000 Quadro. http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx (http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-New-Flagship-GPU-for-Visual-Computing-9e3.aspx). 780/TITAN hack needed :)

YES! A hack of the GTX 780 would be needed indeed... A lot more of horse power for playing with 3D vision in 1080p! Someone?

How are you doing Jager? I'm waiting for you to mod your gtx 670 and to tell me that it work for 3d passiv dual projection with 3d vision you know :)
Have you bought your second BENQ W1070 yet?

Another W1070 arrived to local post office, will collect it today. I have been trying to find those resistors from old stuff i have but not found, so i need to wait those ebay resistors.
 You can get 780 for 550€ and overclock those to close TITAN performance or even better. There is also bios mods that disables boost feature and forces clocks to say 1200 or whatever. Maybe this will give boost to those pro programs that do not use correct power state(if there is any) of modded quadro's.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 26, 2013, 03:37:04 am
K6000 looks pretty sweet, but the press release says 2880 cores. Titan only has 2688. Modding it might not work. It has been reported before that modifying a 144-shader GTS450 into a Quadro 2000 doesn't work, while I have successfully modified 192 shader versions into Quadro 2000s, which implies you have to have at least as many shaders and memory controllers for the modification to work.

Also the K series seems to not be supported for MultiOS VGA passthrough virtualization, and there is nothing listed in the press release about this being available on the K6000. Since one of the main benefits of modifying a GeForce into a Quadro is precisely for enabling VGA passthrough, the benefits from a Titan->K6000 mod seem much slimmer even if it does work.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 26, 2013, 05:16:50 am
K6000 looks pretty sweet, but the press release says 2880 cores. Titan only has 2688. Modding it might not work. It has been reported before that modifying a 144-shader GTS450 into a Quadro 2000 doesn't work, while I have successfully modified 192 shader versions into Quadro 2000s, which implies you have to have at least as many shaders and memory controllers for the modification to work.

Also the K series seems to not be supported for MultiOS VGA passthrough virtualization, and there is nothing listed in the press release about this being available on the K6000. Since one of the main benefits of modifying a GeForce into a Quadro is precisely for enabling VGA passthrough, the benefits from a Titan->K6000 mod seem much slimmer even if it does work.

GTX 670 to "K5000" seems to work so maybe this is doable.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 26, 2013, 05:19:49 am
K6000 looks pretty sweet, but the press release says 2880 cores. Titan only has 2688. Modding it might not work. It has been reported before that modifying a 144-shader GTS450 into a Quadro 2000 doesn't work, while I have successfully modified 192 shader versions into Quadro 2000s, which implies you have to have at least as many shaders and memory controllers for the modification to work.

Also the K series seems to not be supported for MultiOS VGA passthrough virtualization, and there is nothing listed in the press release about this being available on the K6000. Since one of the main benefits of modifying a GeForce into a Quadro is precisely for enabling VGA passthrough, the benefits from a Titan->K6000 mod seem much slimmer even if it does work.

GTX 670 to "K5000" seems to work so maybe this is doable.

What is exactly your GTX 670? Has someone already modded exactly your model or do you hope you will find which resistors you need to modify?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 26, 2013, 05:38:01 am
K6000 looks pretty sweet, but the press release says 2880 cores. Titan only has 2688. Modding it might not work. It has been reported before that modifying a 144-shader GTS450 into a Quadro 2000 doesn't work, while I have successfully modified 192 shader versions into Quadro 2000s, which implies you have to have at least as many shaders and memory controllers for the modification to work.

Also the K series seems to not be supported for MultiOS VGA passthrough virtualization, and there is nothing listed in the press release about this being available on the K6000. Since one of the main benefits of modifying a GeForce into a Quadro is precisely for enabling VGA passthrough, the benefits from a Titan->K6000 mod seem much slimmer even if it does work.

GTX 670 to "K5000" seems to work so maybe this is doable.

What is exactly your GTX 670? Has someone already modded exactly your model or do you hope you will find which resistors you need to modify?

I have reference design 670. Same pcb as K5000 have. My resistors arrived too and will do some soldering tomorrow or day after.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 26, 2013, 05:51:23 am
K6000 looks pretty sweet, but the press release says 2880 cores. Titan only has 2688. Modding it might not work. It has been reported before that modifying a 144-shader GTS450 into a Quadro 2000 doesn't work, while I have successfully modified 192 shader versions into Quadro 2000s, which implies you have to have at least as many shaders and memory controllers for the modification to work.

Also the K series seems to not be supported for MultiOS VGA passthrough virtualization, and there is nothing listed in the press release about this being available on the K6000. Since one of the main benefits of modifying a GeForce into a Quadro is precisely for enabling VGA passthrough, the benefits from a Titan->K6000 mod seem much slimmer even if it does work.

GTX 670 to "K5000" seems to work so maybe this is doable.

What is exactly your GTX 670? Has someone already modded exactly your model or do you hope you will find which resistors you need to modify?

I have reference design 670. Same pcb as K5000 have. My resistors arrived too and will do some soldering tomorrow or day after.

You mean today!!!  :clap: And then you buy a gtx 780, you mod it, and you tell us how to do it. And then I follow your steps :)  ::)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 26, 2013, 07:00:41 am
You mean today!!!  :clap: And then you buy a gtx 780, you mod it, and you tell us how to do it. And then I follow your steps :)  ::)

I will try 670 to "k5000" without failing first. "K5000" SLI is main target at the moment but i think it will never work without certificated workstation. Maybe someone do HyperSLI for quadro some day but i think it's not going to happen. I wish i had skill's to find out if 780 to K6000 is even possible :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 26, 2013, 08:47:10 am
You mean today!!!  :clap: And then you buy a gtx 780, you mod it, and you tell us how to do it. And then I follow your steps :)  ::)

I will try 670 to "k5000" without failing first. "K5000" SLI is main target at the moment but i think it will never work without certificated workstation. Maybe someone do HyperSLI for quadro some day but i think it's not going to happen. I wish i had skill's to find out if 780 to K6000 is even possible :)

And where are all the skilled people of this forum? All in vacations? Or did nvidia have them assassinate?  :wtf:
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on July 26, 2013, 09:22:21 am
And where are all the skilled people of this forum? All in vacations? Or did nvidia have them assassinate?  :wtf:

This is an Electronics board and there hasn't been any real electronics talk in the past 10 pages or more. Probably why this got moved to general chat. And while some cards may be able to have the straps identified based on the similarity to another confirmed card, to my knowledge no one here has a 780 to find the correct resistors to begin with.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 26, 2013, 04:10:09 pm
Well i managed to order capacitors instead of resistors :palm:  |O Need to find 40K somewhere, 15K i all ready have...

PS.Link i posted earlier was wrong, edited it.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 26, 2013, 05:45:56 pm
Well i managed to order capacitors instead of resistors :palm:  |O Need to find 40K somewhere, 15K i all ready have...

PS.Link i posted earlier was wrong, edited it.

Oh no!  :palm: You should go shopping in town today. It must not be so hard to find a small electronic store with such resistors.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 26, 2013, 05:48:53 pm
I'm pretty sure 15K is all you actually need. Modify the 3rd nibble to change the ID from 0x1189 to 0x11A9. From there on you can use the 5 bits of the soft strap to switch between K5000 or GTX680M (which has the same spec as GTX670). It's what I'll be doing to my GTX680 this weekend. Make the hard-strap ID 0x11A0 (GTX680M) instead of 0x1180 (GTX680) by changing only the 3rd nibble resistor. Then from that I can soft-mod to GTX680MX (0x11A3, same spec as GTX680) for non-Quadro uses or a Grid K2 (0x11BF) for virtualization.

You may find it preferable to just strip out all the UEFI crap out while you're BIOS hacking. I keep meaning to write up a BIOS modding guide for hacking most things since the 4xx series onward, but I've not had the time to do it in the past month. :(
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 26, 2013, 06:01:35 pm
I'm pretty sure 15K is all you actually need. Modify the 3rd nibble to change the ID from 0x1189 to 0x11A9. From there on you can use the 5 bits of the soft strap to switch between K5000 or GTX680M (which has the same spec as GTX670). It's what I'll be doing to my GTX680 this weekend. Make the hard-strap ID 0x11A0 (GTX680M) instead of 0x1180 (GTX680) by changing only the 3rd nibble resistor. Then from that I can soft-mod to GTX680MX (0x11A3, same spec as GTX680) for non-Quadro uses or a Grid K2 (0x11BF) for virtualization.

You may find it preferable to just strip out all the UEFI crap out while you're BIOS hacking. I keep meaning to write up a BIOS modding guide for hacking most things since the 4xx series onward, but I've not had the time to do it in the past month. :(

I was wrong... There is still an expert among us!  :-+

And would you be so nice to try for us (with for example 2 screens instead of a dual projector stack) if you can activate 3D vision with a quadro card when you already have activated the option: passive stereo?

In the help of nvidia, I understand it is possible but I can't be sure until someone has tried it out:
http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7 (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7)

You don't need an 3D vision emitter to activate 3D vision since there is an emulator for this:
http://3dvision-blog.com/forum/viewtopic.php?f=9&t=997 (http://3dvision-blog.com/forum/viewtopic.php?f=9&t=997)

Thank you a lot! You would be the first on earth to answer this "simple" question ! 8) :clap:
Title: GTX 670 into quadro K5000 mod: complicated?
Post by: Soulnight on July 26, 2013, 10:32:16 pm
I can confirm the mod done by blanka.
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)
But I pimped it a little bit.

670GTX to K5000 works!

R4 on the front side.
R1, R2, R3 on the bottom side.

K5000 works absolutely stable for me, but has no performance increase in SPECviewperf. I tested with few different Quadro drivers.

Summary
GPU Name         R1 / 0-7 4th byte        R2 / 8-f 4th byte   R3/ 3th (high)   R4 / 3th (low)
GTX 660Ti          20K                            None                       None             25k
GTX 670            None                           10K                        None              25k
tesla k10           none                            40K                         None            25k
Quadro k5000    none                            15k                          40K            none
grid k2              none                            40K                          40K            none

I flashed it (EVGA 670GTX 2GB 915MHz) with the K5000 bios from techpowerup.
"nvflash.exe -4 -5 -6 K5000.rom" had to be used because of different subsystem and board id.

It started with minor pixel errors but booted into win7.
After driver installation and reboot win7 didn't start anymore.
Flashing it back worked without problems.

It's me or the mod made by blanka seemed to be a lot more complicated? Do you need a cable like blanka did? Blanka also said you needed to put R3 manually since there was no place for it but shlomo.m didn't seem to have this problem.

Here the link to the picture of shlomo.m:
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534)

Here the link for blanka mod with pictures as well:
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)

What do you think?

ps: I would like to buy a gtx 670 instead of a gtx 680 because the performances are about the same even with 3D vision and the gtx 670 costs 100€ less! In germany, the GTX 670 is selling for 260€! What model should I take?
What was exactly the model of shlomo.m: EVGA 670GTX 2GB 915MHz  ???

Thank you!


Title: Re: GTX 670 into quadro K5000 mod: complicated?
Post by: Jager on July 26, 2013, 10:56:18 pm

It's me or the mod made by blanka seemed to be a lot more complicated? Do you need a cable like blanka did? Blanka also said you needed to put R3 manually since there was no place for it but shlomo.m didn't seem to have this problem.

Here the link to the picture of shlomo.m:
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534)

Here the link for blanka mod with pictures as well:
https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg210798/#msg210798)

What do you think?

ps: I would like to buy a gtx 670 instead of a gtx 680 because the performances are about the same even with 3D vision and the gtx 670 costs 100€ less! In germany, the GTX 670 is selling for 260€! What model should I take?
What was exactly the model of shlomo.m: EVGA 670GTX 2GB 915MHz  ???

Thank you!

I'm doing it like shlomo.m did. There is reference design PCB's, custom PCB's and those that use 680 PCB. So i think ref/680 PCB is safe bet.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on July 26, 2013, 11:10:07 pm
Is it going well?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on July 29, 2013, 05:56:37 pm
OK, so I finally took a soldering iron to my GTX680 - and found that I changed the wrong resistor.  :palm:
Which made me re-read these posts:

https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg202901/#msg202901 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg202901/#msg202901)
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg209227/#msg209227 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg209227/#msg209227)
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550)

It looks like there is an inconsistency in the findings, or at least an inconsistency in the way the ID numbers are set. Specifically, for the 4th nibble, the translation as per the first post works:

4th nibble, Resistor 2
5K   = 0
10K = 1
15K = 2
20K = 3
25K = 4
30K = 5
35K = 6
40K = 7

4th nibble, Resistor 3
5K   = 8
10K = 9
15K = A
20K = B
25K = C
30K = D
35K = E
40K = F

That's all well and good, but it seems like a waste of time since the bottom 5 bits (4th nibble plus the last bit of the 3rd nibble) can actually be changed by modifying the soft-strap. So the real juicy part is modifying the 3rd nibble since this is required to change the part of the ID that cannot be manipulated using only the soft strap. So you can go from GTX680 -> Tesla K10 by only changing the soft strap, but you cannot change to a K5000 or K2.

But - the 3rd nibble translation table does not appear to be the same (for resistors 1 and 2). Specifically, 3rd post linked above, from verybigbadboy, states:

Summary
GPU Name   Resistor 0 / 3rd byte   Resistor 1 / 3rd byte   Resistor 2 / 8-f 4th byte   Resistor 3 / 0-7 4th byte
GTX 660 ti   none   25k   none   20k
GTX 670   none   25k   10k   none
GTX 680   none   25k   none   5k
GTX 770   none   25k   none   25k
tesla k10   none   25k   40k   none
quadro k5000   40k   none   15k   none
grid k2   40k   none   40k   none

3rd symbol on K5000 and K2 is B; according to the original translation table, B should be:
Resistor 0: 20K resistor
Resistor 1: none

For the other cards listed the 3rd nibble is 8, which should be (5K, none).

Has anybody figured out what the full translation table here is?

I can sort of infer*:
25K=8
*30K=9
*35K=A
40K=B

but it is unclear why they are not set on the same resistor - and where exactly does the division fall between the two. I'm going to hazard a guess (and test when I get some 35K resistors) that it is resistor 1 that needs to be modified, but that is only a guess at the moment.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on July 31, 2013, 08:20:57 pm
Ok, did some soldering. Goal is to mod GTX 670  to K5000. I have PNY ref desing GTX670.
Did it like  shlomo.m did here -->https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg217534/#msg217534)

Used this table, but not able to find 40K so used 39K instead.

GPU Name         R1 / 0-7 4th byte        R2 / 8-f 4th byte   R3/ 3th (high)   R4 / 3th (low)
GTX 660Ti          20K                            None                       None             25k
GTX 670            None                           10K                        None              25k
tesla k10           none                            40K                         None            25k
Quadro k5000    none                            15k                          40K            none
grid k2              none                            40K                          40K            none

My GTX670 shows up as GRID K2, and removing R3/3th (high) it still show up as GRID K2. So 39K do not work after all? Will try 43K after i get those.

edit: none - none - none - none gives also GRID K2 ID...

edit:Got it working with potentiometer but there seem to be no benefits do this mod. No way to enable dual passive 3D after all. No mosaic either and it seems that this mod just allows to install Quadro drivers but to no benefit at all. Only quadro feature that worked was nView that you can "hack" without hard mod. Did try several drivers but no help. Mosaic utility reported no support...Maybe there is need for a BIOS mod to get things working....
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 01, 2013, 09:40:33 am
Interesting, good work, Jager. I'm surprised that Mosaic doesn't work, it did on older cards. It looks like you only managed to achieve modifying it to Grid K2, though. The main advantage of modifying to a Grid card is VGA passthrough. It is quite plausible that Mosaic and 3D are not supported on the Grid series because they are specialist cards for virtualization. You might find it works when you change it to a K5000.

Note that if you have the card in Grid K2 mode (make sure it is reliably detected as such, I think you are supposed to have a 40K resistor rather than disconnected or things can become problematic), you can modify to K5000 using a BIOS only mod. Strip out the UEFI header (first 1024 bytes (everything up to 0x400, you'll find the AA55 header marking the beginning of the real BIOS. Then strip out the tail (UEFI crypto certs and a bunch of whitespace, trim it out, the end BIOS should be a little under 64KB). Then you can use most of the normal tools to edit it like before. Edit the device ID in the BIOS, re-calculate the checksum (using nibitor or write a program to do it for you), nvflash it to the card and then use nvflash to change the straps to match the device ID and you should be good to go.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 01, 2013, 04:15:03 pm
Jager told me in MP that he succeeded in having the GTX 670 showing as K5000 but no success for mosaic...

@ Gordan: could you try to activate mosaic with a modded GTX 680?

Thank you a lot!  :-+
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 01, 2013, 04:55:27 pm
@ Jager: here: http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

They do not talk about mosaic, do they?  ???
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 01, 2013, 05:04:26 pm
It should be noted that this mod was originally performed not to get a high performance Quadro or Telsa card, it was done to unlock additional features such as Mosaic support which does indeed work.

It should work Jager...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 01, 2013, 06:44:58 pm
Interesting, good work, Jager. I'm surprised that Mosaic doesn't work, it did on older cards. It looks like you only managed to achieve modifying it to Grid K2, though. The main advantage of modifying to a Grid card is VGA passthrough. It is quite plausible that Mosaic and 3D are not supported on the Grid series because they are specialist cards for virtualization. You might find it works when you change it to a K5000.

Note that if you have the card in Grid K2 mode (make sure it is reliably detected as such, I think you are supposed to have a 40K resistor rather than disconnected or things can become problematic), you can modify to K5000 using a BIOS only mod. Strip out the UEFI header (first 1024 bytes (everything up to 0x400, you'll find the AA55 header marking the beginning of the real BIOS. Then strip out the tail (UEFI crypto certs and a bunch of whitespace, trim it out, the end BIOS should be a little under 64KB). Then you can use most of the normal tools to edit it like before. Edit the device ID in the BIOS, re-calculate the checksum (using nibitor or write a program to do it for you), nvflash it to the card and then use nvflash to change the straps to match the device ID and you should be good to go.

Hi, i got it working as K5000 with using pot @40k.
Using mosaic utility did not allowed me to set 3840x1080. I did try it several drivers. There is no additional setup options in nVidia control panel so no "workstation" tree. I did try Mosaic utility but using "query lgpu" it shows supportmosaic=0 +some other information. Whenever i try to enable mosaic with "set rows=1 cols=2  out=0,0 out=0,1 res=3840,1080,60" it returns error flag not supported. Will do more testing later.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 01, 2013, 06:51:36 pm
Interesting, good work, Jager. I'm surprised that Mosaic doesn't work, it did on older cards. It looks like you only managed to achieve modifying it to Grid K2, though. The main advantage of modifying to a Grid card is VGA passthrough. It is quite plausible that Mosaic and 3D are not supported on the Grid series because they are specialist cards for virtualization. You might find it works when you change it to a K5000.

Note that if you have the card in Grid K2 mode (make sure it is reliably detected as such, I think you are supposed to have a 40K resistor rather than disconnected or things can become problematic), you can modify to K5000 using a BIOS only mod. Strip out the UEFI header (first 1024 bytes (everything up to 0x400, you'll find the AA55 header marking the beginning of the real BIOS. Then strip out the tail (UEFI crypto certs and a bunch of whitespace, trim it out, the end BIOS should be a little under 64KB). Then you can use most of the normal tools to edit it like before. Edit the device ID in the BIOS, re-calculate the checksum (using nibitor or write a program to do it for you), nvflash it to the card and then use nvflash to change the straps to match the device ID and you should be good to go.

Hi, i got it working as K5000 with using pot @40k.
Using mosaic utility did not allowed me to set 3840x1080. I did try it several drivers. There is no additional setup options in nVidia control panel so no "workstation" tree. I did try Mosaic utility but using "query lgpu" it shows supportmosaic=0 +some other information. Whenever i try to enable mosaic with "set rows=1 cols=2  out=0,0 out=0,1 res=3840,1080,60" it returns error flag not supported. Will do more testing later.

I know those are questions for dummies (sorry) but:
1) have you erased any trace of the previous geforce drivers?
2) Do you have 2 displays connected at the time you try to activate mosaic?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 01, 2013, 06:58:15 pm
@ Jager: here: http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7. (http://nvidia.custhelp.com/app/answers/detail/a_id/3012/~/how-to-configure-passive-or-dual-pipe-stereo-with-quadro-cards-in-windows-7.)

They do not talk about mosaic, do they?  ???

There is no way to follow that guide because there is no options needed. I have exact same options as geforce drivers. Drivers do some additional checks and do not enable needed options for stereo. Mosaic could have been "workaround" and it would make possible to use tridef3D at least.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 01, 2013, 07:01:08 pm
I know those are questions for dummies (sorry) but:
1) have you erased any trace of the previous geforce drivers?
2) Do you have 2 displays connected at the time you try to activate mosaic?

1.Yes
2.Of course, two identical projectors with dvi-to-hdmi and that should be supported.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 01, 2013, 08:39:55 pm
I'll try it as soon as my GTX680 is re-modified. I am still waiting for the correct resistors. My last attempt went a bit wrong, I ended up modifying the ID from 0x1180 to 0x1182 instead of 0x11A0, due to an epic brainfart.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 02, 2013, 05:15:02 pm
Hmm... It would appear Nvidia don't list the Mosaic utility available for anything older than Windows 7 any more. :-/
I don't suppose anyone has a link to the XP64 version?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 05, 2013, 11:00:43 pm
Vacation is over and i had a change to look two Quadro machines at work. One is Win7 with quadro 600 and other is XP32bit with Quadro 2000. Both machines have Workstation tree and possible to chose different stereo modes at manage3d setings. Whole manage 3D settings is different as you can chose 3D profiles for professional programs. WinXP machine do not have Mosaic setup at Workstation tree, only "view system topology". It seems that changing those resistors do not affect what features will be enabled. It only allows to install quadro drivers but those drivers look features from vbios? and enables only features that is supported in vbios not by product name/ID. I did some bios Hex editing and changed subsystem id to correspond K5000 but still no help. So there will be need for some serious vbios hacking and this is area that i have no experience. Maybe K5000 bios could be modded to support these hard modded cards?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 06, 2013, 04:01:53 am
I don't know about the other cards, but I have tried flashing a Q2000 BIOS onto a GTS450. You have to doctor the device ID straps to override the hardware straps accordingly, but it works well enough to boot up. I never tested more thoroughly than that since I didn't see the point. If I planned to keep it I would have doctored all the clock speeds and memory timings to match the original BIOS. There was no effect on performance, over and above what was caused by the drop in clock speeds to Quadro levels.  I didn't notice any extra features, but I wasn't looking (was testing on XP).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 06, 2013, 07:59:03 pm
Did some more testing. Flashed my GTX670 with K5000 BIOS. GTX670 shader count, memory size and board ID is different so it was quite clear that this will not work. Interesting thing was that when i flashed back to GTX670 BIOS something left behind from K5000 BIOS. Size is now 129kb instead 96kb that GTX670 normally have, K5000 bios is 221kb. When i first opened GPU-Z it recognized GPU as GK104GT and bios as "modified" but after some time without touching bios it showed up correctly. Funny thing is that now i have <workstation> options in nvidia control panel but only "change ECC state" and after looking bios with hex editor it clearly show that those extra 33kb included EEC option(this code is after 670 normal bios code). So it is now quite clear that BIOS for modded one is needed to get this thing actually behaving like K5000, this soldering is just for name change without some hardcore BIOS editing. 
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 07, 2013, 06:47:40 am
Did some more testing. Flashed my GTX670 with K5000 BIOS. GTX670 shader count, memory size and board ID is different so it was quite clear that this will not work. Interesting thing was that when i flashed back to GTX670 BIOS something left behind from K5000 BIOS. Size is now 129kb instead 96kb that GTX670 normally have, K5000 bios is 221kb. When i first opened GPU-Z it recognized GPU as GK104GT and bios as "modified" but after some time without touching bios it showed up correctly. Funny thing is that now i have <workstation> options in nvidia control panel but only "change ECC state" and after looking bios with hex editor it clearly show that those extra 33kb included EEC option(this code is after 670 normal bios code). So it is now quite clear that BIOS for modded one is needed to get this thing actually behaving like K5000, this soldering is just for name change without some hardcore BIOS editing.

And could you try to edit the bios witn right shader counts, etc? And flash it again?
Thx :) Your work is appreciated!
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 08, 2013, 03:12:00 am
That is indeed interesting - does the ECC toggle actually work? Does the memory amount shrink by 1/9? If so, any chance you could PM me a link where I could download the before+after BIOS for analysis?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 08, 2013, 06:04:15 pm
That is indeed interesting - does the ECC toggle actually work? Does the memory amount shrink by 1/9? If so, any chance you could PM me a link where I could download the before+after BIOS for analysis?

Otion do not work. I can enable it but after restart there is still small star next to checked checkbox that says it will be enabled after restart. It definetely tries to enable on boot because screen blanks way that it wont normally do.

I did some testing with tridef and it seems that outputs on dualprojection is synced after all, this is also card without mods. So i think it is nowadays navite behavior    also with Geforce up to 2xSLI. Now i just need to find another way to enable Dual passive stereo and buying K6000 is not an option :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 08, 2013, 06:20:40 pm
TRidef doesn't support Nvidia SLI... You would instead need a beast: GTX 780 :) Is it nice to play with tridef games in passive 3d 1080p 60hz?  :P How do you know it is synchronized for both projectors?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 08, 2013, 08:51:40 pm
TRidef doesn't support Nvidia SLI... You would instead need a beast: GTX 780 :) Is it nice to play with tridef games in passive 3d 1080p 60hz?  :P How do you know it is synchronized for both projectors?

To be honest 1080p 60hz is not that great because framerate. Tridef also support dual projection only for DX9. Many of my favorite games do not work as well with 3D vision+helix mod. It is quite easy to tell when 3D is out of sync, maybe i do side by side video to confirm this...

Too bad 670 to K5000 was "fail", but i will keep looking. Maybe nvidia inspector's stereo settings could help and there is 3d vision hacks etc...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on August 08, 2013, 09:34:16 pm
TRidef doesn't support Nvidia SLI... You would instead need a beast: GTX 780 :) Is it nice to play with tridef games in passive 3d 1080p 60hz?  :P How do you know it is synchronized for both projectors?

To be honest 1080p 60hz is not that great because framerate. Tridef also support dual projection only for DX9. Many of my favorite games do not work as well with 3D vision+helix mod. It is quite easy to tell when 3D is out of sync, maybe i do side by side video to confirm this...

Too bad 670 to K5000 was "fail", but i will keep looking. Maybe nvidia inspector's stereo settings could help and there is 3d vision hacks etc...

The best and only solution for 3d vision dual projector passive 3D is still this demultiplexer for 3000$ which is 3D vision ready for 120hz 1080p 3D gaming and where you can use any nvidia cards:
http://www.mviewtech.com/listen.asp?ProdId=111025104433 (http://www.mviewtech.com/listen.asp?ProdId=111025104433)
http://www.dhgate.com/product/ma2p202-dvi-single-channel-active-to-passive/144319832.html (http://www.dhgate.com/product/ma2p202-dvi-single-channel-active-to-passive/144319832.html)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 10, 2013, 12:13:24 am
That is indeed interesting - does the ECC toggle actually work? Does the memory amount shrink by 1/9? If so, any chance you could PM me a link where I could download the before+after BIOS for analysis?

Otion do not work. I can enable it but after restart there is still small star next to checked checkbox that says it will be enabled after restart. It definetely tries to enable on boot because screen blanks way that it wont normally do.

A before+after BIOS check would still be handy if it is different.
The other possibility, if you used a 3rd party VBIOS flash package, is that it also brought with it a FPGA blob with it and it somehow flashed that onto the card via some less documented means, and when you downgraded the BIOS that wasn't reverted.

If all your video outputs work fine, I see no reason to not use a Quadro BIOS - but you will still need to modify the straps in the BIOS to match the hard-straps you need to override (if any) and you may also want to change the clock speeds and timings to the GeForce spec.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 13, 2013, 04:11:48 pm
That is indeed interesting - does the ECC toggle actually work? Does the memory amount shrink by 1/9? If so, any chance you could PM me a link where I could download the before+after BIOS for analysis?

Otion do not work. I can enable it but after restart there is still small star next to checked checkbox that says it will be enabled after restart. It definetely tries to enable on boot because screen blanks way that it wont normally do.

A before+after BIOS check would still be handy if it is different.
The other possibility, if you used a 3rd party VBIOS flash package, is that it also brought with it a FPGA blob with it and it somehow flashed that onto the card via some less documented means, and when you downgraded the BIOS that wasn't reverted.

If all your video outputs work fine, I see no reason to not use a Quadro BIOS - but you will still need to modify the straps in the BIOS to match the hard-straps you need to override (if any) and you may also want to change the clock speeds and timings to the GeForce spec.

gordan, send PM with before after bios. I use latest nvflash(dos) to flash bios.

I also have 8800GT lying around so i flashed it to "FX 3700", same thing. With modded 8800GT bios no Quadro features are actually enabled, just name change also with this. Is there any Quadro mod ever made that actually enables quadro features? How nvidia actually look these features. When doing 3D it is possible to change 3D modes that are not listed in cp when alt-tab and edit registry key. Is there possible to do injector that enables features? ECC feature that i have now is somehow left behind in bios and maybe nvidia is looking just some flags in bios and enables feature even it is not actually possible to do with card. Many quadro features are possible with Geforce as in Linux u can do Mosaic etc with geforce too.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: jukk on August 13, 2013, 09:20:08 pm
I also have 8800GT lying around so i flashed it to "FX 3700", same thing. With modded 8800GT bios no Quadro features are actually enabled, just name change also with this. Is there any Quadro mod ever made that actually enables quadro features? How nvidia actually look these features. When doing 3D it is possible to change 3D modes that are not listed in cp when alt-tab and edit registry key. Is there possible to do injector that enables features? ECC feature that i have now is somehow left behind in bios and maybe nvidia is looking just some flags in bios and enables feature even it is not actually possible to do with card. Many quadro features are possible with Geforce as in Linux u can do Mosaic etc with geforce too.

Are you claiming that there is no gain of changing the ID from Geforce to Quadro? I thought that it at least would improve restricted driver features in Linux, i.e. namely mosaic with more than two displays, and 3dvision, which both currently is only possible with Quadro cards. That's why the author started this thread in the first place.

Edit: I see there is now support for base mosaic in latest linux drivers (not sure which versions). And to the comment about 3dvision (note to myself): it is a DirectX thing only. Basically there is 3D support in linux with OpenGL, but not using the nvidia driver. The Quadro 3D support seems to be something different. So actually you are right. There is no point in converting to Quadro for these features alone.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 16, 2013, 06:59:29 am
Got your PM. Will look at disecting the files soon. I will reserve judgement until after I've had a look through that. My suspicion is currently leaning toward just checking the device ID in the BIOS (as opposed to only the straps). On a K series the cards have UEFI crypto stuff in the BIOS, so it is more difficult to change the BIOS other than the straps (unless you just completely rip out the UEFI headers and the trailing certs and other garbage). Since most of the hacks focus on the device ID (as opposed to what the BIOS things the device ID is), it is possible this is overlooked.

Nevertheless, a Quadro BIOS should be very easily modifiable to suit using the existing tools (or just manual hacking) - mainly just a case of tweaking clock speeds and timings. In some cases it will work out of the box: e.g. my GTS450 cards work just fine with a Quadro 2000 BIOS. Unfortunately, this does not restore the missing GL functionality. There are three possibilities here:

1) That functionality is laser-cut out of the GPU - no hope of restoring that.
2) Pin shorting on the GPU to disable the feature (e.g. caps across pins - plausible, but I've not been able to establish a pattern here, all GTS450 cards I have had were mutually different in that regard, and different from the Quadro 2000 as well.
3) Secondary firmware somewhere that initializes an FPGA somewhere to do the normally crippled GL stuff properly.

My suspicion is that 1) is the case since the Quadro and GeForce GPUs have different part numbers stamped on them, and this likely happens before they are fitted to the PCBs.

I don't think there is a Mosaic utility for Linux, but I could be wrong - you can set up multiple displays using xorg.conf anyway (e.g. when you're using an IBM T221). I don't remember seeing any driver options for adding spacing between monitors, but I haven't looked. I certainly haven't noticed any difference in features between a real Quadro 2000 and a fake one (other than the fact that one works for VGA passthrough in Xen and the other does not). And while we're on the subject, on a GTS450, modding to a Q2000 does improve performance of some SPEC tests (Maya goes up by about 40%, although it is still well short of a real Q2000 score on it). On a GTX580/Q7000, there is no difference in SPEC performance whatsoever before/after the mod (in both cases the mod consisted of strap and BIOS device ID changes, nothing else). In the GTS450 case a real Q2000 BIOS made no further improvement - in fact it made the scores worse due to the lower clock speeds being set.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on August 16, 2013, 05:41:23 pm
gordan, at least latest beta geforce drivers for linux contains support for mosaic. Here is line from driver notes --> "Added support for configuring SLI Mosaic and Base Mosaic in the "X Server Display Configuration" page of nvidia-settings".

I think many of the quadro features are supported hardware level on geforces too. Mosaic utility for example reports sync=1 for GTX670 vanilla but not for real Q2000 on XP.

Maybe some skilled coder could do injection/hook hacks to enable those missing features...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on August 16, 2013, 11:55:30 pm
If the feature is indeed dependant on something in the BIOS, I think you're better off working on a BIOS mod based on the Quadro BIOS, and hope it's not crippled in hardware in some obscure way (more obscure than the strap resistors).
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: arghalhuas on August 27, 2013, 05:46:02 am
Really thanks for the efforts of gnif and verybigbadboy
However, since I have a EVGA GTX670 with the same PCB layout like GTX660Ti
So I need to find the modification by myself and here is the result.
For the 4th digit, as everyone already knows, it is right on the position of resistor 1 and 2.
Depend on which card you have and you can remove resistor 1 and change it to tesla(40K), grid k2(40K) or Quadro(15K) on resistor 2.
For the 3rd digit, it is the tricky part.
As the low byte on the top side of the PCB with resistor 4.
You don't need to do anything for Tesla K10.
However, if you need to change it to a Quadro K5000 or Grid K2
You need to remove resistor 4 and install resistor 3 "MANUALLY" since no place for resistor 3 any more in the PCB of GTX670 and GTX660Ti
As you can see in my attached  bottom side photo for the "rework".
You need to connect to EEPROM pin 6 with a 20K Ohm and pull up to VCC.
My rework is quite ugly but it works fine!
Please be careful and take your own risk for modifying your card!!

Summary
GPU Name         Resistor 1 / 0-7 4th byte                  Resistor 2 / 8-f 4th byte                 Resistor 3/ 3th byte (high)       Resistor 4 / 3th byte(low)           
GTX 660Ti          20K                                                  None                                               None                                         25k                                                                             
GTX 670            None                                                10K                                                  None                                         25k                                                                             
tesla k10           none                                                40K                                                  None                                         25k                                                                             
quadro k5000    none                                                15k                                                  20K                                            none                                                                             
grid k2              none                                                40K                                                  20K                                           none                                                                           

Should this work with a GTX 670MX? Can this one be turned into a K5000M?

I bought a laptop with a 120 Hz screen, which is compatible with 3D Vision Pro. The problem is that, with Linux, 3D Vision Pro only works with Quadro cards but not with GeForce ones. That is why I might be interested in trying to make a Quadro out of my GeForce...
Title: Re: Hacking NVidia Cards into their Professional Counterparts
Post by: mvrk on August 27, 2013, 11:42:44 pm

Yes, running Crysis in a VM on a Quadrified GTX480 (Quadro 6000) on my 2nd T221 and there is no tearing whatsoever. Nor was there any tearing on my other T221 with the 8800GT card, but there is massive tearing visible with the Radeon 4850.


Hi, i've got a Nvidia Geforce GTX 480 and i would like to transform it into a Quadro 5000 so i can use it with VMware Horizon View vSGA.... can you post the howto to modify the the GTX480 into a Quadro 6000 please? Or is already on this forum and i missed it?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 02, 2013, 12:13:15 am
yeah second the svga driver for esxi - can the driver be hacked? Also how come the earlier FX3800,4800,5800 (noted to have multi-os support) are not workable with esxi vsvga api-intercept protocol.

I'm guessing the grid K1/K2 can do real SR-IOV in the future but it seems that the nvidia driver is doing api-intercept with some sort of virtualization driver similar to how they can combine multiple gpu's with internal gpu.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: opoeta on September 04, 2013, 03:29:37 am
Good afternoon, I'm thinking of turning my GTX770 in K5000. But the suppliers here in Brazil, I can not find the resistors. For example the 40K resistor here only has 40K2, can I use this resistor will give or difference?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: opoeta on September 04, 2013, 09:33:07 pm
Good morning, I tested successfully mod my GTX770 in K5000. How was not finding the resistors, used Trimmers 50k ohms, like these,(http://www.huinfinito.com.br/383-494-large/trimpot-multivoltas-vertical-100k-carenagem-curta.jpg)
And I configured one for 15K and one for 40K, the board used was a Zotac GTX770 AMP!

Follows the model of plate tests and referrals. Subsequently do more stability testing. If anyone needs the bios I upload.

http://www.zotacusa.com/geforce-gtx-770-zt-70303-10p.html (http://www.zotacusa.com/geforce-gtx-770-zt-70303-10p.html)
(http://www.zotacusa.com/media/catalog/product/cache/1/image/9df78eab33525d08d6e5fb8d27136e95/z/t/zt-70303-10p_image2.jpg)
(http://i.imgur.com/3X8pbpO.jpg)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on September 08, 2013, 02:13:23 am
Nice mod on the GTX 770! Did it was exacty like on the gtx 680?

And more importantly: what did you gain from making the mod? Do you have any new functionnalities?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 08, 2013, 06:32:34 am
Small update guys.

I finally got around to playing a little more with my GTX680. Soldering 0402 components manually is an absolute bitch even with solder paste, a decent magnifying lamp, good eyes and steady hands.

Findings:

1) Removing both resistors (so all 0-4 resistor locations are empty) results in device ID of 0x11BF, i.e. Grid K2 - which is what I was aiming for anyway. From there on I can soft-mod to K5000 or GTX680MX if required (or anything else with IDs between 0x11A0 and 0x11BF).

2) In K2 mode, the card works for VGA passthrough on Xen. Sort of. Almost. It works find at up to 1280x800. If I select any res higher than that, it fails. As far as I can tell, the monitor is told to go into sleep mode. Tested with 320.49 and 320.78 drivers. Has anyone else found this? I haven't done any BIOS modding yet, but did anyone else see a similar issue? Is this something Nvidia did in recent drivers to cripple modified cards when running in a VM? I tested the K2-ified card in another bare metal machine with the same monitors, and in all cases there it works fine. But on my VM host, when passed through to a VM, it works great up to and including at 1280x800, and the screen just remains blank at higher resolutions. Talk about bizzare.

This is an interesting finding - my soft-Quadrified GTS450 (Q2000), GTX470 (Q5000), and GTX480 (Q6000) cards work just fine under the exact same conditions. I wonder if this is some kind of an obscure compatibility issue between Grid and Qx000 cards in the same machine since they have different size memory apertures - something could be getting confused.

Until I can get this resolved, modifying of my GTX690 is on hold.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on September 08, 2013, 02:13:00 pm
1) Removing both resistors (so all 0-4 resistor locations are empty) results in device ID of 0x11BF, i.e. Grid K2 - which is what I was aiming for anyway. From there on I can soft-mod to K5000 or GTX680MX if required (or anything else with IDs between 0x11A0 and 0x11BF).

Did you add two 40k resistors in the correct locations? If you did not, this could be the cause.

verybigbadboy notes he has some stability problems when they are not on.
I do not have these stability problems (but I did add them on for good measure a month back).
You may be having problems because of this that neither one of us experienced. Try adding the resistors.

In K2 mode, the card works for VGA passthrough on Xen. Sort of. Almost. It works find at up to 1280x800. If I select any res higher than that, it fails. As far as I can tell, the monitor is told to go into sleep mode. Tested with 320.49 and 320.78 drivers.

I am running Xen 4.2.2 with no patches (save a SLIC table I added in to active Windows to). The unofficial nVidia patches do not have to be used, but the did work for me if you wanted to do GPU passthrough without the cirrus card. My current graphics driver is 320.00 (http://www.nvidia.com/object/quadro-tesla-grid-win8-win7-winvista-64bit-320.00-whql-driver.html). Both the Geforce and Quadro/Grid drivers give me the same performance. I have not upgraded to test the new ones. Try that revision and see if it helps.

Soldering 0402 components manually is an absolute bitch even with solder paste, a decent magnifying lamp, good eyes and steady hands.

Toaster oven is the way to go. Heat gun if you are in a hurry. I use the iron to touch up and desolder. Occasionally I will use it to do one 0402. Any more than that it gets throw in to the toaster over nowadays.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: karakarga on September 08, 2013, 04:52:54 pm
Hi to all, I have just become a member, and wish to ask!

Chip Computer Magazine September issue of Turkey, quoted a "GTX670 to Quadro K5000" mod from German Chip Magazine crew, with a pdf link at http://chip.tk/13W2MpS (http://chip.tk/13W2MpS) adress!

They have used a Zotac 2GB GTX670 Amp Edition. But, for the resistor 0, they have used, 20k instead of 40k! :o

Which one is the best?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 08, 2013, 08:39:38 pm
1) Removing both resistors (so all 0-4 resistor locations are empty) results in device ID of 0x11BF, i.e. Grid K2 - which is what I was aiming for anyway. From there on I can soft-mod to K5000 or GTX680MX if required (or anything else with IDs between 0x11A0 and 0x11BF).

Did you add two 40k resistors in the correct locations? If you did not, this could be the cause.

verybigbadboy notes he has some stability problems when they are not on.
I do not have these stability problems (but I did add them on for good measure a month back).
You may be having problems because of this that neither one of us experienced. Try adding the resistors.

I'm not convinced. The stability issues I have seen mentioned is related to the PCI device ID changing randomly. I am not seeing that. It is always 0x11BF. I am only seeing the 1280x800 limitation on my VM host. On bare metal in another machine it works fine.

In K2 mode, the card works for VGA passthrough on Xen. Sort of. Almost. It works find at up to 1280x800. If I select any res higher than that, it fails. As far as I can tell, the monitor is told to go into sleep mode. Tested with 320.49 and 320.78 drivers.

I am running Xen 4.2.2 with no patches (save a SLIC table I added in to active Windows to). The unofficial nVidia patches do not have to be used, but the did work for me if you wanted to do GPU passthrough without the cirrus card. My current graphics driver is 320.00 (http://www.nvidia.com/object/quadro-tesla-grid-win8-win7-winvista-64bit-320.00-whql-driver.html). Both the Geforce and Quadro/Grid drivers give me the same performance. I have not upgraded to test the new ones. Try that revision and see if it helps.

I am using Xen 4.3.0 and the same setup works fine with faux-Quadro 2000, 5000 and 6000 cards. I am using XP64. This is probably the big difference between my setup and everyone else's, but I habe tried it on bare metal on XP64 and it works fine there.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on September 09, 2013, 04:50:13 am
Your "monitor is told to go into sleep mode" sounds like the card is locking up. I don't know what your setup is, what info can you get from the hypervisor when that happens? When my card would lockup just like you are describing, the ID would still read 0x11BF. It wasn't until I popped on a 100k resistor to R2 that it no longer locked up just like you are describing. You should try it instead of dismissing it.

As for using XP64, I haven't tested this card in that environment. A previous card that I had working with that a few years back required I use the `stdvga=1` option to get rid of the CIRRUS card before it would work. I would have screwy results otherwise. It would either not work at all, or would crash when I changed resolution or launched a full screen game. Try a Win7 VM.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 09, 2013, 05:47:39 am
The card doesn't crash/lock up - if I don't click the button to keep the new mode, it reverts back to the previos mode after 15 seconds, at which point it works again. And it works fine on a different machine (bare metal XP64, different motherboard).

I'll put some 40K resistors on it instead of leaving them off and see if it helps - stranger things have happened, so I'm not prepared to dismiss anything at this point.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: opoeta on September 09, 2013, 09:51:14 pm
Small update guys.

I finally got around to playing a little more with my GTX680. Soldering 0402 components manually is an absolute bitch even with solder paste, a decent magnifying lamp, good eyes and steady hands.

Findings:

1) Removing both resistors (so all 0-4 resistor locations are empty) results in device ID of 0x11BF, i.e. Grid K2 - which is what I was aiming for anyway. From there on I can soft-mod to K5000 or GTX680MX if required (or anything else with IDs between 0x11A0 and 0x11BF).

2) In K2 mode, the card works for VGA passthrough on Xen. Sort of. Almost. It works find at up to 1280x800. If I select any res higher than that, it fails. As far as I can tell, the monitor is told to go into sleep mode. Tested with 320.49 and 320.78 drivers. Has anyone else found this? I haven't done any BIOS modding yet, but did anyone else see a similar issue? Is this something Nvidia did in recent drivers to cripple modified cards when running in a VM? I tested the K2-ified card in another bare metal machine with the same monitors, and in all cases there it works fine. But on my VM host, when passed through to a VM, it works great up to and including at 1280x800, and the screen just remains blank at higher resolutions. Talk about bizzare.

This is an interesting finding - my soft-Quadrified GTS450 (Q2000), GTX470 (Q5000), and GTX480 (Q6000) cards work just fine under the exact same conditions. I wonder if this is some kind of an obscure compatibility issue between Grid and Qx000 cards in the same machine since they have different size memory apertures - something could be getting confused.

Until I can get this resolved, modifying of my GTX690 is on hold.

Gordon, could explain here how you transformed your GTX4xx in Quadro? I also saw that you turned your GTX580 in Quadro 7000, could give us the way?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 11, 2013, 01:23:21 am
opoeta, I'm writing up the process at the moment. I need to do a bit more testing and re-testing - I modified my cards a few months ago and I need to get them out of my production machine before I can re-test them to make sure that the writeup is correct - I wouldn't want to cause any inadvertent bricking. Once I've re-tested and written it up, I'll post a link here. Unfortunately, I have to get the GTX680 working first - that can then replace one of my 4xx series cards that I can then use to re-test the procedure.

I've been meaning to do this for the past month, but something more important always comes up just when I think I have a few hours put aside for GPU hacking. Apologies for the delay. :(
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 11, 2013, 05:09:56 am
It looks like the K6000 drivers and device ID are now available:
http://us.download.nvidia.com/XFree86/Linux-x86/319.49/README/supportedchips.html (http://us.download.nvidia.com/XFree86/Linux-x86/319.49/README/supportedchips.html)

Only a matter of time before somebody finds the correct resistors on a Titan to modify. gnif and verybigbadboy, I'm looking at you ;)
Anyone Interested in doing this if there is a donation round to cover the cost of a sacrificial Titan?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Soulnight on September 11, 2013, 05:19:15 am
YEAH...but what for? If there are no aditionnal functionnalities available! NO Support of nvidia MOSAIC for example. Or did I miss something?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 11, 2013, 06:22:52 am
Virtualization. It may or may not work - but it's worth a shot. For me, only the cards that are listed as MultiOS have worked (Quadro [256]000) and Grid K2, but not Tesla K10 or Quadro 7000. Other people have reported that K5000 works for them for virtualization (I have just confirmed this myself), so there is a reasonable chance that K6000 will work too. You know - for when half of a K2 just isn't quite enough. :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: opoeta on September 12, 2013, 02:14:55 am
Waiting for the tutorial of Q6000
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 12, 2013, 06:30:54 am
For the record - putting the 40K resistors in positions 0 and 2 did NOT solve my problem of the card seemingly no longer being able to handle modes above 1280x800 when virtualizing. It still works absolutely fine on a different bare metal machine in all resolutions and in 3D applications. When I try to set the res to anything above 1280x800, the monitor goes to sleep as if the input signal disappears. And since I don't click the button to keep the new mode, the OS reverts back to the previous resolution, and the screen output comes back. Very strange.

Edit: And it gets weirder. If I plug in my old 17" VGA monitor that can do 1280x1024 - the card happily putputs 1280x1024 to that over VGA. Which makes me wonder if something bizzare is happening with the second DVI link in VM mode. I can test for that - my backup T221 is running on SL-DVI connections. Lo and behold, I that comes up at 3840x2400@13Hz. So for some reason when running virtualized, my faux Grid K2 refuses to run in DL-DVI mode on both of it's ports. But when running on a bare metal machine, it works fine. W-T-F. This makes me wonder whether this is a Grid K2 "feature". Has anyone got DL-DVI working in a VM with a gridified GTX680?

Edit 2: I just soft-modded the card to a K5000. No change - for some reason whenever the second DVI channel gets enabled, it all goes wrong.

Has anybody seen this issue before?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on September 12, 2013, 11:40:39 am
The card doesn't crash/lock up - if I don't click the button to keep the new mode, it reverts back to the previos mode after 15 seconds, at which point it works again. And it works fine on a different machine (bare metal XP64, different motherboard).

Then the issue is most likely with XP64, Xen and gpu passthrough. XP as a whole was never meant for virtualization and support for it is just hacked together. Try Windows 7 instead of the operating system that is older than Xen itself.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 12, 2013, 06:58:22 pm
I'll try Windows 7 for the sake of completeness, but unfortunately, using it is not an option for me, due to a complete lack of desktop spanning functionality (T221s show up as 2 or 4 discrete monitors). In XP this works fine, in Vista and later the functionality has been removed. Windows 8 allegedly adds it back, but suffering Metro on top of spending more on two Windows 8 licences than I spent on my GTX680 is something I am not prepared to do.

Another thing I might try is XP64 on bare metal on my VM machine - just to eliminate the possibility of some utterly bizzare motherboard-influenced issue.

My current workaround is to swap my primary and secondary T221s around, so my gaming VM is running 2xSL-DVI. That works around the DL-DVI not working, albeit by limiting the refresh to 25Hz (33Hz with a custom mode).

For my other VM connected to a standard 30" monitor, I might just have to jump ship back to ATI. :(

One of these days a monitor manufacturer will make something that actually beats a T221 on pixel count (>= 3840x2400, i.e. more than the current 4K screens) in comparable size (<= 24"). But for now, the 12 year old technology is still unbeaten.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 13, 2013, 12:33:11 am
i've got a dual DL-DVI quadro 6000 bios ;)

it seems the original discussion on strapping the 4xx series to quadro is all based on the post that shows how to strap the bits 0,1,2,3,4 but not actually flashing the quadro bios.

Anyone actually use this stuff with esxi 5.5? Supposively they support intel/AMD GPU in the new version.

seems like the problems are:

SR-IOV support (some have it?) VT-D
FLR (Function Level Reset) - without it your vm's will crash the card upon reset (happens alot in windows vista/7/8)

is VGX just simply SR-IOV/MR-IOV with FLR?

API-intercept seems cool but to actually pull off a FLR SR-IOV on nvidia would be the real trick!

It seems the XEN guys are light years ahead of everyone else as far as getting things to work - and AMD GPU seems to have far superior support.

But if esxi 5.5 can handle intel HD and AMD GPU, perhaps we need to just take a look at the new version ?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 13, 2013, 01:45:51 am
NONE of the modifications are based on flashing a Quadro BIOS onto the card. I did limited testing on this (Q2000 BIOS on a GTS450), and all that achieves is lower the clock speeds on the GTS450 down to what they are on Q2000. There is still some stuff in my TODO queue to investigate things like ECC support, though.

To do it properly, you should really change at least a few more things in a addition to the straps, e.g. the record in the BIOS containing the PCI ID (at around position 0x18E on 4xx cards), as well as the board/card identification strings.

FLR is nowhere nearly as required as people indicate. If you pass the whole device (i.e. VGA + HDMI audio) it works fine, and with Quadro (real one or a modified GeForce) rebooting VMs works just fine, if your motherboard BIOS and PCIe bridges aren't buggy (some careful research there is needed if you're looking to buy a suitable motherboard, otherwise you will spend weeks troubleshooting and writing hypervisor patches, and you're pretty much screwed if you use a closed-source hypervisor).

For the record - I have Quadro 2000, 5000, and 6000 cards (real and faux 2000, modified GeForce 5000 and 6000) and rebooting VMs works fine (most of the time - I have problems due to various hardware/firmware bugs that plague the EVGA SR-2). It is ATI cards that suffer from the rebooting crashes and performance degradation after reboots. No GPUs available today have FLR, including real Quadros or real FirePros. It is not needed if the BIOS and drivers are doing their job.

VGX and the new Xen project that implements something similar exposes a guest VM driver API that offloads the GPU tasks onto a real GPU, shared between multiple VMS. Nothing to do with FLR whatsoever. It is essentially a virtualized GPU driver API designed to allow you to share GPU processing between multiple guests.

FWIW, I have had much better luck with Quadrified Nvidia cards for Xen virtualization than with ATI cards. ATI cards have far too many limitations (only a single DL-DVI port from HD5xxx series onward), don't work properly with multi-monitor spanning in my experience (at least not on IBM T221s), and suffer from the VM reboot bugs in the BIOS and drivers (e.g. if the VM dosn't crash on a reboot, the performance is degraded). Then again, I seem to have just hit a XP Quadro driver bug that breaks DL-DVI from working in a VM (but not on bare metal). So neither are perfect, but Nvidia just seems to suck a lot less than ATI (even though I may just have to resort to using an ATI card for one of my VMs if I can't work around the DL-DVI problem on the GTX680 based K5000/K2).

On a separate note, device reset can actually be implemented in multiple different ways, not only if it has FLR support. For example, Xen also supports a method of using the PCIe power management; putting the card into the powered-down state and bringing it back results in the device being reset (if it implements the PCIe power saving functionality properly).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 13, 2013, 07:24:35 am
http://gfxspeak.com/wp-content/uploads/2013/09/VGX-GPU-virtualization-Nvidia.png (http://gfxspeak.com/wp-content/uploads/2013/09/VGX-GPU-virtualization-Nvidia.png)

better example: http://on-demand.gputechconf.com/gtc/2013/presentations/S3501-NVIDIA-GRID-Virtualization.pdf (http://on-demand.gputechconf.com/gtc/2013/presentations/S3501-NVIDIA-GRID-Virtualization.pdf)

more:
http://www.nvidia.com/object/cloud-gaming-gpu-boards.html (http://www.nvidia.com/object/cloud-gaming-gpu-boards.html)



I thought what they are saying here is not API intercept?

API intercept doesn't give you cuda,opencl,directx right?

I'm still trying to figure out how they can manage fair share loading of a video card since a vm could potentially tear ass on a video card with true hardware virtualization.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 13, 2013, 08:54:43 am
I was under the impression that the VMware way involves passing a GPU to a VM. Grid K2 is a GTX690 i.e. two GTX680s. So you can "share" a card between two VMs, but in reality you are not sharing GPUs between VMs. Grid K1 is the same sort of thing, only it comes with 4 lower-spec GPUs (for passing to up to 4 VMs) rather than two high spec ones. In other words, you cannot split a Grid K2 more than 2 ways, and you cannot split a Grid K1 more than 4 ways.

The new Xen way is quite radically different (and far more ambitious), from what I gather, but I haven't really looked into it in great depth since it was only made public yesterday.

My original plan was to run a mod both halves of a GTX690 into Grid K2 as per the hard-mod on this thread (you only have to hard-mod the first byte's resistor, the 2nd byte you can soft-mod), and pass one to each of my VMs (for me and my wife). Unfortunately, the DL-DVI issue with the GTX680 has put that plan on hold - I don't want to waste my time modifying a GTX690 if it's going to prove to be equally unusable (I need at least one DL-DVI working for my wife's 2560x1600 monitor). My plan B is to split the 690 into a GPU for the host and a GPU for my VM (I can live with a T221 running off 2xSL-DVI at 3840x2400@32Hz for gaming), and get something like an ATI 7970 for her VM (most (but not all) of the problems I've had with ATI cards are T221 related, and even though they do suffer issues with VM reboots, we hardly ever need to reboot our VMs - XP64 is extremely stable).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 13, 2013, 10:12:50 am
right now you have API intercept - and coming soon will be virtualization.

Do you remember how the video card virtu alization worked? Laptop was on INTEL (or virtual gpu) then it would switch to AMD/NVIDIA but video out from the intel hd port! it is all based around SLI and i'm guessing if you really think about it you could SLI a software GPU and then add hardware assist (or a real gpu DVGA) or api intercept (sVGA) but you are still rocking the software driver (think of that as the ROUTE of video stream).

So software nvidia driver switches to 1% load, and 99% to hardware GPU (if available) - this way you can vmotion to another server without graphics card and still continue gaming (just 30fps to 1fps! lol).

Right now everyone does this method. Then hyper-v/XEN use hardware transcoding to offload (like intel has quick-sync) but it happens at the same time. I think this is kepler technology.

The last step which I have not seen in production anywhere would be true SR-IOV+FLReset + QOS to seperate the tasks (if you think api intercept to consumer video card is safe! not! Easy to glitch and crash or access system ram/other VRAM).

That's why I wish for true SR-IOV/FLR or MR-IOV(bladesystem) - security. Stability as well since if you crash a virtual function of gpu you don't take out all of the other vm's video function or crash the host.

Pretty much if my vmware host crashes or loses power, it is completely rebuilt. Period. Full reinstall and format of all components.  I seriously doubt that vmware esxi using X mosaic and API intercept is going to provide stability. Otherwise what is the point of using Intel VT technology? We could do API-intercept (Binary translation) from the first days of virtualization or "double-dos" lol. Definitely not fast nor secure.


Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 13, 2013, 10:29:54 am
Well see there is one thing we haven't explored: Tesla cards can send to mellanox connectx-3 in 40gbe or 56FDR IB mode direct so it does not touch the CPU (gpudirect). It had many flaws though since mapping direct memory with virtualization is difficult (bypassing the CPU virtualization violates memory!)

The GPU direct could then output the video stream without compression to 10gbe (40gbe/56IB) and you don't have all that lag. Given that you can get 24 port 10gbe switch for $1200-1500 now and cheap 10gbe dual port nic for 75$ to me this would be the best way!  I don't care about "WAN/PCOIP".

I am tempted to see if you could bridge a virtual USB gpu over 10gbe that would be awesome! I think everyone is so focused on limited bandwidth WAN and gigabit they ignore the most direct path which is use a ton of raw bandwidth and run the GPU remote with a pci-e tunnel effect. It has to be possible!
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 13, 2013, 10:34:03 am
I can easily believe that you can do this over infiniband. Infiniband works using DMA, and in fact does RDMA. So it is ideally suited for doing precisely what you are describing, in the sense that you can map the GPU BARs via infiniband and use remote GPU number-crunching as if the GPU was local. And if you are already running infiniband for this, why bother with ethernet at all? If you can live with the 15 meter cable length limit, it's the way forward. And it's dirt cheap compared to 10Gb ethernet (not to mention 2-4x faster).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: baconsteak on September 13, 2013, 12:24:35 pm
Has anyone ever heard of this working on a g92 9800gt? I want to use opengl flip 3d.

I have tried changing the straps in bios to 061A with nvflash --straps 0x7FFE23C3 0X1000A804 0X7FFEFFFF 0X00010000
It's recognised as a FX3700 now and the drivers install but the 3d options aren't there so I don't think its working properly.

Is this because the hardstraps are different? How can I check the hardstraps? Nvflash only shows the softstraps. Or is it something unrelated to the straps?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 13, 2013, 07:24:44 pm
Are you sure this works on a genuine FX3700? I have a laptop with a genuine FX3700M in it, but I don't recall seeing any extra options in the settings compared to a GeForce 260M it replaced. If you tell me what exact options you expect to see, I can look for them and report back.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 13, 2013, 11:17:54 pm
Anyone care to make a javascript calculator for straps?

Perhaps you can upload the file, it will check the values, give you options (only valid ones) and then tell you which nvflash --straps to use.

What's going to happen is endian typos killing cards, but if you can make a webpage to help folks, it would reduce the # of fail.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 14, 2013, 05:01:47 am
Not an unreasonable idea. Pitty NiBiTor authors disappeared (or were disappeared, depending on how conspiracy-theoretic you want to be) - shame it wasn't an open source project, or it could have easily been extended to cover more recent developments.

Having said that - there is an argument that anyone attempting thos really should know what they are doing and know their way around a soldering iron, calculator and an editor. Darwininan force is a good thing.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 14, 2013, 05:58:26 am
Do you know what's up with those mac folks selling "quadro" fakeys on ebay? It seems they turn anything geforce into a quadro somehow?

I wonder if they are doing some other trick to fool the server? almost every card they are selling is identified as Quadro 4000.

Maybe they know how to reload the vbios from ram or something?

I'm not trying to get into the business at all - if I wanted to make money i'd just hack intel nic's for MAC (thunderbolt case, or mac pro) since you could buy $99 intel 10gbe nic and modify smalltree/atto drives to accept the intel standard nic. That would make one rich since the mac people love paying $999 for a $99 pre-tested nic lolololol.
----------

Does anyone have TRUE VGX working at all? Hardware not API-intercept? I would be glad to donate to the cause for one video card with decent amount of ram that would actually do GRID K1 duties.

But it would have to be SR-IOV+FLRESET and work with XEN.

I suspect if someone can do true VGX - I would buy 2 or 4 ! Seriously. Can't afford K1 grid they cost $1400 and K2 costs $1800 !! way too rich for my blood

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 14, 2013, 06:42:08 am
I'm surprised they are modding GeForce cards into Quadro 4000 cards - since there is no direct GeForce equivalent. Modding a GTS450 into a Quadro 2000, GeForce 470 into a Quadro 5000 and GeForce 480 into a Quadro 6000 is pretty trivial, on the other hand.

I suspect the Mac requirements, however, include having a UEFI BIOS, which is trickier since as soon as you modify the BIOS, the crypto signatures in the UEFI BIOS no longer match and the Mac will probably reject the hardware. Depending on how the crypto stuff is implemented, I'm sure it is defeatable, though. Worst case, you can get a genuine Quadro UEFI BIOS (from a Mac Quadro), find a GeForce card on which the important bits of the strap match (EEPROM type, clock generator type), and hard-mod it with resistor changes, and voila, you have a Mac Quadro for the price of a GeForce, a couple of 0402 resistors, and a few minutes or extreme precision work with soldering gear (easy with years of experience and decent equipment (solder removal braid, pro grade soldering kit with hot air, difficult otherwise). I would imagine somebody experienced in it could knock out a whole bunch of modified cards per hour (removing and refitting heatsinks would likely be the most time consuming part).

Personally, I'm working in the other direction - stripping UEFI crap from the 6xx BIOS so I can soft-mod it more easily, plus some other advantages such as having a BIOS smaller than 64KB, which makes it loadable for primary VGA passthrough in Xen for bootstrapping the GPU inside the VM (so you get boot console output on the external monitor, rather than in a VNC console).

$1800 for a Grid K2 sounds downright cheap - only 3x the price of a GTX690. Until recently the difference has been 5-8x. Seems we are having a sobering impact on their pricing strategy.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 14, 2013, 07:31:03 am
I was curious about that. you mentioned you did up a card with a real bios.

The Kepler 2000,4000,5000,6000 seem to be the cards to do.

Do you think its possible to find a 2000 or 4000 compatible kepler geforce and actually splice the bios?

Nvidia website mentions GRID but also says Kepler can do some of the functions as well.

However it is opposite of how ESXi works (GRID or non-kepler).

And remember the old FX 3800,4800,5800 are multi-os.

I think it would be interesting to compare the features of the multi-os cards and their geforce partner!


2xx/3xx are software

4xx are software plus straps

5xx are hardware resistor plus straps

6xx are hardware resistor plus straps.

It seems the 4xx are easiest to do and cheap to source for those of us who can't solder.

6xx would then be the next choice since kepler has something that fermi does not.

Do you have suggestion on cheaper 6xx series to give a shot to mod?


btw, I was looking at pricing. The low end quadro piqued my interest. What about cross flashing? Notice something in common in this list???

Quadro 410    GK107    28    PCIe 3.0 x16    512          1800    192:16:8          14.4    DDR3[54]    64          11.0    4.4    38    <-- $50
Quadro K600    GK107    28    PCIe 2.0 x16    1024    876    876    891(1782)    192:16:16    14.0    14.0    28.5    DDR3    128    336.38       11.0    4.4    41    6.3" Card <-- $70
Quadro K2000    GK107    28    PCIe 2.0 x16    2048    954    954    1000(4000)    384:32:16    15.2    30.5    64    GDDR5    128    732.67       11.0    4.4    51    7.97" Card <--- $210
Quadro K2000D    GK107    28    PCIe 2.0 x16    2048    954    954    1000(4000)    384:32:16    15.2    30.5    64    GDDR5    128    732.67       11.0    4.4    51    7.97" Card <--- $250
Quadro K4000    GK106    28    PCIe 2.0 x16    3072    810.5    810.5    1404(5616)    768:64:24    19.4    51.9    134.8    GDDR5    192    1244.93       11.0    4.4    80    9.5" Card <--- $350
Quadro K5000    GK104    28    PCIe 2.0 x16    4096    705.5    705.5    1350(5400)    1536:128:32    22.592    90.368    172.8    GDDR5    256    2168.832    90    11.0    4.4    122    10.5" Card <-- $800

Quadro K500M    GK107    28    PCIe 3.0 x16    1024    850    850    1600    192:16:8    6.8    13.6    12.8    DDR3    64    326.4    11.0    4.4    Yes    35    
Quadro K1000M    GK107    28    PCIe 3.0 x16    2048    850    850    1800    192:16:16    13.6    13.6    28.8    DDR3    128    326.4    11.0    4.4    Yes    45
Quadro K2000M    GK107    28    PCIe 3.0 x16    2048    745    745    1800    384:32:16    11.92    23.84    28.8    DDR3    128    572.16    11.0    4.4    Yes    55

NVS 510    GK107    28    PCI-Express ×16    2048          1800    384:32:16          28.5    GDDR3    128       11.0    4.4    35    4× miniDisplayPort <--  $150
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 14, 2013, 10:54:55 am
Quadro x000 / GeForce 4xx/5xx series cards are Fermi, not Kepler based. Quadro/Grid K series and GeForce 6xx and 7xx cards are Kepler based.

GeForce 8xxx/2xx/3xx/4xx/5xx are all fully modifiable using soft-straps to change the ID (to equivalent Quadro/Tesla cards).
6xx is, too, but the scope for change is limited. For example, you can soft-mod a GTX680 into a Tesla K10 without touching any resistors.
Nvidia obviously got smart to it and didn't add extra ID bits to the soft strap and made the bits above the 5th hard-strap-only - hence why we have to hard-mod the byte 3. Note that you can still adjust the least significant bit of the 3rd byte using a soft-mod as per the previous generation cards.

I have yet to see evidence of cross-flashing he whole BIOS achieving anything useful, but my research into that is not yet complete.
I wouldn't bother with the Quadro xxx (3 digits) and NVS cards, they aren't suitable for VGA passthrough which is the main point of modding at the moment.
People have reported success in modding a GTX670 into a K5000 and if you look through the GPU tables on wikipedia you should be able to find a suitable lower end GeForce card that you might be able to modify into a Grid K1. Check the driver release appendix A for device IDs supported by the drivers - this will also tell you what the device IDs of the various cards is, from which you can work out what is soft-moddable and what you'll need to mod the hard-straps for. Interestingly, you can soft-mode a GTX680M/GTX680MX into a K5000/K2 without the need for hard-modding.

But yes, 4xx series cards are probably the best target for experimentation. Performance is pretty good even by today's standards, they are cheap on ebay, and can be soft-modded easily. And modding them enables the second DMA channel, so you actually get some quite tangible performance improvements in I/O heavy loads. They easily win in terms of cost/ease/benefit.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 14, 2013, 12:29:36 pm
Yeah so my theory was to locate the cheap quadro cards and mod them. Then maybe while you are there do up the geforce as well.

So I was actually thinking the  NVS510 could be modded into a K2000!

Besides it is fun!

The quadro 6000 was expensive, and I somewhat regret not getting a K6000/K5000 since they will support VGX (grid). But I think I can probably use the quadro 6000 for a few more months and sell it for the same price I paid.

Or trade it to someone else. Or keep it :)

How bout trying to crossflash a titan to a tesla or K6000! I mean we want the VGX at the end of the day, might as well go big!

Plus if you can pull it off, folks would probably kick in  $$ to get it done.

I'd donate my quadro 6000 to someone that can pull off a grid K1 that works[grid vgx mode]. So far it seems only the GK110 has grid/VGX but clearly there are the other grid (K340/520) gaming gpu's and grid k1/k2 so i'm guessing the K340 or 520 are variants of the grid g p u .

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: eddy02350 on September 14, 2013, 07:43:47 pm
Hi all
 
After modding  my gtx670 to k5000 by hard strapping, i tried to change some  ID bits  in the  straps area too .
But lspci  show k5000 gpu  while the driver find a quadro k3000m gpu.
Strange , perhaps soft modding is possible  only  for the driver ?
(ex.. i don't speek english).




 
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Mikhail80 on September 14, 2013, 09:09:24 pm
Good morning, I tested successfully mod my GTX770 in K5000. How was not finding the resistors, used Trimmers 50k ohms, like these,(http://www.huinfinito.com.br/383-494-large/trimpot-multivoltas-vertical-100k-carenagem-curta.jpg)
And I configured one for 15K and one for 40K, the board used was a Zotac GTX770 AMP!

Follows the model of plate tests and referrals. Subsequently do more stability testing. If anyone needs the bios I upload.

(http://i.imgur.com/3X8pbpO.jpg)
My congratulations!
But... dear colleagues, what's about SPECviewperf test?
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg210155/#msg210155)
If your "GeForce" and "Quadro" test speeds are the same, then there is a big question for unlocking efficiency.

Could you post your SPECviewperf test results to submit your success?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 14, 2013, 10:38:32 pm
are you allowed to modify the straps code (bios/driver)? I assume the bios is x86 partially and then a FPGA (opencl/cuda?) blob?

just curious how the bios -> pc , and whatever -> gpu works!

Perhaps straps are like how cpu's have microcode patches to neuter bugs post-production? IDA pro anyone?

Or is the bios/driver all encrypted?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 16, 2013, 02:12:08 am
@eddy02350:
When hard-modding, you have to check in the BIOS and make sure that the UEFI header strap doesn't override anything (i.e. the AND strap ID bits should all be 1, and OR strap ID bits should all be 0). It is possible that your hard-strap works fine, but your soft-strap is locked so modding the hard-strap won't touch (some of) the low 5 bits. Same goes for the straps in the BIOS part (as opposed to UEFI header part).

@mrkrad:
The driver has been scrambled ever since somebody modified them to make SLI work on non-Nvidia motherboards. Nvidia's reponse was to encrypt the drivers. Intel's response was to not give Nvidia a QPI licence so they couldn't make motherboard chipsets for Core i and later CPUs. Just deserts and good riddance - doubly so since Nvidia motherboard chipsets were eyewateringly buggy and unreliable. But the scrambled drivers, unfortunately, stuck.

@opoeta:
Kudos for putting adjustable resistors on there. :)

@Mikhail80:
Nobody has yet been able to restore the missing GL primitives on GeForce cards. Cross-flashing the BIOS, in the one case where it actually works (Q2000 BIOS onto a GTS450), doesn't seem to achieve anything obviously useful in this regard; then again, Q2000 and GTS450 are very, very similar, much more so than other GeForces are to their equivalent Quadros, with maybe the exception of a K5000/GTX680 4GB variants - in that they have the same amount of VRAM. I haven't tried flashing a full strap-adjusted K5000 BIOS onto my GTX680 yet - it is on my ever-growing TODO list. :(

In case of the GF106 GPUs (GTS450/Q2000) I suspect the missing functionality is cut out of the GPUs before packaging, and if that is the case, the chances of restoring this are non-existant.

Note, however, that modifying a GTS450 into a Quadro 2000 does produce some performance benefits - Maya scores, although still far behind a real Quadro 2000, go up by around 40% after modifying the card. In case of the GTX470/Q5000 and GTX480/Q6000 mods, there are also other performance improvements, to be gained, such as enabling the second DMA channel (DMA transfers become bidirectional rather than unidirectional), which can significantly boost DMA transfers to/from the GPU, depending on the workload you are throwing at it.

Unfortunately, the same improvements do not hold in case of GTX580/Q7000 mod - all results remain exactly the same on that. Other people have reported similarly unchanged results on later GPUs. There also appear to be no bidirectional DMA capabilities on the newer GPUs.

General update:

I have just spent a few hours messing about with a GTX470/Q5000 BIOS modding, and one thing that seems quite certain at the moment is that a GTX470 will not work with a strap-modded Q5000 BIOS. The machine boots, but graphical output is corrupted when not in text mode (at least in Windows), and as far as I can make out GPU-Z isn't showing the card's capabilities properly (e.g. 0MB of RAM). I suspect a major part of this is down to the fact that a Q5000 has 2x the RAM of a GTX470, but I have not been able to establish where the memory size is stored in the BIOS by looking at things like very similar GTX580 cards that have the same BIOS version number but different RAM size (e.g. MSI produced GTX580s with 1.5 and 3GB of RAM which have the same BIOS version numbers). I narrowed the BIOS differences down to 5 possible locations (I think) by excluding things like the board, boot string, and checksum related differences, but it is not at all obvious whether the memory size is even encoded in the BIOS, let alone how.

If anyone cares or is interested in investigating further, the full hex diff between the 1.5 and 3 GB card variants is here:

Code: [Select]
$ diff <(xxd MSI.GTX580.1536.110715.rom) <(xxd MSI.GTX580.3072.110504.rom)
4c4
< 0000030: 0100 0000 c000 8d4e 3037 2f31 352f 3131  .......N07/15/11
---
> 0000030: 0100 0000 c000 8d4e 3035 2f30 342f 3131  .......N05/04/11
6c6
< 0000050: e986 2a00 6214 6025 ffff ffff 0000 0000  ..*.b.`%........
---
> 0000050: e986 2a00 6214 6225 ffff ffff 0000 0000  ..*.b.b%........
12c12
< 00000b0: 3136 3100 0000 0000 0000 0000 0000 0000  161.............
---
> 00000b0: 3133 3000 0000 0000 0000 0000 0000 0000  130.............
1189c1189
< 0004a40: 0000 1f01 0000 0023 6220 0300 3313 2003  .......#b ..3. .
---
> 0004a40: 0000 1f01 0000 0023 6230 0300 3313 2003  .......#b0..3. .
1191,1192c1191,1192
< 0004a60: 7f07 0000 008f 0000 0000 9f01 0000 00af  ................
< 0004a70: 0200 0000 bf03 0000 00cf 0400 0000 df05  ................
---
> 0004a60: 7f07 0000 008f 0000 0000 9f01 0000 00a3  ................
> 0004a70: 6230 0300 bf03 0000 00cf 0400 0000 df05  b0..............
1622,1623c1622,1623
< 0006550: 0c19 0000 0c06 0e26 003e 001b 000c 0c0a  .......&.>......
< 0006560: 0a0a 0100 0000 0200 160a 0500 0405 0407  ................
---
> 0006550: 0c19 0000 1006 0e30 0078 0020 0010 100e  .......0.x. ....
> 0006560: 070b 0100 0000 0200 170b 0500 0405 0407  ................
1646,1647c1646,1647
< 00066d0: 0000 0000 0014 730f 0064 3610 0020 8169  ......s..d6.. .i
< 00066e0: 0050 2200 00ac 53ff ff14 730f 0064 3610  .P"...S...s..d6.
---
> 00066d0: 0000 0000 0014 730f 0038 6710 0020 8169  ......s..8g.. .i
> 00066e0: 0050 2200 00ac 53ff ff14 730f 0038 6710  .P"...S...s..8g.
1660c1660
< 00067b0: 1001 0111 750d 714c 0000 c409 0010 0000  ....u.qL........
---
> 00067b0: 1001 0111 840d 824c 0000 c409 0010 0000  .......L........
2008c2008
< 0007d70: 0090 4402 0090 4401 0090 4402 0090 4402  ..D...D...D...D.
---
> 0007d70: 0090 4402 0090 4401 0090 5502 0090 4402  ..D...D...U...D.
3648c3648
< 000e3f0: ffff ffff ffff ffff ffff ffff ffff ff35  ...............5
---
> 000e3f0: ffff ffff ffff ffff ffff ffff ffff ff10  ................

I'm reasonably confident that the 1st, 3rd and last blocks of the diff are not relevant (last is the checksum, the the 1st and 3rd are board IDs).

Since differences between the GTX480/Q6000 and GTX470/Q5000 are negligible, I rather doubt that a Q6000 BIOS will work on a GTX480. I will double-check that when my faux-Q6000 is available for experimentation again.

Regarding the Quadrified/Gridifed GTX680's bizzare problem of refusing to work with DL-DVI outputs (at least in XP64, not tried on Windows 7 yet), I ordered an active DP->DL-DVI adapter. DP works quite differently to DVI, so there is a reasonable chance that this will handle the DL output signal. If it turns out to work, although not a proper solution, at least it will be a very good workaround.

Update: As a random factoid, Fermi Bios Editor doesn't actually understand how to parse a genuine Quadro 5000 BIOS (but parses a standard or modified GTX470/480 BIOS just fine).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 17, 2013, 09:32:09 pm
quadro 6000 has 448 rops and 6gb of memory with 384 bit memory layout, gtx 470 matches that. the gtx 480 does not match any quadro.

also remember they replaced some boards with GF110 equivalent so there are variant (quadro 4,5,6000) out there as well which are definitely not going to work..

the quadro 5000 is 320 bit memory but but only 352 cores, the 6000 is 384-bit and 448 cores. Geforce cards need to match both cores and bit-wise ram. I think that is the problem you match on cores but not ram, that's definitely not going to fly.

Too many variants it seems..
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 17, 2013, 10:15:46 pm
Q5000 has 320-bit memory bus like the GTX470.
Q6000 has 384-bit memory bus like the GTX480.

The key problem is matching the memory bus with and the amount of RAM. For example, this morning I successfully flashed the complete K5000 BIOS onto my 4GB GTX680. It works absolutely fine. When flashing the Q5000 BIOS onto a GTX470, it works fine in text mode, but there is massive corruption in graphical modes - I suspect because the BIOS expects there to be double the RAM, and things go wrong when it's not there.

On the subject of a 4GB 680 with a K5000 BIOS - there is no obvious performance boost in SPEC (I didn't wait for it to finish, I just watched through the first few seconds of the catia benchmark and it was as dire as usual, but maybe it's worth comparing the whole lot to see if anything changes). The ECC memory feature, however, appears in the Windows driver, and it seems to stick after a reboot. So it _appears_ to be working, but I am not convinced that it is actually doing anything - the amount of VRAM is still 4GB; I would expect it to reduce. In Linux, nvidia-smi just reports that ECC is not available when you try to set it.

So in conclusion - in cases where you can flash a whole Quadro BIOS onto a GeForce card it does very little other than slow it down, through Quadros having slower clock speeds pre-programmed.
ECC sort of almost looks enablable, but doesn't appear to do anything. I have no cosmic ray generator handy so I cannot easily test whether it actually works, I am purely going on the fact the RAM amount doesn't reduce, but maybe the shown value of RAM is the raw RAM, not post-ECC RAM. nvidia-smi in Linux claiming that ECC is not available on the card is more of an indication that it doesn't actually work. The toggle setting survives reboots and moving the card to another machine.

I have not tried Mosaic since it doesn't work on XP, nor have I tried any 3D stuff since I have no way of testing it at the moment.

So in conclusion - yes, you can get a Quadro BIOS onto a GeForce and make it work, provided you have the same amount of RAM on the GeForce (e.g. 4GB GTX680 with a K5000 BIOS). But it doesn't actually gain you anything. You can even go and edit the BIOS using the Kepler BIOS Tweaker and put the clocks back where they were on the GeForce BIOS - that gets you the performance back. But there is nothing there that you couldn't have achieved with just modifying the original BIOS.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Jager on September 18, 2013, 06:18:24 am
Gordan, can you check if Manage 3D settings is different with K5000 bios? It should look like this with quadro --> http://www.planar.com/media/245367/3d-ready-config-3.jpg (http://www.planar.com/media/245367/3d-ready-config-3.jpg)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 18, 2013, 01:37:45 pm
my mongoloid 1/2 of the quadro plex 7000 (GF100 quadro 6000) when flashed to 06d8 does ECC. This is evident with the reduction of ram from 6gb to 5.3gb and the benchmarks slow likewise since you have the overhead of the extra bit of data from gpu to ram to bus (end to end ECC). You should see a reduction in overall performance.
 
So i'm guessing that the ECC is not working if you do not see a linear reduction of performance along with reduction in ram..

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: baconsteak on September 18, 2013, 04:44:48 pm
Are you sure this works on a genuine FX3700? I have a laptop with a genuine FX3700M in it, but I don't recall seeing any extra options in the settings compared to a GeForce 260M it replaced. If you tell me what exact options you expect to see, I can look for them and report back.

Under manage 3d setting there should be a Stereo-enable option.

I'm fairly certain it should be there. Your FX3700M might not have it unless it has a stereo connector or you have a 3d vision unit with approprate drivers.

There is definately something else being checked by the driver. I found an old driver patch that enables the options and page flip stereo works but at the same time the mod destroys open gl performance and playback is jerky.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: jmecherul on September 19, 2013, 12:00:47 am
Hi all

It has been fun reading this whole post and some other information.

I have a K20 card that my company bought along with a Quadro K600 in my main machine.  I use the K20 to do electrical simulations with a 3D full wave simulator, so we are all good there.

I have an older version of the 3D simulator that will not recognize my K20 card, and I would still like to use it to do some simulations as I think the latest version is somewhat flawed, but it will take forever to use just the processors in my workstation.  That software works with with the C20xx/M20xx versions of the Fermi silicon.

I am thinking about getting a GTX580 or GTX590 card on the cheap and modding it into a C20xx/M20xx card.  I know it will not be as fast as the professional cards have more memory, but I would like to try and see what kind of acceleration I get.

Now, I have searched here and elsewhere on the internet, and I could not find any instructions how to hard-mod or soft-mod the straps from the gtx5xx cards to the c20xx counterparts.   I see some people here has done it, so any help that you can shoot my way would be greatly appreciated.  I am good with soldering (have IPC training) and would be  more comfortable with replacing resistors so pictures would be helpful.  I can do the soft-mod too, but I am a little more handicapped at working with a hex editor and flashing firmwares and the likes.

Thank you very much in advance.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 19, 2013, 06:40:02 am
@Jager, baconsteak:
I'm going to have to disappoint you - the stereo options don't appear in the control panel. It does appear on my genuine Q2000. It is possible one of the unknown strap bits have an effect, but it will be a while before I can test this.

The ECC is togglable (to no effect) on bare metal, but not from a VM. There must be some NVRAM somewhere on the card that saves the setting, but virtualization layer doesn't let it through (in the same way that BIOS access to the card crashes the host).

@jmecherul:
I am the one that soft-modded a GTX580 into a Q7000. This should be just as easily modifiable into a Tesla M2090 if that is what you prefer. I never bothered looking for the hard-strap resistors since a BIOS mod was sufficient. A GTX590 looks like it should be modifiable into a pair of Tesla M2090s.

I don't have my GTX580 any more (sold it, didn't have a use for it since it didn't work in a VM), but I do have the modified BIOS. If you get a 3GB Gainward Phantom I can send you a hex diff between the original and my modded BIOS (can also change it for you to ID the card as a M2090 if you like).

I will write a crash course on modding, but finding time to put all the info together and then test it by following my own instructions has been somewhat difficult recently.

If your kernels work well on Fermi chips, and you are not particularly limited by the amount of RAM on the card you may find that a modified GTX480 (C2050) works better for you - unlike later cards, that has a bidirectional async DMA engine. I guess it depends on whether you are more limited by the amount of RAM on the card (1.5GB on GTX480 vs. potentially 3GB on some GTX580s) or by cudamemcpy() I/O. In terms of raw number crunching power, GTX480 and 580 are similar enough for it not too matter too much (480 vs 512 shaders isn't that big a deal).

On an earlier note, regarding the weird issue that only makes SL-DVI modes work when in a VM in XP64 - I tried using the DB with an active adapter. The monitor gets recognized, but no modes work via DP - always just a blank screen. It is really most bizzare that this happens only in a VM and not on bare metal. I haven't done any modifications to my GTX690 yet so I don't know yet whether that will suffer from the same issue. On one hand I'm surprised it hasn't come up in anyone else's use-cases, but this is the sort of thing that likely only manifests in cases where:
1) A Xen VM is used with VGA passthrough
and
2) XP is used
and
3) DL-DVI monitor is used (1920x1200@60Hz does not require DL and relatively few people use higher res screens)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: jmecherul on September 20, 2013, 12:26:12 am
Hi Gordan

Thanks for the reply

I tried searching for Gainward card to buy, but I cannot seem to find any.  Thanks a bunch for offering to even modify the BIOS for me.  That would be great.

The soft I am using does work OK with Fermi chips, but I am indeed limited by the amount of memory.  My K20 has 5GB right now, and I find myself maxing it quite often (run a lot of mesh cells  in my simulation software), so I would rather get a card with as much memory as possible.  I would go for a GTX590 (to get two M2090s), but they are a little expensive for my experiment.  I can find GTX 580 with 3GB ram for around $200 on ebay right now, which seems decent to get one M2090 our of it. 

The hard part is how to get this done.  Do all GTX580s follow the reference design, or are some brands better than others at modding?  Your help in getting me started is much appreciated.  By the way, I do not need the display to work with this card as I have separate Quadro card for that.

Thanks once again.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 20, 2013, 06:14:14 am
It sounds like a 3GB GTX580 would be your best bet.

Following the reference design is irrelevant for soft-mods. All you need to do is modify a few bits in the straps (16 bytes + strap checksum starting at offset 0x58 in the BIOS dump) the device ID in the BIOS at offset around 0x18E, and the BIOS checksum (last byte in the BIOS). Modifying the boot string and the board ID string is optional but I'm pretty sure these are just human readable fields that don't actually do anything of consequence.

Checking my various notes, there are at least 32 bits in the strap (a whopping half! I never counted them before! 0,7-9,16-21,29-30,32-35,37-47,56-62) for which I didn't establish what they do, but toggling some of them caused my sacrificial GTS450 to become unbootable (not detected at all). If I'm reading my notes right (and I really hope I am), the bits that bricked the card include 16,17 and 31.


Perhaps I should re-test the bits that didn't brick the card for other potential functionality such as enabling stereo 3D functionality or ECC. The only snag is, Q2000 (what my sacrificial GTS450 is equivalent to) doesn't support ECC, so this wouldn't show up, and I'm not sure I can be bothered right now to solder a switch across the EEPROM pins (for de-bricking purposes) on my GTX470/Q5000 card.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: jmecherul on September 20, 2013, 06:44:10 am
yeah, a 3gb GTX580 would be my best bet

that sounds utterly complicated for my hardware brain.  I would rather solder resistors.

what tools would i need to get this accomplished?  which bits do i need to change and to what values?  how is the checksum exactly calculated in there?  a lot of questions.

thanks.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: jmecherul on September 20, 2013, 06:49:08 am
probably consumer video cards so not have ECC type memory in there.  I would not really need 3D functionality or anything fancy.  This is strictly numbers crunching. 

if you have time to create a detailed how to guide to turn that GTX580 into a M2090 would be greatly appreciated.  Or, a guide for multiple cards.

or as you suggested, i would dump the bios (if you tell me which tool to use), pm it to you and you can change it to whatever to make the card a M2090.

thanks once again
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 20, 2013, 11:15:07 am
ecc works differently with the gpu. Yes the gpu has ecc end to end but the ram is just soft-ecc, it takes a bit so the quadro 6000 goes down to 5.3gb and the bandwidth and performance likewise goes down.

Remember the old core 2 xeon's did not have FSB parity - the memory controller did the parity. However the odd bird is my mac pro which uses the x58 chipset can run with or without parity even though the cpu (xeon) is parity only. not sure how that works.

So you might have a GPU with parity (cache/address/data lines) but the nvidia cards do not have ECC ram at all. It is just taking 1 bit out of regular ram. Which makes sense since parity ram in regular motherboards is just another chip.

I suspect geforce cards just have it disabled.

The rom if you think about it (bios) tells the system about the card, and does basic functions (during boot) but also functions to configure the GPU. It is likely that very little of the bios is for the computer, and the rest is used to program/patch the GPU.

Now if the functions are disabled on the gpu itself (VGX on GK104/GK107/GK110) then you are going to have little luck trying to enable them, but I'd guess if you could find "beta" cards they may have the functionality still left in.

I was trying to think of where you might find such beta cards, or if they are all destroyed? That would be the key.

Some idiot sold me a raid controller with pre-release firmware (didn't bother to flash it) and it did not behave like the production board, it was running too hot and was even faster than the production model. odd.

Anyone have any ideas where to score some pre-release quadro/geforce cards?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: baconsteak on September 20, 2013, 09:30:42 pm
how is the checksum exactly calculated in there?

You can use the nvflash tool to modify the straps and it calculates the checksums for you. I'm completely noob and I managed to do it but it took me hours to wrap my head around it. I followed this (https://devtalk.nvidia.com/default/topic/489965/cuda-programming-and-performance/gtx480-to-c2050-hack-or-unlocking-tcc-mode-on-geforce/post/3511598/#3511598) guide. I'm sure one of us can help you out if you're stuck.

Thanks for the help gordan.
That sucks that nothing in the bios seems to control whether the 3d stereo function exists. I guess I will have to look for a second hand quadro with a power connector so I can overclock and still game.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on September 21, 2013, 06:29:03 am
I think the odds are about even at the moment on whether the extra quadro functionality is disabled in silicon (e.g by laser cutting) or whether it is disabled by some of the unidentified strap bits. Or it could be done by a FPGA that is programmed differently on the GeForce without exposed pins for programming it in the field. There are many possibilities. There are also a LOT of unidentified strap bit (32) and it could be any one of those or a combination thereof that affects any of the functionality in question. That's potentially 2^32 combinations to try. That's several thousand times the life expectancy of the EEPROM chip, so you better stock up on them if you want to undertake this. In fairness, though, it is likely that if the bit does control come functionality (as opposed to being merely unused), single bit flipping is likely to reveal some indication of what it does.

Also bear in mind that strap bits might do a different thing on Kepler than they do on Fermi. The ones I've been playing with that I identified on Fermi seem to do the same thing on Kepler, but some that might not have done anything on Fermi might do something on Kepler (wouldn't it be nice if one of those bits handles the next bit of the device ID strap so we can avoid soldering resistors to change the ID...).

Either way, it's an adventure for someone with waaaaaaay more time on their hands than I am ever likely to have.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 21, 2013, 07:05:08 am
It seems the new GK208 2GB GDDR5 64bit GT 640 would be a good candidate to cross-flash! But the straps are hard!

The better one would be the K2000 quadro since it sits right over the GRID so a small bit 0/1 AND mask could push it down plus it has the GK107 that seems to be so popular.

If there was only some sort of program that could run diagnostics on cards, we could collect information and locate the chips that have features enabled (mistake/or nobody cared!). Something folks could run that would use advanced nvidia technology to get information from the cards. :( Someone like the GPU-Z folks could distribute the technology and collect information.

Given that there are so many variants these days, I am going to guess that going Quadro to Quadro might be  better option! Worst case you have a quadro - cheap
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on September 25, 2013, 08:57:03 pm
I was thinking about getting a new card, anyone have suggestions? K5000 could be doable but K6000 or Grid K1 not (unless someone can find a workstation that sells with one!)

Could also get a K4000/K2000 but the idea would be to find a similar match that has the feature set we are looking for.

It seems that some kepler may have VGX (or ability) but nobody really knows.

Also the K340/K520 grid (not much known about this) is a variant.

Anyone know which chipsets the GRID use? Could get a Kepler based on the same chipset to see if its got any of the features en parity?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: aktivb on October 03, 2013, 03:14:32 am
I have a Asus GTX 650Ti on the shelf, GK106-220-A1 as by
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#GeForce_600_Series (http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#GeForce_600_Series)

Would this be modifiable to a K4000, GK106 as by
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#Quadro_Kxxx_Series (http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#Quadro_Kxxx_Series)

I am guessing the answer is 'Maybe', since I have not seen anyone
attempt this in the thread so far. I saw verybigbadbody modified a 650 (GK107-450-A2) into K1 here:
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg235610/#msg235610 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg235610/#msg235610)

The thing is, I have no idea how to start in locating straps, but if someone like
gnif, verybigbadboy or gordan would like to have a go at it, I would gladly ship
it over. (provided I get it back if the mod succeeds).

Would anyone like to have a go at mapping out a new card?

I can also provide high-res pictures of the card upon requests.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: dongsterlicious on October 04, 2013, 02:18:23 pm
Hi, I've got an evga 670 2gb card that i'm interesting in modding to a k5000.

It seems that everyone is modding their cards to be used in virtual machines, but i'm wondering if modding it opens up the hardware opengl or if it will give me any performance gain for that matter.

I'm mainly using maya, mari, adobe and etc, so i'm wondering if this mod will uncripple the geforce cards for these applications.

Thanks.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 07, 2013, 04:18:17 am
Hi, I've got an evga 670 2gb card that i'm interesting in modding to a k5000.

It seems that everyone is modding their cards to be used in virtual machines, but i'm wondering if modding it opens up the hardware opengl or if it will give me any performance gain for that matter.

Read the whole thread. Yes, I know there's 39 pages. No, that doesn't excuse you being lazy. :p
Uncrippling the missing GL functionality has not been achieved (at least not yet). Neither has ECC support or stereo 3D support.

On a separate note, I've been working on my GTX690. Inexplicably, it behaves very differently compared to the GTX680 under soft-modding. I removed resistor controlling the 3rd byte, and tried to soft-mod from there and it looks like the UEFI part of the BIOS is the only way to control this. Modifying the straps in the old places doesn't seem to do it. Removing the UEFI headers renders the card unbootable (not bricked, just no BIOS init). Removing the UEFI footer seems to not quite be doable - nvflash complains about a size mismatch. nvflash versions old enough to have the --eraseeprom don't recognize the card. Padding the BIOS out to 64KB like on the Gigabyte GTX680 that has a non-UEFI BIOS allows flashing onto the card, but the VBIOS fails to initialize. This is distinctly different from the GTX680 on which the soft-modding worked just fine.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: cubansite on October 07, 2013, 01:31:24 pm
How is this accomplish with an EVGA 670 FTW ?

Its pcb is the same as 680 so would this mean the same resistor change ?

anyone ?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 08, 2013, 07:21:11 am
Read the rest of the thread. I'm sure I remember people posting about having successfully modded the GTX670.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: wukoje on October 14, 2013, 05:14:23 am
Hi guys this is my first post, but I have been reading this thread for quite some time, would you mind tell me what would the benefits be in modding the gtx 580 to a tesla?
-Did anyone actually test it, does it make any difference in double precision (since this is supposed use case)?
-Until now did anyone manage to run multiple VM instances on the same quadrified/gridfied GPU? if not what's the best option, going for a real k5000,k6000 or just doing it the "as supposed to be way" with a grid card?
-Gordan would you mind sharing what 4gb gtx 680 did you manage to quadrfie to k5000?
-Did anyway manage to get any card (including ATIs consumer cards) to work with multiple instances?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 14, 2013, 10:35:22 pm
Hi guys this is my first post, but I have been reading this thread for quite some time, would you mind tell me what would the benefits be in modding the gtx 580 to a tesla?
-Did anyone actually test it, does it make any difference in double precision (since this is supposed use case)?
-Until now did anyone manage to run multiple VM instances on the same quadrified/gridfied GPU? if not what's the best option, going for a real k5000,k6000 or just doing it the "as supposed to be way" with a grid card?
-Gordan would you mind sharing what 4gb gtx 680 did you manage to quadrfie to k5000?
-Did anyway manage to get any card (including ATIs consumer cards) to work with multiple instances?

1) The only benefit of modifying the GTX580 Into a Tesla is that you can put the card into TCC mode and avoid the windows WDM driver overhead. If you do your number-crunching on Linux it won't gain you anything.

2) None of the modification discussed in this thread make any difference to DP performance. The only performance improvement from modding is on GTX470/GTX480 where you get the 2nd DMA channel enabled when you modify to Quadro 5000/6000. GTX580 only has one DMA channel.

3) The only system that supports GPU sharing in the way you describe is recent VMware ESX and that only supprots Grid GPUs. You should be able to use a GTX680/690 modified to Grid K2, but most of us here use Xen, and that only supports GPU virtualization by passing the whole GPU to a VM. If you want to try VMware with GPU sharing, go for it, but this is sufficiently obscure that you're bound to run into issues if you run on anything but their supported reference hardware. Also note that you will need a VMware specific client on the workstation where you are outputting the video stream of the 3D rendering. You cannot just plug monitors into different outputs and have each be a separate VM sharing a GPU. Grid GPUs have no video outputs at all. I suggest you go and read through all the VMware and Xen documentation on the subject before you ask questions like this here.

4) The GTX680 I used is a Gainward Phantom 4GB model, but I would expect any 4GB GTX680 to work fine when flashed with a K5000 BIOS (you will of course also need to hard-mod the resistors).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: wukoje on October 14, 2013, 11:53:52 pm
Quote from: gordan
You cannot just plug monitors into different outputs and have each be a separate VM sharing a GPU. Grid GPUs have no video outputs at all. I suggest you go and read through all the VMware and Xen documentation on the subject before you ask questions like this here.
I don't see where you understood that, since all the time I was talking about instances (maybe that has a different meaning for us) in any case shared gpu (particularly Nvidia Grid) is supported on all 3 major player in the virtualization field , MS with RemoteFX, Citrix and WMware.
The nice thing about using nvidia GPUs is the hardware support for H.264 Encoding in Kepler GPUs witch allows to encode the rendered streams fast and CPU free. Did anyone manage to use this hardware acceleration feature for virtualization? Anyway I wonder why do you insist so much on nvidia cards for Dedicated GPU (GPU passthrough) virtualization even by quadrifieing them, while amd consumer cards support it by default?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gamezr2ez on October 17, 2013, 02:08:59 am
Anyway I wonder why do you insist so much on nvidia cards for Dedicated GPU (GPU passthrough) virtualization even by quadrifieing them, while amd consumer cards support it by default?

AMD cards passthrough, sure. But they are buggy and you cannot reboot the VM. Lots of stability issues.

Modding my GTX680 allows me to reboot the VM as many times as I would like without any stability issues. I have shutdown the vm, passed the GPU back to my host, then passed it back to a different VM all without any reboots or performances issues.

That being said, I am not sure I would buy Nvidia again since I have completely converted to using virtualization for everything. The hardware mod is nice, but future generations will probably make this method obsolete. AMD contributed some of the initial GPU passthrough patches, I doubt they are going to block it like Nvidia does.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on October 17, 2013, 05:36:51 am
The nice thing about using nvidia GPUs is the hardware support for H.264 Encoding in Kepler GPUs witch allows to encode the rendered streams fast and CPU free. Did anyone manage to use this hardware acceleration feature for virtualization?

Hi I tried to use NVENC sdk but seems it is not posible to use hardware encoder on GRIDfied 680 card for me. I have error about not valid licence key. But It should works fine on any non geforce card. May be I did something wrong.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: rboxeur on October 17, 2013, 08:55:50 am
To verybigbadboy

wowza.com is a project which supports NVIDIA NVENC. This project is supported on Linux.

If you have a look at http://www.wowza.com/forums/content.php?512 (http://www.wowza.com/forums/content.php?512) explaining how to install NVIDIA NVENC on Ubuntu 12.04 then it is clearly written

Note: Accelerated encoding isn't available when running on a virtual hardware environment such as VMware or Xen.

If I am not mistaken then it should work on bare metal with a quadrified card. I didn't test it.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 18, 2013, 03:31:47 am
Quote from: gordan
You cannot just plug monitors into different outputs and have each be a separate VM sharing a GPU. Grid GPUs have no video outputs at all. I suggest you go and read through all the VMware and Xen documentation on the subject before you ask questions like this here.
I don't see where you understood that, since all the time I was talking about instances (maybe that has a different meaning for us) in any case shared gpu (particularly Nvidia Grid) is supported on all 3 major player in the virtualization field , MS with RemoteFX, Citrix and WMware.
The nice thing about using nvidia GPUs is the hardware support for H.264 Encoding in Kepler GPUs witch allows to encode the rendered streams fast and CPU free. Did anyone manage to use this hardware acceleration feature for virtualization?

The real question here is what is the use-case? You still have to have enough GPU on the hardware to do high-res H264 decoding in realtime to make this worthwhile. It's only useful for dedicated terminal hardware that comes with hardware video decoding yet is cheap. The only problem is that such hardware doesn't really exist because the client software is Windows-only AFAIK (and certainly x86 only). So the cost benefit is somewhat questionable, although it makes it a little easier to throw away workstations without the hassle of reintalling and data copying.

Anyway I wonder why do you insist so much on nvidia cards for Dedicated GPU (GPU passthrough) virtualization even by quadrifieing them, while amd consumer cards support it by default?

Because ATI cards utterly suck. Have you actually tried using an ATI card with VGA passthrough? The drivers are unusably crap, especially if you try to use multiple monitors, and doubly so on XP. They randomly blue-screen, and on a reboot they tend to either slow down to a crawl, blue-screen the guest, or even crash the host. Having battled an ATI card for days, I invested in a Quadro 2000 (lowest spec Quadro supported for GPU passthrough), and the experience was quite eyeopening.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: CDeLorme on October 18, 2013, 01:40:24 pm
I just finished reading all the contents of this thread, totally awesome.  gnif I donated to your cause, I hope you reach your goal soon and can show off an amazing video of the GTX 690 modification.  I just ordered a GTX 670 to try this mod with, and will probably get around to it this weekend.

I am also a Xen user and have spent months tinkering with AMD, and can affirm that gordan speaks the truth.

At most I would say amd cards make it "possible", to say they "support" it feels like a misnomer.  It is far from ideal, and not streamlined in the least.  Success still varies on model/make and surrounding hardware.  For my card it took days of nonstop experimenting just to figure out what exactly caused BSoD's, and then driver corruption which would lead to live degradation.

Even after all that work, manual ejection is required unless you want to reboot the physical machine, and my Windows 7/8 installs tend to have a maximum lifespan of about 6 months, after which point they become corrupt and the degradation happens within a day of runtime (not a reboot), and it isn't "fixable" except by a reinstall (or restoring from a pre-driver image).

Honestly, if I had the money and time I would have been experimenting with alternatives sooner.  I look forward to posting back whether I have success with the GTX 670 grid k2 mod.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on October 20, 2013, 08:10:11 am
someone just sold a grid k2 on ebay for $1525

with prices like that, maybe not a bad idea to just deal hunt these :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 21, 2013, 12:51:16 am
Indeed, that sounds like a deal good enough to not bother with modding - only there is a distinct discrepancy between what a Grid K2 is for and what most of us here want to use it for. A real Grid K2 has no video outputs. From that point of view, the use-case here is more along the lines of a Quadro than a Grid (passing a whole dedicated card to the VM).

On a separate note, I've been doing a bit more digging on soft-modding a the 6xx series cards. From what I can tell, the reason why things weren't working as expected on my first GTX690 was because it was not working properly (finally gave up the ghost the other day, most annoyingly). I got another one, I can soft-mod it to a Tesla K10. Except get this - modding the ID the old way in the main BIOS body has no effect. The old straps that used to start at 0x58 (0x458 on the UEFI BIOS) no longer have any effect when the UEFI wrappers are in place - even when running on a non-UEFI motherboard. Instead the strap at 0x0C is what controls the device ID. The bit layout, at least for the device ID control bits, is the same.

So for those of you that only want to mod to a Tesla K10, you can soft-mod, no need to take a soldering iron to he card. It also means that you can soft mod the 4th nibble, and only hard-mod the resistor pair for the 3rd nibble.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 22, 2013, 09:11:16 pm
First of all, a big thanks to anyone who contributed: gnif, verybigbadboy, and all the others I forgot to mention ^-^

I'm trying to convert a GTX780 to a Tesla K20 which have the following device IDs:
GTX7800x1004
K200x1022

According to the resistor values that were discovered so far, this would suggest that I would have to find a 5K and a 25K resistor and change them both to 15K since they are both digits are in the 0-7 range. I found the EEPROM and measured the values of the resistors around it. You can find the results below:
(http://vps1931.directvps.nl/GTX780-1.jpg)

As you can see, I found a 5K resistor which I removed and replaced it with a multi-turn 50K pot which I set to 15K. Unfortunately, this did nothing as the device ID still remains 0x1004 whereas I expected it to be 0x1024. There are two 4.7K resistors at the back of the board and other than that there are no 5K resistors on the board. Either NVIDIA changed the way the device ID is determined, or they changed the values, or there is a simple resistor divider action going on.

Before I go and change the 25K resistor, I want to make sure that I can change the 3rd digit from 0 to 2. I did try to flash a K20 ROM onto the EEPROM, but it is still recognized as a GTX780. Strangely enough, with the K20 ROM, the nvidia-smi tool reports that the board supposedly has 6GB of RAM instead of the actual 3GB. Any ideas or suggestions?

EDIT: I will be updating the image when I discover any new values/paths.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on October 22, 2013, 11:23:25 pm
First of all, a big thanks to anyone who contributed: gnif, verybigbadboy, and all the others I forgot to mention ^-^

I'm trying to convert a GTX780 to a Tesla K20 which have the following device IDs:
GTX7800x1004
K200x1022

According to the resistor values that were discovered so far, this would suggest that I would have to find a 5K and a 25K resistor and change them both to 15K since they are both digits are in the 0-7 range. I found the EEPROM and measured the values of the resistors around it. You can find the results below and in the attached photo:

As you can see, I found a 5K resistor which I removed and replaced it with a multi-turn 50K pot which I set to 15K. Unfortunately, this did nothing as the device ID still remains 0x1004 whereas I expected it to be 0x1024. There are two 4.7K resistors at the back of the board and other than that there are no 5K resistors on the board. Either NVIDIA changed the way the device ID is determined, or they changed the values, or there is a simple resistor divider action going on.

Before I go and change the 25K resistor, I want to make sure that I can change the 3rd digit from 0 to 2. I did try to flash a K20 ROM onto the EEPROM, but it is still recognized as a GTX780. Strangely enough, with the K20 ROM, the nvidia-smi tool reports that the board supposedly has 6GB of RAM instead of the actual 3GB. Any ideas or suggestions?

Hello oguz286,

Bios contain information about memory and memory type. It is absolutely normal that nvidia smi shows 6gb of ram with K20 bios.

Also I looked at gtx 780 bios, and have a question. What gtx780 model do you use? I think you need to do bios modification like a for GT 640 (https://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg213332/#msg213332)

Thank you.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 22, 2013, 11:35:37 pm
Hello verybigbadboy (thanks for the many mods :D).

I have a Palit GTX780, which I believe is the standard NVIDIA reference design (http://www.palit.biz/palit/vgapro.php?id=2132 (http://www.palit.biz/palit/vgapro.php?id=2132)).

Ok so what I understand from your post that you linked is that I need to modify the GTX780 BIOS on the card to unlock it.
My BIOS has the following values:
00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF 3F FC 7F
00000020: 00 50 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C

So that means my BIOS is locked and I need to change the FF 3F FC 7F to FF FF FF 7F. Is that correct?

I'm going to try it right now :D

EDIT: NiBiTor cannot read the ROM file correctly. I saved the modified BIOS using NiBiTor 6.06 but I do not know if the checksum is now correct. I guess I'll find out soon enough.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on October 22, 2013, 11:41:53 pm
Hello verybigbadboy (thanks for the many mods :D).

I have a Palit GTX780, which I believe is the standard NVIDIA reference design (http://www.palit.biz/palit/vgapro.php?id=2132 (http://www.palit.biz/palit/vgapro.php?id=2132)).

Ok so what I understand from your post that you linked is that I need to modify the GTX780 BIOS on the card to unlock it.
My BIOS has the following values:
00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF 3F FC 7F
00000020: 00 50 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C

So that means my BIOS is locked and I need to change the FF 3F FC 7F to FF FF FF 7F. Is that correct?

I'm going to try it right now :D

yes, and next line too
00000020: 00 50 00 80 to 00 00 00 80

also please update  checksum. without it card won't start at all :)
Quote
4. Change values to be equal values from 4.
5. Update checksum. I do it by nibitor tool. just open bios rom and save it. It produces lot of warnings, but it is ok.
6. Upload bios back to card.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 23, 2013, 12:02:51 am
Hello verybigbadboy (thanks for the many mods :D).

I have a Palit GTX780, which I believe is the standard NVIDIA reference design (http://www.palit.biz/palit/vgapro.php?id=2132 (http://www.palit.biz/palit/vgapro.php?id=2132)).

Ok so what I understand from your post that you linked is that I need to modify the GTX780 BIOS on the card to unlock it.
My BIOS has the following values:
00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF 3F FC 7F
00000020: 00 50 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C

So that means my BIOS is locked and I need to change the FF 3F FC 7F to FF FF FF 7F. Is that correct?

I'm going to try it right now :D

yes, and next line too
00000020: 00 50 00 80 to 00 00 00 80

also please update  checksum. without it card won't start at all :)
Quote
4. Change values to be equal values from 4.
5. Update checksum. I do it by nibitor tool. just open bios rom and save it. It produces lot of warnings, but it is ok.
6. Upload bios back to card.

Aha, I forgot the second line |O Anyway, I changed both lines and saved it to a file (FILE_A). I opened that file in NiBiTor (which complained), and saved it to a different file (FILE_B). Then I flashed the card with FILE_B and rebooted.

With the pot set at 5K (which is the original value), nvidia-smi now reports that it cannot determine the device handle and gives an unknown error :( Did I miss something?

EDIT: I do see that FILE_A and FILE_B are different:
one byte is different at 0x8DFF
one byte is different at 0x391FF

So I guess the checksum has been updated correctly.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: verybigbadboy on October 23, 2013, 12:09:32 am
With the pot set at 5K (which is the original value), nvidia-smi now reports that it cannot determine the device handle and gives an unknown error :( Did I miss something?

EDIT: I do see that FILE_A and FILE_B are different: one byte is different at 0x8DFF so I guess the checksum has been updated correctly.

yes checksum looks like corrected correctly.
can you check lspci for videocard id?

or
boot via dos flash drive and
nvflash --list ?

are you trying to flash it with gtx780 or k20 bios? please make changes with original bios first.

EDIT:
Also I suggest to change resistor near flash first. I am not sure that it is correct one are you trying to change now.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 23, 2013, 12:23:20 am
With the pot set at 5K (which is the original value), nvidia-smi now reports that it cannot determine the device handle and gives an unknown error :( Did I miss something?

EDIT: I do see that FILE_A and FILE_B are different: one byte is different at 0x8DFF so I guess the checksum has been updated correctly.

yes checksum looks like corrected correctly.
can you check lspci for videocard id?

or
boot via dos flash drive and
nvflash --list ?

are you trying to flash it with gtx780 or k20 bios? please make changes with original bios first.

I used the original BIOS which I first saved to a file and then modified and saved it with the correct checksum. With the modified BIOS:

5K pot:

nvflash --list gives GK11x (10DE,1004,10DE,104B)
lscpi gives device ID 0x1004

15K pot:

nvflash --list gives GK11x (10DE,1004,10DE,104B)
lspci gives device ID 0x1004

So there are no changes :(
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 23, 2013, 01:09:56 am
A few points:

1) The values for the resistors for the 4th nibble are the ones that were documented. Resistor values for the 3rd nibble are NOT the same. For example, on my GTX690, the 3rd nibble resistor is 25K (24.8K) and the value is 8.

2) You cannot measure the value of the resistor while it is attached to the board. What you will end up measuring is the resistance of the resistor in parallel with the resistance of the rest of the circuit (if it is connected - which in most cases it will be).

3) The 3rd nibble isn't fully adjustable. On 6xx series cards it tops out at 0xB. It doesn't matter what you set it to past 40K, I suspect you'll find it will not go past that value. This may be different on 7xx series cards.

4) Be careful when blanking out the strap values at 0x0C - the card could plausibly be partially soft-strapped, which means that editing the strap value can brick the card - hard. Normally, unbricking is reliant on the card being fully hard-strapped. You can then ground the EEPROM power pin, and the card will boot EEPROM-less and and show up again for nvflash (I have a GTS450 modified this way for easy unbricking when BIOS-modding). If the card relies on partial soft-strapping and you break the soft-strap, the only way of unbricking it may well be to find how the important other bits are hard-strapped and modify them for the correct hard-strap - much harder considering that nobody has yet reverse engineered anything other than the device ID resistor locations.

5) Cross-flashing a ROM from a similar card with a different amount of RAM will not work. At best you will end up with garbled/corrupted screen output, even if text mode boot-up works (and/or the card shows up as a secondary card). The only way you will get a Quadro/Tesla/Grid ROM working on a GeForce card is if you use a card with the same GPU with the same amount of VRAM. The only cross-flashes I have managed to get working are Q2000 1GB -> GTS450 1GB works, and QK5000 4GB -> GTX680 4GB works. And if you are doing that, you will also want to edit the BIOS to adjust the clocks and fan speeds back to where they were on the GeForce card.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 23, 2013, 02:30:31 am
A few points:

1) The values for the resistors for the 4th nibble are the ones that were documented. Resistor values for the 3rd nibble are NOT the same. For example, on my GTX690, the 3rd nibble resistor is 25K (24.8K) and the value is 8.
Ok, that was my misunderstanding then. I need to go through the topic then and see if there is any logic to be found concerning the 3rd nibble resistor.

Quote
2) You cannot measure the value of the resistor while it is attached to the board. What you will end up measuring is the resistance of the resistor in parallel with the resistance of the rest of the circuit (if it is connected - which in most cases it will be).
You are right. I just got lucky in that the 5K resistor I desoldered is in fact a 5K resistor. Unfortunately I cannot 'randomly' desolder parts as they are very fragile, so I want to be as sure as possible that I got the right resistor.

Quote
3) The 3rd nibble isn't fully adjustable. On 6xx series cards it tops out at 0xB. It doesn't matter what you set it to past 40K, I suspect you'll find it will not go past that value. This may be different on 7xx series cards.
Good to know. I soldered 50K multi-turn pots, so I can test with different values and see if it makes any difference. For the third nibble, I need to go from 0 to 2 so I hope that it is possible.

Quote
4) Be careful when blanking out the strap values at 0x0C - the card could plausibly be partially soft-strapped, which means that editing the strap value can brick the card - hard. Normally, unbricking is reliant on the card being fully hard-strapped. You can then ground the EEPROM power pin, and the card will boot EEPROM-less and and show up again for nvflash (I have a GTS450 modified this way for easy unbricking when BIOS-modding). If the card relies on partial soft-strapping and you break the soft-strap, the only way of unbricking it may well be to find how the important other bits are hard-strapped and modify them for the correct hard-strap - much harder considering that nobody has yet reverse engineered anything other than the device ID resistor locations.
I see. I thought that if I adjust the value of the 5K resistor to its original value and potentially mess with the power pin of the EEPROM, that I can simply reflash it with the original BIOS. I'll take that into consideration next time I try to adjust values in the BIOS.

Quote
5) Cross-flashing a ROM from a similar card with a different amount of RAM will not work. At best you will end up with garbled/corrupted screen output, even if text mode boot-up works (and/or the card shows up as a secondary card). The only way you will get a Quadro/Tesla/Grid ROM working on a GeForce card is if you use a card with the same GPU with the same amount of VRAM. The only cross-flashes I have managed to get working are Q2000 1GB -> GTS450 1GB works, and QK5000 4GB -> GTX680 4GB works. And if you are doing that, you will also want to edit the BIOS to adjust the clocks and fan speeds back to where they were on the GeForce card.
Yes, apparently the size of the RAM is also stored in the BIOS, so the chances of cross flashing to work are very slim if the hardware differs.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 23, 2013, 09:19:35 am
I found something interesting. After I changed the original BIOS like verybigbadboy mentioned, I opened it in NiBiTor and saved it to a different file as NiBiTor recalculates the checksum. I opened this supposedly correct file in the Kepler Bios Tweaker 1.25 tool and found that the checksum of this new file is incorrect and that the original BIOS and the BIOS file which I edited (but not opened/saved with NiBiTor) both have correct checksums.

So I flashed the modified BIOS onto my card and now the nvidia-smi tool does detect the card properly as a GTX780. It seems that NiBiTor breaks the checksum of the BIOS file. Unfortunately, with the pot at 15K, there is no change in the device ID.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: amyk on October 23, 2013, 05:26:22 pm
4) Be careful when blanking out the strap values at 0x0C - the card could plausibly be partially soft-strapped, which means that editing the strap value can brick the card - hard. Normally, unbricking is reliant on the card being fully hard-strapped. You can then ground the EEPROM power pin, and the card will boot EEPROM-less and and show up again for nvflash (I have a GTS450 modified this way for easy unbricking when BIOS-modding). If the card relies on partial soft-strapping and you break the soft-strap, the only way of unbricking it may well be to find how the important other bits are hard-strapped and modify them for the correct hard-strap - much harder considering that nobody has yet reverse engineered anything other than the device ID resistor locations.
Is everything stored in the EEPROM or is there some configuration stored in nonvolatile memory in the GPU itself? Otherwise it seems that a failsafe way to unbrick would be to rewrite the EEPROM out-of-system using something like a buspirate.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 23, 2013, 10:37:06 pm
I've always used Kepler BIOS Tweaker to sort out the checksums for me NiBiTor doesn't really understand recent BIOS-es properly, especially the UEFI wrapped ones.

The memory configuration is stored in the EEPROM. I guess you could make some kind of a piggy-back adapter to re-write the EEPROM out of band. I also did some basic comparisons between BIOS images with same BIOS version but for cards with different amounts of RAM, if you look a few pages back on the thread. I'm sure it should be possible to work out where the memory size is stored.

Please, do report back when you find the 3rd nibble resistor pair on the GTX780. I'm most interested in the prospect of turning a Titan into a K6000.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 23, 2013, 10:52:09 pm
User athanor posted photos of his TITAN a while back (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg212115/?topicseen#msg212115 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg212115/?topicseen#msg212115)).
I sent him a personal message to ask if he could measure some resistor values so that we can cross-reference some values. He hasn't responded though :(

If anyone has a TITAN or a K20 or K20X and could measure some resistors, then it would help tremendously.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 25, 2013, 08:43:11 am
Just completed modifying my new GTX690 and I can confirm what I suggested previously - the only hard-mod required is removing the resistor controlling the 3rd nibble..

This makes the 3rd nibble unstable but it only flaps between A and B. That's not a problem since that is only an instability in the least significant bit of the 3rd nibble, or bit 5 of the device ID.
Since that means only the bottom 5 bits need to change/stabilize, this can be achieved using using a soft mod.

On my card, the default soft-strap in the UEFI header starting at 0xC is effectively null (fully hard-strapped card), i.e.
FF FF FF 7F 00 00 00 80
We want all 5 accessible bits of the device ID to go high, so we need to change the OR section to:
00 3C 00 90.

And voila. GTX690 is now stably a Grid K2 with only two resistors removed, none replaced - which makes the mod at least 4x less complex. The improvement in difficulty is probably greater than 4x since manually re-soldering a 0402 component is considerably more difficult than removing it unless you truly are a ninja with a soldering iron.

Also note that the UEFI headers don't appear to be a part of the checksum - Kepler BIOS Tweaker doesn't detect a checksum mismatch after the above change to the strap. Nvflash also doesn't complain about anything other than the fact that the BIOS ID you end up flashing doesn't match the device ID of the board you are flashing to do you have to do it with --overridetype - no big deal. It also appears there is no strap checksum in the UEFI headers, unlike in the main BIOS payload. All of this makes the process even simpler.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 25, 2013, 09:00:26 am
Can I ask you where how you figured out where the UEFI header is? The guy who did the GTX480 to Tesla mod is a colleague of mine, but I did not know that the soft-strap locations for Kepler cards were known. Any link to where I can find such information?
Maybe this is also possible on the 700-series.

BTW, I was very busy so I couldn't make any progress with the GTX780, but I will have some more time tomorrow :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 25, 2013, 10:25:13 am
UEFI header is the first 1024 (0x400) bytes. If you look at the BIOS closely you'll find that once you strip off the first 0x400 bytes and all the padding and crypto certs off the end (more or less everything past 64KB), you are left with what is a pretty familiar Nvidia BIOS that hasn't changed much since the Fermi (4xx series) days.

The soft-straps for Kepler are similar to Fermi. At least the ID bits are the same, but I suspect most if not all bits will be the same. It is possible (even likely) some of the previously unknown/unused bits now do something new/different, of course.

I'd be surprised if this worked fundamentally differently on the 7xx series cards.

Update on my GTX690/Grid K2 mod - the card works fine on bare metal, but I cannot for the life of me get it to work with VGA passthrough. Tried both GPUs, and all I get is error 10 (device cannot start). My GTX680 works fine, both as a K5000 and a K2. I suspect the doubled-up PLX PCIe bridge on the 690 is causing a problem - NF200 bridges on my SR-2 are already problematic, and further bridging on top of them isn't going to be helping. The PCIe arrangement is thus Intel PCIe bridge -> NF200 (known to have VT-d affecting bugs) -> PLX bridge -> PLX bridge -> GPU. :(

Which means I need to start seriously considering either taking a chance on a Titan in hope we can figure out how to mod it into a K6000 as well as hope that the driver "just works" in that arrangement, or I get an ATI card for the VM this was intended for. Tough choice... I guess it depends on whether I can get a Titan for a similar price as an R9 R290X...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 25, 2013, 09:54:46 pm
I tried to softmod the 4th nibble (0x1004 to 0x1022, so from 4 to 2). My BIOS contains the following:

00000000: 4E 56 47 49 0C 01 10 80 B8 04 00 00 DE 10 4B 10
00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF 3F FC 7F
00000020: 00 50 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C
00000030: 00 50 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C
00000040: 36 10 10 02 FF FF FF FF 00 00 00 00 02 87 08 02

So in my case I guess that means that:
HEXBinary
AND07FFC3FFF0111 1111 1111 1100 0011 1111 1111 1111
OR0800050001000 0000 0000 0000 0101 0000 0000 0000
AND173FFFFFF0111 0011 1111 1111 1111 1111 1111 1111
OR18C0000001000 1100 0000 0000 0000 0000 0000 0000

If the bit positions for the device ID are still the same, then to go from 0x1004 to 0x1002 I would change AND0 to 7FFC3BFF and OR0 to 80005200 so that I don't take the resistor value for the 4th nibble, and force it to 2 via OR0.
I tried this and also changed the device ID strings from 0x1004 to 0x1002 in the BIOS, updated the checksum, flashed it to the card... it still displays 0x1004 :(

I hope I did something wrong, but if not, then NVIDIA probably changed some things. The reason why I started with nibble 3 is that I hoped I could change the 4th nibble without hardmodding.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 25, 2013, 10:32:50 pm
I just spotted an error in what I said - I'm not sure where 0xC came from. Looking at the BIOS dumps I think I meant 0x1C. Annoyingly, I don't have my GTX690 dumps handy, and they are the most minimally modified ones I have.

Ignore the second strap mask - there is nothing of interest there. The important bit is AND0/OR0.

The 32-bit mask can be represented this way:

-xx4 xxxx       xxxx xxxx       xx32 10xx       xxxx xxxx

You want to flip bit 2 low (AND0 to 0, OR0 to 0) and bit 1 high (OR0 to 1).

OLD AND0   7FFC3FFF   0111 1111 1111 1100 0011 1111 1111 1111
NEW AND0   7FFC2FFF   0111 1111 1111 1100 0010 1111 1111 1111
OLD OR0   80005000   1000 0000 0000 0000 0101 0000 0000 0000
NEW OR0   80004800   1000 0000 0000 0000 0100 1000 0000 0000

Which makes the new mask:
AND0 7FFC2FFF
OR0 80004800

And of course remember the byte order is little-endian when editing in the BIOS. :)

Disclaimer - I may be completely wrong in the above calculation - I haven't had enough coffee yet today.

On the off-chance I'm right, however, the relevant hex pseudo-patch would be:

< 00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF 3F FC 7F
> 00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF 2F FC 7F
< 00000020: 00 50 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C
> 00000020: 00 48 00 80 0E 10 10 82 FF FF FF 73 00 00 00 8C

If you could let me know in the next few hours if that works for you, I'd very much appreciate it - I need to make a decision on whether to get a Titan by tonight, and a confirmation that soft-modding works on the GTX780 would go a long way toward persuading me that is the way forward.

Unfortunately, GTX780 and Titan only different in the 4th nibble, so while we should have no trouble figuring out which resistor controls the 4th nibble based on the difference, finding the 3rd nibble will be more difficult.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 25, 2013, 11:42:13 pm
Yeah I haven't had coffee as well. I cannot recall the reasoning behind my modifications, but yes they are wrong.
Your modification look right, and I will try them when I get back home. Hopefully it will work :)

EDIT: Well gordan, you were right! nvflash now displays device ID 0x1002! Now we just have to figure out how to modify the 3rd nibble (which is going to be harder).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: Cubexed on October 26, 2013, 04:54:17 am
Hello, I have a GTX 770 with PCI id 0x1184, I am wondering if changing the soft straps I can turn it to 0x118F (Tesla K10), as both have the same chip, I want mainly for VGA Passtrough, as http://wiki.xen.org/wiki/XenVGAPassthroughTestedAdapters (http://wiki.xen.org/wiki/XenVGAPassthroughTestedAdapters) indicates that might work, but I want experts opinion before attempting anything
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 26, 2013, 07:13:42 am
GTX770 is a relabeled GTX680.
Yes, you can change the soft-straps as described above to turn it into a Tesla K10.
Tesla cards do not work for VGA passthrough.

The only cards that will work for VGA passthrough are:

Fermi (soft-moddable)
Quadro 2000 (or modified GTS450)
Quadro 4000 (no directly equivalent GeForce)
Quadro 5000 (or modified GTX470)
Quadro 6000 (or modified GTX480)

Kepler: (some hard-modding required)
Quadro K5000/Grid K2 (or modified GTX680/GTX770, some got a modified GTX690 to work, but mine refuses to, almost certainly due to the extra PCIe bridging on the card on top of NF200 PCIe bridges on the motherboard - happy to sell you my modified one if you're interested).

IIRC GTX650 -> Grid K1 has also been done.

Also, as you can see above, some effort is going into figuring out how to modify a GTX780/Titan into a K6000, but we're not there yet.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: axero on October 28, 2013, 02:55:21 am
Earlier in this thread there was a discussion about the FLR feature which surprisingly, in spite of its sheer simplicity is rarely implemented in PCI hardware for some reason. So another way to to trig a reset in selected hardware is to use the ACPI API and alter the power states of that hardware. In ACPI the power state D0 means fully on and the power state D3 means off. In power state D3, the Vcc of the PCIe slot is turned off, and in state D0 it is turned back on. Therefore this reset method is called D3-D0.

Now to my question; does this really work on PCIe cards that also take power from an auxilliary power source, i.e. directly from the PSU like most graphics adapters do these days? Setting the slot to D0 will not cut the power to the card, only to the slot where the card is sitting in. Has anyone managed to reset a GPU with the D3D0 method above in spite of the auxiliary power connection? I would really like to know more about that.


Also, I read a lot of questions regarding the vGPU feature offered by the Grid K2 GPU. A few years ago I wrote a post regarding a similar feature in the Intel Communty (and some other forums which I cannot remember):

https://communities.intel.com/thread/25945 (https://communities.intel.com/thread/25945)

In my posts there I was suggesting the development of Intel VT-x/AMD-v like extensions that allows the GPU(s) be shared seamlessly among VMs (and the host) just like the CPU cores can be shared among the (physical and virtual) machines with hypervisors such as VirtualBox, Parallels Desktop, Xen, KVM, Hyper-V, VMWare and so on. A lot has happened since then so I suspect that the vGPU feature is exactly that; a set of hardware assisted GPU extensions. The rendered image (of the computer desktop, a DirectX game or video playback) is then shared over a network connection most likely by using a remote desktop protocol such as RDP, VNC, Spice, etc, (Also The AMD FirePro R5000 et al have the capability of outputting the video signal over an ethernet connection, it even has its own NIC sitting on the card).

If this is the case then perhaps those extensions get disabled on non-K2 GPUs by zener zapping a certain set of fusebits. That would probably lead to the GPU ignoring such instructions as it receives them. Perhaps some instruction sets execution bits have a connection to one of those fusebits through an AND-gate (or NAND-gate depending on how you look at it). These instructions may then get blocked even before they reach the cores.

Edit: Regarding Nvidia's penchant for artificially disabling things, they appear to have done than on their Linux binary blob. This article mentions this about their Mosaic feature:

http://www.phoronix.com/scan.php?page=news_item&px=MTQ3NDE (http://www.phoronix.com/scan.php?page=news_item&px=MTQ3NDE)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 28, 2013, 10:47:35 am
IIRC the power management trick is something Xen already implements, provided the GPU actually reports as supporting D3 and D0 power management states.

It doesn't always work, though. Due to various issues, one of my gaming VMs is running a HD7970 cards, and for some bizzare, inexplicable reason GPU-Z locks up the VM and and the GPU hard. Issuing the reset (presumably implemented using the above trick, since the card doesn't support FLR) does nothing to shake it loose. The particularly weird thing is that if I use a HD4850 card for dom0, the whole machine locks up solid as soon as the ATI driver is loaded in domU. This doesn't happen, however, if the domU card is HD7450. Really strange. Because of this I had to switch back to my 8800GT card for dom0.

The trick might, however, just work well enough to resolve the issue of performance degradation after domU reboots (instead of rebooting, power it off, issue the reset, then start up the domU).

While this is still nowhere nearly the "just works" 100% reliable and robust experience achieved using something like the soft-modded Fermi (GTS450/GTX470/GTX480) cards into corresponding Quadros (Quadro (2/5/6)000 respectively), it is a vast improvement on ATI usability since a few months ago. So much so, in fact, that given my experience of assorted oddness with my Kepler cards (GTX680 won't do dual-link DVI modes but only when virtualized (works fine on bare metal), GTX690 modified to Grid K2 that flat out refuses to initialize in domU), I'm actually switching back to using ATI cards for some VMs where I need more performance than a GTX480/Q6000.

Anyway, back on the issue of power management reset trick, (most?) Nvidia cards don't seem to support the required levels of power management to do that trick (or at least they don't seem to advertise it), yet they work just fine across domU reboots.

Finally, I don't think any of the extensions for virtualization are disabled - I seem to recall that somebody posted somewhere that they got vSGA/ESXi working with a GeForce GK104 based card after modifying it. If you think about it, no special hardware support is required. All vSGA does is implement a virtual GPU driver that offloads DirectX and suchlike via a paravirtualized interface to the host's GPU. I may be wrong, but IIRC older models like Quadro 6000 are also supported.

I guess the next logical step from there might be to use remote GPU rendering using a trick along the lines of DMA mapping the BARs using RDMA over infiniband.

Also, you don't actually need Mosaic on Linux - you can achieve the exact same thing using standard Xorg configuration and extensions like Xinerama. I've been using it for years - it's the only way to get stretched full screen desktop working on monitors like the IBM T221 that require two dual-link DVI channels to achieve enough bandwidth for their full resolution and frame rate, and thus appear as either 2x1920x2400 screens or 4x960x2400 screens that need to be "glued" together in Xorg.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: axero on October 28, 2013, 12:10:31 pm
Perhaps a failure to reset e.g. the earlier generations of the ATi/AMD card might be due to the auxiliary power keeping the card "alive" even when the power to the PCIe slot is cut. If that is the case then maybe a switch or relay that turns off the auxiliary input (upon detection of a Vcc cut) might help (such relays ought to be able to take quite a few amps though 240W@12V => 20A).

After reading a Whitepaper on sSGA/sDGA deployment in ESXi I found at the end of that document that the Quadro 6000 uses "Bridge reset" as a reset method (a search for '"bridge reset" virtualization' in google made me discover that also Quadro 4000 and Grid K1 use this method, this can be found in ESXi through the command "esxcli hardware pci list -c 0x300 -m 0xffc"). Bridge reset is what I believe to be also known as "bus reset" which presumably means that the entire PCI bus is reset. The different reset methods are discussed very briefly in the VM DirectPath documentation although I don't know any more about it. How bridge or bus reset works is as to this date a mystery to me...

The special thing about the vGPU feature is that it can be shared among up to 8 virtual guests as one GPU, i.e. it is not dedicated to one VM as VGA passthrough (or sDGA in ESXi). That requires a more sophisticated solution than when it is dedicated which made me suspect that the drivers are not only paravirtualized but also hardware assisted through certain extensions (mind you that the AMD-v/Intel VT-x extensions do not require special paravirtualized drivers on the guest side). The downside with this technology is that it currently only gives up to 512MB of video RAM to each VM and that only DirectX up to version 9.0c is supported, at least in ESXi. Other conditions may apply in Hyper-v and other hypervisors that support the vGPU technology. So, maybe there are no hardware extensions involved with the vGPU technology after all.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 28, 2013, 12:32:32 pm
Well, here's an update. In trying to find the resistor(s) that control the third nibble, ijsf (the guy who did the original GTX480 to Tesla hack) and I screwed around with the BIOS, and sure enough, the card was not recognized anymore.

I disconnected the power to the eeprom but that didn't help either. In the end I ended up with hooking up the eeprom to my Raspberry Pi, writing a python script that can read from and write to the eeprom and finally managed to write the original BIOS back. Luckily the card works again, and now I can always reflash the card because I have a breakout-board that I can hook up to my RPi :D
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 28, 2013, 06:57:45 pm
Perhaps a failure to reset e.g. the earlier generations of the ATi/AMD card might be due to the auxiliary power keeping the card "alive" even when the power to the PCIe slot is cut. If that is the case then maybe a switch or relay that turns off the auxiliary input (upon detection of a Vcc cut) might help (such relays ought to be able to take quite a few amps though 240W@12V => 20A).

You know, you might be on to something here - I thought I found that resetting the HD7450 card works, but - that only draws power from the slot, no auxiliary ATX power goes into it. So perhaps you are right. Putting the card into D3 state, and cycling ATX aux power might just do the trick. You'd need a multi-throw (3 lines per power connector) switch to make this work, but it does sound like something worth testing. Do post some designs if you come up with an internal USB+hub widget that can do this. :)

The special thing about the vGPU feature is that it can be shared among up to 8 virtual guests as one GPU, i.e. it is not dedicated to one VM as VGA passthrough (or sDGA in ESXi). That requires a more sophisticated solution than when it is dedicated which made me suspect that the drivers are not only paravirtualized but also hardware assisted through certain extensions (mind you that the AMD-v/Intel VT-x extensions do not require special paravirtualized drivers on the guest side). The downside with this technology is that it currently only gives up to 512MB of video RAM to each VM and that only DirectX up to version 9.0c is supported, at least in ESXi. Other conditions may apply in Hyper-v and other hypervisors that support the vGPU technology. So, maybe there are no hardware extensions involved with the vGPU technology after all.

It's not that magical/complicated. VMware has had accelerated emulated drivers in desktop hypervisors for years. They offloaded guest's 3D rendering onto the host's OpenGL subsystem. vSGA is conceptually similar - it is a hardware accelerated emulated graphics device that offloads the rendering work to the real GPU.

Well, here's an update. In trying to find the resistor(s) that control the third nibble, ijsf (the guy who did the original GTX480 to Tesla hack) and I screwed around with the BIOS, and sure enough, the card was not recognized anymore.

I disconnected the power to the eeprom but that didn't help either. In the end I ended up with hooking up the eeprom to my Raspberry Pi, writing a python script that can read from and write to the eeprom and finally managed to write the original BIOS back. Luckily the card works again, and now I can always reflash the card because I have a breakout-board that I can hook up to my RPi :D

Awesome stuff. :)
Any chance you could use this opportunity of having an unbrickable card to investigate whether the meaning of any of the bits in the first 32-bit strap have changed, and whether one of the previously unknown bits might have been used to set the 6th device ID bit? It'd be really handy if the soldering solution could be fully deprecated.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: axero on October 29, 2013, 04:03:32 am
Maybe it's not that much "magic" in sharing a GPU between VMs but it is quite tricky to do that without overhead and yet be rather as feature rich as if it were on bare metal. Before AMD-v and Intel VT-x the CPU sharing took a rather substantial penalty from the virtualization. Now this penalty is rather small thanks to the hardware assisted virtualization technology offered through VT-x and AMD-v. From the papers on vGPU there seems to be a rather small penalty to sharing the GPU, either they have really managed to bring up smart drivers or there is something hardware assisted to back it up. Maybe there is a rather substantial overhead that is "offset" by the capabilities of the GPU.

I have started a thread for resetting PCI devices with auxiliary power input here:

https://www.eevblog.com/forum/projects/acpi-power-saving-circuitry-for-150-w-pci-devices-%28ie-gpus%29/ (https://www.eevblog.com/forum/projects/acpi-power-saving-circuitry-for-150-w-pci-devices-%28ie-gpus%29/)

I guess a further discussion about it should be taken there.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 29, 2013, 04:23:45 am
Maybe it's not that much "magic" in sharing a GPU between VMs but it is quite tricky to do that without overhead and yet be rather as feature rich as if it were on bare metal.

It's not without overhead - the overhead is likely quite substantial, not including the inevitable overhead of actually encoding it into an MPEG stream in realtime and sending it over the network. While it is a cool feature, its use-case is rather narrow.

Before AMD-v and Intel VT-x the CPU sharing took a rather substantial penalty from the virtualization.

Even with those virtualization performance penalty is substantial:
http://www.altechnative.net/2012/08/04/virtual-performance-part-1-vmware/ (http://www.altechnative.net/2012/08/04/virtual-performance-part-1-vmware/)

There were also other solutions before VT-x that provided only marginally worse performance (e.g. kqemu)

Now this penalty is rather small thanks to the hardware assisted virtualization technology offered through VT-x and AMD-v. From the papers on vGPU there seems to be a rather small penalty to sharing the GPU, either they have really managed to bring up smart drivers or there is something hardware assisted to back it up. Maybe there is a rather substantial overhead that is "offset" by the capabilities of the GPU.

That's pretty much it - the GPU has enough processing power to to produce reasonable results within the given constraints. That doesn't mean it's particularly efficient. I would be surprised if the performance is much more than 50% of what you might expect on bare metal, especially after you account for the MPEG encoding. I have a virtualized gaming rig that is pretty finely tuned, and the frame rates on bare metal are at least 10-20% higher - and that's just with VGA passthrough, which is a lot less overheady than something like vSGA.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 29, 2013, 06:32:59 pm
Well, here's an update. In trying to find the resistor(s) that control the third nibble, ijsf (the guy who did the original GTX480 to Tesla hack) and I screwed around with the BIOS, and sure enough, the card was not recognized anymore.

I disconnected the power to the eeprom but that didn't help either. In the end I ended up with hooking up the eeprom to my Raspberry Pi, writing a python script that can read from and write to the eeprom and finally managed to write the original BIOS back. Luckily the card works again, and now I can always reflash the card because I have a breakout-board that I can hook up to my RPi :D

Any chance you could post a detailed explanation of what you did to make an unbricking rig? I have a suspicion that the root cause of the death of my first GTX690 might have been a misflash that corrupted the PLX chip (PCIe bridge) EEPROM. It'd be nice to have a go at resurrecting it.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: axero on October 29, 2013, 10:32:26 pm
Before AMD-v and Intel VT-x the CPU sharing took a rather substantial penalty from the virtualization.

Even with those virtualization performance penalty is substantial:
http://www.altechnative.net/2012/08/04/virtual-performance-part-1-vmware/ (http://www.altechnative.net/2012/08/04/virtual-performance-part-1-vmware/)

There were also other solutions before VT-x that provided only marginally worse performance (e.g. kqemu)
I guess it depends on what type of load you expose the virtual CPU to. I have seen tests from Phonoix.com website where the difference between VMs and bare-metal performance is considerably less. Look for example at this article:

http://www.phoronix.com/scan.php?page=article&item=ubuntu_1110_xenkvm&num=2 (http://www.phoronix.com/scan.php?page=article&item=ubuntu_1110_xenkvm&num=2)

Regarding VirtualBox4 that has shown a notoriously bad result  in the test you refer to (perhaps an old version? VB4.3 is out now), if you intend to run VB and find VB4 to be sluggish, you could also try the earlier VirtualBox 3.2.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 30, 2013, 08:26:29 am
My advice on the subject of benchmarks is to only believe your own. The reason I carried out mine was because I didn't believe all the ones that showed a negligible performance penalty. The performance hit from core-to-core migration alone is very significant (bigger than the "industry claims" of performance hit of virtualization).

Some years ago when I was folding I found that just pinning process threads to cores boosted performance by nearly 25% on a Core2 Quad. Now, granted, on a C2Q you get doubly hit when the process migrates between cores that aren't on the same die (C2Q is a 2x2 design) since the other die won't have it's caches primed for that process, but even staying on the same die there is a slow-down.

Now take this a further level of abstraction up where the guest generally has no idea what the physical CPU estate layout might be, and you are making the problem massively worse because the guest process scheduler is running completely blind, while the hypervisor is additionally context switching different VMs vCPUs all over the place to run on overbooked physical hardware. It doesn't take much imagination to see how the performance cannot be anything but poor.

Anyway, this is getting off-topic.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on October 30, 2013, 11:43:38 pm
I'm disposing of some of my modified GeForce cards. If anyone who reads this wants a VGA passthrough capable Nvidia card on the cheap but lacks the confidence to attack it with a soldering iron and/or hex editor, you may be interested in this.

I have:
2x GTS450 -> Q2000
1x GTX470 -> Q5000
1x GTX690 -> Grid K2

PM me if you are interested in any of these and are in the EU (outside the EU the import duty and shipping would make this uneconomical).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on October 31, 2013, 05:05:15 am
Well, here's an update. In trying to find the resistor(s) that control the third nibble, ijsf (the guy who did the original GTX480 to Tesla hack) and I screwed around with the BIOS, and sure enough, the card was not recognized anymore.

I disconnected the power to the eeprom but that didn't help either. In the end I ended up with hooking up the eeprom to my Raspberry Pi, writing a python script that can read from and write to the eeprom and finally managed to write the original BIOS back. Luckily the card works again, and now I can always reflash the card because I have a breakout-board that I can hook up to my RPi :D

Any chance you could post a detailed explanation of what you did to make an unbricking rig? I have a suspicion that the root cause of the death of my first GTX690 might have been a misflash that corrupted the PLX chip (PCIe bridge) EEPROM. It'd be nice to have a go at resurrecting it.

It's really not that complicated. I have a Gigadevice GD25Q10B SPI-EEPROM on my card and I used the datasheet to figure out what the layout of the pins were. The SPI protocol is really simple as you have six pins that you have to connect: Vcc, Vss, CE# (= chip select), SCLK (=serial clock), MISO (=Master In, Slave Out) and MOSI (=Master Out, Slave In). The WE# and HOLD# pins are connected to Vcc so, you just solder wires onto the EEPROM and connect them to the Raspberry Pi. Then you just use the spidev user space module to sent commands to the EEPROM which are listen in the datasheet. To get it running quickly I used py-spidev so that I could whip up a simple python script that dumps the EEPROM (to check if I can communicate with the chip correctly), and one script that writes the correct ROM to the chip.

One potential problem though: I have a different card that has the Pm25LV512 chip which I couldn't program. It's because it reads/writes data at the falling edge of the clock, whereas the spidev module reads/writes on the rising edge of the clock. This is stated in the datasheet of that chip, and the datasheet of the GD25Q10B that is on the GTX780 states that it reads/writes on the rising edge of the clock. If you have a chip that acts on the falling of the clock you need to use a different linux kernel module that supports this mode (spidev does not) which involves recompiling the Raspbian kernel.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: foxdie on November 01, 2013, 08:12:49 am
Hi all, just want to chime in here.

Firstly, it's great to see such passionate hacking of Nvidias offerings. There's been a few geekgasms perusing this thread  :-+

Secondly, this thread has grown to become white a whopper and extracting information is becoming quite a challenge. What would be nice is a summary post once in a while like gordans post here (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg316739/#msg316739). That helps people of all experience get a broad overview of progress and those with deeper understanding can delve deeper.

Whats not clear to me at this point is what will and won't work for those of us who want to make daily use of the end result. I personally would like to give guests in ESXi a decent 3D performance bump but I'm not sure how to approach that (what card is seen as the best starting point, what work needs doing to it etc). I realise this thread isn't about making card X work with technology Y but most of us are here for the virtualisation benefits.

I'm not scared to crack out a soldering iron and a multimeter (although I doubt my hands are steady enough to resolder SMD resistors hehe), I'd just like some recommendations on what direction to take :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 01, 2013, 11:48:36 am
Whats not clear to me at this point is what will and won't work for those of us who want to make daily use of the end result. I personally would like to give guests in ESXi a decent 3D performance bump but I'm not sure how to approach that (what card is seen as the best starting point, what work needs doing to it etc). I realise this thread isn't about making card X work with technology Y but most of us are here for the virtualisation benefits.

It comes down to 2 things:
1) Budget
2) Hand steadyness / suitable soldering equipment availability (tiny soldering iron, at least a good magnifying lamp, preferably an electronics microscope

If you are on a tiny budget (<= £50) and only need reasonable performance with direct VGA passthrough up to 1080 resolution, get a GTS450 GDDR5 card and turn it into a Quadro 2000.

As your budget and performance requirements go up, a GTX470 or GTX480 may be more suitable (they make good Quadro 5000 and Quadro 6000 cards respectively). These work not only with direct VGA passthrough, but should also work with ESXi vSGA. There have been reports of people successfully running ESXi vSGA with 6 clients using a single Quadro 6000 card to achieve mostly playable Borderlands at 800x600. A modified GTX480 should be somewhat faster at this than a real Quadro 6000. A modified GTX470 should be about 50% faster than a real Quadro 5000. I use Xen rather than ESXi, but I wouldn't expect anything to notice any difference between one of these modified cards and the real thing as far as GPU acceleration offload is concerned.

If your budget and performance requirements are even higher, you have little choice but to modify a GTX680/GTX690/GTX770 into Quadro K5000 or Grid K2. Either mod works fine, and I have not observed any obvious difference in functionality or stability between modifying to K5000 or K2, both work just fine. For this, however, you will need to at least remove the resistor controlling the 3rd device ID nibble. Replacing an 0402 resistor is harder than removing it, but leaving the resistor off can cause device ID instability. You can compensate for this by soft-modding the available strap bits.

It is perhaps worth noting that my experience with modified Fermi cards has been 100% problem-free (Q2000, Q5000 and Q6000 mods have all always worked flawlessly).

Kepler modding has been somewhat unpredictable for me. I had a GTX690 card that wouldn't cooperate for some reason, and another that works fine, only I cannot use it due to a motherboard bug (avoid anything with NF200 PCIe bridges if you want to virtualize). I have a GTX680 card that is quite thoroughly modified into a K5000, but it is also not trouble free (read back on the thread about the bizzare dual-link DVI issue - single link DVI modes work fine, dual link modes don't, and neither does DP - but all this is only a problem with running virtualized; on bare metal the card works absolutely fine on all ports and in all modes). Nobody else reported similar issues (and in fact, many people reported a resounding success with modified Keplers), so this is probably just my talent for finding bugs in everything showing up.

So if you can live with the performance, and you want to put in the least possible amount of effort on a relatively tight budget, GTX470 or GTX480 is probably the best price/performance/ease compromise.

Does that answer your question?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: foxdie on November 01, 2013, 11:30:57 pm
Hi gordan, you're a legend for getting back to me so quick.

Budget isn't an issue (within reason), I'm looking to set up a virtualised gaming rig much like yourself. There'll be a Windows 7 (64-bit ent) VM that'll need as much 3D gaming grunt as possible and a couple of other VMs that need some acceleration to be responsive and usable (possibly 3D too). Naturally I want to go for as much power as possible, I plan to use the machine for current and next gen gaming.

My system will be a SuperMicro X9SRA (http://www.supermicro.co.uk/products/motherboard/Xeon/C600/X9SRA.cfm) motherboard with a Xeon E5-2620 v2 (http://ark.intel.com/products/75789/) and 32GB Reg ECC RAM.

I thought about converting a GTX 480 into a Q6000 as they're fairly cheap to pick up used on ebay, however I'm not sure which brand would follow reference design to make the modification less of a headache.

What wasn't made clear earlier was that a Q6000 clone (GTX 470 / 480) can be used as vSGA with 6 guests, that's a great piece of information for those looking to accelerate 3D on multiple VMs on a budget  :-+

What would also be handy to know, and again I assume this is probably beyond scope of this thread, would be if multiple Q6000 clones can be added to a system, one passthrough'd to one VM directly for as much acceleration as possible, and a second Q6000 clone distributing as vSGA between remaining VMs? GTX 480s can be picked up second hand for around £100 each on eBay so they'd make a great price vs benefit starting point.

The GTX 680 I was secretly hoping you would have found a solution by now, but as with all things of this nature, it wouldn't be too easy or everyone would be doing this to their cards :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 02, 2013, 12:35:56 am
Budget isn't an issue (within reason), I'm looking to set up a virtualised gaming rig much like yourself. There'll be a Windows 7 (64-bit ent) VM that'll need as much 3D gaming grunt as possible and a couple of other VMs that need some acceleration to be responsive and usable (possibly 3D too). Naturally I want to go for as much power as possible, I plan to use the machine for current and next gen gaming.

Depends on your intended resolution. If all you want is 1080 resolution capability, GTX480/Q6000 will deliver. Granted, my eyes don't seem to see things the way most people's do (more pixels, not as many frames per second, or so it seems), but I happily completed Crysis+Warhead on a T221 3840x2400 in a VM on my Quadrified GTX480. On the same physical host, my wife was finding Borderlands 2 unplayably bad at 2560x1600 (so I temporarily put a HD7970 in her VM (and yes, it crashed the host when you try to reboot the VM - I'm hoping the ATI pollution is going to be temporary) and kept the 480 for my VM).

If you need more than 2 VMs with 3D acceleration on that motherboard, you are going to have to use something like a GTX690, given it only has 2 PCIe x16 slots.

I thought about converting a GTX 480 into a Q6000 as they're fairly cheap to pick up used on ebay, however I'm not sure which brand would follow reference design to make the modification less of a headache.

They all do / it doesn't matter.

What wasn't made clear earlier was that a Q6000 clone (GTX 470 / 480) can be used as vSGA with 6 guests, that's a great piece of information for those looking to accelerate 3D on multiple VMs on a budget  :-+

Quick googling some of the terms in the article from memoring yields this:
http://www.simonlong.co.uk/blog/2012/10/25/vmware-view-3d-gaming-experience/ (http://www.simonlong.co.uk/blog/2012/10/25/vmware-view-3d-gaming-experience/)

Note that this is not exactly high-performance gaming - it may be 6 clients, but they are 800x600@25fps.

I should also point out that I don't use ESXi for this stuff, so you might want to start on the cheap with a modified 470 or 480 (performance difference between them is pretty negligible, and a 470 will likely outperform a Q6000 as long as you don't run out of VRAM) as a proof of concept before you commit to a more expensive piece of kit like a 680/770 or a 690. I should perhaps also point out (hint:nudge) that there is currently a quadrified GTX470 on eBay at the moment. ;)

What would also be handy to know, and again I assume this is probably beyond scope of this thread, would be if multiple Q6000 clones can be added to a system, one passthrough'd to one VM directly for as much acceleration as possible, and a second Q6000 clone distributing as vSGA between remaining VMs? GTX 480s can be picked up second hand for around £100 each on eBay so they'd make a great price vs benefit starting point.

I see no reason why you couldn't use one card for vSGA and one for vDGA. Just bear in mind that you won't be getting video directly out of your vSGA cards - those VMs will feed you a compressed video stream of the desktop that you will have to decode on another machine. The problem with this being that you need another machine as a terminal (unless you use your vDGA machine as a terminal for it, which would work I suppose, but it gets a bit recursive).

But as I said before - I am not an ESXi user, and while I would expect their solution to be a little more polished than Xen, I suspect you will also have a lot less community support to fall back on if it doesn't work out of the box. Also, last time I checked vDGA was treated as an experimental feature.

The GTX 680 I was secretly hoping you would have found a solution by now, but as with all things of this nature, it wouldn't be too easy or everyone would be doing this to their cards :)

It could be that I just have a weird GTX680, or there is some OS/environmental issue that is manifesting as the problem I mentioned. One of the guys on the Xen list modified a completely standard GTX680, and his works fine in all modes, so my DVI issues are most likely just a bizzare quirk of my system configuration.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: foxdie on November 02, 2013, 01:02:59 am
Depends on your intended resolution. If all you want is 1080 resolution capability, GTX480/Q6000 will deliver. Granted, my eyes don't seem to see things the way most people's do (more pixels, not as many frames per second, or so it seems), but I happily completed Crysis+Warhead on a T221 3840x2400 in a VM on my Quadrified GTX480. On the same physical host, my wife was finding Borderlands 2 unplayably bad at 2560x1600 (so I temporarily put a HD7970 in her VM (and yes, it crashed the host when you try to reboot the VM - I'm hoping the ATI pollution is going to be temporary) and kept the 480 for my VM).

If you need more than 2 VMs with 3D acceleration on that motherboard, you are going to have to use something like a GTX690, given it only has 2 PCIe x16 slots.

I'm admittedly not an UHD junkie, 1080p would meet my needs (and avoid having to buy a new monitor). Framerate is however a potential issue, I'd like to be able to play Crysis 3 at 1080p60 with no slowdown.

I should perhaps also point out (hint:nudge) that there is currently a quadrified GTX470 on eBay at the moment. ;)

That's funny, I can't find it, think someone might have snapped it up O0

I see no reason why you couldn't use one card for vSGA and one for vDGA. Just bear in mind that you won't be getting video directly out of your vSGA cards - those VMs will feed you a compressed video stream of the desktop that you will have to decode on another machine. The problem with this being that you need another machine as a terminal (unless you use your vDGA machine as a terminal for it, which would work I suppose, but it gets a bit recursive).

Recursion can be fun! That's probably what I'd do anyway ;)

I have an Asus Radeon HD 6450 bouncing around on my desk anyway so I'll first try passing that through as vDGA and use the GTX 470 as vSGA.

But as I said before - I am not an ESXi user, and while I would expect their solution to be a little more polished than Xen, I suspect you will also have a lot less community support to fall back on if it doesn't work out of the box. Also, last time I checked vDGA was treated as an experimental feature.

True but someone needs to take the plunge to see if this'll work right? I'm in the same boat with the X9SRA, I haven't found any solid documentation this will work but progress isn't made on repetition ;)

It could be that I just have a weird GTX680, or there is some OS/environmental issue that is manifesting as the problem I mentioned. One of the guys on the Xen list modified a completely standard GTX680, and his works fine in all modes, so my DVI issues are most likely just a bizzare quirk of my system configuration.

If you want, I'm happy to test the card in my new system for you, confirm if its working, then post it back in the same condition it was received in. I'm only up the road (in relative terms given a global community, still 120 miles away :-DD) so RM Special Delivery would be fast.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 02, 2013, 02:08:36 am
But as I said before - I am not an ESXi user, and while I would expect their solution to be a little more polished than Xen, I suspect you will also have a lot less community support to fall back on if it doesn't work out of the box. Also, last time I checked vDGA was treated as an experimental feature.

True but someone needs to take the plunge to see if this'll work right? I'm in the same boat with the X9SRA, I haven't found any solid documentation this will work but progress isn't made on repetition ;)

That is, indeed, my view as well. If we'd stuck with repetition the fastest way to travel would still involve staring at an ox's backside for excruciatingly prolonged periods of time.

It could be that I just have a weird GTX680, or there is some OS/environmental issue that is manifesting as the problem I mentioned. One of the guys on the Xen list modified a completely standard GTX680, and his works fine in all modes, so my DVI issues are most likely just a bizzare quirk of my system configuration.

If you want, I'm happy to test the card in my new system for you, confirm if its working, then post it back in the same condition it was received in. I'm only up the road (in relative terms given a global community, still 120 miles away :-DD) so RM Special Delivery would be fast.

Interesting thought. Unfortunately, I'm using it at the moment, and, ironically, I'm fresh out of spares I could drop in.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 02, 2013, 11:29:59 pm
Servus,

I changed the marked resistor in the pic from 25k to 40k. Now the pci-id is 1025 instead of 1005 for a Titan. The aim is to get 1020 (K20X). I changed already the resistors near by. But it doesn't changed anything. Somebody any ideas?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 03, 2013, 12:25:47 am
I changed the marked resistor in the pic from 25k to 40k. Now the pci-id is 1025 instead of 1005 for a Titan. The aim is to get 1020 (K20X). I changed already the resistors near by. But it doesn't changed anything. Somebody any ideas?

Holy crap, you did it!
There is no need to change any further resistors - you can soft-mod the last nibble. See the info on page 41.

/me goes to acquire a Titan.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: foxdie on November 03, 2013, 02:04:45 am
Yeah I was just thinking he could soft-mod it, great progress :) Wonder what the potential of the titan is? Extreme virtualised gaming? :D

(Might treat myself for xmas lol)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 03:34:13 am
Haha, I was JUST going to post that I managed to convert my GTX780 to a Tesla K20 and I saw that johndoe beat me to it :-DD I know I should have posted here before I wrote an article on my site about how I figured it out.

The first 5 bits of the device ID are softstrapped so just change the appropriate bits and reflash your card.

But I can also tell you that modding the GTX TITAN BIOS with the right straps will not turn it into a working Tesla K20. For example, you cannot disable the TCC mode, and you cannot run any CUDA code. :( What you CAN do however is go to TechPowerup and download the only K20 BIOS they have, and change the soft straps on that BIOS ;)

Btw, it would be awesome if you run some CUDA samples to see if everything runs fine (and report back of course ;) ).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 04:08:38 am
Ok, can someone explain me in a little bit more detail how to work with the softstraps, I have some sim software to test cuda.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 04:53:35 am
Ok, this is untested and I'm doing this from the top of my head, so please check how this works!

First download the K20c BIOS from techpowerup.com (or if you have a Tesla K20X BIOS somewhere, use that). Open up this file with a hex editor (for example HxD). You will see something like this:

00000000: 4E 56 47 49 92 01 10 00 60 04 00 00 DE 10 82 09
00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF FF FF 7F
00000020: 00 00 00 80 0E 10 10 82 EF FF FE 73 00 00 00 8C
00000030: 36 10 10 02 FF FF FF FF 00 00 00 00 02 87 08 02

Now you have to change the values at 0x1C (mine contains FFFFFF7F as you can see above) to FFC3FF7F, and make sure the values at 0x20 are 00000080.
Make sure you do this correctly! If you make a mistake then you could brick your card. It should be something like this but beware, I DO NOT TAKE ANY RESPONSIBILITY IF YOU BRICK YOUR CARD.

00000000: 4E 56 47 49 92 01 10 00 60 04 00 00 DE 10 82 09
00000010: 08 E2 00 00 00 06 00 00 02 10 10 82 FF C3 FF 7F
00000020: 00 00 00 80 0E 10 10 82 EF FF FE 73 00 00 00 8C
00000030: 36 10 10 02 FF FF FF FF 00 00 00 00 02 87 08 02
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 03, 2013, 07:59:06 am
Thanks for the diagram for figuring out the hard-straps. It more or less provides a template for figuring out the hard-straps in the future.

It is both interesting and concerning that you had to flash a Tesla BIOS onto the card to make it work.

Please keep us posted on the progress of figuring out the VRAM size.

My Titan should be arriving on Tuesday, so with a bit of luck I'll have a pseudo-K6000 by the end of next week. The use case for a Titan is 2-fold:
1) Extreme virtualized gaming, as foxdie says
2) Number crunching that required double precision floating point - IIRC Titan, unlike the GTX780, doesn't have crippled DP performance.

Now that I mention that - can you check the before/after DP performance on the Teslified 780? Is there an improvement?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 08:20:44 am
No problem, sharing is caring after all :)

I cannot test the DP performance right now as I have taken the card apart again. When it's put together I'll test it although I'm pretty sure that it will stay the same.

Could you please convert your Titan into a Tesla and check if CUDA samples run correctly? Because I found out that my card does not run all CUDA programs correctly, and I assume that the incorrect VRAM size is the culprit. It is weird as I have a Tesla K20c BIOS, and the official Tesla K20 has 5GB of VRAM, not 6GB. I can also allocate almost 4GB of memory in CUDA even though my card has 3GB.

Since the Titan is the same a Tesla K20X, I would like to buy a Titan if I know that the 'Teslalized' Titan runs CUDA apps correctly.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 11:36:45 am
Haha, I was JUST going to post that I managed to convert my GTX780 to a Tesla K20 and I saw that johndoe beat me to it :-DD I know I should have posted here before I wrote an article on my site about how I figured it out.

The first 5 bits of the device ID are softstrapped so just change the appropriate bits and reflash your card.

But I can also tell you that modding the GTX TITAN BIOS with the right straps will not turn it into a working Tesla K20. For example, you cannot disable the TCC mode, and you cannot run any CUDA code. :( What you CAN do however is go to TechPowerup and download the only K20 BIOS they have, and change the soft straps on that BIOS ;)

Btw, it would be awesome if you run some CUDA samples to see if everything runs fine (and report back of course ;) ).

Ok, first I changed everything according to your hints in the TITAN bios. You are right: nvflash recognize now K20X but it is not possible to use the card as Tesla card. I just wanted to try  ;). Honestly, I don't want to load the K20c bios. Anybody an idea where to get a K20x bios?
------------
Maybe I found something...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 12:08:40 pm
As far as I know, you can simply use the K20c BIOS with the soft straps set correctly, and it should work fine. I've done it as well and it works (kind of, I just need to modify the BIOS to report 3GB of VRAM instead of 6GB like you have).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 12:09:20 pm
I found an update file for linux for k20xm (1021). The bios is included but quite small, 128k, but i think try and error. I saved the extracted part and read in with bios tweaker. Checksum ok, parameters ok and bios is read as k20xm, as it should.

how i have to edit the softstraps to get 1021 instead of 1020?
-----------
You mean, I use the bios of k20c but it looks like k20x, right? But bios of k20c is smaller than TITAN bios, problem?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 12:31:06 pm
Yes, if you use the K20c BIOS but change the soft straps like you did for your Titan BIOS, it should get recognized as a K20X.
And it's not a problem if the BIOS is smaller. However, I don't know about the other BIOS you found.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 12:44:02 pm
Ok, i updated the bios, but still not possible to install tesla driver, also manually (driver is not for this windows version). strange...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on November 03, 2013, 12:48:23 pm
you guys notice the GTX780 is a GK110 chip? !!?!

GeForce GTX 760 192-bit [4]    Unknown    GK104    28    PCIe 3.0 x16    15363072    1152:96:24    823    888    5808    19.8    79    134    GDDR5    192    11    4.4    1.1    1896    79    14.6    130    OEM
GeForce GTX 760 [5]    June 25, 2013    GK104-225-A2    28    PCIe 3.0 x16    20484096    1152:96:32    980    1033    6008    31.4    94.1    192    GDDR5    256    11    4.4    1.1    2258    94.1    13.3    170    $249
GeForce GTX 760 Ti [6]    Unknown    GK104    28    PCIe 3.0 x16    2048    1344:112:32    915    980    6008    29.3    103    192    GDDR5    256    11    4.4    1.1    2460    103    14.5    170    OEM
GeForce GTX 770 [7]    May 30, 2013    GK104    28    PCIe 3.0 x16    20484096   1536:128:32    1046    1085    7008    33.5    134    224    GDDR5    256    11    4.4    1.1    3213    134    14.0    230    $399
GeForce GTX 780 [8]    May 23, 2013    GK110    28    PCIe 3.0 x16    3072    2304:192:48    863    900    6008    41.4    166    288    GDDR5    384    11    4.4    1.1    3977    166    15.9    250    $649
GeForce GTX 780 Ti    Nov 7th, 2013    GK110    28    PCIe 3.0 x16    3072    2880:240:48    876    928    7000    42.048    210.24    336    GDDR5    384    11    4.4    1.1    5045.76    Unknown    Unknown    Unknown    $699 (TBC)
GeForce GTX Titan    February 19, 2013    GK110    28    PCIe 3.0 x16    6144    2688:224:48    837    876    6008    40.2    188    288    GDDR5    384    11    4.4    1.1    4500    1500    18.0    250    $999
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 12:49:58 pm
And?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 12:51:03 pm
Ok, i updated the bios, but still not possible to install tesla driver, also manually (driver is not for this windows version). strange...

Is it the only card in your computer? It was in mine and then the Tesla did not work. When I put in another NVIDIA graphics card as the primary card, then I could install the driver and use the card.

you guys notice the GTX780 is a GK110 chip? !!?!

Did you notice that I modified my GTX780 into a Tesla? :P That wouldn't be possible if it was not a GK110 chip ;)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 12:52:36 pm
I'm working with a second card, titan is not connected to screen. very strange...
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 12:56:02 pm
But is your primary card an NVIDIA card? It does not work if it is an AMD card.

Very strange indeed. You used the K20c BIOS right? For me it works just fine. What does your OS say about the card?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 12:57:24 pm
primary card is also nvidia gt9500. Windows reports this card as 3d-game controller.
---------
Did you used the automatic driver installer or manually?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 01:02:39 pm
If you look at the photos on my site (http://www.guztech.nl/wordpress/index.php/2013/11/researching-nvidia-gpus-geforce-gtx780-and-gtx-titan-to-tesla-k20-and-tesla-k20x/ (http://www.guztech.nl/wordpress/index.php/2013/11/researching-nvidia-gpus-geforce-gtx780-and-gtx-titan-to-tesla-k20-and-tesla-k20x/)) you will see that nvflash recognizes my card as a Tesla K20(X), but also the last two numbers are non-zero. nvflash reports four numbers (0x10DE, 0x1020, 0x10DE, 0x104B), and when I was researching the values of the resistors, I once got (0x10DE, 0x1020, 0x0000, 0x0000).

You could check if the last two numbers are correct. I don't know what else could be wrong.

I just downloaded the newest Geforce driver and it installed it automatically for me. Took a couple of minutes after rebooting for the card to be recognized (because Windows was installing the driver).

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 01:06:18 pm
strange, i have different sub id: <0> Tesla K20X           (10DE,1020,10DE,0982)
did you changed something more?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 01:57:33 pm
I changed now also sub-id and bios to k20xm but it doesn't changed anything. Nvflash recognize card as k20x but windows as 3d-video controller (yellow marked) and it is impossible to install manually or automatically any tesla driver. If somebody has an idea you are welcome!
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on November 03, 2013, 06:03:45 pm
Anyone have bios link to K10 or Grid K2 (or k1) please.

Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 03, 2013, 08:01:05 pm
Interesting. So GTX780Ti is overdue to hit the shelves. Full shader count (like the K6000), but half the VRAM of the Titan at 3GB, and 1/3 cheaper than the Titan. I wonder if DP will be crippled.

Seems we need to figure out where in the BIOS the VRAM size is stored and in what format. I'm going to be quite displeased if the Quadrified Titan doesn't work for VGA passthrough without a K6000 BIOS flashed onto it, and that will only work with the VRAM size adjustment.

Has anybody got a copy of a K20X BIOS handy? That should "just work" on a Titan with the strap mod.

Anyone have bios link to K10 or Grid K2 (or k1) please.

Why do you need it? GTX680 works just fine as a GK2 with it's original BIOS. I'd be surprised if GT635 didn't also work as a GK1 with it's original BIOS.

The BIOS requirements we are talking about here seems to be a new thing on GK110 based cards.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 08:29:20 pm
Interesting. So GTX780Ti is overdue to hit the shelves. Full shader count (like the K6000), but half the VRAM of the Titan at 3GB, and 1/3 cheaper than the Titan. I wonder if DP will be crippled.

Seems we need to figure out where in the BIOS the VRAM size is stored and in what format. I'm going to be quite displeased if the Quadrified Titan doesn't work for VGA passthrough without a K6000 BIOS flashed onto it, and that will only work with the VRAM size adjustment.

Has anybody got a copy of a K20X BIOS handy? That should "just work" on a Titan with the strap mod.

Anyone have bios link to K10 or Grid K2 (or k1) please.

Why do you need it? GTX680 works just fine as a GK2 with it's original BIOS. I'd be surprised if GT635 didn't also work as a GK1 with it's original BIOS.

The BIOS requirements we are talking about here seems to be a new thing on GK110 based cards.

I have a bios for k20xm and i modified the tesla driver. I was now able to install driver. I will test and post results.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 03, 2013, 08:55:30 pm
What is a K20XM? I cannot find any official reference to it. Do you have a download link for that BIOS?

And what exactly do you mean by "modified the tesla driver"? Modified how and to do what?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 09:14:10 pm
K20xm should be the same like k20x from the point of performance. I didn't found any differences. I looked at the inf file of nvidia tesla driver and found that no 1020 id is listed, means no k20x. I changed the predefined 1021 (k20xm) to 1020 and i was able to install driver. But gpuz for example report only 512mb vram. And cuda sim programs still not recognize the card.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 03, 2013, 09:25:57 pm
That's probably because there is an issue due to the modified driver. Change the strap in the Tesla BIOS to make it 0x1021 for K20xm, then do a clean install of the unmodified driver. That might help.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 03, 2013, 09:29:45 pm
I will test. How to change straps to 1021?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 03, 2013, 10:32:49 pm
Interesting. So GTX780Ti is overdue to hit the shelves. Full shader count (like the K6000), but half the VRAM of the Titan at 3GB, and 1/3 cheaper than the Titan. I wonder if DP will be crippled.

I'm sure it will be crippled the same as the GTX780.

The BIOS requirements we are talking about here seems to be a new thing on GK110 based cards.

I'm pretty sure that only because of the modification to Tesla cards (TCC mode code is probably not available in GeForce BIOSes).

What is a K20XM? I cannot find any official reference to it. Do you have a download link for that BIOS?

The "M" versions are passively cooled cards. That's the only difference (maybe also lower clock speeds).

K20xm should be the same like k20x from the point of performance. I didn't found any differences. I looked at the inf file of nvidia tesla driver and found that no 1020 id is listed, means no k20x. I changed the predefined 1021 (k20xm) to 1020 and i was able to install driver. But gpuz for example report only 512mb vram. And cuda sim programs still not recognize the card.

512MB VRAM? Wow, that is strange. I still don't think there is much difference in Tesla BIOSes. If you just get your device ID correct, then it should behave like that card so you probably do not have to use the K20Xm BIOS.

I will test. How to change straps to 1021?

I should be changing the 0x00000080 value at 0x20 to 0x00040080 (with 0x1C still at 0xFFC3FF7F).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on November 03, 2013, 11:34:07 pm
OT: firepro S7000 S8000 S9000 S10000 V7800P V9000P [v7800p? cheap!] for esxi 5.5!!

Any potential with these models to have shared vgpu *more than 1 user* and cross-flash/hack AMD?

also special edition 780 will have 6gb and 12GB. which maybe gets rid of whole memory issue problems stored on rom
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 04, 2013, 12:45:25 am
I wouldn't take rumours for granted - don't expect a GTX780Ti with more than 3GB of RAM until you actually see it. There are also still a number of questions about the 780Ti that remain to be answered, not least of which is whether the DP performance is crippled unlike on the Titan.

BIOS cross-flashing on ATI cards was doable around the HD4xxx era, don't know what the deal with it is now. But if the VGA passthrough experience with ATI is anything to go by, I wouldn't begin the experiment with much optimism.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 04, 2013, 06:09:22 am
@johndoe: I just read that it was GPU-Z that reported 512MB  |O GPU-Z cannot be trusted. It says that my GTX780 has 48 cores (my primary card is a GT610 for testing purposes, and THAT card has 48 cores).

Use the nvidia-smi tool to read out the real values. And of course you should also run the deviceQuery CUDA sample and report its output.
After some more research, I figured out some things. The amount of VRAM is determined by a couple of things: the bus-width, the number of RAM chips, and the size of each RAM chip. The GTX780 has 12 chips under the cooler, and zero on the back. A Titan has 12 chips under the cooler, and 12 on the back (please confirm this), and the Tesla BIOS is configured to select 24 chips, whereas I have 12. That's why it doesn't work and that why it should probably work for you.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 04, 2013, 06:33:49 am
@johndoe: I just read that it was GPU-Z that reported 512MB  |O GPU-Z cannot be trusted. It says that my GTX780 has 48 cores (my primary card is a GT610 for testing purposes, and THAT card has 48 cores).

I have to say I have never seen GPU-Z get it wrong before. There is a drop-down box where you select which card you want GPU-Z to report on.
Typically GPU-Z gets it badly wrong only when the driver is force-installed and doesn't initialize properly.

Use the nvidia-smi tool to read out the real values. And of course you should also run the deviceQuery CUDA sample and report its output.
After some more research, I figured out some things. The amount of VRAM is determined by a couple of things: the bus-width, the number of RAM chips, and the size of each RAM chip. The GTX780 has 12 chips under the cooler, and zero on the back. A Titan has 12 chips under the cooler, and 12 on the back (please confirm this), and the Tesla BIOS is configured to select 24 chips, whereas I have 12. That's why it doesn't work and that why it should probably work for you.

Interesting observation. Does that help gain some insight into how the memory size is encoded in the BIOS?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 04, 2013, 07:04:59 am
So, i used now bios of k20c and titan, both with id of k20x, modified the driver because all k20 cards are set to the same driver section. gpu-z is maybe. I added the id of k20x. The results of devicequery.exe with titan bios are in the picture. Results of cuda-z are also added. Nevertheless, the sim software does not recognize the card. Last thing I will do is to use k20xm id with my k20xm bios.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 04, 2013, 08:55:20 am
It will not work if you use a modified Titan BIOS, so you will have to use a Tesla BIOS. If your primary card is the GT9500 and the Titan is the secondary card, then it should work. I also don't see why you need to add the K20X device ID to the driver installer. Are you using an older driver? Just use the newest GeForce driver.

The only difference I can see is that you replaced the 25K resistor with a 40K resistor. I did not remove the 25k resistor, but I added a 33K resistor between SCLK and Vcc on the EEPROM. My card does work (except for memory issues), so that is the only thing that I can think of. Maybe your solution changed some other straps without you knowing it.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: johnjoe on November 04, 2013, 09:28:59 am
Ok, i understood now. Maybe you are right and I changed add. straps. In your case, you only added a 33k res between vcc and sclk without changing anything else? And you took normal Geforce driver instead of Tesla driver from nvidia? I can only try tomorrow with resistors but will report. What you mean with secondary card, not connected to screen? it is not connected.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 04, 2013, 10:21:23 am
Yes, the only change is the 33K resistor between SCLK and Vcc.

With secondary card I mean that your GT9500 is in the first PCI express slot (primary) and your Titan is in the second PCI express slot (secondary). That is crucial, because else it will not work (at least it didn't work for me).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 04, 2013, 10:53:24 am
GeForce driver running a Tesla? Really? That's not supposed to work because the GeForce driver .inf doesn't contain any Quadro/Tesla/Grid device IDs. Unless this is some kind of a unified pre-release driver.

I thought we already established there is no need to replace the nibble 3 resistor - you can just leave it off, and fix the potential slight flapping issue by setting the 5th bit in the soft-strap appropriately.

It is normal that with a Tesla BIOS the machine won't POST on that card - Teslas don't work as standard VGA adapters, they normally show up as 3D adapters. Having said that, on the 690, the primary GPU shows up as VGA, the secondary as 3D, and that posts on ports attached to either, so the chances are that the Tesla BIOS just disables all the video outputs (since Tesla cards don't have any). I imaging flashing a modified Grid BIOS to a GTX680/770/690 would also produce similar results.

Having said that, I seem to recall I found that the device type (i.e. VGA or 3D) is set by a bit in the secondary strap - but I don't know for sure where the secondary strap is in the UEFI setup (several possible candidates IIRC) or whether the one on the main BIOS payload is the effective one. But if you can find it, you could potentially make the Tesla BIOS work with normal VGA enabled, but that would be a bit of a bizzare use-case (going headless after booting).
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 04, 2013, 11:10:12 am
GeForce driver running a Tesla? Really? That's not supposed to work because the GeForce driver .inf doesn't contain any Quadro/Tesla/Grid device IDs. Unless this is some kind of a unified pre-release driver.

I thought we already established there is no need to replace the nibble 3 resistor - you can just leave it off, and fix the potential slight flapping issue by setting the 5th bit in the soft-strap appropriately.

It is normal that with a Tesla BIOS the machine won't POST on that card - Teslas don't work as standard VGA adapters, they normally show up as 3D adapters. Having said that, on the 690, the primary GPU shows up as VGA, the secondary as 3D, and that posts on ports attached to either, so the chances are that the Tesla BIOS just disables all the video outputs (since Tesla cards don't have any). I imaging flashing a modified Grid BIOS to a GTX680/770/690 would also produce similar results.

Having said that, I seem to recall I found that the device type (i.e. VGA or 3D) is set by a bit in the secondary strap - but I don't know for sure where the secondary strap is in the UEFI setup (several possible candidates IIRC) or whether the one on the main BIOS payload is the effective one. But if you can find it, you could potentially make the Tesla BIOS work with normal VGA enabled, but that would be a bit of a bizzare use-case (going headless after booting).

How do you want to change the device ID from 0x1004/0x1005 to 0x1020 when you can only go as high as 0x101F with just the soft straps?

Yes you can use the GeForce driver with Tesla cards. At work we have Teslas and we never installed Tesla drivers, so I don't know why it wouldn't work. We use the Tesla C2050 which has two DVI outputs, so when you use the Tesla with the standard WDDM driver, it is basically a slower GeForce card.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 04, 2013, 07:33:05 pm
How do you want to change the device ID from 0x1004/0x1005 to 0x1020 when you can only go as high as 0x101F with just the soft straps?

Perhaps I wasn't clear enough, apologies. I wasn't suggesting a purely soft-mod. I was saying that if you remove the resistor and don't replace it, the value of the nibble goes to the top of it's settable range, but there is often an instability in the 5th bit. For example, on a GTX680, if you remove it, it will go to A or B, and often flip between the two when you reboot. You can compensate for this by soft-strapping the 5th bit high (to make it B) or low (to make it A). Since removing an 0402 is slightly easier than replacing it, it makes for a smaller, easier to apply hard-mod, and the rest can be soft-modded.

Yes you can use the GeForce driver with Tesla cards. At work we have Teslas and we never installed Tesla drivers, so I don't know why it wouldn't work. We use the Tesla C2050 which has two DVI outputs, so when you use the Tesla with the standard WDDM driver, it is basically a slower GeForce card.

That surprises me - I didn't think there was that overlap in the Windows drivers. I hadn't expected the device IDs for any Tesla/Quadro/Grid cards to be listed in the .inf. Certainly when you go to the Nvidia site and select that you want a driver for the Tesla, it will point you at the Tesla/Quadro/Grid download rather than the GeForce one. But hey, if it works for you... :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 04, 2013, 09:07:35 pm
How do you want to change the device ID from 0x1004/0x1005 to 0x1020 when you can only go as high as 0x101F with just the soft straps?

Perhaps I wasn't clear enough, apologies. I wasn't suggesting a purely soft-mod. I was saying that if you remove the resistor and don't replace it, the value of the nibble goes to the top of it's settable range, but there is often an instability in the 5th bit. For example, on a GTX680, if you remove it, it will go to A or B, and often flip between the two when you reboot. You can compensate for this by soft-strapping the 5th bit high (to make it B) or low (to make it A). Since removing an 0402 is slightly easier than replacing it, it makes for a smaller, easier to apply hard-mod, and the rest can be soft-modded.

Ah, that's what you meant. Now I understand. Still, I don't know if that would work in this case. It's true that removing a 0402 resistor is easier than soldering it (good luck with that :P), but my solution is non-destructive. I can always remove the 33K resistor (1206 btw) and the card will be a GTX780 again, whereas adding the 25K resistor back on the board would be harder :)
There is also another difference. The 25K resistor is a pull-down resistor, and I added a pull-up resistor. The lines that go to the GPU most likely have an ADC that measures the voltage and based on that it sets certain straps. I had also removed the 28.5K resistor and let me tell you, that one had to be precise! It will not boot with a value of 28K or 29K :(

Yes you can use the GeForce driver with Tesla cards. At work we have Teslas and we never installed Tesla drivers, so I don't know why it wouldn't work. We use the Tesla C2050 which has two DVI outputs, so when you use the Tesla with the standard WDDM driver, it is basically a slower GeForce card.

That surprises me - I didn't think there was that overlap in the Windows drivers. I hadn't expected the device IDs for any Tesla/Quadro/Grid cards to be listed in the .inf. Certainly when you go to the Nvidia site and select that you want a driver for the Tesla, it will point you at the Tesla/Quadro/Grid download rather than the GeForce one. But hey, if it works for you... :)

I guess the only difference between the drivers is that the Tesla drivers do not have the device IDs of the GeForce cards in the inf file. Haven't checked it though.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: mrkrad on November 04, 2013, 11:21:40 pm
what I was saying. the K10 tesla is the quadro grid K2 - features like software-ecc and vgx support on both. Just strap them and go.

finding the kepler that is closest to the K10/grid k2 should not be that hard since they are all the same core chip with a tweak to the soft straps .

That crazy Quadroplex Quadro 6000 was renamed to be the #2 card. Strapped it back to #1 position and all was good.

Now that optix 3 supports regular nvidia cards (OSX premier adobe) the need for tweaked geforce cards  may change a lot soon. The bet I'm going for is finding the cheaper Quadro/Tesla to make into the better (quadro/tesla). The GK104 has many many choices to work with. GK107 too.

I'm hunting VGX since it is better than API-intercept. I think I agree with Gordan. VGX is more like "VT-d" for video cards, alone it won't do squat but it will work with nvidia's VGPU API to accelerate much faster than pure binary translation mode (API intercept)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 04, 2013, 11:48:57 pm
Ah, that's what you meant. Now I understand. Still, I don't know if that would work in this case. It's true that removing a 0402 resistor is easier than soldering it (good luck with that :P), but my solution is non-destructive. I can always remove the 33K resistor (1206 btw) and the card will be a GTX780 again, whereas adding the 25K resistor back on the board would be harder :)

Can you post a photo of your mod showing exactly where you put the 33K resistor?

And why do you think removing the 3rd nibble resistor wouldn't work in this case? It works on the GTX680 and GTX690 I have. You mean it might upset the voltage somewhere else and cause unrelated straps to end up with wrong values? Surely adding a pull-down resistor in addition to the pull-up would run the same risk of doing that.

Note: I'm not disputing that dealing with a 1206 is far preferable than dealing with an 0402, especially without specialist tools. :)

There is also another difference. The 25K resistor is a pull-down resistor, and I added a pull-up resistor. The lines that go to the GPU most likely have an ADC that measures the voltage and based on that it sets certain straps. I had also removed the 28.5K resistor and let me tell you, that one had to be precise! It will not boot with a value of 28K or 29K :(

Which 28.5K resistor? What was it for?

what I was saying. the K10 tesla is the quadro grid K2 - features like software-ecc and vgx support on both. Just strap them and go.

...

I'm hunting VGX since it is better than API-intercept. I think I agree with Gordan. VGX is more like "VT-d" for video cards, alone it won't do squat but it will work with nvidia's VGPU API to accelerate much faster than pure binary translation mode (API intercept)

I'm not sure I follow what this would be for.
1) Why do you need ECC for graphics rendering and video stream encoding?
2) I'm pretty sure VGX requires no special features at all. vDGA is just straight PCI passthrough a-la Xen. vSGA just makes the GPU act as a co-processor. I'm going to try putting together an ESXi test bed machine with a spare motherboard I have which I _think_ has a non-broken IOMMU with the required features for ESXi PCI passthrough, and try and get my Gridified 690 working on it. If that works, it would prove you need no special features on the GPU itself to make it work.

It will also be good to hear back from foxdie when he has had a chance to test the Quadrified GTX470 with vSGA. If that works, the chances are that other modified cards will, too.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 05, 2013, 12:10:16 am
Ah, that's what you meant. Now I understand. Still, I don't know if that would work in this case. It's true that removing a 0402 resistor is easier than soldering it (good luck with that :P), but my solution is non-destructive. I can always remove the 33K resistor (1206 btw) and the card will be a GTX780 again, whereas adding the 25K resistor back on the board would be harder :)

Can you post a photo of your mod showing exactly where you put the 33K resistor?

And why do you think removing the 3rd nibble resistor wouldn't work in this case? It works on the GTX680 and GTX690 I have. You mean it might upset the voltage somewhere else and cause unrelated straps to end up with wrong values? Surely adding a pull-down resistor in addition to the pull-up would run the same risk of doing that.

Note: I'm not disputing that dealing with a 1206 is far preferable than dealing with an 0402, especially without specialist tools. :)

I have the card in my computer now with the cooler on it. When and if I take it apart again, I'll take a photo. But it's really just two very thin wires soldered on the Vcc and the SCLK pins of the EEPROM going to a 1206 33K resistor.

The card has a pull-down resistor by default, and I added a pull-up resistor. If the other end of the SCLK line has an ADC, then having no resistor on it will cause a floating pin which is not something you want with analog electronics. A pull-down, pull-up, or both will make everything more stable electric wise (even though you override that value with the soft-straps).

There is also another difference. The 25K resistor is a pull-down resistor, and I added a pull-up resistor. The lines that go to the GPU most likely have an ADC that measures the voltage and based on that it sets certain straps. I had also removed the 28.5K resistor and let me tell you, that one had to be precise! It will not boot with a value of 28K or 29K :(

Which 28.5K resistor? What was it for?
It is a pull-down resistor on the SO pin of the EEPROM. I measured around 2.3M resistance when I removed the 28.5K resistor which is actually a 30K resistor when measured outside the circuit. Because it needs to be exactly that value, I used a high-precision pot because I couldn't put back the 0402 resistor. It is on the schematic on my site and on the photo I posted a couple of pages back.

what I was saying. the K10 tesla is the quadro grid K2 - features like software-ecc and vgx support on both. Just strap them and go.

...

I'm hunting VGX since it is better than API-intercept. I think I agree with Gordan. VGX is more like "VT-d" for video cards, alone it won't do squat but it will work with nvidia's VGPU API to accelerate much faster than pure binary translation mode (API intercept)

I'm not sure I follow what this would be for.
1) Why do you need ECC for graphics rendering and video stream encoding?
2) I'm pretty sure VGX requires no special features at all. vDGA is just straight PCI passthrough a-la Xen. vSGA just makes the GPU act as a co-processor. I'm going to try putting together an ESXi test bed machine with a spare motherboard I have which I _think_ has a non-broken IOMMU with the required features for ESXi PCI passthrough, and try and get my Gridified 690 working on it. If that works, it would prove you need no special features on the GPU itself to make it work.

It will also be good to hear back from foxdie when he has had a chance to test the Quadrified GTX470 with vSGA. If that works, the chances are that other modified cards will, too.

Not to be a jerk, but wouldn't it be wiser if the conversation about virtualized GPUs would take place in a separate topic? I'm interested in it as well, but the last pages of this topic has been nothing but questions about if card X can be modified or about virtualization. You know, just to make it less cluttered. :)
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 05, 2013, 03:30:49 am
I have the card in my computer now with the cooler on it. When and if I take it apart again, I'll take a photo. But it's really just two very thin wires soldered on the Vcc and the SCLK pins of the EEPROM going to a 1206 33K resistor.

Oh, I see. So just short the EEPROM Vcc to SCLK with a 33KO resistor?
Won't this potentially upset other things?

The card has a pull-down resistor by default, and I added a pull-up resistor. If the other end of the SCLK line has an ADC, then having no resistor on it will cause a floating pin which is not something you want with analog electronics. A pull-down, pull-up, or both will make everything more stable electric wise (even though you override that value with the soft-straps).

Indeed, that that sounds like a good idea. I might check if the same approach might work on the GTX690, if I manage to steady my hands enough to solder some 24.8K resistors back on.

Not to be a jerk, but wouldn't it be wiser if the conversation about virtualized GPUs would take place in a separate topic? I'm interested in it as well, but the last pages of this topic has been nothing but questions about if card X can be modified or about virtualization. You know, just to make it less cluttered. :)

Agreed - let's keep this purely down to the modding part. There are at least two other threads here for what to use the modded cards for.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 06, 2013, 11:21:56 pm
I have the card in my computer now with the cooler on it. When and if I take it apart again, I'll take a photo. But it's really just two very thin wires soldered on the Vcc and the SCLK pins of the EEPROM going to a 1206 33K resistor.

Oh, I see. So just short the EEPROM Vcc to SCLK with a 33KO resistor?
Won't this potentially upset other things?

Yes, just a resistor between Vcc and SCLK. Seeing as I'm the only one that has a working Tesla (except for memory size issues), I'm guessing that my mod does not change anything else. I would like to try it with a Titan, but that thing is expensive. :( That's why I'm waiting for johndoe to apply my mod see if it works then. Btw, did you get a Titan?
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 07, 2013, 01:21:26 am
I have the card in my computer now with the cooler on it. When and if I take it apart again, I'll take a photo. But it's really just two very thin wires soldered on the Vcc and the SCLK pins of the EEPROM going to a 1206 33K resistor.

Oh, I see. So just short the EEPROM Vcc to SCLK with a 33KO resistor?
Won't this potentially upset other things?

Yes, just a resistor between Vcc and SCLK. Seeing as I'm the only one that has a working Tesla (except for memory size issues), I'm guessing that my mod does not change anything else. I would like to try it with a Titan, but that thing is expensive. :( That's why I'm waiting for johndoe to apply my mod see if it works then. Btw, did you get a Titan?

Yes, the Titan arrived last night. I was going to just remove the 3rd nibble resistor, but I like your approach of adding a resistor between Vcc and SCLK better. Unfortunately, I don't have any 1206 resistors lying around, so I had to order some 33K ones as per your findings. I had some non-SMD ones, but I couldn't use them as they wouldn't fit under the heatsink. They 1206s should arrive tomorrow. Will report back when I have some news.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 07, 2013, 05:01:17 am
Nice! If it works out I'm getting a Titan too :) Then I can concentrate on 'Teslafying' the GTX780 completely and running CUDA on it without getting unknown errors when calling cudaMemcpy.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 07, 2013, 11:13:33 am
Nice! If it works out I'm getting a Titan too :) Then I can concentrate on 'Teslafying' the GTX780 completely and running CUDA on it without getting unknown errors when calling cudaMemcpy.

I must say I am really curious if you will be able to figure out where and how the memory configuration is stored. If you look a back on this thread on page 38, you will find this post:
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg292021/#msg292021 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg292021/#msg292021)
containing the hex diff between 1.5GB and 3GB variants of a GTX580 BIOS of the same version number. Unless I made a huge mistake somewhere (or the BIOSes are mislabeled on TPU, I no longer have a GTX580 I could flash with those BIOSes to test), the memory difference should be encoded somewhere in those 10 lines.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: oguz286 on November 07, 2013, 07:46:30 pm
Nice! If it works out I'm getting a Titan too :) Then I can concentrate on 'Teslafying' the GTX780 completely and running CUDA on it without getting unknown errors when calling cudaMemcpy.

I must say I am really curious if you will be able to figure out where and how the memory configuration is stored. If you look a back on this thread on page 38, you will find this post:
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg292021/#msg292021 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg292021/#msg292021)
containing the hex diff between 1.5GB and 3GB variants of a GTX580 BIOS of the same version number. Unless I made a huge mistake somewhere (or the BIOSes are mislabeled on TPU, I no longer have a GTX580 I could flash with those BIOSes to test), the memory difference should be encoded somewhere in those 10 lines.

Yeah I already checked the diffs of many BIOSes but the actual size of the memory is not stored literally in the BIOS. The type of memory, the configuration, the clocks etc. are stored as a table in the BIOS and according to these variables you can calculate what the memory size is. Another problem is that Tesla BIOSes are not exactly the same as the GeForce BIOSes so its pretty difficult to find the bits that are responsible.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: cloudscapes on November 08, 2013, 03:12:54 am
Back in the GeForce 2 days, you could turn certain models into a Quadro 2, though in those cases it wasn't just a straight performance unlock. It was a tradeoff. Something like far better CAD and wireframe performance, but games weren't so well optimized anymore. It wasn't something a gamer would do to get a few extra FPS.

I have no idea if this tradeoff is still the case with new cards. It's been quite a while.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: KamranB on November 08, 2013, 03:14:28 am
Hello,

We faced the exact problem that you have mentioned in the forum and changed the resistors in a way to get a Quadro Graphic card but it did not work for us. By the way I see that there are little differences in our board with the image that you have shared in the forum.
1) In the upper column that you showed there is a 25K resistor that should be removed and a 20K resistor should be mount below that one.Ok and we did it. But in our board the second right side column is different. There is a resistor on top of this row
which is not on your board and reversely there is a resistor under that on your board which is not present in our board.

2) We plugged in the board and there were one long beep and three short beeps on windows startup. And it did not work.

I would be really thankful if you  could help us through this problem.
Title: Re: [MOVED] Hacking NVidia Cards into their Professional Counterparts
Post by: gordan on November 08, 2013, 04:04:52 am
Nice! If it works out I'm getting a Titan too :) Then I can concentrate on 'Teslafying' the GTX780 completely and running CUDA on it without getting unknown errors when calling cudaMemcpy.

I must say I am really curious if you will be able to figure out where and how the memory configuration is stored. If you look a back on this thread on page 38, you will find this post:
https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg292021/#msg292021 (https://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-count