UPDATE: The reference voltage given on the manual is wrong (?), it says that 3.3V = 255, but i did a little test that proved this to be wrong. I measured all PS3_PRT, PS2_PRT and PS1_PRT. The values i measured with my scope, and the values being shown on the display are as follows:
PSx_PRT DISPLAY [A/D] DISPLAY (USING 3.3V = 255) [V] MEASURED [V] PS1 030 0.388V 0.639V
PS2 041 0.530V 0.876V
PS3 094 1.216V 2V
So after doing some calculations:
30 * x = 0.639 => x = 0.0213000
41 * x = 0.876 => x = 0.0213659
94 * x = 2 => x = 0.0212766
It all lines up perfectly, is it possible that the A/D conversion is wrong?
Guys am i going insane or does this make sense? Until now i used the conversion 255=3.3V =~ 1 = 0.013V, is this wrong?
Does this mean that the cpu is getting (255*0.0213=) 5.431V instead of 3.3V as the reference voltage? If that was the case the PS1_PRT and the PS2_PRT voltage would be still within bounds, going from our 255 = 5.43V system back to the 255= 3.3V system we would get:
PSx_PRT 3.3V = 255 [A/D] BOUNDS PS1 049 12 to 100
PS2 068 34 to 106
PS3 155 132 to 168
I really hope all this is making sense. Please reply with your thoughts.
UPDATE: A/D conversion reference voltage is input at pin 2 of IC21, it should be 3.3V, the 3.3V comes from +3.3M rail. Its surely a coincidence, that +3.3M is the same rail that i measured 5.44V on (almost exactly the reference voltage that lines up with the A/D values being displayed on screen), instead of 3.3V

.
Guys, i think this is it. UPDATE: Everything liens up. The voltage at pin 2 is 5.2V. I need to find out why the 3.3M rail is at 5.43V, and get it back to 3.3V.