To measure voltage, it's enough to put each probe on the positive and negative terminals.
To measure current going to a device, the multimeter must be placed in-circuit : this means the energy must go through the multimeter and into the device.
So for example, one probe goes to the positive of your power supply and the other probe goes to your device, the other probe becomes the positive of your power supply.
Depending on what range the multimeter is set, there will be some amount of voltage drop inside the multimeter, because the actual measurement is done with a resistor, and you have the basic formula (Ohm's law) Voltage = Current x Resistance.
So for example, on the 10A range of your multimeter a 0.1 ohm resistor may be used but if your multimeter is on the 1A range the resistor may be 1 ohm.
With this example, if you set the multimeter on 1A range and the multimeter has a 1 ohm resistor inside, then if you measure a 5v power supply, at 100 mA there will be 5v - 0.1Ax1 ohm = 4.9v and at 500mA it may be 5v - 0.5a x 1 ohm = 4.5v
Power supplies can provide a limited amount of power, where power means the product of voltage and current.
In your particular case, your power supply may only guarantee 3v at 500 mA , which means it can give you 3v x 0.5A = 1.5 watts of power. There may be some "reserve", some tolerance, so in reality it may give 550-600mA.
If a device needs more, then the power supply may increase the current, but lower the voltage so for example, you may measure 2A of current, but the voltage may only be 0.75v, because the power supply is still only capable of 1.5 watts of power ( 0.75v x 2 = 1.5 watts)
A fancier power supply may turn itself off if a device takes too much current or if the voltage goes below some threshold, to protect devices connected to it.
So when you placed the multimeter directly across the positive and negative wires of your power supply, you more or less created a short circuit between the wires... in reality the very tiny resistor used by the multimeter to measure the current plus the resistance of the multimeter probes were the only things connected.
So the power supply sent as much current through the "load" between the positive and negative wires, which probably was 1.92 A of current .... but you don't know what voltage that was, could have been 1v.
Also, you also don't know if the power supply was actually capable of sustaining that "effort" for long periods of time. Components inside could be damaged or broken after some period.
Another thing you have to understand.
A lot of power supplies regulate a voltage only when something is connected to the power supply and that something consumes a tiny bit of power.
Some power supplies will output a higher voltage with nothing connected to them, but as soon as something is connected, the voltage gets closer to the actual specified voltage.
For example, cheap wall wart adapters that use old transformers may be configured to output 9v but they'll output 10-11v with nothing connected to them. If you connect something as simple as a led (with its current limiting resistor) that consumes 10-15mA of current you may notice that the voltage gets closer to 9v.
Other power supplies use cheap techniques to regulate the output voltage, which can cause power supply to output higher voltage at very low currents. For example, you may see a phone charger output 5.5v with nothing connected to it, but as soon as something consumes power, the voltage would go down closer to 5v.
There's various levels of quality when it comes to power supplies