When looking at kits, I see some say something like 12VDC @ 15mA. So if I have a power supply that outputs 12VDC but at 1A, then this is no good. How would I reduce the current, since if I did, I'd also be reducing the voltage.
A kit built power supply I was looking at, for example, can output from +/- 1.35 to +/- 15 VDC @ 1A. So in a circuit that I see as taking 12VDC @15mA, this supply would give too much current, right? Or am I looking at this the wrong way?