amps 9.5-10 average (multimeter was on 20mA setting)
For both tests, I had the terminals of the panel connected straight to the COM and the V-ohm-MA terminals of my meter (my V-ohm-MA terminal says 200mA on it, I assume that is the rating to use this one before switching over to the A terminal instead).
Certainly the mA is really a bummer here. I would have guessed by the specs that we have 150mA, not 1/10th of that! Something must be wrong. Am I multiplying wrong here? My panel is behind the glass of my sunroof in the car. So perhaps the tint in the sunroof is to blame?
Don't connect the leads of your multimeter across a voltage source while in current mode! The meter is acting like a low-value resistor and will draw as much current from the source as it can, potentially blowing the fuse in your meter (hopefully it
has a fuse) or destroying the source. Even if nothing is damaged you're not getting a meaningful reading of how much current the device can put out because the source voltage will plummet and the output power will fall with it.
The current mode on your meter is for connecting in series with a load, to measure how much current the load is drawing, not to determine (not directly at least) how much current the source can safely put out. Except in cases of constant current sources like a bench power supply, the load determines how much current flows, not the source. When you connect your meter across the supply, you're measuring how much current the sense resistor in your multimeter draws, which is meaningless and likely to be high enough to blow the fuse.
My understanding is that with a voltage regulator you are controlling the voltage output even though you are getting a wide input voltage range. But does it act like a transformer? For example, if my input was 15-20V and I decide to use a voltage regulator to lower it down to 10V.... so I can have a guaranteed 10V source at varying current. Now, I use a transformer that takes 10V down to 5V, but doubles the current. Or does the voltage regulator automatically boost the current dependent on the "drop" it is performing....
First off, I'm going to assume you're talking about switching voltage regulators, not linear, since you linked to two documents that discuss the switching type. Switching regulators are ideal because they usually get 80% or better efficiency across a wide input voltage range. It's roughly the same concept as a transformer in that the power out is the same as the power in (minus losses as heat). This gives the "fixed current at higher voltage -> higher current at fixed voltage" characteristic that you are talking about in transformers, and would indeed mean that you could draw more current in full sun compared to the shade. But keep in mind that switching converters, solar panels, and USB are all DC power, while transformers are AC power -- so you can't use a transformer to step down power from a solar panel. That's why the DC-DC converters exist.
There are actually switching regulators based around transformers but they are more complex, use a special type of transformer at a much higher frequency than a wall wart (around 100khz and up), and only needed when you want to isolate the output from the input. These are the kind you would find in a cell phone charger or computer power supply.
If you want something easy, start with a linear regulator (like the 7805 mentioned earlier). It will be inefficient and won't yield more power with higher voltage but it's cheap and really hard to screw up. A simple way to think about how much current you'll get is that since they work by burning off voltage, the current in and current out are almost exactly the same. So if you have 200mA available from the panel, then you'll have 200mA available at the 5V output terminal, regardless of the input voltage. The problem is that all that voltage burned off gives off a proportionate amount of heat so it might shut down if you draw enough to make it overheat. They're all thermally protected though so it's not going to explode. It's easy to calculate how much power it's burning, just subtract volts out from volts in and multiply by the current. The datasheet for the regulator will likely list a nominal figure for maximum power dissipation. Also, what I mean by "available current" in this case is how much current you can draw from the panel before the voltage falls low enough to become unacceptable. The load determines the current, but obviously you can't draw 1000 amps from a $10 solar panel, and with this kind of source that means the voltage drops when you draw too much. The voltage regulator will keep a rock solid 5V output until the input falls to 6-7V (the difference is called "dropout voltage" -- check the datasheet).