You will see people using 'constant current' when they really mean 'current limiting'.
I'm not sure I understand the distinction you're trying to point out. What is the difference between constant current and current limiting in your view? Because most bench supplies do switch to constant current mode when the load impedance drops below your set voltage / current.
Current limit = giving you what you need
up to the specified current.
Constant Current = giving you the specified current
all the time.
Constant Voltage = giving you the specified voltage and current is whatever the device takes
Of course, the qualifier "within the power supplies' capability" applies to all of the above. At constant voltage of say 10V, in theory, the supply will keep it at 10V regardless of your device connected. But, if your device draw more current than the supply is capable of supplying, the system breaks down and it can no longer give you the voltage you specified.
Using LiIon charging as an example since it goes both CC (constant current) and CV (constant voltage), and as example a charger with current limits of 500mA, and the battery is the typical "charge to 4.2V" type (so charger is voltage limited to 4.2V).
When the battery is low, it is in the CC mode. It limits the current to 500mA. It does that by adjusting the voltage. If it is bigger than 500mA, it lowers the voltage, if it is less than 500mA, it increases the voltage (within limits). So it does what it needs to ensure the supply is giving out 500mA. Even while the battery can take more but supply will not give it more. Thus, this is "constant current" (CC) mode at 500mA. Supplier will change the voltage to keep the current constant at 500mA.
When the battery is near full, full enough that it cannot take 500mA any more at 4.2V. Now, the charger now is in CV mode. Voltage is kept at 4.2V constant (CV). The battery draws whatever it draws (which would be below 500mA or it could not get to this mode). When the current gets very low (low as defined by the battery type and size, say below 100mA or below 50mA), it is full.
Now if the charger (ie: the supply) is designed to be constant current all the time (no real design will actually do that. This is just theoretical so as to understand), when the battery is near full, the charger can keep increasing the voltage as much as it can to force the current at 500mA. Increasing voltage for LiIon will kill the battery or even cause a fire, but (in theory) you want constant current, the only way to make current go up is to push up the voltage.
Charger or a plain supply, V=IR by ohm's law. The higher the current (I), the higher the V for a given R.
Charger or just plain PSU, as you can see a power supply cannot increase the voltage indefinitely. Thus, in real life, it is often a
current limit rather than a true
constant current. Conversely, you cannot have really constant voltage. There is a limit to the amount of current it can produce. So, beyond a certain point, voltage will drop because it can give you
no more. [edit: missing the word
no earlier]
Hope this helps.