... and you thought a simple power supply would be something simple to talk about.

As you can tell from the above responses, there are quite a few things that can come into play - but let's step back to the question you ask about a fundamental topic.
This is a good summary:
Voltage is something you have. Current is what you take. Quick answer.
... but to spell things out more clearly, I offer the following:
* From the specification given, your transformer is
capable of providing an output current of
up to 1.6A at an output voltage of 40V.
* The output voltage will be 40V
*1- a figure defined by the physical construction of the transformer and the input voltage on the primary.
* The current drawn from this transformer will depend on the circuit connected to it. If you have a circuit that only needs 10mA, then that's all the current which will flow from the transformer. Likewise, if you have a circuit that needs 1.5A, then
that is the current which will flow from the transformer.
* If you have a circuit that sometimes needs 10mA and sometimes needs 1.5A, then sometimes the current flowing will be 10mA and sometimes it will be 1.5A. This just happens as a function of the physics - you do not have to do anything (nor can you do anything) to have this current flow at a rate other than what the physics determines. This is an extremely common situation where you may have a circuit which is monitoring some condition (and only needs 10mA to do that) and then switches on a heating circuit or a motor or some other load that needs 1.5A
*1 Now, in the real world, the transformer has some physical limitations and while
we like to consider the output voltage will be a constant 40V, it does vary slightly. Those limitations include the strength of the magnetic field within the transformer and the resistance of the wire. The general rule is: the voltage at no load will be higher than the voltage given for the transformer (sometimes notably higher, so watch out for that) and the output voltage will be the nominated value (or close to) when supplying the
full rated current.
NOW - there is a question you may not have considered - but - it is this: Can a transformer supply
more than it's "rated" current?
The simple answer is "Yes" - however there is a huge
BUT associated with that. Things included in that "BUT" are:
* The voltage WILL sag (fall) to a value less than the nominal
* The transformer WILL start to overheat. If this over current is only for a short time - for example a turn-on surge during the initial charging up of capacitors, where it's all over in less than a second - then this overheating will usually be insignificant. If you run over current for an extended period, you WILL stress the transformer physically. It can cook and fail. Lovely smell.