Yes, basically compare it to a known good-enough meter (reference) and write down the differences of the measurements with date and if possible temperature. That is calibration.
Sources for good-enough meters:
* Ask local electronics shops
* Hobbyists/Ham radio ops
* local school teaching electronics,
* a very friendly but possibly expensive cal lab
and ask if they mind if you compare your own meter to theirs.
On the other hand you might decide to go for buying a new meter which you trust to be good for you, which will act as your personal reference meter then. (threat it well)
You will have it easier when you think about the interesting ranges you want to calibrate - comparing resistor measurements will be easy as cake, low-voltage, low current DC stuff too... AC values and high voltages or currents are a bit more complicated, but do you really need it ? If you have something which you want to measure, just take it with you and use your meter and the reference meter to measure it.
I rarely use any AC, and never more than 400VDC.
If possible, make or buy sources which will give you the values that you want to calibrate - a handful of resistors (1 or 2 for every range your meter has), a stable voltage source (buy a fluke if you are rich, a geller labs reference if inclined, or make one yourself) and combine U source + R to get a I source.
And finally, know your meter ! It has a impedance in voltage measurement, and a burden voltage in I measurement !
You can even go one step further and look if the service manual of your meter tells you how to adjust it.
Try to make your "slave" meter reading approaching the value that the reference meter reads.
After that, do the calibration as described above.
Did this help ?
BR
Babysitter Hendi