As Kleinstein already mentioned you should first only verify the accuracy and adjust only if needed.
And the official definition for calibration is to only compare the readings between the reference and instrument under test.
("common abuse" or historic use of word calibration often implies also adjustment )
Calibration is actually a two-step procedure (shamelessly stolen from BIPM VIM)
1) establishing a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties
2) using this information to establish a measurement result
What Kleinstein mentioned ("checking of the readings") is not a calibration, it's a verification.
For hobby purposes: Verification of test equipment is easily possible with accessible gadgets. You just have to "steal" or "borrow" the measurement results from a higher quality standard (e.g. 6.5 digit DMM for a 5.5 digit DMM) and somehow transfer it to your test equipment. Your SMU/Source has to be short-time stable, just enough to steal those precious ppms
My tips:
- experimental and environmental conditions should be taken into consideration (temperature, humidity, contacts and cables, shielding, warm-up time) in order to have comparable results
- try not to use cheapest low-quality chinesium gear (cables) although there are very cheap but effective methods to get good enough results
- if you have to adjust your device under test, take measurements before and after the adjustment
- always document every number, experimental conditions and settings
- Don't go down the metrology rabbit hole