Here's my $0.10: 2 cents per paragraph

Both meters have a similar accuracy of about 30 ppm/90 days in the middle resistance ranges, so you won't be able to ensure the 8505A is in spec, but should be able to get it pretty close. I would definitely four-wire connections for everything under 1 MOhm.
Thermal EMF could be a factor. The voltage while measuring resistance is about 1 V, and a copper-nickel junction
produces about 10 uV/°C. So thermal EMF errors could easily be in the order of 10 ppm. See also
Fluke metrology 101: Watch Out for Those Thermoelectric Voltages!.
To combat this, you can either (or both) use offset compensation if the meters support this, or use materials (copper) that minimize thermal EMF. Offset compensation involves the meter measuring the voltage with the current source enabled (as a normal resistance measurement), and then another measurement with the current source disabled (should measure 0 V in a perfect world), and subtract the latter voltage from the first. As you see in the tables in the documents I linked, thermal EMF depends on the type of metal, so you can minimize them by using all copper connections, but make sure the copper is not oxidized. A simple solution is using UTP (cat5/6/7) network cable, I like solid strands best. I connect the stripped strands directly to binding posts on both the DMM and device under test (e.g. resistor). If the DMM only has banana jacks (boo Keithley!), then you can insert the copper wires and then insert Q-tips to press them against the jack. I strip the outer sheath and extract individual pairs, but you could also leave the cable in tact especially if it's shielded. In that case, use the shield for guard connection (more about that later). Leaving connections for a while after touching them to wait for thermal equilibrium also helps with thermal EMF.
I believe the SR1 resistors are manganin, so they should have a somewhat substantial temperature coefficient in the order of 10 ppm/°C. And of course the meters will also have a temperature coefficient. So ideally try to match the ambient temperature to the one it was calibrated at as closely as possible, and definitely register the ambient temperature you do the "calibration" at. Some resistors, like the NBS/Rosa type (the L&N resistors you often see on eBay for ~$100), have a special hole for inserting a temperature probe. Here it also helps to let the resistor sit for a bit after handling, since your hands will heat it up.
If you are using a shielded cable, then connect the shield to the guard terminal at the meter end, make sure to enable external guarding (aka disconnect guard from LO) on the meter, and connect the shield to the lo force/current terminal at the resistor end. If the resistor has a shield, then connect this to the same point. If you are not connecting guard at the resistor end, then make sure the meter is set to internal guarding / connect guard to lo. The meters manual might give more information about this topic, but I think
this application note gives a clear and quite deep treatment of the subject of guarding.
Then take a number of resistance readings, say 16, and check if there is an upward or downward trend (suggests something isn't fully stabilized yet) and check if the sample standard deviation is well below the target uncertainty (say << 30 ppm). For very high and low resistance values you expect a larger uncertainty. You will also need this standard deviation (well, the standard error of the mean, which is standard deviation divided by the square root of the number of samples) to calculate the uncertainty of your calibration according to the standard
Guide to the expression of uncertainty in measurement if you want.