This is one of the things that come up a lot for no good reason.
First, think what would you gain by measuring the mains current in the described manner? Assuming that you actually get a measurement (and that's a big assumption), all you'd get would be the short-circuit current of that single plug, which isn't going to be very helpful.
Now time to scare you straight!
First, let's think a bit about realistic system: Your multimeter is rated for 20 A, but under what conditions? Did you completely read the manual? Did you fully understand the specifications there? are you 100% sure that you did, if the answer to previous question is yes? Would you bet your life and lives of people around you on that?
Let's take a look at what happens when you're measuring current: First, your multimeter is definitely using some sort of resistive element and when measuring current is actually measuring voltage across that element. That may be a precise resistor or it could be even just a metal bar of (more or less) known resistance.
When I stick one my meter's probes into amperes socket of my other meter (also rated for 20 A), I get around 0.5 ohms as resistance. Do that experiment yourself! Now let's assume for a second that the reading is correct and that resistance of my probes and my meter's socket is zero. If I were in the US and I connected that meter to mains socket to "measure the current", in world of ideal voltage sources, I'd have say 220 A flowing through that meter. That's 11 times the current it's rated for. If the probes had resistance of say 0.01 ohm, then they'd be dissipating around 485 W of heat just from that connection. The meter would be dissipating around 24 kW! You can of course see that in such conditions it would momentarily melt.
On the other hand, you have circuit-breakers and hopefully fuses in the meter. That's all fine and nice, but!! the circuit-breakers have minimal time it takes them to respond and that time might be too long for your hands or whatever you're using to connect the probes. Next, we have the fuse in the meter. If that fuse is safe (and that's a big, great, huge and extremely dangerous assumption), it should burn out very quickly and open the circuit, hopefully before the rest of the meter vaporizes. Those fuses have maximum breaking current and maximum breaking voltage. You need to be sure that those numbers are realistic and that they will be sufficient for your case. There's a reason why you can buy a whole multimeter (or maybe even more than one) just for a price of a Fluke fuse.
Next, you have all the imperfections along the way. I won't go too much into them, there are other members of this forum who are much more capable of providing relevant information. I'll just mention that in this case, you can expect lots o sparking when making connection, you can expect the multimeter fuse to explode, maybe even eat bits of the PCB with it. Good quality multimeters will often have some sort of blast walls inside, especially around fuses that are meant to prevent the explosion from escaping the meter's enclosure. They will also have specially designed seams, so as to reduce the possibility of explosion getting outside of the meter.
And keep in mind: The time it takes for the circuit breaker to react is short, but the time it takes for the meter to turn you into a cripple is even shorter. And yes, I've heard of a guy who was scratching his head trying to understand why appliances work from a plug whose current reads as zero. You really don't want to try to find out if you're as lucky as he was.