How in the world does a metal wire-wound resistor resist, if the metal inside is a good conductor?
Why doesn't it get hot and fail like a diode?
The wire may or may not melt, depending on it's resistance to the flow of electricity, which if you're thinking about a chunk of copper wire, is likely determined by it's gauge. Small wire...high-ish resistance...lots of work done trying to push the power thru it....burns up. Large wire....low-ish resistance....not a lot of work done pushing the current thru it....doesn't burn up.
How in the world am I supposed to know from Ohm's Law: V=IR that a lone resistor across a mains line will or will not blow up? Let's say it's 120 AC 60Hz, and a 1 Ohm resistor. That implies 120 Amps are flowing through the resistor, right? But how is that sufficient to know whether or not the part will burn up?
It's not sufficient knowing the voltage, current values going into the part alone to know whether or not your part will burn up.
You also have to know the voltage, current, power ratings of the device you're dealing with.
In this instance, if you've got a resistor that's valued at 1 ohm, rated to handle 120v (likely more), AND 120 amps (likely more), AND, most importantly 14,400 watts (volts times amps = watts, more ohms law)....only then, if all of those conditions are met, will the part survive. (Ok, it may survive with lower ratings, but I wouldn't bet on it). So, in your example you need a resistor capable of handling almost 15,000 watts.
Similarly, if you take a 1 megaohm (1,000,000 ohms) resistor across that same 120 volts, current flow will be limited to about .00012 amps, .12 milliamps.
Power (watts) = Volts times amps.
120 * .00012 = .0144 watts is dissipated across that resistor. So, IF the resistor was rated to handle 120volts, you could put a small 1/8 watt resistor across the 120volt source and NOT burn it up.
http://lmgtfy.com/?q=ohms+law+wikihttp://lmgtfy.com/?q=Watt+wiki