Having said the above, if the water droplets evaporate during the compression stroke thereby reducing the air temperature and consequently the amount of work needed to compress that air... Hmm.... Dunno. Although the energy needed to compress the air would normally be recouped on the expansion stroke, so maybe not.
You can get more power out of an engine at lower air temperatures because the air intake density is greater, meaning you can get a larger mass of air into the cylinder with each intake stroke, meaning you can burn more fuel in the power stroke (more oxygen present), meaning more power.
Now, suppose you cool the intake air by injecting water and letting it evaporate. The evaporated water will have a molecular weight of 18, which is somewhat lower than the molecular weight of nitrogen (28) or oxygen (32). So the water-laden air will have a lower density than dry air, and on top of that the water vapor will displace some of the oxygen in the charge (so the air will no longer contain 21% oxygen as normal, it will contain rather less). It follows that for two reasons there will now be
less oxygen in the intake charge (lower air density, displaced oxygen), countering the desired effect of a lower intake temperature (higher air density). I haven't done the calculations, but my intuition says power will go down if you try this.
On the other hand, suppose you inject water droplets and don't allow them to evaporate before reaching the cylinder. This time no significant amount of air will be displaced, so no oxygen starvation will occur. On the other hand, the liquid water will evaporate in the cylinder with the combustion heat and increase the expansion power of the hot gases (making the internal combustion engine a kind of hybrid steam engine). In this case it may be that the power could be increased, though again I have not worked through the thermodynamic analysis to put numbers on it. Presumably if it was a viable proposition there would be water injectors in every car engine, and there are not...