If it's stored in hard drive or Flash or DRAM, it's probably on the order of hundreds or thousands of perturbed electrons (stuck, in one way or another; magnetically in the HD, electrically in the memory).
The amount of energy associated with that change, times some kB for the size of a post*, is down in the nano or picojoules. So, the relativistic mass change** will be very, very small indeed, in fact many orders of magnitude below the mass error that is inherent in a macroscopic clump of atoms: you physically cannot measure mass accurately enough (due to binding energy, temperature, internal disorder and so on) to be able to count the atoms!
*But should we also count the knock-on effects of updating the database so the post is linked into the system? It may be in the 10s of kB, on a per-post basis. Maybe a whole lot more, I have no idea... Oh, and is this just in final storage, or are we counting the changes in all the intermediate caches too?
Alright, nevermind...
**The mass doesn't really change, because on average, the system hasn't gained or lost much energy. That is, given a compressed (looks like random noise!) data stream, the entropy is pretty much the same, for any input you might have. It's about half ones and half zeroes! Moreover, symmetrical systems (like the HD's magnetic bits) won't really store energy (the particles are magnetized regardless -- just in + or - directions). Asymmetrical systems will, like memory (which holds a static electric field, or nothing).
Perhaps a more interesting theoretical question is, how much energy is
needed?We're something like 6 orders of magnitude away from the information-theoretic limit, which comes from the amount of energy needed to transition a (quantum) system from one state to another, in a finite time (comparable to modern clock rates). And that limit is something like 10 orders of magnitude away from relativistically measurable information change [citation needed], so the concept of "relativistic information theory" is a very ponderous one, indeed.
Tim