So a definition of entropy is the amount of information needed to describe a microstate, that is, every configuration of the memory, given a description of the macrostate. A memory that is described as, for example, 64 kilobytes, has 524,288 bits. There are 2^524288 possible states of the memory given that description. So it's proper to describe the entropy in bits as 524,288, which is the logarithm of the number of combinations base 2.
The same idea applies to physical systems such as gases. We describe a gas with certain extrinsic quantities such as volume and mass, and intrinsic quantities such as pressure and temperature. With physical systems, the system can take on a continuum of states because each gas particle can be at any position in the box, as opposed to a discrete set of countable states like all of the possible combinations of 0 and 1 bits in a flash memory. So instead we can specify the entropy differences with physical systems that have uncountable numbers of states.
Lets say we have an ideal gas, so a gas where the particles do not interact or collide or have any attraction or repulsion. The gas is in a cylinder with a movable piston, and the piston can be pulled in or out very slowly to increase or decrease the box volume. The position of every particle can be described as an x, y, and z position somewhere in the volume. Now imagine that the piston is pushed in slowly to halve the volume. We are compressing the gas in the cylinder. Because each particle can only be in 1/2 of the volume it used to be in, we can say that one bit of information less is required to describe the position of each gas molecule in the cylinder, as if you described the position of the particle with the same positional accuracy, only half the range needs to be described after compression. To force the gas particles into the smaller space, we had to do work. Boltzmann's constant or 1.38E-23 J/K tells us (logarithm base e) how much work is required to reduce the entropy of the gas, which is k ln 2 = 1E-24 J/K per gas particle. At room temperature, or T=300 K this is 3E-22 Joules per gas particle.
Basically, we pushed on the piston and did work on the system, reducing its entropy (and the number of states the particles can be in) while increasing the entropy of the universe as a whole. So you can see with a microchip or a gas how entropy is really just a measure of what is not known about the system given its macroscopic description.
In fact for memory, Landauer's principle (
https://en.wikipedia.org/wiki/Landauer%27s_principle ) exactly describes how much energy is needed to erase a bit. This is exactly what we did by compressing the gas in the cylinder. We erased a bit of range needed to describe the gas particles position, and so the minimum energy needed to erase a bit is also about 3E-22 J at room temperature.