One sided limits are expressed like:
\$ \lim\limits_{x\to 0_{ }^{+}}{1\over x} = +\infty \$
\$ \lim\limits_{x\to 0_{ }^{-}}{1\over x} = -\infty \$
A two-sided limit implies that both left and right hand side limits are equal. Then the function is continuous, differentiable, etc. at x=0 So above example does not have a limit at x=0. But for this:
\$ \lim\limits_{x\to 0_{ }^{+}}{1\over x^2} = \infty \$ = \$ \lim\limits_{x\to 0_{ }^{-}}{1\over x^2} = \infty \$ = \$ \lim\limits_{x\to 0}{1\over x^2} = \infty \$
The limit does exist.
Things get even more fun in multi variable calculus, because you can approach a point from any direction.
If a function does not have a two-sided limit at a certain point, lots of normal Calculus falls apart and you need to be extra careful applying it near that function value, because there are jumps or gaps in the function.
For example: differentiate the Heaviside (unit step) function. You get the impulse function, or better known as the Dirac delta function. Although the latter is called a function; it is actually a distribution. And unfortunately, that theory was skipped in my calculus, signals & systems courses.
Nevertheless, discontinuities and one-sided limits are seen quite a lot in signal theory like Laplace of Fourier, because most of these transformations are unilateral. Otherwise the table of transformations would not converge for minus infinity. This poses problems if you have a causal system that has an initial value at t=0
+.
Another typical case is the fundamental theorem of Fourier that describes what the value of a signal is represented by it's Fourier coefficients near discontinuities (namely the average of the left and right side limit).