Right, to get a smaller offset voltage you need some kind of gain; the best we can do for single devices is the antiquated germanium bipolar transistor, with a Vbe around 0.3V. To do any better (with silicon or otherwise), we need more transistors/diodes, to adjust the voltage drop smaller. Typically a differential amplifier is used, to remove the temperature and bias dependence that a Vbe drop has.
Physics presents us with other opportunities, which can be more generally useful. If we apply the Hall effect, we can generate a sense voltage proportional to a magnetic field, in turn proportional to a current flow. The current flow can be isolated, AC or DC, and with a little circuitry we can see what's going on.
Hall effect sensors are cheap and abundant; though "cheap" is relative (a few bucks). You're likely better off with a resistor, reference and comparator, if they are suitable.
For AC only, we can use Faraday's law to induce a sense voltage, adding an arbitrarily small voltage drop to the sensed line. In other words, a current transformer. (Obviously, this doesn't work so well at DC, so doesn't apply to the present case.)
So, we must consider suitability. Is power always available? (Kind of a silly question maybe, but important when we're looking at whether the lamp circuit itself is live, versus just the lamp.) Is it always in the same direction? Do we have to worry about it being plugged in backwards or anything? (If the sense were on the lamp itself, we might need protection to deal with the lamp being plugged in wrong. Depending on type of course.) Given answers to these and other questions, we can solve for a perfect solution -- if the requirements are very modest, then the above examples already given will be close already.

Likely any better-suited solution will still follow the same plan, just with different sorts of comparators, or output signaling, or signal filtering, or etc. Simple stuff.
Tim