To a large extent - yes. Modern LEDs have extremely thin metallization on top of the bandgap area (or none at all, and just the bulk material - see illustration below), with the total active region only being a micron or so across. At that point the substrate is very transparent towards its surface. There are other cool things that are being done, like patterning the sapphire substrate to more efficiently reflect back stray photons, adding microlenses or micro-waveguides...
All this has led to ridiculous external quantum efficiencies nowadays. I've made
a calculator a while ago that predicted a 75% LER (lighting efficiency, i.e. amount of light power out vs power in, meaning that a 10W LED at 75% LER only outpuyts 2.5W of heat). Including the phosphor losses, that would mean the underlying blue LED would need to essentially have nearly 100% quantum efficiency (about 91-93% calculated). Sure enough, when measuring over an integrating sphere this is actually true. If we'd only need lots of monochromatic blue light in our world, we would pretty much have the perfect light source already.