The LED itself is almost certainly rated only for 3V. Normally, the current limiting resistor for 5V or 3.3V is on the display PCB itself. I wouldn't run it at 5V if you expect it to live any meaningful amount of time.
I just checked one of my cheapie eBay LCDs, and it uses a 100 ohm resistor for 5V and draws exactly 20mA, which means it's a 3V LED.
Originally I was gonna say that I'd run it off the -6.5V rail with an appropriate resistor (180 ohms assuming you want the usual 20mA). But after looking at the schematic, and admitting I don't totally understand the circuit, I'd leave it alone because it looks like it's connected to the voltage reference.
So I'd use the +15V rail through a 5V regulator (properly configured with input/output caps) and a 180 ohm current limiting resistor. Probably overkill, but better than a simple resistor, since the ≠15V rails are unregulated. (Otherwise, you could use a 15V rail and around 600 ohms resistance, but it's a full quarter watt of dissipated heat, so either use a 1/2W resistor, or divide the resistance among a few resistors, e.g. two 1.2K 1/4W in parallel, or four 150 ohm 1/8W in series.)
Using a voltage divider makes no sense; all it does is waste more power as heat.
Also, you can put some tape (either aluminum foil or opaque white are ideal) on the cut edge of the backlight panel to prevent light leakage, both for improved efficiency and to not be lighting up the buttons.