One thing I seldom see in technical discussions of cell phone risk is the distinction between "near field" energy and "far field" radiation power. Of course, as we go higher in frequency, the wavelength shrinks and therefore the distance for "near field" reduces. "Near field" region stores energy in the oscillating (not traveling) EM field near the transmit antenna, while far field power is the useful EM traveling wave propagating away from the antenna at a distance greater than the wavelength. The relation between these two values depends on the antenna geometry, and the mathematical description of the transition between the two regions is messy. In communications, objects near the antenna can absorb substantial power from the near field energy, reducing the efficiency of the transmitter system. Presumably, holding a phone against your skull has a similar effect. Most of the discussion I have seen is relevant to the question of siting cell antennas near to schools, etc., where the victims are in the far field and the power density hitting them is the relevant parameter. In the near field region, the relevant parameter is "SAR" (specific absorption rate) in W/kg: there are regulations for MRI and cell phones, based on models for human anatomy. Last year, there was a muckraking article in the Chicago Tribune that reported independent tests of existing cell phones, showing some that exceeded the FCC regulatory limit--the manufacturers blamed software.
https://www.chicagotribune.com/investigations/ct-cell-phone-radiation-testing-20190821-72qgu4nzlfda5kyuhteiieh4da-story.html (Paywall).