I'd say examine the varieties of established measurement, specifically INVERSE SQUARE law. That specifies that most field driven effects will diminish, a lot faster than linear, as you move away. If you get located 200 yards, and double the distance to 400 yards most radio frequency cell phone towers will have transmission power way less than 1/2 the power you got, up close.
That means any such 'harmful' effects attributed can go to less meaningful levels due to some distance, going down way faster than a linear drop as you move away.
Not to say no danger, but I think it's less subtle, like say, IONIZING radiation is one danger...but that's pretty extreme exposure. Meaning that a cell tower 1/4 mile away would have to have nuclear detonation to supply IONIZING effects on your skin.
And, no, I don't think such EM exposure would somehow bypass the skin, and start altering your brain matter, for example. No, you'd likely have massive skin burns, etc. along with internal damage.
Certain animals, whales for example might sense a nearby cell phone tower, but at very close range.
Sure though, a sensitive antenna might measure some 'frequencies', which, by the way is not a technical term for EM Fields, but rather just a 'folk' or common term, very inappropriate.
(Maybe like calling rain 'drowning fluid'.
You might enjoy picking up a good engineering book, on RF and HIGH VOLTAGE safety, from an Industrial Hygienist, because there are various dangers when involved up close, like in a testing laboratory, etc.