My question is how are they overcoming the technical obstacles of transmitting a, say 60GHz signal with 1 Gb of data riding on it every second with out line-of-sight. I actually didn't realize that they made stuff that operated in those frequencies.
I first saw a 60GHz link in 1996. It wasn't phased array, but had a 20dB antenna that was ~4cm long. It also had a thin piece of plastic over the antenna that acted as an anti-reflection coating in the same way that there in an anti-reflection coating on your camera's lens. And that clearly indicates you aren't in Kansas anymore:)
A limiting problem back then was simple thermodynamics: the power transistors are necessarily small, and the power output is limited by the ability to remove the heat from a small area. There will have been some improvements since then, especially w.r.t. having many transistors in a phased array transmitter, but I'd still look carefully at that topic.
As for line of sight only. 60GHz signals have multipath as you would expect, but narrow-angle beams make multipath less predictable and stable. Cue phased array transmitters, but I'd like to see how they determine where to point the beam this millisecond.
I note that the consumer's antenna has to be mounted in a window. This is realistic given how 60GHz is not good at penetrating many materials - including foliage. 60GHz is, of course, in the "oxygen hole", so range will be limited anyway.
If there is a link budget published anywhere, I'd be curious to read it.