Hi All, lets start here.
We are working on solutions to run computers significantly hotter and harder to specifically create heat with computation, we're hitting 200+ in the cooling loop today. That heat can be used for more than just running a boiler, we can power air conditioning and refrigeration as well with about 225-235 f. I know you are electronics guys so you will have to invest some google time looking for desiccant enhanced air conditioning and absorptive chillers/refrigeration - the building science guys get the implications pretty quickly. With the addition of refrigeration and air conditioning loads, thermal storage and proper sizing there is no room in our model for 'exhausting heat' - there is also a reason it's called project Exergy! If you want to talk about exhausting useful heat, track down Nerdalize or Cloud & Heat.
Since the biggest loads in our building currently run on heat (space and domestic hot water) or could be converted to run on heat (air conditioning and refrigeration) with well developed, existing technology, we could be getting two (or more) benefits out of the same energy that we are using to just run our heaters or computers today. (please don't tell me about the cost of natural gas, we're obviously targeting electric heating markets which are large and growing) One or the other could happen for free, from an energy perspective, if we used computers to make the heat that runs our homes/businesses/economy.
Interestingly, the majority of our US economy actually runs on heat, not electricity... but that is a far more confusing topic that I'd rather talk to economists about.
A gaming rig/laptop aren't going to make enough heat to have an impact so we aren't talking about converting current computers to a super computer/heater. We are building high temperature, modular 2-4kW computing appliances coupled with thermal storage that sit in the space a typical hot water heater sits today and performs local computation that datacenters are currently doing remotely near the arctic circle (JacquesBBB). Yes, there is an existing if small but growing market for this compute. The model is to keep the demand and compute local so we aren't shipping huge amounts of data to the arctic circle - which is pretty expensive, BTW.
http://aceee.org/files/proceedings/2012/data/papers/0193-000409.pdf The data is also parsed across multiple machines taking care of security, redundancy and resiliency issues as well as cutting down dramatically on the need for high speed internet - at this point. Again, this already exists in some of the distributed computing projects
https://en.wikipedia.org/wiki/List_of_distributed_computing_projects as well as distributed rendering.
(JacquesBBB) - we don't want to build data centers, we want to eliminate them so I am not going to spend time jousting about datacenter performance. The equation is simple: datacenters buy energy for compute and cooling. We are using existing energy consumption (space/water heat - refrigeration/air conditioning), which is already paid for, to compute. Datacenters pay significant sums to transfer information to the arctic circle - we want to keep it local and distributed on short hops which are inherently faster.