I'm looking at various options for adding some renewable energy into our houses energy mix, and as the price of solar panels falls these start to look like a sensible option. However, currently there is no sensible feed-in tarrif in the UK, and electricity still isn't actually very expensive, so reducing fossil fuel consumption without spending more money is actually really difficult.
However it struck me, that if you start with solar panels for just water heating, then a DC load, acting in input voltage control mode, is basically an MPP controller for solar panels, and as what you want is heat anyway, if you made that load water cooled, then your solar electric energy is pretty much driven straight to heat, and there is no need for an expensive DC -> AC inverter.
A 1kW nominal system would be reasonably easy and relatively cheap, perhaps using 20 parallel pass elements (transistors will be fine because you don't care about how the current gets to ground, so the high base current is ok). That would be 50W per device, just about inside the SOA for a typical TO-220 package. You could also wire your solar panels in parallel and run at a low voltage to avoid any HV issues, obvs your resistive losses will be a bit higher than running at HV, but for a 1kW system at say 30Vdc we are only talking about 33 amps, so nothing too wild.
One question i did wonder about is how is the life of a BJT transistor affected by the temperature at which it runs? Given your water outlet temp is going to need to be lets say 80 degC (aim for 60 to 65degC max in the hot water tank), running at max junction temp (150 degC) gives an overal deltaT for the heater of around 70 degC, so to pull 50watts out of it, we'll need a thermal impedance to the junction of lower than 1.4 degC/watt, and that is getting tricky (typical TO-220 junction to case impedances seem to sit around 1.2 to 1.5 degC/watt)
So if we run our trannies at maximum junction temp is that going to cause long term reliability issues??