Cost per degree on thermostat
Dork Scratchings has finally decarbonised! The first to go was the fossil fuel investing pension (this has been replaced by PensionBee's fossil fuel free plan), then went the petrol car (replaced by an electric 208), and this week we got rid of the boiler.
The new heating system is a Mitzubishi heat pump which heats water by extracting heat from cold air. The laws of Thermodynamics set a limit on the amount of heat energy that can be produced per unit of electrical energy and the formula is
$$
\frac{T_H}{T_H-T_C}
$$
Where $T_H$ is the temperature of the hot water ($50^\circ C$ in our case), and $T_C$ is the temperature of the cold air outside the house. Note that the temperatures have to be in Kelvin to make this work, and the value drops as the outside temperature drops. For example, for $50^\circ C$ water, it is 12.9 when the outside temperature is $25^\circ C$ but only 6.5 when the outside temperature is $0^\circ C$. The heat produced per unit of electrical energy for a real heat pump is known as its Coefficient of Performance and is typically around 40% of the ideal value.
We were told it's best not to fiddle with the thermostat setting too much as heat pumps work best when they're aiming for a fixed temperature. This led me to wonder exactly how much difference it would make to the bills what target temperature we set. I spent a while googling but nothing came up, so I thought I'd have a go at modelling it myself.
According to Newton's law of cooling the heat loss $dQ/dt$ of an object, such as a house, is proportional to the difference between its internal temperature $T_{in}$ and the temperature of its surroundings $T$. This should mean that the average rate of heat injection needed to maintain a house at temperature $T_{in}$ should be proportional to
$$
\int_{-\infty}^{T_{in}}\frac{dP}{dT}(T_{in}-T)dT
$$
where $P(T)$ is the probability that the outside temperature is less than $T$. Note that the integral has a limit of $T_{in}$ because when the outside temperature is above the set point the heating goes off, and not into reverse. For a house heated by gas the above value is directly proportional the the cost of heating the house to $T_{in}$ as each unit of heat requires the same quantity of gas to produce. However, for a heat pump the formula is slightly different:
$$
\int_{-\infty}^{T_{in}}\frac{dP}{dT}\frac{T_{in}-T}{C}dT
$$
where $C(T)$ is the coefficient of performance, a temperature dependent quantity. In my model I have used the ideal COP of $T_H/(T_H -T)$. I think this is fine because the real COP is more or less a fixed fraction of the ideal, and that fraction will get absorbed into the coefficient of proportionality.
The first step in modelling is to work out what the distribution $P$ is. I grabbed five year's worth of temperature data from a Cambridge weather station and plotted it:
It is clear from the plot that a normal distribution with mean $9.9^\circ C$ and standard deviation $6.5^\circ C$ is a good enough fit for our purposes.
The next step is to plot the cost of heating a house with a boiler and a heat pump.
I've used arbitrary units for the costs and normalized it so that $20^\circ C$ corresponds to 1.0 for both the boiler (Av. heating power) and the heat pump (Av. electrical power). The two lines diverge for higher thermostat settings. The reason for this is that a heat pump's COP gets better when the outside temperature is higher, but you only take advantage of this if you have the heating on!
The big takeaway however, is that the two lines are pretty similar, and that for practical purposes the fact the COP varies with outside temperature makes little difference.
What about the original question: how much difference does turning up the thermostat make? It's clear from the graph that over the range of interest it is more or less linear, and that if we treat the cost at a setting of $20^\circ C$ as 100 points then each extra degree costs 8 points.
Comments
Post a Comment