I think some folks have confusion about the meaning of relative humidity. Let’s start with that. Relative humidity is a measure (expressed as a percentage) of the actual moisture content in the air compared to the maximum it could hold. When it reaches 100% we call that raining.
The capacity of air to hold water is highly, highly dependent on the air temperature. Higher summer temperature air can hold significantly more moisture then winter cold temperature air. Let’s say a given volume of air at 90F could hold 100 ounces of water but it actually is holding 50 ounces. That would get reported as a 50% relative humidity on the weather map. Now that same volume of air at winter 20F temperature could hold 10 ounces of water but is actually holding 5 ounces of water - same 50% relative humidity but note the difference of 45 ounces of water summer v winter.
Lets take this further and move that 20F 50% relative humidity air volume and bring it inside the house through cracks in windows or whatever and let’s heat it to say 70F. The ability of that 70F air volume to hold moisture is say 80 ounces of water if at 100% relative humidity. But this air only contains 5 ounces of water (see previous paragraph) so now the homeowner experiences a humidity of 5/80 or 6.25% humidity. That’s why people humidify indoor air in the winter. This is why operating DEhumidifiers in the winter is a waste of energy. Want the waste heat they generate? just plug in a heater and be done with it.
If you want to follow a number then track dew point and the temperature - dew point spread. Topic for another time….
For now here is an easy humidity measurement device for the winter months - put a glass of ice water on the counter. If the outside of the glass stays dry with say 30ish F water in the glass then you do not have a humidity problem.
That's a pretty good explanation, but I'm not sure it's so cut and dry. It could be that running a heater to increase the moisture capacity of the space by increasing the bulk temperature would save money compared to using a dehumidifier. You may very well be right; I need to do more research, but I don't agree with you just yet. Besides, at the moment I'd have to run multiple space heaters to maintain the large basement at a high enough temperature to lower RH consistently < 50%.
I know you were simplifying the relationship between temperature and humidity, which is relative humidity (RH), to make a point. What I don't know is how much water is actually entering the space and how it affects the saturation vapor pressure, as well as how much energy is required, taking into account the heat flux to outside due to the larger temperature difference (Q-dot at 60F is higher than 50F if the ambient is less than 50F). The space is not perfectly sealed, so figuring out whether heating the space to lower RH or running a dehumidifier is more efficient.
For your argument, we're assuming that the vapor pressure does not change a significant amount or very little compared to the saturation vapor pressure. The rise in temperature raises the saturation vapor pressure with very little assumed change in vapor pressure from moisture entering the space, resulting in a substantial differential in the two vapor pressures. I don't know how to determine what temperature needs to be maintained to account for moisture in-leakage to obtain RH SAT.
For what I'm doing, the heat from the dehumidifier can be assumed to be insignificant, so the temperature is maintained relatively constant while the vapor pressure is reduced relative to an assumed constant saturation vapor pressure for the given temperature and pressure. As before, water entering the space will raise the vapor pressure and the dehumidifier will need to run periodically to remove that moisture, whereas it would not need to run at all or substantially less than at a higher temperature and lower saturation vapor pressure.
The question is how to solve this. Variables:
1. Moisture in-leakage and evaporation rate at two relative temperatures, and the effect on vapor pressure
-This will change wildly based on outside RH (water entering the space vai air in-leakage) and liquid water in-leakage (through the concrete due to rain and etc.
- As I mentioned and you noted, RH humidity is sometimes low in colder weather, so on days where 50F would still result in < 50% RH with the dehumidifier off, an increase in space temperature > 50F would be a waste of energy.
2. Energy required over time by a space heater(s) to maintain appropriate temperature to lower the RH sufficiently with a higher heat loss to outside
3. Energy required to remove moisture from the air by dehumidifier for same with lower heat loss to outside
-For 2 and 3, the difference is affected by the insulation of the house
Based on what some of you have posted, I need to determine at what temperature and RH mold is really a concern. For now, anecdotally, the basement smells musty, even at these temperatures, if the RH is greater than 50% for an extended period of time. Lots of variables there, too, of course, so it's hard to determine an actual correlation and exact RH vs temperature, at least for me.