You got me curious, so I reviewed my thermo texts on closed loop dual heat exchangers. For an ideal system, raising the flow rate always increases the overall heat transfer. It's not a linear curve, more bell shaped. Changing the flow rate obviously alters the temps of the fluid in different places of the system which may have its own consequences, but the overall heat transfer does goes up. 1 gpm of fluid passing through an exchanger with an inlet-outet temp diff of 20 degrees, is transferring the same amount of heat as 2 gpm with a 10 degree temp drop.
Certainly too low of a flow rate causes problems. If the coolant is back down to ambient temp by the time its halfway through the radiator, then the second half of the radiator isn't doing anything. Increaseing the flow rate will allow better use of the radiator (ie you can increase the flow rate and still have the same temp drop). Coolant spending too much time in the engine may heat up above boiling.
The Thermo text also discusses proper flow rate selection. Too low a flow rate underutilizes the heat exchanger surfaces. Too a high flow rate causes cavitation in either the pump or the radiator. Pump cavitation damages the pump and if the coolant is near its boiling point it can cause localized hot spots or steam pockets. If you increase the flow rate to the point of causing turbulence or cavitation in the radiator itself, its efficiency starts dropping very quickly. I think that is more likely the cause of the problem. I also noted a number of engine builders saying they use restrictors to avoid cavitation at high engine speeds.
Of course, an engine is hardly an ideal heat exchanger and local coolant temps matter quite a bit. Especially when the coolant is operating fairly closely to it boiling point.
But thats just me being anal and academic about things. I don't doubt that too high of a flow rate causes problems, but I'm thinking its cavitation thats the underlying issue and not just the flow rate all by itself.
Great analysis except for one thing. The 1 gpm at 20 F delta T, and 2 gpm at 10 F delta T, ignores heat transfer rates. The higher the delta T, the higher the heat transfer rate, thus the higher the heat rejection (absorption) rate.
That may not be clearly stated, so let me explain further. If the water entering the engine is colder, say 150 F versus 200 F, the 150 F water at a fixed flow rate will absorb more heat, than the 200 F water. Reverse happens for the radiator. If I double the flow rate, the heat transfer increase is not linear. You need to look at the new equilibrium temperatures, and the heat transfer rate at those temperatures. IIRC there is log rate on heat transfer that is something like ln(T1-T2)? It predicts that the heat transfer rate depends on the ln of the temperature difference. The temperature difference is the driving force that moves heat, and the rate of that movement depends on the LN(T1-T2), IIRC?
On another interesting point, the radiator may at times be a sufficient restriction to flow, especially if it is clogging up, used, such that the system cools better with out the thermostat (clogging radiator takes its place?). Something I may have experienced in the past, with out understanding why at the time. In other words in an older system, the thermostat is restricting flow too much for an old radiator.
Today, I discovered that my system, which has several cooling improvements now has hotter coolant exiting the radiator, than before the changes I made, but the system, tested at the thermostat is running cooler than before. The only reasonable explanation I can find is that my 4 year old 2 row alumiunum radiator was restricting flow (I was getting 30-40 F delta across the radiator which had me thinking it was working properly, but it was 4 years old), and my new CSF, 3 row copper/brass radiator has only a 15 F delta T across the radiator at idle. :dunno:
All I can figure is I am getting a lot more flow through the entire system now as my peak thermostat temp is down a good 20 to 30 F, while the new radiator exit temp is up 10 to 20 F!:dunno:
I now have a 15 F delta T between the engine inlet (radiator exit) and engine outlet (radiator inlet), where as it was 30-40 F with the old radiator. Only other change I made was a new AC condenser.
Another important factor is that heat transfer from metal to water (coolant) is much faster than from metal to air. Huge difference in surface areas needed to move the same amount of heat.