Shortwave and longwave radiative contributions to global warming under increasing CO2
In response to increasing concentrations of atmospheric CO2, high-end general circulation models (GCMs) simulate an accumulation of energy at the top-of-the-atmosphere not through a reduction in out-going longwave radiation (OLR) as one might expect from green-house gas forcing but through an enhancement of net absorbed solar radiation (ASR). A simple linear radiative feedback framework is used to explain this counter-intuitive behavior. It is found that the timescale over which OLR returns to its initial value following a CO2 perturbation depends sensitively on the magnitude of short-wave (SW) feedbacks. If SW feedbacks are succinctly positive, OLR recovers within merely several decades and any subsequent global energy accumulation is due to enhanced ASR only. In the GCM-mean, this OLR recovery timescale is only 20 years due to robust SW water vapor and surface albedo feedbacks. However, a large spread in the net SW feedback across models (due to clouds) produces a range of OLR responses; in those few models with a weak SW feedback, OLR takes centuries to recover and energy accumulation is dominated by reduced OLR. Observational constraints of radiative feedbacks from satellite radiation data and surface temperature data suggest an OLR recovery timescale of decades or less, consistent with the majority of GCMs. Altogether, these results suggest that although greenhouse gas forcing predominantly acts to reduce OLR, the resulting global warming is likely due to enhanced ASR.