The New American
by Ed Hiserodt
Carbon dioxide, that little molecule composed of one carbon atom and two oxygen atoms, is a modern-day environmental criminal. Its crime: trapping the Sun’s heat in the Earth’s atmosphere, potentially causing catastrophic global warming that will destroy all life on Earth. It’s so bad that the United Nations Intergovernmental Panel on Climate Change (IPCC) officially labeled it a pollutant in 2009. Humans, especially those living in wealthy Western countries, are said to be co-conspirators in the climate-change crime, since they release those criminal molecules into the atmosphere every day via industrial activity.
At least, that’s the story we’ve all been told for the past few decades. But before we charge anyone (or anything) with destroying the Earth’s environment, it would be wise to do a little more detective work to gather all the available evidence. In the following article, we’ll do just that. Spoiler alert: We’ll find that CO2 and human industrial activity have been falsely accused. The evidence points to the Sun, not CO2, as likely being the main driver of climate change, and that climate change is a perfectly natural phenomenon.
Before we get into the details of how solar activity helps drive the Earth’s climate, let’s take some time to briefly look at the “CO2 drives climate” argument and examine why it’s wrong.
Falsely Accused
The increase in atmospheric CO2 concentration over the last century and a half, from slightly under 300 parts per million (ppm), or 0.03 percent of the Earth’s atmosphere, to around 400 ppm, or 0.04 percent, is blamed for the observed warming of the Earth’s average temperature over that time period. Further, the fact that human industrial activity also increased dramatically over that same period seems to justify claims that the increase in CO2 levels is caused by humans. Ergo, humans are causing climate change.
“Climate scientists” often throw around terms such as CO2 forcing, sensitivity, positive feedback loop, tipping point, and runaway greenhouse when attempting to explain the supposedly disastrous effects of CO2 on our planet. Their argument goes something like this: Atmospheric CO2 concentration increases because of human industrial emissions; this is known as “CO2 forcing.” Because of the climate’s high “sensitivity” to CO2, this “forcing” will cause the Earth to warm, owing to the greenhouse effect. This will cause changes to the Earth, such as melting ice sheets, etc., which will lower the Earth’s albedo (i.e., cause less sunlight to be reflected back into space), which will further warm the Earth in sort of a vicious cycle of global warming. This cycle is known as a “positive feedback loop.” Once the atmospheric CO2 concentration reaches a certain level, the Earth’s climate will have reached a “tipping point” where, owing to positive feedback loops, a “runaway greenhouse” is inevitable, meaning the planet will essentially be destroyed or at least made uninhabitable, and nothing could be done to stop this. “Climate scientists” such as Bill McKibben, founder of 350.org, say that we should keep atmospheric CO2 levels at 350 ppm or lower in order to avoid any untoward effects. Since we’re already at 400 ppm, it sounds like we’re in big trouble — meaning humans must make immediate and drastic cuts to CO2 emissions in order to stave off certain doom, and we might already be too late.
Well, not so fast. First of all, the warming effect of increased CO2 concentrations is logarithmic, not linear. What does this mean? The best analogy is that of blankets on a cold winter night: The first one is a godsend, the second stops the shivering, and the third brings peaceful comfort. But by the time you pile on the ninth or 10th blanket, the effect is imperceptible. Likewise with CO2: While the Earth’s temperature could be expected to rise by one degree Celsius if pre-industrial CO2 levels were doubled (i.e., to 600 ppm), that would not be the case for another doubling. Evidence for this can be seen in temperature reconstructions done by paleoclimatologists. Using a commonly accepted age for the Earth of 4.6 billion years, paleoclimatologists’ reconstructions of past climates place CO2 levels between 1,000-2,000 ppm for much of the Earth’s history, with levels sometimes reaching as high as 4,000 ppm. So was the planet fried to a crisp? No! Reconstructions place mean surface temperatures, at most, at 7° Celsius higher than today’s levels — so much for runaway global warming caused by CO2 crossing some soon-to-be achieved “tipping point.”
What’s more, there’s not even a strong correlation in more recent times between temperatures and CO2 levels. Most data reveal that temperature drives CO2 levels, not the other way around. Case in point: In the late 1800s, the Earth started coming out of a cold period known as the Little Ice Age that began around 1500. Atmospheric CO2 concentrations did not start increasing until after this warming trend began, and didn’t really start taking off until the mid 20th century. So how can warming cause CO2 levels to increase, and not the other way around? Put simply, as the Earth warms, the oceans warm up. Water maintains its temperature better than air, so the oceans will warm up more slowly. Warm water holds less CO2, so as the oceans finally warm, they off-gas CO2 into the atmosphere.
Understanding this solves a major problem: Computer models used by “climate scientists” to predict climate assume CO2 is the driver of temperature increases, but the observed temperatures are always lower than what the models had predicted. Increases in CO2 levels have not caused an increase in temperature anywhere near predictions. The September 2017 edition of the prestigious Nature Geoscience journal contained an admission that, as the London Times put it, “We Were Wrong, Climate Scientists Concede.” According to the Times, “The world has warmed more slowly than had been predicted by computer models, which were ‘on the hot side’ and overstated the impact of emissions on average temperature, research has found.”
It doesn’t seem very logical to assume that a weak greenhouse gas (CO2) is the main driver of climate change. What, then, are some other ingredients in this complex stew we call “climate”?
Here Comes the Sun
Suggest to climate alarmists that the Sun may in fact be a primary driver of climate change, and you will likely get the dismissive retort: “We’ve already considered that and found it insignificant.” Actually, they are right, in a way. In a universe where there are variable stars that pulse at thousands of times per second, our Sun is a model of constancy. Solar output varies by only about 0.15 percent. This small change in intensity is indeed too small to account for climatic changes — at least directly. But our Sun has other ways to shed its light on climate.
Every 11 years, the polarity of the Sun reverses, and the Sun goes through a cycle of increasing and decreasing sunspot activity, solar flares, and solar radiation. These solar cycles are numbered, with the 1755-1766 cycle being labeled Cycle 1. We are currently in Cycle 24. Throughout history there have also been longer cycles in which a series of the 11-year cycles see a relatively “inactive” sun, with few sunspots even during the peaks of the 11-year cycles. It is during these times that the Earth experiences a cooler climate. During periods of high solar activity, with many sunspots, average global temperatures will be warmer.
One such inactive cycle came between the early 1600s and approximately 1715, where astronomers recorded a phenomenal dearth of sunspots. Research in the 1800s by German astronomer Gustav Spörer reveals that during a 28-year period (1672-1699) there were fewer than 50 sunspots. By comparison, we would typically see 40,000 to 50,000 sunspots over a similar period today. Spörer was a contemporary of British astronomer Walter Maunder, who identified a period from 1645 to 1715 now known as the Maunder Minimum. We know little about the solar cycles preceding that period since such cycles had not yet been realized by observers. But some reconstructions indicate that the Maunder Minimum followed several periods of decreasing activity. Incidentally, this trend of decreasing solar activity, “bottoming out” with the Maunder Minimum, fits very neatly with the aforementioned Little Ice Age. The Dalton Minimum (circa 1790-1830), named after English meteorologist John Dalton, occurred as temperatures recovered from the Little Ice Age, and appears to have slowed the return to earlier norms.
There is no lack of researchers who present studies on the similarity between today’s waning of the solar magnetic flux and periods of low sunspot activity such as the Little Ice Age. Since the 1970s, Valentina Zharkova, mathematics professor at Great Britain’s Northumbria University, has been studying solar cycles. Earlier this year she presented to the National Astronomy Meeting in Wales a new model of solar cycles, which scientists agree is an amazingly accurate prediction tool. According to the Royal Astronomical Society, “Predictions from the model suggest that solar activity will fall by 60 percent during the 2030s to conditions last seen during the ‘mini ice age’ that began in 1645.” In fact, a decrease in solar activity from Cycle 22 to the current Cycle 24 are typical of conditions leading to a “global cooling” phase.
On the other side of the planet, the Chinese Science Bulletin published a study entitled “Periodicities of Solar Activity and the Surface Temperature Variation on the Earth and Their Correlations” in which the authors opined, “Research shows that current warming does not exceed natural fluctuations of climate. The climate models of IPCC seem to underestimate the impact of natural factors on climate change, while overstat[ing] that of human activities.”
So we can see a direct correlation between solar activity (most easily marked by the number of sunspots) and temperatures here on Earth, but if solar radiation does not vary enough to affect the Earth’s climate, what’s going on here?
Among the dozens of scientists around the world studying the effects of the Sun on Earth’s climate is Henrik Svensmark, a physicist and professor in the Division of Solar System Physics at the Danish National Space Institute in Copenhagen. According to “Climate Change Reconsidered,” the 2009 report of the Non-Governmental International Panel on Climate Change,
[Svensmark] and his colleagues experimentally determined that electrons released to the atmosphere by galactic cosmic rays act as catalysts that significantly accelerate the formation of ultra-small clusters of sulfuric acid and water molecules that constitute the building blocks of cloud condensation nuclei.
He then discusses how, during periods of greater solar magnetic activity, greater shielding of the earth occurs, resulting in less cosmic rays penetrating to the lower atmosphere, resulting in fewer cloud condensation nuclei being produced, resulting in fewer and less reflective low-level clouds occurring, which leads to more solar radiation being absorbed by the surface of the earth, resulting in increasing near-surface air temperatures and global warming.
To put it in plain English, cosmic radiation coming from space (i.e., from outside the solar system) enters the Earth’s lower atmosphere, where it reacts with water vapor and other molecules in the air to cause cloud formation. The more radiation, the more clouds, and the more clouds, the less solar energy is able to hit the Earth, resulting in a cooler Earth. When the Sun is “active,” its magnetic field blocks this cosmic radiation, which means fewer clouds will be formed, and more of the Sun’s energy will hit the Earth, resulting in a warmer Earth.
That all sounds very interesting, but where is the evidence that all of this is actually happening? The relationship between solar activity, cosmic radiation, and cloud cover is well attested. For instance, when comparing cloud cover and cosmic ray flux over the past two solar cycles, it’s clear that when more cosmic rays enter the atmosphere, more low-level cloud formation occurs, and vice versa. The fact that solar cycles affect this has been noted by other scientists as well. From the Encyclopedia Britannica we learn:
The effect was discovered in 1937 by the American physicist Scott E. Forbush. Forbush observed that the intensity of cosmic rays reaching Earth was inversely correlated with the 11-year solar cycle of sunspot activity in that there are more cosmic rays at the minimum of the cycle and fewer cosmic rays at the maximum. At maximum solar activity, stronger magnetic fields are carried out into interplanetary space by the solar wind, and these fields block the cosmic rays.
The near-Earth interplanetary magnetic field can also be measured independently. Using isotopic analysis, these data can be recovered from as early as 1868. This was accomplished at the U.K. Solar System Data Center by a team of scientists led by Dr. Michael Lockwood, who published results of their research in a Nature article entitled “A Doubling of the Sun’s Coronal Magnetic Field During the Last 100 Years.” They found that the near-Earth magnetic field environment has risen by a factor of 1.4 since 1964 and by a factor of 2.3 since 1901. The authors report: “Moreover, changes in the heliospheric magnetic field have been linked with changes in total cloud cover over the Earth, which may influence global climate change.”
So solar activity affects the level of cosmic rays reaching Earth, which in turn affects cloud cover in the lower atmosphere. Does this really impact temperature? In “Cosmic Rays and Climate” published in Surveys in Geophysics, author Jasper Kirby notes, “The question of whether, and to what extent, the climate is influenced by solar wind and cosmic ray variability remains central to our understanding of the anthropogenic contribution to present climate change.” The editors of Climate Change Reconsidered agreed with Kirby and added:
Clearly, carbon dioxide is not the all-important dominating factor in Earth’s climatic history. Within the context of the Holocene, the only time CO2 moved in concert with air temperature was over the period of Earth’s recovery from the global chill of the [Little Ice Age], and it does so then quite imperfectly. The flux of galactic cosmic rays, on the other hand, appears to have influenced ups and downs in both temperature and precipitation over the entire 10-12 thousand years of the Holocene, making it the prime candidate for “prime determinant” of the Earth’s climatic state.
Nir Shaviv, a senior lecturer at the Hebrew University of Jerusalem, hypothesized in 2002 that as our solar system passes through the spiral arms of the Milky Way galaxy, variations in cosmic radiation appear to be a causative factor of ice ages on Earth. The stars in the relatively more crowded neighborhood of the spiral arms produce a higher cosmic ray flux. Just as Svensmark et al. had done, Shaviv postulated that the galactic cosmic ray flux was associated with global low-altitude cloud cover. According to Shaviv, this cycle takes place about every 145,000 years and can be confirmed by looking at the ratios of different oxygen atoms in marine fossils. Shaviv’s work may be seen as further evidence of climate changes caused by cosmic activity from outside our solar system.
But increasing or decreasing cloud cover, as a result of solar activity affecting cosmic rays, is not the only factor influencing Earth’s long-term climate.
More Cycles, Anyone?
There are other cycles, also involving the Sun, that are a major factor influencing global climate change. These are known as Milankovitch Cycles, named after Serbian astronomer, geophysicist, and mathematician Milutin Milanković (1879-1958). Milankovitch Cycles have to do with the physics behind how the Earth moves about on its orbit over long periods of time, and how this affects the Earth’s relationship with the Sun (and the moon), and the overall affect of these variables on the Earth’s climate. (These cycles, it should be noted, are not associated with the Sun’s effect on cosmic ray flux.) For nearly a century, scientists have used Milankovitch Cycles to understand large-scale climate events such as periods of advancing glaciation, commonly called ice ages, and corresponding warm periods, known as interglacials.
Milankovitch Cycles deal with the changes in several elements of Earth’s movement through space. One is eccentricity, or the deviation from a circular orbit. The Earth’s orbit around the Sun is not a perfect circle (though it’s pretty close); rather, it’s an ellipse, or a slight oval shape. Further, the Sun is not at the exact center of that ellipse: Earth is currently slightly closer to the Sun during the Southern Hemisphere’s “summer” and slightly further from the Sun during the Northern Hemisphere’s “summer.” Milanković calculated that the eccentricity of Earth’s orbit actually changes on a 100,000-year cycle (mainly due to the gravitational pull of Jupiter and Saturn on the Earth), with the “sides” of the elliptical orbit getting closer to the Sun (i.e., the ellipse gets “thinner”), and then moving back. This ends up pulling the “close” end of the ellipse closer to the Sun as well. When the orbit is most elliptical, the Earth receives 23 percent more radiation from the Sun at the “close” end than at the “far” end of the orbit. Jupiter and Saturn also cause Earth’s orbit itself to move around the Sun much the same way a hula hoop moves around a person’s body. This is known as apsidal precession, and occurs on a 112,000-year cycle. Apsidal precession results in the seasons occurring at different places on the Earth’s orbit, so “winter” and “summer” will sometimes occur on the “sides” of the elliptical orbit rather than at the close and far “ends.”
Milankovitch Cycles also take into consideration axial tilt, or the angle of the Earth’s north-south axis as it revolves around the Sun. Currently at 23.44°, axial tilt varies between 22.1° and 24.5° on a 41,000-year cycle. The tilt is currently on its way down, or “straighter,” and will reach its minimum by roughly 11,800 A.D. Axial tilt largely accounts for the Earth’s seasons, as the hemisphere tilted toward the Sun during the day is said to be in “summer,” versus the hemisphere tilted away from the Sun, which would be in “winter.” But the Earth is not just tilted on its axis, it also “wobbles” on the axis as it revolves around the Sun, much the same way a top wobbles as it spins down. This “wobbling” is known as axial precession and is another component of the Milankovitch Cycles, and is caused by tidal forces exerted by the Sun and moon on the Earth in approximately equal measure. Essentially, the poles will appear to change position relative to the fixed stars; owing to precession, someday Polaris will no longer be the North Pole star, and in about 13,000 years the Northern Hemisphere summer will be at the “close” end of Earth’s orbit around the Sun, with Northern winter occurring while the Earth is further from the Sun. Axial precession works in conjunction with apsidal precession and occurs on a cycle averaging 23,000 years.
That’s some pretty complicated physics, but the effect of all of these cycles working in conjunction is to change the way the Earth relates to the Sun, i.e., how close/far it is from the Sun, which hemisphere is tilted toward/away from the Sun, etc. One can see how this would affect long-term climate on the Earth. For example, if axial tilt is at maximum, the orbit is at its least elliptical, and axial/apsidal precession places the Northern Hemisphere winter at the “far” end of the orbit, which would make for some very cold winters. Geologists and paleoclimatologists generally accept the Milankovitch Cycles as the main contributor to cyclic, long-term climate events such as ice ages. For decadal or century time-scale changes, the effect of cosmic rays and cloud cover on Earth’s climate has strong evidence backing it and is gaining in standing — unlike claims about carbon dioxide’s effect on Earth’s temperatures.
Where Do We Go From Here?
So what are we to make of all this? Is CO2 a greenhouse gas? Yes. Does it contribute to “global warming”? Yes, but only very slightly. Is it the main driver of climate? Not a chance.
Climate alarmists run into big problems when attempting to predict global temperatures because their basic premise is that atmospheric carbon dioxide is the main driver of Earth’s temperature and climate. So an increase in CO2 levels, from, say, human industrial activity, will supposedly result in out-of-control global warming. So far, Mother Nature has proven them wrong.
Scientists interested more in the truth than in government grants see things in a different light. They see the Sun, and natural solar and orbital cycles, as an important driver of the Earth’s climate. The 20th century was dominated by strong solar activity, or a “hot Sun.” However, the diminishing activity over the most recent two solar cycles is expected to produce a “cool Sun” for the next quarter century. This suggests that we can expect lower average global temperatures.
The Earth will continue to revolve around the Sun, with the Milankovitch Cycles influencing long-term climate. We will eventually be thrust into another glacial event, and someday the Earth could become a warm “greenhouse” again with no permanent ice sheets at the poles.
The Earth’s climate is not overly sensitive to CO2 levels, we are not anywhere near a tipping point (which is likely not even possible), and humans are not going to destroy the planet because of industrial CO2 emissions. However, as The New American has been pointing out for quite some time, this alarmist viewpoint is being used to foist a global “climate regime” on all of humanity, with the goal of regulating human behavior in order to “save the planet.”
The facts are on the side of climate realists, who see the Sun as a main driver of Earth’s climate. Realists need to get the word out to as many people as possible that the CO2 alarmists are wrong, despite their religious-like zeal in spreading their false ideas. The truth will win out when enough people are exposed to it.