Hubble’s Successor and the Man it’s Named After

In 2018, NASA is scheduled the launch the James Webb Space Telescope (JWST). The JWST is the successor to the Hubble Space Telescope. The Hubble, upgraded in 2009, is still producing high quality science. However, it has been in operation for 25 years, and like an old car, will begin to break down sooner or (hopefully) later. It is projected that Hubble will fall back to Earth sometime around 2024.

The JWST will be fundamentally different from the Hubble in three ways, its mirror type, location, and the part of the electromagnetic spectrum observed.

The Hubble’s primary mirror is a single piece 2.4 meters (8 feet) in diameter. The mirror is made of ultra low expansion glass that weights 2,400 lbs. This is pretty lightweight; a regular glass mirror the same size would weigh five times as much. The JWST primary mirror will consist of 18 segments with a total weight of 1375 pounds. The total mirror size will be 6.5 meters (21 feet) in diameter.

Why are the JWST mirrors so light?

The mirrors for the JWST, rather than composed of glass, are made of beryllium. This substance (mined in Utah) has a long history of use in the space program, as it is very durable and heat resistant. In fact, the original Mercury program heat shields were made of beryllium. In space, weight is money. Currently, it cost $10,000 to put a pound of payload into orbit. Since the JWST mirror has 7 times the area of the Hubble mirror, a lighter material had to be found.

Beryllium itself is a dull gray color. The mirrors will be coated with gold to reflect the incoming light back to the secondary mirror to be focused into the JWST instrument package.  The choice of gold was not for aesthetic purposes, but rather gold is a good reflector of infrared light and that is key to the JWST mission.  The total amount of gold used is a little over 1 1/2 ounces, worth roughly $2,000, which is a minute fraction of the JWST $8.5 billion budget (about the same price tag for an aircraft carrier).

The final assembly of the primary mirror will take place at the Goddard Space Flight Center in Maryland. The contractor for the assembly is ITT Exelis, which was formally a part of Kodak and is still based in Rochester, NY.

The JWST will launch in 2018 on an Ariane 5 rocket at the ESA launch facility in French Guiana. This is near Devil’s Island, the site of the former penal colony featured in the film Papillon. Its location near the equator provides a competitive advantage over the American launch site at Cape Canaveral. The closer to the equator, the greater the eastward push a rocket receives from the Earth’s rotation. In Florida, the Earth’s rotational speed is 915 mph. At French Guiana, it is 1,030 mph.  That extra 1,000 mph boost allows a launch vehicle to lift more payload into orbit.

Even if the shuttle program were still active, unlike the Hubble, it would not have been used to lift the JWST into space. The Hubble is situated in orbit 350 miles above the Earth. This was the upper end of the shuttle’s range. The JWST will be placed 1,000,000 miles away from Earth at a spot known as the L2 Lagrange point.  What is the L2 point?  Think of the launch of the JWST as a golfer’s drive shot.  The interplay between the Earth and Sun produce gravitational contours as seen below:

Credit: NASA / WMAP Science Team

The gravitational contours are like the greens on a golf course.  The arrows are the direction gravity will pull an object.  The blue areas will cause the satellite to “roll away” from a Lagrange point.  Red arrows will cause the satellite to “roll towards” the desired destination.  Kind of like this shot from the 2012 Masters:

The L2 spot is not entirely stable.  If the JWST moves towards or away from the Earth, its operators will need to make slight adjustments to move it back towards the L2 spot.  Due to this placement, the JWST will not have the servicing missions the Hubble enjoyed. The specifications of the JWST must be made correctly here on Earth before launch.

Why does the JWST need to be so far away from Earth?

The answer lies in the part of the electromagnetic (EM) spectrum the telescope will observe in. Don’t get turned off by the term electromagnetic, as we’ll see below, you will already be familiar with most parts of the EM spectrum.

Credit: NASA

The word radiation tends to be associated with something harmful, and in some cases, it is.  However, radio and light waves are also forms of EM radiation.  What differentiates one form of radiation from another is its wavelength.  Cool objects emit mostly long wavelength, low energy radiation.  Hot objects emit short wavelength, high energy radiation.  The JWST will observe in the infrared.  And this is a result of the objects the JWST is designed to detect.

The JWST will search the most distant regions of the universe.  Due to the expansion of the universe, these objects are receding from us at such a rapid rate, their light is red-shifted all the way into the infrared.  Planets also emit mostly in the infrared as a consequence of their cool (relative to stars) temperatures.    The infrared detectors on the JWST will enable it to study objects in a manner that the Hubble could not.

The L2 location allows the JWST to be shielded from the Earth, Moon, and Sun all at the same time.  This prevents those bright sources of EM radiation from blotting out the faint sources of infrared that the telescope is attempting to collect.

The video below from National Geographic provides a good synopsis of the JWST.

So, who was James Webb? And why did NASA name Hubble’s successor after him?

The short answer is that James Webb was NASA Administrator during the Apollo era. Given that Apollo may very well be NASA’s greatest accomplishment, that alone might be enough to warrant the honor. However, Webb’s guidance during NASA’s formative years was also instrumental in commencing the space agency’s planetary exploration program. To understand this, lets take a look at John Kennedy’s famous “we choose to go to the Moon” speech at Rice University on September 12, 1962.

During that speech, President Kennedy not only provided the rational for the Apollo program, but stated the following:

“Within these last 19 months at least 45 satellites have circled the earth. Some 40 of them were made in the United States of America and they were far more sophisticated and supplied far more knowledge to the people of the world than those of the Soviet Union.

The Mariner spacecraft now on its way to Venus is the most intricate instrument in the history of space science. The accuracy of that shot is comparable to firing a missile from Cape Canaveral and dropping it in this stadium between the 40-yard lines.

Transit satellites are helping our ships at sea to steer a safer course. Tiros satellites have given us unprecedented warnings of hurricanes and storms, and will do the same for forest fires and icebergs.”

It has to be noted here that soaring rhetoric notwithstanding, Kennedy was not exactly a fan of spending money on space exploration. At least not to the extent the Apollo program demanded. Kennedy felt the political goal of beating the Soviet Union to the Moon trumped space sciences.  Nonetheless, you can see the origins of NASA’s planetary & Earth sciences programs along with applications such as GPS in Kennedy’s speech. So how does James Webb fit into all this?

When tapped for the job as NASA administrator, Webb was reluctant to take the position. Part of it was his background as Webb was a lawyer. He was also Director for the Bureau of the Budget and Under Secretary of State during the Truman Administration. Webb initially felt the job of NASA Administrator should go to someone with a science background. However, Vice President Lyndon Johnson, who was also head of the National Space Council, impressed upon Webb during his interview that policy and budgetary expertise was a greater requirement for the job.

That background paid off well when dealing with both Presidents Kennedy and Johnson. As NASA funding increased rapidly during the early 1960’s, there was great pressure to cut space sciences in favor of the Apollo program. Webb’s philosophy on that topic was this; “It’s too important. And so far as I’m concerned, I’m not going to run a program that’s just a one-shot program. If you want me to be the administrator, it’s going to be a balanced program that does the job for the country that I think has got to be done under the policies of the 1958 Act.”

The 1958 Act refers to the law the founded NASA and stipulated a broad range of space activities to be pursued by NASA.  The law can be found here.

During the 1960’s, NASA’s percentage of total federal spending is below:

Credit: Center for Lunar Science and Exploration

NASA has never obtained that level of funding since. Most of it was earmarked to develop and test the expensive Saturn V launch vehicle. And pressure was often applied from the President to Webb to scale back or delay NASA’s science program to meet Apollo’s goal of landing on the Moon before 1970. The video below is a recording of one such meeting between Kennedy and Webb.

Webb’s law background served him well in making the case for a balanced NASA agenda.  Despite pressure of the highest order, Webb was able to guide both Apollo to a successful conclusion and build NASA’s science programs as well.  The latter would include the Mariner program that conducted flybys of Mercury, Venus, and Mars.  Mariner 9 mapped 70% of Mars’ surface and Mariners 11 & 12 eventually became Voyager’s 1 & 2, humanity’s first venture beyond the Solar System.

Quite a legacy for a non-science guy.

This also demonstrates you do not necessarily have to have a science/engineering background to work in the space program.  Take a gander at NASA’s or SpaceX’s career pages and you will find many jobs posted for backgrounds other than science.  As James Webb proved, it takes more than science to study the universe.

*Image at top of post is JWST mirror segment undergoing cryo testing.  Credit:  NASA.

Pluto & New Horizons

When I was in grade school, I designed a crewed mission to Pluto and dubbed it Hercules, obviously taking a cue from the then recent Apollo program. The ship itself was armed with laser banks. Not sure what exactly I was expecting to run into out there, perhaps just Cold War paranoia. The crew was also top heavy in security personnel. As anyone who watches Star Trek can tell you, just like pitchers in baseball, you can’t have enough redshirts on a space mission. The mission was planned to go in the year 2002.

It’s really funny to see the ideas one can conjure when you do not have to worry about budgets, research, and politics. The people at NASA who do have to worry about that stuff are unable to send humans that far, but have pulled off a most excellent mission in New Horizons that will flyby Pluto with its closest approach on July 14th.  An added bonus, with the advent of social media, we will get to see the images from this mission almost in real-time.

The mission was put together at a cost of $700 million.  That is the same amount spent in Colorado on marijuana during its first year of legalization.

This particular mission has a personal tie to me in that it was launched on January 19, 2006, the very same week I started my first class teaching astronomy. Next fall, in my 10th year of teaching, I will finally be able to discuss the New Horizon’s images of Pluto rather as something to look forward to. An animation of New Horizon’s voyage to Pluto is below. You’ll note that New Horizons performed a flyby of Jupiter to receive a gravity boost towards Pluto.

The gravity boost from Jupiter in 2007 shortened the journey to Pluto by three years. The flyby of Jupiter provided a test run for New Horizon’s imaging equipment and the results were impressive. The video below shows New Horizon’s look at the rotation of Jupiter.

New Horizons also took this shot of a volcanic plume on Io, which is the most volcanically active body in the Solar System. This activity is generated by gravitational flexing of Io as it is stretched back and forth by Jupiter, Europa, and Ganymede. This is similar to the heat caused by stretching a putty ball back and forth.

Credit:  NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute
Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute

The image above of Io was taken from 1.5 million miles away.  While Pluto is only 60% the size of Io, New Horizons will approach much closer at 6,200 miles and should provide exceptional image quality.

A lot has happened to Pluto itself over the past decade. Not so much Pluto, but rather our perception of it. Of course, when New Horizons was first proposed in 2001, Pluto was still classified as a planet. It is now referred to as a dwarf planet. Over time, as memory of Pluto as a planet fades, I suspect this will eventually be changed to simply a Kuiper Belt object (KBO).

The reclassification was portrayed in the popular media as a demotion for Pluto. It really was not so much a demotion as it was an expansion of our understanding of the nature of Pluto and the Solar System.  For those of us who went to grade school before the reclassification, we were introduced to the planets with a diagram such as this:

Credit: Rice University

An updated version of this diagram looks like this:

Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute/Alex Parker

The yellow line is the path of the New Horizons probe.  The yellow dots?  Those are Kuiper Belt objects of which Pluto is one of.

Pluto’s classification as a planet was shaky from the start.  Pluto’s orbit is inclined much more than the other eight planets and its composition is unlike the four gas giants which occupy the outer Solar System.  Questions about Pluto’s planet classification were raised only a few months after its discovery by Clyde Tombaugh of the Lowell Observatory, as this New York Times article from April of 1930 indicates.

However, Pluto’s size was thought to be much larger at the time than it actually is and that caused the planetary classification to stick.  Gerard Kuiper himself, as late as 1950, calculated Pluto to be about the same size as Earth.  In fact, it was this overestimate of Pluto’s size that caused Kuiper to predict the following year there would not be what we now call the Kuiper Belt.  It’s a bit ironic that the Kuiper Belt is named after the astronomer who predicted its non-existence, as Kuiper felt Pluto would have cleared out that region of the Solar System during its formation.

Nonetheless, Kuiper had a distinguished career that included the discovery of Titan’s atmosphere, carbon dioxide in Mars’ atmosphere, and the Uranus satellite Miranda.  Kuiper played a key role as mentor to Carl Sagan during the 1950’s as well.  Unlike most astronomers at the time, Kuiper felt there was an abundance of planets outside the Solar System.  In turn, this inspired Sagan to explore that along with the possibility of life beyond Earth.  This was mentioned prominently during Ann Druyan’s remarks at the recent inauguration of the Carl Sagan Institute at Cornell University.

The Kuiper Belt is a region of the Solar System past the orbit of Neptune that is thought to contain thousands of small celestial bodies composed of water ice, methane, and ammonia.  Short period comets originate from this region.  An excellent overview on the Kuiper Belt can be found here.

The first Kuiper Belt object discovered besides Pluto came in 1992.  Since then, some 1,300 Kuiper Belt objects have been observed.  This, along with more precise measurements of Pluto’s mass, which have come in at 0.002% of Earth’s, have resulted in the reclassification of Pluto to its present dwarf-planet designation.  Pluto is simply too small to be considered a planet and its placement in the Solar System puts it among other Kuiper Belt objects.

Does this reclassification affect Clyde Tombaugh’s legacy as the discoverer of Pluto?  I think not.  Consider this, Tombaugh discovered a Kuiper Belt object 62 years ahead of the next observation of another such object.  Tombaugh also discovered Pluto six years before earning his bachelors degree at University of Kansas.  Keep in mind, the discovery of Pluto would have been a suitable topic for a Ph.D thesis.  Tombaugh’s legacy is quite safe.  In fact, a portion of Tombaugh’s ashes are aboard the New Horizons probe and will flyby  Pluto along with the spacecraft.

The flyby of Pluto next July may very well represent a once in a lifetime opportunity to observe Pluto this close.  No other missions are in the proposal stage at this time and given the travel time to Pluto, it will be at the very least, 15-20 years before another mission arrives in that part of the Solar System.  NASA has just released the first color image of Pluto and its moon Charon (below).  I consider myself very fortunate to be able to witness the culmination of this 15 year effort .

Pluto and Charon orbit shared center of gravity. Credit: NASA

*Image on top of post is the Pluto discovery plates.  Credit:  Lowell Observatory Archives.

 

Minimum Wage & Unemployment: Confusing Micro and Macro

The recent movement to raise the minimum wage to $15.00/hr. has brought out the usual dire warnings that this will cause a significant increase in unemployment and lock out low wage workers from obtaining jobs. Intuitively, this seems to make sense. When one looks at this scenario from the viewpoint of a business owner, the common sense outcome is you would have to offset the increase in costs by reducing staff.

Classic demand and supply analysis of the labor market from Econ 101 would seem to confirm this as you can see below:

Wages set above equilibrium creates surplus of labor - unemployment.
Wages set above equilibrium creates surplus of labor – unemployment. Image: Wiki Commons.

Yet, the evidence is clear that unemployment does not rise with an increase of the minimum wage. Why should this produce such a counter-intuitive result? The key to the answer lies in the fundamental differences between microeconomics (study of individuals and firms) and macroeconomics (study of the economy as a whole) as well as a more complex model of labor markets developed the past few decades referred to as efficiency wage theory.

First, lets take a look at that classic Econ 101 model of labor markets since this is how the issue is most often debated in the popular media and general public.

Models of micro units in the economy are open systems. Take an employer for example, income flows into the employer from an outside entity (customers). Likewise, spending flows out to entities beyond the employer in the form of wages.  The argument against raising the minimum wage sees the cash flow out increasing without an increase in the cash flow in.  Hence, staff is reduced to offset this outflow.

Employer is an open system. the system is "permeable" as cash flows in and out of the system boundary.
Employer is an open system. the system is “permeable” as cash flows in and out of the system boundary.

The same does not hold for a national economy as a whole. Why? A macro unit, such as nation, is a closed system. As Paul Krugman says, in this scenario, everybody’s spending is someone else’s income. If spending drops overall in a macro unit, then income must necessarily drop as well.  This is the cause of business cycles.  An example of a closed system is below. The three major components of GDP are consumption, investment, and government spending. Unlike a household or business, there is no income to flow in from outside the system.  As we’ll see, modulating these business cycles is more important than minimum wage laws in reducing unemployment for low wage workers.

Closed System
Boundary around a closed system is “impermeable”. Cash flows remain inside the system and do not leak out.

As noted earlier, the classic micro model would indicate that an increase in the minimum wage forces employers to reduce staff and also increases the available pool of labor as higher wages induce more people to look for a job.  In this model, an increase in the minimum wage can represent a transfer of wealth from employers to employees, which is the real cause of the political friction on this issue.  Framed this way, it directly pits employees against employers.

Is this transfer of wealth fair?

In the late 1800’s, Alfred Marshall made the great defense for capitalism against the growing socialist movement. Marshall postulated that increased worker productivity would result in increased wages and that was the key to reducing the great poverty of the time. How to increase productivity? Marshall proposed an expansion of expenditures in public education. He was also recognized the productivity gains acquired from spillover knowledge. That is, less experienced workers increase productivity by working with more experienced workers.

The classical economic model derived by Marshall (and others) suggested that workers wages are commensurate with productivity in a free labor market. This model makes a few assumptions. Among them being:

Information is symmetric. That is, both employers and employees have the same knowledge of the existing labor market.

The labor market is competitive to the point where neither an individual employer nor employee can affect the wage rate.

The economy is always at full capacity.

To paraphrase Harry Callahan, a good economic model has got to know its limitations.

How close to reality are these assumptions? A good diagnostic is to compare productivity gains with labor costs (wages and benefits). If the model is correct, both these variables should match. What does the data show?

Below is a comparison between annual increases in worker productivity and real hourly compensation (which accounts for both wages and benefits) since 1979:

Data Source: St Louis Federal Reserve
Data Source: St Louis Federal Reserve

For the most part, compensation lags behind productivity.  Only periodically has compensation matched productivity and that includes the mid-1980’s and late 1990’s.  Hence, the economy usually operates at less than full capacity.  Since the late 1990’s, compensation has seriously lagged behind productivity.  This represents a transfer of wealth from labor to employers not predicted by classical micro models of the economy.  Consequently, the defense of free labor markets as a means to reduce poverty breaks down.  How to change that?

It helps to use real world case studies rather than fictional work.

One thing to avoid is fear of a supply shock by employers “going Galt” and reducing and/or quitting their businesses in a snit over increasing wages.  During the period between 1947-79, while unions were at their peak, wages kept up with productivity gains.  The result was an expanding middle class and business did just fine.  Employers might resent an increase in the minimum wage, but on an aggregate scale, will not have an incentive to downscale their business unless wage increases surpass productivity increases for a sustained period of time.  If that did not happen in the post-World War II period, it’s not likely to happen now.

One of the major critiques of the minimum wage law is that it locks entry level workers out of the job market by keeping wages above the market equilibrium level.  However, the main driver of unemployment for teenagers (by definition entry level employees), as with any section of the population, is the business cycle seen below:

fredgraph

The impact of minimum wage increases is dwarfed by the effect of the business cycle on unemployment.

Here is where macroeconomics comes into the picture.  Over the past 35 years, teenage unemployment topped 20% three times, all during recessions.  The Great Recession, created by the 2008 financial crisis, produced a teenage unemployment rate of over 25%.  The first step in creating job opportunities is to modulate the business cycle in a manner to avoid steep recessions.  A combination of New Deal banking regulations and appropriate monetary/fiscal policy was successful in this regard from 1947-1972.

It doesn’t make sense to oppose minimum wage laws as a means to decrease teenage unemployment if one is also opposed to employing monetary and fiscal policy to moderate business cycles.  The first priority in this direction is to regulate the financial sector so the risk of banking crisis are reduced.  It is financial crisis that cause periods of severe unemployment lasting 3-5 years, sometimes longer.  Prior to World War II, financial panics induced multi-year depressions in 1857, 1873, 1893, and 1929.  The last recession is not an isolated event, but a natural consequence of an unregulated financial sector.  Younger workers are significantly at risk during these events of long-term unemployment.

An additional step is to index the minimum wage to the inflation rate.  The minimum wage topped out at $10.69/hr (2013 dollars) in 1968 and has steadily eroded since.  Overall unemployment was 3.6% during that year with a teenage unemployment rate of 11-12%, a further indication that the business cycle is a greater determinant of teenage unemployment than the minimum wage.   Also, indexing minimum wage to productivity increases needs to be considered.  Paying for production is a reasonable proposition.

The perfect labor market model as presented by the demand and supply graph at the top of this post is an abstract concept.  That model relies on assumptions that cannot be fully realized in the real-world.  Think of it as the economic version of the Carnot engine, which represents the theoretical limit for engine efficiency.  You cannot build such an engine in the real world as it relies on a cycle that does not lose heat to friction.  Likewise, it is impossible to build a perfect labor market in the real world as efficiency is lost due to frictions such as incomplete information and an economy operating at less than full employment.

For an economist to claim free labor markets are efficient to the point where labor receives rising compensation with rising productivity is the same as an engineer who claims to have built a 100% efficient engine.

We need to realize labor is not the equivalent of widgets.  Classic demand and supply curves oversimplifies human behavior in the labor market.

However, a new, more complex theory of labor markets has emerged over the past few decades that merits real consideration as it predictions coincide with some real world observations.  And this is efficiency wage theory.  This theory predicts that employers have a menu of wages to pick from rather than a single market wage.  The tradeoffs involve low wage, low productivity, high employee turnover, vs. high wage, higher productivity, and low turnover.

Lets take a look how employers have responded to an increase in the minimum wage. For starters, layoffs are not the primary, or even a significant reaction. A variety of strategies are employed to offset the higher cost of labor. One is to train employees in a manner to boost productivity. Another is to reduce costly employee turnover. Also, some employers elect to reduce profits to cover the cost.  The real response to minimum wage increases are more reflective of the efficiency wage concept than the classic single wage demand and supply model.

New entrants in the labor market need to acquire social connections to move into higher wage brackets, even within the same skill level jobs.

A 1984 survey paper by current Fed chair Janet Yellen noted that efficiency wage theory predicts a two-tiered workforce.  One is a high wage workforce where jobs are obtained mostly via personal contacts.  Employers have a comfort level with employees they personally know and do not feel the need to go through expensive vetting and monitoring processes.  The other tier is a low wage, highly monitored workforce with a lot of turnover.  Sound familiar?  The latter seems to reflect low paid contract/temp workers who often perform the same functions as higher paid permanent employees at the same company.  Here again, the efficiency wage model seems to trump regular demand and supply.

It does appear to be time for policy makers to incorporate this more sophisticated model when addressing low wage workers and the chronically high unemployment rate in various demographic groups of the workforce.  In particular, is the need to promote a way for low wage workers to make the leap into the high wage sector.  As noted in Yellen’s paper, ability and education is not enough, social connections play a key role in obtaining a high wage job.  This, combined with proper fiscal/monetary policy, brings the best hope for lifting individuals from poverty.

It is often said that anyone who has taken Economics 101 will understand raising the minimum wage causes unemployment to increase.  The efficiency wage model is a bit beyond Econ 101, probably at an intermediate level course.  And that’s perfectly fine.  If you are diagnosed with a serious illness, do you want to be treated by a doctor who went to med school or is using Bio 101?  The same holds true in economics.

*Image on top is the Chicago Memorial Day Massacre of 1937.  Ten striking workers were killed and gave momentum for the Fair Labor Standards Act of 1938 which mandated the first minimum wage.  Photo:  U.S. National Archives and Records Administration.

Climate Change: Global vs. Local

As we conclude another month of unusual weather in my hometown (Buffalo, NY), I thought it would be a good time to take a look how climate change can be measured locally.  Coming off the heels of the coldest month in Buffalo history in February, May featured an average temperature 5.70 F warmer than normal including nine days over 80 degrees.  This month was on pace to be the 5th driest May in Buffalo history, but 2 1/2 inches of rain on May 31st more than doubled the precipitation experienced the first thirty days of the month.

Although our lives revolve around daily fluctuations in weather, the best way to determine how global warming may influence local climate is to examine annual temperature as that measure smooths out noise and gives a good take at long-term trends.  The global temperature history is below:

Temp jpeg 9.07.37 PM
One degree Celsius = 1.8 degrees Fahrenheit. Source: http://climate.nasa.gov/

A few key trends to notice, the post World War II cooling likely was a result of the global economy recovering after the Great Depression.  An increase in atmospheric aerosols caused by industrial emissions blocked sunlight during that period.  This helped to offset greenhouse gas heating.  Aerosols are particles suspended in the atmosphere rather then a gas such as carbon dioxide.

Buffalo Bethlehem Steel plant 1970's.  Photo:  George Burns, courtesy EPA
Buffalo Bethlehem Steel plant 1970’s. Photo: George Burns, courtesy EPA

That era ended with environmental regulations put into effect during the 1970’s.  The drop in aerosols prompted global temperatures to rise after 1980.  A slight cooling trend took place in the early 1990’s, this was precipitated by the Mt. Pinatubo eruption in 1991.  Large volcanic eruptions can eject sulfuric aerosols into the stratosphere and cause global cooling on a scale of 2-3 years.  The most famous example of this was the Year Without a Summer in 1816 produced by the eruption of Mt. Tambora.  After that brief period of cooling, global temperatures began to march upwards again.  This period included the 1998 El Nino warming spike, and the nine of the ten warmest years on record that have occurred after 2000.

So how does Buffalo match up to all this?  Lets take a look below:

Source:  http://www.weather.gov/buf/BUFtemp
Temperature in Fahrenheit. Data from http://www.weather.gov/buf/BUFtemp

We have a replication in the progression of temperature change.  The post World War II cooling can clearly be seen from the 1950’s through 1979.  The 1970’s saw the most infamous weather event in Buffalo history with the Blizzard of ’77.  Temperatures begin to warm up after 1980 just as happened globally.  The short term cooling effect from Mt. Pinatubo is clearly observed in the early 90’s (the summer of 1992 was very cool and wet) along with the (then) record breaking warmth from the 1998 El Nino event.

1998 El Nino event.  Image:  NASA

Despite the noise from short term climate forcings, the warming trend from 1980 to the current is plainly visible.  From 1940 to 1989, only two years clocked in at over 50 degrees.  The period from 1990 to 2014 featured six such years including 2012 which shattered the 1998 record by 1.20 F.  That might not seem like a lot, but the prior record was only 3 degrees above normal.  In other words, if climate models are correct and temperatures increase by more than 4 degrees, the average year will be hotter than the hottest year on record in Buffalo.

What I found really interesting is the greater variation of annual temperatures after 1990 as the yearly figures fluctuate widely across the moving average.  Also, even though the winter of 2014 was very cold in its own right, the temperature for the year was still greater than many years during the 1970’s.  The trend has been clearly increasing temperatures the past three decades.  Is this widening fluctuation in temperatures due to greenhouse gas induced climate change?  The answer is uncertain and as many a science article has concluded, more research needs to be done in this area.

What we do know is that we have to prepare to protect our regional economy against the influence of climate change.  Decreasing lake waters will make hydroelectricity production more expensive and lower cargo capacity of lake freighters.  General Mills, a major local employer (whose plant spreads the aroma of Cheerios downtown), has recognized the dangers of climate change to its supply chain and has a formal policy to mitigate its role in releasing greenhouse gases.  The next decade will be key in slowing climate change and will require crucial policy choices at both a regional and national level.

General Mills (right) and renewable windmill power takes over old Bethlehem Steel site.  Photo:  Gregory Pijanowski 2014.
General Mills (right) and renewable windmill power takes over old Bethlehem Steel site (upper right). Photo: Gregory Pijanowski 2014.

*Image on top of post is Buffalo from Sturgeon Point 15 miles south on Lake Erie.  Just wanted to demonstrate it does not snow here 12 months out of the year.