Tuesday, April 28, 2015

Scheduled for publication in Energy and Environment

So Eli was looking at the tweets and the Bunny came across this from Roger Sr.

about his plans for publishing in the, as Victor Venema puts it

By coincidence this week two initiatives have been launched to review the methods to remove non-climatic changes from temperature data. One initiative was launched by the Global Warming&nbsp Policy Foundation (GWPF), a UK free-market think tank. The other by the Task Team on Homogenization ( TT-WMO ) of the Commission for Climatology (CCl) of the World meteorological organization (WMO). Disclosure: I chair the TT-HOM. . . 
Some subtle differences. The Policy Foundation has six people from the UK, Canada and the USA, who do not work on homogenization. The WMO team has nine people who work on homogenization from Congo, Pakistan, Peru, Canada, the USA, Australia, Hungary, Germany, and Spain.

The TT-HOM team has simply started outlining their report. The Policy Foundation creates spin before they have results with publications in their newspapers and blogs and they showcase that they are biased to begin with when they write:
But only when the full picture is in will it be possible to see just how far the scare over global warming has been driven by manipulation of figures accepted as reliable by the politicians who shape our energy policy, and much else besides. If the panel’s findings eventually confirm what we have seen so far, this really will be the “smoking gun”, in a scandal the scale and significance of which for all of us can scarcely be exaggerated.

My emphasis. Talk about hyperbole by the click-whore journalists of the Policy Foundation. Why buy newspapers when their articles are worse than a random page on the internet? The Policy Foundation gave their team a very bad start.
Victor goes on to give a masterful explanation of what homogenization is about and why it is needed, even as the saying goes, for the satellite (A)MSU records.  Highly recommended. The Weasel is having some fun with the latter.

Eli, OTOH can see a hanging curve when a Pielke chucks one and tweeted back

Count your fingers when you shake hands with a Pielke.

Monday, April 27, 2015

Shortened version of Obama with Anger Translator

I've been playing with video editors, so if anyone wants a shorter version of Eli's clip below, here it is. I kept the intro but cut out a minute of the video preceding the climate change section:

The anger translator, btw, is half of the comedy duo Key and Peele. They're very funny, with lots of excerpts on Youtube and Comedy Central.

UPDATE: Here's one of my favorites:

Sausage Grinding School

Bismark is rumored to have pointed out that one should never look took closely at the fabrication of producing sausages and laws.  In the March 6 issue of Science a number of worthies from various conservation oriented organizations first authored by S.L. Maxwell, add environmental treaties to the list, but with a twist worthy of consideration. (Free range version here)

They point out that conservation treaties have strived for targets that are specific, measurable, ambitious, realistic and time-bound (SMART), but that for the most difficult problems a better goal might be to leave a whole lot more wiggle room.

Because different parties have different objectives each of which will be passionately defended science, no matter how well established, is not necessarily going to help much.  In such a situation, it is, the author's say, and Eli agrees, much more important to build trust and work towards common goals than it would be to impose them at the beginning.

The Montreal Protocols provided the wiggle room by setting out different classes of nations, with different goals and schedules for each.  Because the Protocols built trust over two decades, in no small part through financial aid in helping the developing countries to meet their goals (and yes, accepted a degree of chicanery by China and India in particular) it has been successful.

The problem is

A primary focus for international environmental accords should be to promote collaboration, trust and innovation between stakeholders to enable long-term measurable action toward environmental sustainability.  SMART targets provide a potential pathway for achieving this, but the process of building consensus and collaboration when working toward SMART targets is vital.
So what are the principal rules of wiggle for negotiating environmental treaties
Without this, contentious environmental issues can force environmental policy makers to build flexibility into targets as a way to secure agreement.  We identify three common pathways for providing this "wiggle room":  targets that are ambiguous in definition, ambiguous  in quantification or clearly unachievable.
This is basically a half a loaf strategy, which may not be sufficient, but as is pointed out, such treaties and actions in support of them change the playing fields in the direction of SMART treaties that can be established at a later stage.  Moreover, bilateral agreements with SMART goals can be much easier to negotiate in a global "wiggle room".
Game theory can provide insights into why stakeholders adopt certain positions, the conditions under which they are likely to cooperate, and the likelihood that agreement can be achieved.  Smead, et al, used a game theoretic approach to examine failures of, and prospects for, international climate agreements.  They demonstrated that very high initial demands for greenhouse gas reductions made by numerous countries led to negotiations breaking down.  They suggested that future agreements are more likely to succeed if countries (particularly large emitters) reach bilateral reduction agreements before major international meetings as happened in late 2014 between the United States and China.
Maxwell, et al, hold that the targets for international environmental treaties should focus first on building trust and establishing collaborations amongst the parties.  They see local and regional lawmaking on environmental issues as being a better model for negotiation international treaties and point out the role that scientists have played.  Of course, in the US, this is a double edged sword in states that have banned the words climate change.

Sunday, April 26, 2015

Does He or Don't He

No Drama Obama is, by nature and training, cool, not hot.  He is careful in what he says, extremely careful and certainly needs an interpreter to reach beyond the placid surface.  He needs his anger translator, Luther, to translate.  Sometimes

The framing as humor allows Obama to say something serious.  The look at the end?  Think Jack Benny

Friday, April 24, 2015

More from Andy Lacis

Andy Lacis comments on Judith Curry's visit to the hall of mirrors at And Then There is Physics, but in the spirit of the think, allow Eli to repost.

Let me toss on here what I posted on ClimateEtc in regard to the recent (April 15, 2015) Science, Space, and Technology Committee Congressional Hearing:

As was to be expected, Congressional hearings are more about political posturing rather than being a directed effort of objective information gathering. Naturally, there was the perfunctory public posturing of pretending to appear “fair and balanced”. But the unmistakable overall flavor was really one of there-we-g0-again legalistic tribunes where selected legal briefs are presented on behalf of well-known staked-out positions by convenient plaintiffs who get to argue the virtues of their special points of view on their favorite issues regarding global warming and global climate change.

What went missing in this Congressional climate forum was any kind of real balancing testimony from experts in the field who have spent decades to analyze this important topic of global climate change. Regrettably, there was no real discussion as to what we actually do know about the global warming problem, and why we know it.

But, looking on the brighter side, perhaps there may have been a small modicum of progress having been made in that the likes of Senator James Inhofe (R, Oklahoma) and Congressman Dana Rohrabacher (R, California) were not out there lambasting global warming and climate change as being the greatest hoax ever perpetrated on humanity. It appears that perhaps at this point in time, making such blatant denials of reality could be perceived as being unnecessarily clueless and ignorant.

But then there is also the contrary example of courageous conviction, and understanding of the global warming reality, exhibited by former Congressman Bob Inglis (R, South Carolina), who paid the price for being politically incorrect. One can only hope that at some point, pragmatic sanity will eventually prevail.

Even some of the staunchest of the global warming doubters have now grudgingly come around to acknowledge that CO2 does indeed absorb thermal radiation (but they want to claim that the absorption is small, that CO2 is saturated, and that water vapor actually absorbs more strongly); that while there might have been some increase in global temperature (it all has been mostly due to natural variability, and as such, it has been beneficial); and that while humans might have contributed to the rise in atmospheric CO2 (it has not been significant, and besides, the plants have benefitted from more CO2).

While there was nothing that was specifically erroneous in these Congressional Hearing presentations, it was the usual problem of half-truths, misdirection, and non-sequiturs being used to paint a picture that is not an accurate description of where we stand in our understanding of the current climate situation.

Part of the problem may also be attributable to the flexible nature of some basic definitions. What exactly is meant by this common term “global warming”? Literally, the term “global warming” would signify that the global-mean temperature is rising, and if the global-mean temperature were to be decreasing, the situation would then become “global cooling”. But this frequently used term has also acquired a more technical meaning as it is being used in climate science. As the key cause and principal component of global warming, it is the rise in atmospheric CO2 and other greenhouse gases that act to increase the strength of the terrestrial greenhouse effect, and induce more water vapor in the atmosphere as a feedback effect. This inevitably leads to an increase in global surface temperature. This is really what the term “global warming” represents.

But there are other factors that also affect the global temperature. These can be caused by changes in solar irradiance, volcanic aerosols, and the natural variability of the ocean. Changes in solar irradiance and volcanic aerosols are typically known accurately enough. It is the variability of the ocean that is the principal source of uncertainly, such as a strong negative branch of the PDO cycle that can keep the global temperature from rising while atmospheric CO2 continues to increase unabated.

It is important to remember that the present-day changes affecting the global climate consist of two basic components: (1) the ongoing global warming component fueled by increasing atmospheric CO2, and (2) the natural variability of the climate system that consists of random-looking fluctuations about a slowly evolving zero reference point of the climate system.

It would be a misdirection to suggest that global warming has just somehow stalled simply because there has been only a little rise in global surface temperature since the prominent peak in 1998. There was no comparable “pause” in the rate of atmospheric CO2 increase during this time period. Instead, the global energy imbalance of the Earth increased as the heat energy that would have been warming the ground surface was being diverted toward heating the ocean. This puts more unrealized global warming into the “pipeline”, from which it will be emerging as the PDO cycle shifts toward its positive phase.

The natural variability of the climate system also makes it difficult to infer climate sensitivity to the radiative forcing by atmospheric CO2. Reliable estimates of the equilibrium climate sensitivity (equivalent to about 3 K for doubled CO2) are obtained from the geological record and from climate model calculations. The transient climate sensitivity is by definition a moving target since it depends on the rate of change of heat transport into the ocean (which itself is a changing factor), and estimating the transient climate sensitivity from observational data is particularly difficult (and uncertain), because it is necessary to know all contributing forcings in order to disentangle the feedback contributions from the total climate system response. While the CO2 forcing may be known accurately, it is big uncertainty as to the “virtual” forcings due to the natural variability of the ocean that are the most difficult to determine. Thus, estimates of the transient climate sensitivity (whether high, or low), will continue to remain highly uncertain.

In view of the above, the suggestion that climate models are running “too hot” compared to observations is disingenuous. Climate models may well run “cold” while simulating El Nino events, and run “hot” while simulating the global temperature during a strong negative PDO. Both climate models and the real world exhibit a form of unforced natural variability. And in both cases, this natural variability is quasi-chaotic, with no real way to coordinate the phasing of this variability. Any short-term comparisons between climate model results and observations need to keep this in mind. To sidestep this problem, the time period for comparisons must be long enough for the natural variability contributions to average out.

Granted, the definition of “dangerous” climate change is ambiguous. And there is probably no real way to quantify just what “dangerous” actually represents. Perhaps the example of the Titanic may help.
At what point did the situation on the Titanic become dangerous? There was no perceived danger when the Titanic left Southampton for New York. Most of the passengers were still dry and alive some two hours after hitting the iceberg. Did the danger begin when the iceberg was spotted, but there was not enough time to avoid the collision? Or was the danger already brewing when Captain Smith ignored reports of icebergs and continued full steam ahead? There might be some relevant parallels to draw.

Global-mean winds, global-mean temperatures, and global-mean precipitation, compared between a doubled CO2 climate and the current climate would not appear to be consequentially different. But it is the extreme weather events that cause the damage. Whether humans get blamed, or not blamed, neither adds nor detracts from the problem. Global warming puts more heat, water vapor, and latent energy into the atmosphere. And that is the fuel that makes the extreme weather events more extreme. So, there actually is a real relationship to be had between global warming (human induced) and a growing danger of more severe weather extremes. A better studied quantification of this relationship would certainly be very useful.

It would seem more appropriate to assign “wickedness” to problems that are more specifically related to witches. The climate problem, while clearly complex and complicated, is not incomprehensible. Current climate models do a very credible job in simulating current climate variability and seasonal changes. Present-day weather models make credible weather forecasts – and there is a close relationship. Most of the cutting edge current climate modeling research is aimed at understanding the physics of ocean circulation and the natural variability of the climate system that this generates. While this may be the principal source of uncertainty in predicting regional climate change and weather extreme events, this uncertainty in modeling the climate system’s natural variability is clearly separate and unrelated to the radiative energy balance physics that characterize the global warming problem. The appropriate uncertainty that exists in one area of climate modeling doe not automatically translate to all other components of the climate system.

Besides, the persistent uncertainties regarding the natural variability of the climate system are not the real problem that we face. The real problem is the continued increase in atmospheric CO2 that is causing the ongoing global warming. And, the basic facts and physics for understanding this aspect of global warming are all well established and well understood.

There always seem to be temptations to minimize the consequences of the global warming problem, or the cost-effectiveness of proposed efforts taken or suggested to counteract the global warming problem. That is just what Steven Koonin attempted to do in a previous post, nor does it appear to be different in this Congressional hearing.

Typically, the economic costs of taking action to address the global warming problem are always cited as being unnecessarily excessive. This was true of the proposed expenditure of hundreds of millions of dollars to upgrade the levees and shoreline in New Orleans prior to Katrina, and in New York prior to Sandy. Had this money actually been spent to make New York and New Orleans more hurricane-proof, we might never have known that hundreds of billions worth of hurricane damage might have been averted.

The economic cost of combating global warming is likely to be many hundreds of billions of dollars. But has anybody tried to calculate how many trillions of dollars it would cost to relocate Miami, New York, Washington DC, and New Orleans to higher ground? Surely, there are bound to be many other economic costs to tally up, brought on by the inaction to counteract the impending consequences that global warming is sure to bring.

Clearly, decisions will need to be made, and they will need to be made sooner rather than later. Is there anybody in Congress who is capable of making the hard decisions? It is actually important to first fully understand the problem before deciding to act, or in justifying the decision not to act.

Thursday, April 23, 2015

March gave us the warmest 12 months in a row. And the warmest 13 months, 14 months...20 months...40 months...59 months...

...but not the warmest 60 months in a row, so climate change is a hoax. The warmest 60 months in a row happened in ancient history, from March 2010 through February 2015. OTOH, the warmest 61 months in a row did just happen in March, as well 62 months, 63 months, 70 months, and on for quite a while.

The point is that as you look at longer periods, it becomes even more obvious that we're still warming. Denialists made a lot of hay out of the fact that 2014 was the warmest calendar year based on probability, with a lesser-but-still real possibility that another year was actually warmer. They die by the probabilistic sword though if you look at longer periods. There's virtually no chance that any period in the instrumental record longer than 18 months happened before 2014.

Anyway, I thought this is another way to communicate the idea (that temps are warming).

In other news, my careful reading of Tamino's recent blogging frenzy pulled out these two gems from Ted Cruz. In January, Tamino quotes Cruz saying (presumably in 2014):

The last 15 years, there has been no recorded warming. Contrary to all the theories that — that they are expounding, there should have been warming over the last 15 years. It hasn’t happened.

In March, Tamino says Cruz says:

Many of the alarmists on global warming, they’ve got a problem because the science doesn’t back them up. In particular, satellite data demonstrate for the last 17 years, there’s been zero warming.”

Goalpost much? Someone should ask Ted why he's changed his tune, other than fine-tuning his cherrypicking to the only dataset he can still use.

One disadvantage for him in being a presidential candidate is that it becomes a little harder to duck questions.

Tuesday, April 21, 2015

Eli and the Merry Elves

Some time ago, Eli and his merry elves  put together a lengthy comment on an even more lengthy paper (aka piece of trash) by Gerhard Gerlich and Ralf Tscheuschner, that being a paper so bad that it really was not worth the work, except the work the merry elves did was a piece of play.

Now the Rabett is quite happy with the project. It was maybe the first published blog generated reply to such nonsense (thus the grandfather of the 97% paper), and even happier about those who took part, some of whom blog, some of whom tweet and blog to this day, Chris Ho-Stuart, Chris Colose, Joel Shore, Arthur Smith and Joerg Zimmerman.

A major part of the comment was showing that absorbing layer models of the atmosphere lead to a warmer surface, in perfect agreement with the second law of thermodynamics.  What happens, of course, is that each absorbing layer re-emits IR radiation, a part of which is absorbed by the layer below.  This slows the rate at which the lower level cools by radiation.  If the lowest level is heated by an outside source (such as the sun) and an equilibrium is established so that the energy into the system matches that of radiation from the system, then the temperature of the lowest level at equilibrium is higher than it would be in the absence of absorbing layers.

Of course, this did not meet with understanding amongst the lard heads, and Eli ran into it again recently on Bishop Hill.  Curiously Chris Colose has been thinking about the problem too and has a couple of recent  posts on the subject.

Eli's introduction to thermal radiation shielding was building very high temperature ovens (> 1200K) with multiple levels of radiation shielding during his graduate research, so, on an experimental level the answer was clear, but today while searching the net he came across a book on radiative transfer by Robert Siegel which considers the problem in detail starting with parallel piles of heat shielding layers which emit diffusely (e.g. the same in all directions)

in really complete detail.  The model includes different emissivities for the inside and outside walls of each shielding level.  Eli is not going to go full SoD on the bunnies, but those interested can find a detailed derivation of the heat flow per unit area between two parallel plates in just about any book on thermal transfer, or you can corner John Abraham at the next AGU.  When a steady state is established the amount of heat flowing per unit area through each level q must be the same


Following Siegel, if we add these equations up, the right hand side is σ(T14-T24).  Dividing by the co-factor of q on the left hand side yields


Heat transfer books usually stop there, because the MEs are interested in how to design shielding for thermal or cryogenic applications.

OTOH, Rabett and friends were looking at the case of a planet where the amount of incoming energy from the Sun or the star of your choice is q.  The emissivity of the surface is going to be something like 0.95, that of the atmosphere at different levels, well that depends on the pressure, concentration and spectra of greenhouse gases, and, of course the specific humidity and where the clouds are.  For CO2 the contribution is going to be between 0.19 and 0.12.  For water vapor higher, as high as water vapor goes before condensing out

However, we can gain insight by setting ε1 equal to 1 and letting all of the other levels have the same emissivity, both inside and outside each shielding level.  In that case


At a steady state, the same amount of energy has to be radiated to space.  If there are no shielding levels, the amount of heat radiated per unit time is σT1o4.  Consider the case where there is only the outermost heat shield (N=0) then


Canceling σ, multiplying both sides by 1/ε2 and bringing T1o4 to left hand side we get


All terms on the left hand side are positive, ε2 is less than or equal to 1, therefore T1, the temperature where there is one heat shielding level is greater than T1o, the temperature of the surface if there is no blocking.

If there are N equivalent heat shielding layers between the innermost and outmost layers, then similarly


The added term on the left hand side is again positive (if εs =1 then it is simply equal to N.  If εs  < 1 then (2/εs -1) > 1.  In either case, especially the latter, T>  T1o . The same can be done for  spherical geometries, but one has to consider geometric factors, the ratios of the areas of the various shells to each other.

Siegel and other heat transfer books do the derivation.

How important are the geometric factors?  They scale as An/Ao where A=4πR2 so at the risk of offending the punctilious the ratio is (Rn/Ro)2 The radius of the earth is 6371 km.  Using a 10 km high atmosphere basically the troposphere, or at least the effective level which radiates to space in the CO2 bands,  (Rn/Ro)2 = (6381/6371)2 = 1.003, so there will be a .3% difference from treating the system as a nest of sphere's or a series of parallel plates.  Close enough.

Friday, April 17, 2015

Veep candidates bring a 0-5% increase in party vote in their home state

Got into a conversation about this yesterday:  how much help does a vice-presidential candidate provide in winning that candidate's home state? I vaguely recall that poli science says not much. I went and noodled around wiki and can now draw my own dramatic conclusion:  not much.

Wiki has all presidential results by state and year (e.g., here's Texas 1988) so it's simple to compare results before and after a state resident ran for vice president. In the last 30 years, not much happened, although 1992 and 1996 are hard to use because of a strong third party showing.  I'd say everyone brought in much less than a 5% bump, with only Bentsen and (sadly) Palin coming in at or slightly above that level.

This small of a bump suggests that veep candidates shouldn't be chosen based on the help they provide in their home state.

OTOH, there's Florida - that's a very big swing state, and a 2% bump could be useful. I've thought a joint ticket of Jeb Bush and Marco Rubio could make winning Florida very difficult for Democrats. I believe Jeb isn't particularly popular in Florida and Rubio is only moderately popular, but people do tend to root for the home team.