2010 and 2011 may not have been unusual in terms of the number of energy-related disasters, but I suspect they have at least been unusual in terms of the quantity of headline-grabbing material and TV news attention, and the ongoing disaster stemming from the earthquake and tsunami in Japan is only the most dramatic of them. With the 1-year anniversaries of the Upper Big Branch and Deepwater Horizon disasters running past along with the 25th anniversary of Chernobyl, I felt a need for some sort of quantitative comparison of these various events...
An explosion or earthquake or other disaster of that sort involves the almost instantaneous release of a large quantity of energy. For earthquakes we have a convenient measure in the Richter scale, which measures the shaking amplitude. The Richter scale increases logarithmically, so that an increase in magnitude by 1 means a shaking amplitude 10 times as large. The quantity of energy involved scales as the 3/2 power of that amplitude, so 2 magnitudes on the Richter scale corresponds to an increase in energy release by a factor of 1000. Converting energy to standard metric notation in terms of joules (1 J = 1 kg m^2/s^2), the Richter scale magnitudes come to:
Magnitude 3: 2 GJ (2x10^9 J)
Magnitude 5: 2 TJ (2x10^12 J)
Magnitude 7: 2 PJ (2x10^15 J)
Magnitude 9: 2 EJ (2x10^18 J)
Nuclear explosions are typically measured in units of kilotons of TNT, where 1 kt TNT = 4.2 TJ, i.e. a 1 kiloton explosion should be about double the energy release of a magnitude-5 earthquake, and a 1 MT (megaton) explosion around double the energy release of a magnitude-7 earthquake.
Also worth thinking about in comparison is the non-explosive use of energy, as it runs through the natural world and as we use it for our own purposes. Since a year consists of just over 3x10^7 seconds, a 1 GW power plant over the course of a year produces 3x10^16 J or 30 PJ of electrical energy. That's about 7 times the energy release of the 1 MT explosion, about 15 times the energy release of a magnitude-7 earthquake. That energy release is spread over tens of millions of seconds, not just the few seconds of an explosion, but it's good to remember it is a large quantity of energy.
Human society currently uses about 15 TW of primary energy, or 450 EJ per year. That's over 200 magnitude-9 earthquakes, almost 1 per day. That's a lot of energy.
Earth receives energy from our Sun at a rate of about 174 PW. In a year that's about 5x10^24 J, 5 YJ (yottajoules) or 5 million EJ. That's a magnitude-9 earthquake worth of energy every 12 seconds! Luckily it's spread out over the whole (day-lit) surface of the Earth, so we don't normally experience the magnitude of that energy flow in any dramatic fashion. Still, it's worth remembering how natural energy scales like this tend to dwarf whatever humans do.
So, how do our recent collection of energy-related explosions and disasters compare?
A few months ago Tamino at Open Mind posted a fascinating analysis of warming obtained by fitting the various observational temperature series to a linear combination of El Nino, volcano, and solar cycle variations (using sun spots as a proxy for the latter), plus an underlying trend, allowing for some offsets in time between the causative series and the temperature. Year to year global average temperatures fluctuate significantly, by several tenths of a degree. Taking into account these "exogenous" factors, however, greatly reduced the level of variation. Not only does this more clearly show the underlying trend, once the "exogenous" components are removed, but it occurred to me this also allows prediction of future temperatures with considerably more confidence than the usual guessing (though I've done well with that in the past), at least for a short period into the future.
See below for a detailed discussion of what I've done with Tamino's model, for the GISS global surface temperature series. In brief, however, I present the results of two slightly different models of the future, first with no El Nino contribution beyond mid-2011, and second with a pure 5-year-cycle El Nino (starting from zero in positive phase) from mid-2011 on.
|Year||Model 1 prediction for GISS Jan-Dec
global average temperature anomaly
|Model 2 prediction|
While there's some variation in future years, the final average temperature for 2011 should be close to 0.58 (similar to the temperatures in 2009). Temperatures in 2012 are likely to be much warmer - at least breaking the record of 0.63 set in 2005 and 2010, possibly (model 2) by as much as 0.15 degrees. With the continued waxing of the solar cycle and continued increases from CO2 effects, however warm 2012 is, 2013 should be even warmer (unless we get a big volcano or another strong La Nina then).
Back when I was a young graduate student our system administrator was a bit of a gamer. We used UNIX: a Digital Equipment VAX running some BSD version, and later SUN workstations - and I pause for a moment in memory of those worthy but now defunct corporations. UNIX at the time came with a bunch of standard command-line-oriented games (and graphical ones later on the SUN's) - which of course the sysadmin was free to delete, but ours didn't. He even installed a new game - Empire (a multi-player "Civilization"-like game) - and started a few games hosted on our computers, soliciting players from around the internet.
For a few months Empire, rather than physics, became my passion. It ran on a schedule such that every 4 hours or 6 hours the clock ticked and you could make more moves of your military units, move commodities from one city to another, or make new plans for your cities. And of course all your opponents did the same. Being there right at the clock tick allowed you to attack first, if that was in the cards, or prepare necessary defenses for an expected attack. And missing a clock tick (for something as useless as sleep, for instance) meant losing tempo in the game; your military units might just sit there rather than move, one of your cities might start to starve, or food or other elements might be wasted because there was no room to store more.
Realizing this wasn't personally sustainable, I delved into the C programming language which seemed to be the standard for UNIX (but up to then I'd hardly used - I'd done some Fortran and assembly programming before). After a few days work I had an automated player program that I could schedule to run shortly after each clock tick to take care of the basics - moving commodities around and moving some of my units along pre-arranged paths that I could update once or twice a day.
This gave me a slight advantage over those players who weren't waking up every 4 or 6 hours at night to update their games, and my game started to do quite well. But not well enough for me; I started to notice some anomalies in the way certain things behaved in the game. If I used ground transport to move a fighter plane from one city to another, the mobility level in the city I moved it from dropped far more than I expected. And if I moved two aircraft from two different cities, both dropped to the same level. There was some bug in the game software, and I needed to track it down.
So I started reading through the source code of the game. This really got me up to speed on programming with the C language - the code had extensive use of pointers and there were arrays of pointers to functions and multiple layers of indirection that had to be traced to figure things out. When I finally got down to the code regarding moving aircraft, I discovered what was going on. The bug was that it was using the mobility of the central capital city as the starting value before subtracting the mobility cost of moving the plane, rather than the mobility of the actual source city. I quickly realized I could exploit the bug - if I kept my capital city mobility high, I could make use of the bug to quickly raise the mobility available in any city by bringing in a fighter plane and moving it around. This gave a huge advantage in the game - mobility was the key factor that limited how much you could do with each tick of the clock.
While perusing the source code I found some other things that looked like bugs too, and verified them in the game. One of the issues was handling of negative numbers. If you loaded a negative quantity of a commodity onto a ship in a harbor city, the code was set up to treat that the same as unloading a positive quantity from the ship to the city. However, while for positive loading the code checked that the city had sufficient quantity of the commodity, for negative loading (unloading) it didn't do that check for the ship. Loading large negative quantities of gold onto a ship gave you a way to create unlimited quantities of gold (or any other commodity the same way).
Finding these bugs that could give such a huge advantage in the game gave me some moral qualms, and I consulted our sysadmin, who was running the game, about what I should do. He asserted that my only responsibility was to file bug reports and suggested fixes with the game developers, and then he'd update the game software when they fixed the problems. As long as the bug was reported, it was perfectly legal (according to standard game rules) to exploit them... So I did...
My obvious and mysterious advantages in the game didn't sit well with the other players, a few of whom knew who I was. I soon found my nation under attack from a united alliance of all the others. With my bag of tricks I was still able to largely prevail, until the nuclear weapons came out...
Not long after this (November 1988) I was working on one of our Sun machines when suddenly everything mysteriously slowed down - the computers were being attacked by the first "internet worm". It turned out I was very close to the epicenter of this event, and one of my colleagues was a good friend of Robert Morris, the student who launched the attack which exploited vulnerabilities in some standard UNIX system services. The era of computer viruses and worms was upon us. Morris was taking advantages of bugs in major computer systems just as I had exploited bugs in the Empire software to gain advantage in that game.
Bugs with destructive power in themselves or available for exploitation by the unscrupulous are almost inevitable consequences of our efforts at automation and removing humans from low-level oversight and decision-making in any system. Even in systems where humans ostensibly make the decisions, if human actions are governed by rigid rules (whether or not they function well under ordinary circumstances) or are taken with incomplete understanding of what they are doing, the extent to which such a system becomes a "machine" with predictable responses almost inevitably invites a quest for "bugs" to exploit for personal advantage. Infamous hacker Kevin Mitnick found social engineering (tricking people into giving him their passwords) at least as effective as anything else in breaking in to computers.
The problem extends far beyond the domain of computer systems. Economic, media, legal and political systems have become highly complex "machines" in modern times, governed by rigid rules and understood by few of those who depend on them. Vital decisions are often made by poorly paid bureaucrats (on regulation enforcement, say) or low-status workers (those mortgage "robo-signers", for instance). The process can be mystifying to the outsider, but to somebody who works to understand it, "bugs" in the system open up enormous (what most would regard as immoral, but often perfectly legal) opportunities for great riches or power.
The 2007-2008 financial crisis is very much a case in point, at least as I understand from my recent reading of Michael Lewis' account, "The Big Short: Inside the Doomsday Machine". A number of people became enormously wealthy while bankrupting their own companies, their customers, or large swathes of the general public. The ways in which they managed this was through exploitation of a handful of real "bugs" in US and international systems of finance. Some of these bugs have been addressed; some I'm less confident will be, exhibiting further bugs in our political and media systems.
Is a sense of shame, embarrassment, fear at being discovered central to what makes us fully human? The old story in Genesis suggests so - (from the King James translation):
And they were both naked, the man and his wife, and were not ashamed. [...] And the serpent said [...] in the day ye eat thereof, then your eyes shall be opened, and ye shall be as gods, knowing good and evil. [...] And the eyes of them both were opened, and they knew that they were naked; and they sewed fig leaves together, and made themselves aprons. [...] And the Lord God called unto Adam, and said unto him, Where art thou? And he said, I heard thy voice in the garden, and I was afraid, because I was naked; and I hid myself.
The first emotion of the couple awakened by knowledge of good and evil was not joy, but shame. To be human is to err, to make mistakes. To do things that are truly embarrassing. It's part of who we are. But it is not just the mistake-making, it's recognizing those mistakes. To actually be embarrassed, to feel shame, to be afraid of the consequences of what we've done. If we routinely do stupid things but don't even recognize that we've done anything wrong, we're living a paradise of ignorance, a pre-fall state of unconscious bliss.
It's become clear to me this is the state which US political and media culture is aiming for. A return to the Garden of Eden, where nobody can do anything wrong, and it is impossible for any public figure to ever feel shame. Where the media has no ability and feels no responsibility to distinguish between right and wrong, truth and falsehood, good and evil - except on those rare occasions where the bad guys are foreigners and everybody on our side can agree.
There are still occasional exceptions. Elliot Spitzer, briefly. You have to give the guy credit for disappearing from public view for a while after that business - and he was our governor too, a really powerful figure. He really, sincerely looked ashamed and embarrassed at the exposure of his sinful behavior. John Edwards is a worse story - and I and a lot of other people feel embarrassed about that one, I was a big supporter of the guy before his mess became public. But at least Edwards quit his campaign and really did look highly ashamed about his own behavior.
There have been a couple of minor congressmen resigning recently for stuff of that nature, but it seems to be pretty few and far between. Why is David Vitter still in the US Senate? John Ensign? Why did that Idaho senator and S. Carolina governor hang on until their terms ended? Yeah, yeah, Bill Clinton too. Though at least you could sort of feel his pain at times.
And those are just the sex scandals. When it comes to the more serious wrongdoing with respect to their actions in office, there seems to be no sense of shame at all, ever. John Murtha completely unrepentant. Charlie Rangel. Jack Abramoff, Tom Delay and friends. Ted Stevens in Alaska. The many scandals of the Bush administration (and Reagan before). Did any of those folks every admit that anything they did was wrong? Did you notice how governor Walker of Wisconsin reacted when the recent prank Koch call became public?
I am sure that reality shows and celebrity obsessions are a contributing factor in our culture of shamelessness. Perhaps that form of popular entertainment is a real source of harm, but I am far more concerned about the harm done by the lack of recognition of truth and falsehood, right and wrong, among our political and media classes. It almost seems brazenness in the face of what to regular people would be highly embarrassing is an asset for politicians.
My evidence that we have now reached the pinnacle of shamelessness in US politics is this markup hearing of the House Energy and Commerce Committee, held on Tuesday March 15th. The subject of the hearing was House Resolution 910, titled the "Energy Tax Prevention Act". The committee, chaired by Fred Upton, makes the claim that their work to stop the EPA from regulating greenhouse gases will reduce gasoline prices.
Politifact examined Upton's claim on this and found it to be False. But at least the claim that regulations cause prices to rise and therefore is a little like a future "tax" (despite not involving taxes at all) has some rationale. So the bill title, while distorting the truth, isn't perhaps a completely implausible and blatant lie.
And one can understand that representatives might vote against something that on its face sounds like a good idea for legitimate reasons, or conversely for something that otherwise sounds like a terrible idea. Though the vote in committee on the final bill was pretty clearly against the facts of the matter, as this editorial in Nature put it:
It is hard to escape the conclusion that the US Congress has entered the intellectual wilderness, a sad state of affairs in a country that has led the world in many scientific arenas for so long. Global warming is a thorny problem, and disagreement about how to deal with it is understandable. It is not always clear how to interpret data or address legitimate questions. Nor is the scientific process, or any given scientist, perfect. But to deny that there is reason to be concerned, given the decades of work by countless scientists, is irresponsible.
Worse than the title and distorted spin from the committee chair are the actual words uttered by some of our duly elected representatives during that hearing. This is not just an intellectual wilderness, this is a moral wilderness where completely provably false statements are just accepted, treated as legitimate points of view, where nobody needs to apologize for being wrong, ever.
Eli Rabett railed on journalists recently, and particularly (the bulk of) science journalists, for practicing "churnalism" - just writing stuff that's almost copy-paste, without even thinking about it, without seeming to put any effort in. Another part of the problem is what Jay Rosen has termed the view from nowhere - the attempt by the establishment press to appear neutral and claim objectivity, when in fact what they are doing is a form of unrecognized ideology in itself (that "both sides are equal", and all that follows).
John Broder has occasionally written some excellent pieces on the interaction between science and politics for the New York Times (though I've just been browsing their archives and haven't found a good example from the last year. Hmm. A large number of the ones I thought good turned out to have required corrections after initial publication. Oops.). In this latest piece however, he provides the perfect exemplification of both Rabett and Rosen's complaints. This has got to be one of the laziest, most egregious false-balance stories I have ever come across in the national press. The central point of Broder's piece is that "both sides" claim to be standing for science. Broder bases this on his claim that:
Democrats rounded up five eminent academic climatologists who defended the scientific consensus that the planet is warming and that human activities like the burning of fossil fuels are largely responsible. [...]
Republicans countered with two scientific witnesses who said that while there was strong evidence of a rise in global surface temperatures, the reasons were murky and any response could have adverse unintended effects.
I watched most of the hearing yesterday, and I was very surprised at Broder's claim that 5 of the witnesses were called by Democrats, and only 2 by Republicans. The Republican congressmen, in questioning, almost universally queried 3, not 2 of the witnesses, looking for favorable responses (some of them also tried to get responses on very loaded questions from Richard Somerville, called by the Democrats). Why would Broder think that only 2 of the witnesses were called by Republicans, when in fact 3 were? As Joe Romm pointed out in this piece, John Christy and Roger Pielke Sr. have been called upon to testify by Republicans before, so those two weren't exactly a surprise. The DDT-advocate was a bit of a surprise, but clearly there for the Republican "side".
In some followup discussion with Barry Bickmore (by email) and Kevin C (at Skeptical Science) it became clear we were missing something in the analysis of Roy Spencer's climate model. Specifically, Bickmore's Figure 10:
differed significantly from Spencer's 20th century fit even though he was ostensibly using the same parameters. If you look at the year 1900 in particular, Bickmore's Figure 10 has a temperature anomaly slightly above zero, while Spencer's is pegged at -0.6 C. Bickmore did use a slightly different PDO index than Spencer for his 1000-year graph (figure 10) - but more importantly, he took Spencer's -0.6 C temperature as the starting point in the year 993 AD, rather than as a constraint on temperature in the year 1900, as it actually was in Spencer's analysis. It turns out that to actually match Spencer's 20th century temperature fit the starting temperature in 993 AD needs to be extraordinarily, far beyond impossibly, low. We'll get to the details shortly.
BYU geologist Barry Bickmore recently reviewed Roy Spencer's recent book, "The Great Global Warming Blunder", finding a number of true "blunders" by the author. In particular he found some very peculiar properties of the simplified physical model that Spencer made a central feature of the book, finding that Spencer's curve-fitting allow infinitely more solutions than the one Spencer somehow settled on, and a number of related issues.
I tangled with Spencer over an earlier model like this which he was promoting more than 3 years ago. What he didn't seem to realize about that first model was that it was essentially trivial, a linear two-box model with two time constants (a subject I explored in detail here a while back). I tried explaining this, but he seems not to have gotten my point that such a model inherently contains no interesting internal dynamics, just relaxation on some (in this case two) time scales. Which seems to go completely against the point I thought he was trying to make, that some sort of internal variability was responsible for decadal climate change.
So it was something of a surprise to me that Spencer based his "Great Blunder" book on an even more simplified version of this model, with just 1 effective time constant. He even tried to get a paper published using this essentially trivial model of Earth's climate. As Bickmore outlined in his part 1, the basic equation Spencer uses is:
(1) dT/dt = (Forcing(t) – Feedback(t))/Cp
where T is the temperature at a given time t, Forcing is a term representing the input of energy into the climate system (there is a standard definition for this in terms of radiation at the "top of the atmosphere") and Feedback is a term that itself depends on temperature as
(2) Feedback(t) = α (T(t) - Te)
with α a linear feedback parameter and Te an equilibrium temperature in the absence of forcing (Bickmore and apparently Spencer don't actually use absolute temperature T and equilibrium value Te, but rather write the equations in terms of the difference Δ T = T - Te, which amounts to the same thing, but obscures an important point we'll return to later).
The final term Cp is the total heat capacity involved. Each of forcing, feedback and heat capacity is potentially a global average, but would normally be expressed as a quantity per unit area, for example per square meter. Since the bulk of Earth's surface heat capacity that would respond to energy flux changes on a time-scale of a few years is embodied in the oceans (about 70% of the surface), Cp should be defined essentially as 0.7 times the heat capacity of water per cubic meter, multiplied by the relevant ocean depth in meters (h):
(3) Cp = 0.7*4.19*10^6 J/(m^3 K) * h = c * h
where c = 2.9 MJ/(m^3 K) (Spencer and Bickmore seem to have forgotten the factor of 0.7, so use a slightly larger value for c, which means their h values are probably smaller than the actual ocean depth such a heat capacity would be associated with).
Growing up, my mother seemed to worry about everything. What we ate, how we exercised, school grades, future employability, crossing the street. Trees creaking in a storm, us climbing trees and rocks, my father's driving, the least speck of dust in the house, all seemed to her signs of impending disaster. Perhaps it was in reaction that I acquired such a worry-free attitude about life in general, and in particular about the future. To me just about every step to the future seems a wonderful bright beacon to a better purer world, where human beings are fully valued for what they can really contribute, where drudgery is gone, abuse of other humans and the natural world are a thing of the past, all are enlightened and wise...
Now, I don't consider myself a techno-utopian like Ray Kurzweil. I've critiqued his hyper-optimism elsewhere - if Kurzweil is doubly-exponentially optimistic about the future, I'd limit myself to a single exponential, or even (as I think we eventually must) a power law. But I still think overall the future has to be, on net, more positive than negative. Maybe it's just in my nature.
Andy Revkin has a nice interview with optimistic "World-Changer" Alex Steffen, whose latest book "Worldchanging 2.0" (an expanded edition of the first version, of which I have a quite inspirational copy) was just released. As I've noted here before, Steffen's optimistic view of the future is one I largely share, despite much evidence that our present world has some very serious troubles, now and ahead of us.
On what I believe is a private discussion site I was asked a number of questions about the climate problem. I'm copying my answers here (with some minor corrections of typos and for context) as they may be found helpful for others... or at least as a reminder to myself of what I know.
Q. 1 "I look forward to any insight you can provide into the real verifiable evidence of the human footprint."
I'm not quite sure what you mean by "verifiable evidence". You acknowledge climate seems to be changing. There are two distinct pieces of knowledge that go into "blaming" it on us humans, each of which has been substantiated from multiple observations and physical understanding. These are:
(1) Humans have caused atmospheric CO2 levels to increase considerably over the past century. This youtube video shows the wide range of observations of that increase in considerable detail, also showing how it compares with past changes:
Ray Pierrehumbert's recent brief but excellent exposition of radiative transfer theory in Physics Today and scienceofdoom's continuing effort to clearly explain atmospheric radiation and the "Greenhouse" effect inspired me to do a little playing around of my own with the underlying theory and equations, to get a better feel for some of the expected behavior, and perhaps illuminate another aspect of the problem to a public audience.
I have happily used Google Reader in recent years to collect RSS feeds from news sources and blogs around the web, to provide me with a pretty complete up-to-date view of what's going on in science, technology and general world news. The average day has two to three hundred news items, the bulk of them on topics or people I'm interested in. It's an investment of effort to keep up, but I do at least give a brief glance to most of the items that come through the reader.
One of my regular sources has been New Scientist, a UK-based news organization. I've found them a useful source of up-to-date science reporting, despite a tendency to over-hype things. But apparently they've started including non-fact-checked blogs in their news stream - either that, or their editorial process has developed some very lax standards. Because, a few days ago, I was startled to run across this piece by Fred Pearce, reporting on a so-called "reconciliation" conference on climate, held in Lisbon.
The final 2010 Goddard Institute for Space Studies (GISS) global land+ocean temperature numbers are out and just as I predicted, this year was the hottest in their record. Once again my prediction (really mostly a guess) was within just a few hundredths of a degree of the final number for the year:
|Year||Arthur's Feb 2008 prediction||GISS - January 2011||Difference|
I've long been interested in issues of trust and meaning, particularly in regard to scientific information. The importance of context, the "who, when, where, why" of any piece of information, is critical in determining first whether we even learn of that information, and second the degree to which we accept it as part of our base of knowledge about the world.
Historian and philosopher Francis Fukuyama wrote a book on the subject (titled Trust). Ironically, while I felt some interest in it, I haven't read it because of my reaction to an earlier book of his (The End of History) - which I also didn't read. But anybody who could write a book with that title and the apparent thesis that all the interesting debates and conflicts regarding forms of government were somehow in the past was, I decided, not really worth my time. Thanks to just that cue, my level of trust in his ideas fell essentially to zero, and I haven't read what might otherwise have been very interesting to me. Or perhaps not if my judgment was justified.
Trust is fragile, hard to gain and easily lost. Which was why I found a recent post on science and journalism by Scientific American blogger Bora Zivkovic (who I've followed for a long time as @BoraZ on twitter) a little annoying.
APS Staff were recently asked about our thoughts on the future, to help with a planning exercise for coming years. The following are somewhat frank comments I submitted in response to two of the questions, on biggest challenges and opportunities for the future. I doubt they represent the average views of APS staff right now, but they do capture a number of my concerns and ideas at the present so I thought I'd share a bit more widely... I'd certainly be interested in others' thoughts on these and other ideas for what the relatively near future may hold.
Five biggest (but perhaps not most likely) challenges:
1. Retaining the trust of the physics community as a filter and enabler of physics communication (journals, meetings, new media). Trust is fragile; mistakes that drain that trust could come from any front; openness and honesty in all dealings with the community and society at large are paramount.
There exists a widely quoted story about [18th century philosopher/mathematicians] [Denis] Diderot and [Leonhard] Euler according to which Euler, in a public debate in St. Petersburg, succeeded in embarrassing the freethinking Diderot by claiming to possess an algebraic demonstration of the existence of God: "Sir, (a+b^n)/n = x; hence God exists, answer please!"
This story turns out to be (at least in detail) false, but it was likely invented and resonated because it embodies an underlying truth almost any of us in the sciences have seen: once a mathematical equation comes out, it tends to blind the naive, and even the experienced will often skip over the equations on a first reading of any complex argument. A minor error in a mathematical expression (a forgotten minus-sign being the most common example!) can completely change its meaning, and reasoning about such things requires detailed understanding, it's something that's intellectually demanding, requiring time and mental effort. Sometimes we are willing to put that time in, but more often than not we just don't have the time, or the requisite background, and just skip over the math, hoping that it makes sense to somebody else.
Of course, there is an equation that's proof of amazingly beautiful self-consistency in mathematics, that some have taken as evidence for God:
ei π + 1 = 0
but the beauty of that expression isn't something I intend to get into right now.
What brought to my mind the apocryphal story about Euler and Diderot was a pair of recent posts by Dr. Judith Curry, who I've criticized here before. The first post seemed in some ways to finally be a response to my earlier queries about the no-feedbacks question - about which more below. But in the second she oddly chose to highlight 3 comments which claimed the whole thing was ill-defined, with one of them chock-full of equations that seems to have blinded her and others to the fact it made no more sense than Euler's apocryphal equation, ending with a claim that it's all nonsense:
... it is impossible to evaluate these 2 integrals because they necessitate the knowledge of the surface temperature field which is precisely the unknown we want to identify.
The parameter dTa/dFa is a nonsense
which is the sort of language that should remind my few regular readers of our friends Gerlich and Tscheuschner...
Ever since I can remember I've enjoyed putting together jigsaw puzzles. Even as an adult it's a fun diversion for me; of course there have been lots of opportunities to do simple ones with the kids over the years. Every once in a while, since we've been married, Shelly and I would get out one of the tougher ones and do it together - ones with 3000 pieces, or unusual shapes, or other challenges. Each is different, to some degree unique: dominant colors may greatly help find the right piece with one puzzle, while textures, element size and focus help with another, or sometimes you just have to go by piece shape.
Joe Romm has an excellent perspective on the last year, since "Climategate" - focusing on the developments in the science of climate that make our situation only that much more alarming. The real story of "Climategate" is not the frank discussions between climate scientists revealed in stolen emails, and at least so far not the Watergate-like computer break-in whose perpetrators and sponsors have still not been revealed (though I am sure one day that will prove a very interesting story). As Romm emphasizes, the real tragedy of "Climategate" is the media circus that chased this shiny new conflict-driven nothing of a story when there were far, far more momentous issues regarding the reality of climate at hand. If even one of the 9 scientific claims of the past year reviewed by Romm holds up under further research - and in my judgment very likely at least 4 or 5 of these, possibly 7 or 8, are real - the future for my children will be a far less happy place than I had anticipated even just a year ago.
Andy Revkin's coverage of the climate email hack at the NY Times, for example this early Dot Earth post, was an unfortunate example of the herd mentality among journalists on the subject - I've gone back and forth myself on whether Revkin was to some extent responsible for leading the herd. It was around that time I decided his "Dot Earth" blog, which largely launched my interest in climate science, was just not worth my time any more. But even the usually science-friendly George Monbiot thought what was revealed by the emails was serious. Other than the possibly illegal freedom-of-information suppression request by a flustered Phil Jones (who I'd never heard of before), it was not, as Monbiot later confessed.
The strongest lingering widespread meme raised by "Climategate" seems to be along the line of climate scientists being cliquish and "mean", saying nasty things about their critics. But all of science is like that "under the covers" - science is a relentlessly tough intellectual endeavor, and scientists don't waste their time being polite to people who they see as wrong. I work for research journals and see communications between scientists criticizing one another on their science day after day; a lot of this seems very harsh, some hardly the dispassionate image we have of the objective scientist. I looked through a random sample of such commentary recently, selecting a few relatively generic comments (i.e. leaving out the criticisms that were very specific to a particular piece of scientific work) and have posted them below - if the climategate emails seem overly harsh, well, we get just as bad day in, day out, around here!
It's hard to believe that Professor Judith Curry can spend so much time writing blog posts and not seem to have the time to make sense. I've not bothered to follow the drama in any detail, my earlier interaction with her proved rather pointless - she appeared to learn nothing from it, even repeating essentially the same provably false claims about the bare no-feedbacks response in this Scientific American profile.
Anyway, this is intended as a very brief post just to highlight some of the people who've tried to understand Dr. Curry in recent weeks, and found her claims completely without foundation, as I did in the no-feedbacks case. I strongly recommend Coby Beck's latest post getting to the essence of her conspiracy-theory mindset:
there is another plausible explanation for the formation of the IPCC, the rise in funding of climate science and the emergence of the very strong consensus that climate change is happening, human caused and going to get worse. That explanation is this: science revealed a potential problem for human society, society used its institutions to watch for and investigate this problem, honest research has found strong evidence that the problem is real and serious, and virtually all experts, using their best and sincere judgment, have advised the world that the problem is deadly serious.
No conspiracies, no alterior motives, no malfeasonce, just geeks doing science. I know it is not Hollywood material, but sometimes reality is just that dull.
Hal Lewis, emeritus physics professor at UCSB, has just published an open letter to the American Physical Society announcing his resignation, presumably as both a member and Fellow of the society. Lewis' complaint regards the way the society has treated the issue of climate change; he was the second signatory on this open letter published in Nature last year, so has certainly been known to have strong opinions on the matter.
Of course, I've written a bit here before on the genesis of the 2009 APS Council discussion and my thoughts on the "commentary" that was proposed as a way to satisfy some of the complaints.
On May 6th, 2010, climate skeptic Christopher Monckton testified before the US House Select Committee on Energy Independence and Global Warming. The hearing was supposed to be on The Foundations of Climate Science. Democrats on the committee invited 4 leading scientists from major US scientific institutions to testify; Monckton was the only witness on the Republican side. Commitee membership is listed online and includes James Sensenbrenner of Wisconsin as the ranking member on the Republican side, who presumably was responsible for selecting Monckton. It's hard to believe the Republicans were even taking the hearing seriously with that choice. Nevertheless, many climate scientists and other observers were quite disturbed by the misinformation provided as testimony by Monckton to the hearing. A comprehensive rebuttal of Monckton's claims was recently put together by leading climate scientists, with sample comments from them including "very misleading", "profoundly wrong", "simply false", "chemical nonsense", and "cannot be supported by climate physics".
Unfortunately, this is far from the first time congressional Republicans have perpetrated this sort of stunt. An example with far worse implications is the story of the "Wegman Report", supposedly an analysis of the science underlying scientist Michael Mann's early work on reconstruction of historical temperatures, the so-called "hockey stick" (to the right), which showed that temperatures now are warmer than they have been for many centuries, and getting hotter very fast. Mann's results have been repeatedly reproduced with differing methods by himself and other researchers. A 2006 review by the prestigious National Academy of Sciences determined that, according to data and methods available at that time:
- It can be said with a high level of confidence that global mean surface temperature was higher during the last few decades of the 20th century than during any comparable period during the preceding four centuries. This statement is justified by the consistency of the evidence from a wide variety of geographically diverse proxies.
- Less confidence can be placed in large-scale surface temperature reconstructions for the period from A.D. 900 to 1600. Presently available proxy evidence indicates that temperatures at many, but not all, individual locations were higher during the past 25 years than during any period of comparable length since A.D. 900. The uncertainties associated with reconstructing hemispheric mean or global mean temperatures from these data increase substantially backward in time through this period and are not yet fully quantified.
- Very little confidence can be assigned to statements concerning the hemispheric mean or global mean surface temperature prior to about A.D. 900 because of sparse data coverage and because the uncertainties associated with proxy data and the methods used to analyze and combine them are larger than during more recent time periods.
This NAS report, chaired by Gerald North of Texas A&M University, was actually requested by a congressional Republican - Sherwood Boehlert of NY, who was at the time chair of the House Science Committee. He has since retired - one of the last sensible Republicans in congress.
But other congressional Republicans were not interested in what field scientists had to say, and came up with an alternative report on the "hockey stick". Representatives Joe Barton and Ed Whitfield requested a separate report from statistician Edward Wegman, and had Wegman and colleagues testify to congress about the results. The Wegman report seemed to raise a lot of questions about the "hockey stick" and climate science in general. People who have examined the report in isolation may have been quite impressed by some of its claims - just as those lacking expertise might be impressed by Monckton's song and dance. Now, 4 years after its release, computer scientist John Mashey has done an in-depth study of the report and found a large number of indicators of shockingly low academic quality, and further indications that the report was actually tailored as part of a public relations campaign orchestrated by the usual anti-global-warming suspects (conservative and libertarian think tanks, sponsored by fossil fuel concerns).