Securing against the wrong risk

This week’s Economist includes an unusually poor article on security. It explains that the upcoming Swiss election will be using quantum cryptography to transmit the results from polling stations to central tabulation centres. It alleges that this makes the whole electoral process more secure. This is wrong.

What this is essentially saying is that there would otherwise be a risk of manipulation of this data in transit. The chief polling officer at one station might send a set of figures that get altered by a malicious agent en route to the tabulation centre. Having an encrypted link prevents this man-in-the-middle attack. It does not prevent the polling officer from lying, or the person at the tabulation centre from manipulating the results they input into the counting machines. It doesn’t prevent ballot-stuffing, vote buying, or the compromise of computer systems used to collect or tally votes. In short, it provides no security for the parts of the electoral process that are actually vulnerable to attack. In the absence of good security at the more vulnerable points in the electoral process, using quantum cryptography is like putting a padlock on a paper bag.

Hopefully, they will print my brief letter taking them to task for allowing themselves to be seduced by technology, rather than think sensibly about security.

[Update: 29 October 2007] Bruce Schneier has written about this. Unsurprisingly, he agrees that using quantum cryptography does not increase the security of the Swiss election.

Knowledge brokers get the Nobel

Meaghan Beattie and Milan Ilnyckyj

The hot news today is that the Intergovernmental Panel on Climate Change and Al Gore (though not Sheila Watt-Cloutier) have been awarded the Nobel Peace Prize. While some have questioned the appropriateness of awarding the prize on the basis of achievements not directly related to armed conflict, it does seem that the conflict potential connected with migration, water scarcity, and so forth makes this less of a stretch than some previous awards.

What is most notable about all this, for me, is that neither Gore nor the IPCC have actually contributed to climatic science. The IPCC exists to review the published academic literature on climatic science and agree upon a consensus position; Gore has acted as an effective advocate and representative, though his overall contribution has been far more in the area of information transmission than the area of information generation.

What this shows is how vitally important the layer between scientists and policy-makers or the general public is. Scientists are looking (with great skill and detail) at the individual elements that make up the climatic system. Translating that into a comprehensive understanding of relationships and risks – of the sort that can guide policy development – is critical and challenging. As such, these Nobel prizes are well earned.

Previous related entries:

Hot Air

Meaghan Beattie and Tristan Laing

Hot Air: Meeting Canada’s Climate Change Challenge is a concise and virtually up-to-the-minute examination of Canadian climate change policy: past, present, and future. Jeffrey Simpson, Mark Jaccard, and Nic Rivers do a good job of laying out the technical and political issues involved and, while one cannot help taking issue with some aspects of their analysis, this book is definitely a good place to start, when seeking to evaluate Canada’s climate options.

Emission pathways

Hot Air presents two possible emissions pathways: an aggressive scenario that cuts Canadian emissions from 750 Mt of CO2 equivalent in 2005 to about 400 Mt in 2050, and a less aggressive scenario that cuts them to about 600 Mt. For the sake of contrast, Canada’s Kyoto commitment (about which the authors are highly critical) is to cut Canadian emissions to 6% below 1990 levels by 2012, which would mean emissions of 563 Mt five years from now. The present government has promised to cut emissions to 20% below 2006 levels by 2020 (600 Mt) and by 60 to 70% by 2050 (225 to 300 Mt). George Monbiot’s extremely ambitious plan calls for a 90% reduction in greenhouse gas emissions by 2030 (75 Mt for Canada, though he is primarily writing about Britain).

While Monbiot’s plan aims to reach stabilization by 2030, a much more conventional target date is around 2100. It is as though the book presents a five-decade plan to slow the rate at which water is leaking into the boat (greenhouse gasses accumulating in the atmosphere), but doesn’t actually specify how to plug the hole before it the boat sinks (greenhouse gas concentrations overwhelm the ability of human and natural systems to adapt). While having the hole half-plugged at a set date is a big improvement, a plan that focuses only on that phase seems to lack an ultimate purpose. While Hot Air does not continue its projections that far into the future, it is plausible that the extension of the policies therein for a further 50 years would achieve that outcome, though at an unknown stabilization concentration. (See this prior discussion)

Policy prescriptions

Simpson, Jaccard, and Rivers envision the largest reductions being achieved through fuel switching (for instance, from coal to natural gas) and carbon capture and storage. Together, these account for well over 80% of the anticipated reductions in both scenarios, with energy efficiency improvements, agricultural changes, waste treatment changes, and other efforts making up the difference. As policy mechanisms, the authors support carbon pricing (through either a cap-and-trade scheme or the establishment of a carbon tax) as well as command-and-control measures including tightened mandatory efficiency standards for vehicles, renewable portfolio standards (requiring a larger proportion of energy to be renewable), carbon management standards (requiring a larger proportion of CO2 to be sequestered), and tougher building standards. They stress that information and subsidy programs are inadequate to create significant reductions in emissions. Instead, they explain that an eventual carbon price of $100 to $150 a tonne will make “zero-emissions technologies… frequently the most economic option for business and consumers.” This price would be reached by means of a gradual rise ($20 in 2015 and $60 in 2020), encouraging medium and long-term investment in low carbon technologies and capital.

Just 250 pages long, with very few references, Hot Air takes a decidedly journalistic approach. It is very optimistic about the viability and affordability of carbon capture and storage, as well as about the transition to zero emission automobiles. Air travel is completely ignored, while the potential of improved urban planning and public transportation is rather harshly derided. The plan described doesn’t extend beyond 2050 and doesn’t reach a level of Canadian emissions consistent with global stabilization of greenhouse gas concentrations (though it would put Canada on a good footing to achieve that by 2100). While the book’s overall level of detail may not satisfy the requirements of those who want extensive technical and scientific analysis, it is likely to serve admirably as an introduction for those bewildered by the whole ecosystem of past and present plans and concerned with understanding the future course of policy.

The true price of nuclear power

Maple leaf

Several times this blog has discussed whether climate change is making nuclear power a more acceptable option (1, 2, 3). One element of the debate that bears consideration is the legacy of contamination at sites that form part of the nuclear fuel cycle: from uranium mines to post-reactor fuel processing facilities. The Rocky Flats Plant in the United States is an especially sobering example.

Insiders at the plant started “tipping” the FBI about the unsafe conditions sometime in 1988. Late that year the FBI started clandestinely flying light aircraft over the area and noticed that the incinerator was apparently being used late into the night. After several months of collecting evidence both from workers and by direct measurement, they informed the DOE on June 6, 1989 that they wanted to meet about a potential terrorist threat. When the DOE officers arrived, they were served with papers. Simultaneously, the FBI raided the facilities and ordered everyone out. They found numerous violations of federal anti-pollution laws including massive contamination of water and soil, though none of the original charges that led to the raid were substantiated.

In 1992, Rockwell was charged with minor environmental crimes and paid an $18.5 million fine.

Accidents and contamination have been a feature of facilities handling nuclear materials worldwide. Of course, this does not suffice to show that nuclear energy is a bad option. Coal mines certainly produce more than their share of industrial accidents and environmental contamination.

The trickiest thing, when it comes to evaluating the viability of nuclear power, is disentangling exactly what sort of governmental subsidies do, have, and will exist. These subsidies are both direct (paid straight to operators) and more indirect (soft loans for construction, funding for research and development). They also include guarantees that the nuclear industry is only responsible for a set amount of money in the result of a catastrophic accident, as well as the implicit cost that any contamination that corporations cannot be legally forced to correct after the fact will either fester or be fixed at taxpayer expense. Plenty of sources claim to have a comprehensive reckoning of these costs and risks, but the various analyses seem to be both contradictory and self-serving.

Before states make comprehensive plans to embrace or reject nuclear power as a climate change mitigation option, some kind of extensive, comprehensive, and impartial study of the caliber of the Stern Review would be wise.

Carbon pricing and local food

Ottawa Hostel Outdoor Club

I have been hearing a lot about food miles lately. While it is good for people to be aware of the productive processes that support them, I also have issues with the shape of the local foods debate. Just because something is produced closer to where you live does not mean it is more ecologically friendly or less climate harming. To take an extreme case: people living in very cold regions may find that it is far more environmentally sound to import food from warmer places than to grow it in greenhouses nearby. My focus here is on greenhouse gas emissions, but similar arguments can be made regarding water use, pesticides, etc.

What this debate really demonstrates is the lack of proper carbon pricing in the market. If CO2 emissions were included in the price of tomatoes or bananas, producers could choose to base production in whichever location is most efficient when carbon emissions (along with other factors) are taken into account. Until proper carbon pricing exists, there is justification for the intelligent application of the food miles concept. That said, I think the energy of environmentally aware people is much better spent advocating carbon taxes or cap-and-trade schemes than on encouraging people to spent their time buying local zucchini, rather than whatever sort is at the supermarket. By all means, attend farmers’ markets if you like them, but I think it is deluded to think they can make a meaningful contribution to reducing human emissions to 5,000 Mt of CO2 equivalent.

I can already feel the dissenting opinions coming on, for this post…

Jeffersonian trivia

Little known facts:

  1. Former American President Thomas Jefferson was an avid amateur palaeontologist.
  2. In an attempt to mock him, his political opponents gave him the nickname “Mr. Mammoth” during the 1808 election.
  3. He is credited with the discovery of an enormous ground sloth, larger than an elephant, that inhabited North America during the late Pleistocene.
  4. The creature now bears his name: Megalonyx jeffersonii.

These and many other entertaining facts come from the marvellous recent book The World Without Us, which has leapt to the top of my reading pile. I will post a full review when I finish it.

Sputnik at 50

Bridge on the Rideau Canal

Even the Google logo has been altered to commemorate the 50th anniversary of the launch of Sputnik 1: the first artificial satellite. As someone who spends a very considerable amount of time thinking about how things are going to be in 2050 and 2100, it is remarkable to reflect upon both how different the world is from that of 1957 and how similar it is. The big changes that occurred have often been in areas that few if any people would have anticipated the importance of back then. Areas of great enthusiasm, such as nuclear power and space exploration, have only progressed incrementally since the 1950s and 60s.

I mentioned one Sputnik-related irony in a paper published back in 2005:

At the end of August, 1955, the Central Committee of the Communist Party approved the Soviet satellite program that would lead to Sputnik and authorized the construction of the Baikonour Cosmodrome. This facility, the largest of three Soviet launch sites that would eventually built, was the launching place of Sputnik I (and subsequent Sputniks), and the launch site for all Soviet manned missions…

This former stretch of Kazakhstani desert was also, fatefully, the place to which Nikifor Nikitin was exiled by the Czar in1830 for “making seditious speeches about flying to the moon.” He might have taken cold comfort in the fact that in 1955, the Central Committee gave control of the site to the new Soviet ‘Permanent Commission for Interplanetary Travel.’

For all the drama, it remains unclear to me that manned spaceflight serves any useful scientific or practical purpose at this point in time (see previous). In that sense, perhaps Sputnik – rather than John Glenn – was the true template for humanity’s future involvement in space: an 83.6kg ball of metal with a radio transmitter.

PS. My thesis mentions one somewhat surprising connection between Sputnik and climatic science:

A fortuitous bit of funding produced one of the most famous graphs in the climate change literature: the one tracking CO2 concentrations at Mauna Loa in Hawaii. Examining it closely, a gap can be seen in 1957, where David Keeling’s funding for the project ran out. The Soviet launch of Sputnik I on 4 October 1957 led to a marked concern in the United States that American science and technology had fallen behind. One result of the subsequent surge in funding was the resumption of the CO2 recording program, which continues to the present day.

This graph is the jagged, upward-sloping line that Al Gore devotes so much attention to near the beginning of An Inconvenient Truth.

Vermont’s regulatory victory

Well known as a progressive place, Vermont seems to have recently struck a notable blow in the fight to develop regulatory structures to address climate change. A heated court case had developed between car manufacturers and the state government about whether the latter could impose tough emission limits on cars and light trucks. William Sessions, a federal judge, found in favour of the state’s right to do so. You can read the entire judgment here: PDF, Google Cache.

Among the arguments brought forward by the auto makers (and rejected by Sessions) were that the regulations were unconstitutional, impossible to meet with existing technology, economically disastrous, ineffective, and anti-consumer. The case also involved a reasonably complex jurisdictional issue regarding California’s special exemptions to set environmental policy more broadly than other states.

There do seem to be a suspicious number of cases where industries have followed this trajectory in relation to new regulations: saying that they are unnecessary, saying they would be financially ruinous, then quietly adapting to them with little fuss once they come into force. The phase-out of CFCs in response to the Montreal Protocol is an excellent example. This trend is explicitly recognized in the ruling:

Policy-makers have used the regulatory process to prompt automakers to develop and employ new, state-of-the-art technologies, more often than not over the industry’s objections. The introduction of catalytic converters in the 1970s is just one example. In each case the industry responded with technological advancements designed to meet the challenges…

On this issue, the automotive industry bears the burden of proving the regulations are beyond their ability to meet…

In light of the public statements of industry representatives, history of compliance with previous technological challenges, and the state of the record, the Court remains unconvinced automakers cannot meet the challenges of Vermont and California’s GHG regulations.

The fact that Chinese cars have to meet better emission standards than American ones strongly suggests that the objections of industry are bogus. Given the price inelasticity of demand for gasoline (people keep buying about the same amount when the price goes up), regulating fuel efficiency and emissions does seem like an efficient way to reduce GHG emissions in the transport sector.

The folly of Apollo redux

In an earlier post, I discussed the wastefulness of manned spaceflight. In particular, plans to return to the Moon or go to Mars cannot be justified in any sensible cost-benefit analysis. The cost is high, and the main benefit seems to be national prestige. Human spaceflight is essentially defended in a circular way: we need to undertake it so that we can learn how human beings function in space.

A post on Gristmill captures it well:

Let me be clear. There is a 0 percent chance that this Moon base or anything like it will ever be built, for the following reason: the moon missions in the ’60s and early ’70s cost something like $100 billion in today’s dollars. There is no way that setting up a semipermanent lunar base will be anything other than many times more expensive. That would put the total cost at one to a few trillion dollars.

Assuming that this taxpayer money needs to be lavished on big aerospace firms like Lockheed anyhow, it would be much better spent on satellites for the study of our planet (Some comprehensive temperature data for Antarctica, perhaps? Some RADAR analysis of the Greenland icecap? Some salaries for people studying climatic feedbacks?) or on robotic missions to objects of interest in the solar system.

HCFC phaseout

While international negotiations on climate change don’t seem to be going anywhere at the moment, some further tightening has been agreed within the regime that combats substances that deplete the ozone layer (the Vienna Convention and Montreal Protocol). The parties have decided to speed up the elimination of hydrochlorofluorocarbons (HCFCs), which were permitted as temporary substitutes for the chlorofluorocarbons (CFCs) that destroy ozone most energetically.

The BBC reports that:

The US administration says the new deal will be twice as effective as the Kyoto Protocol in controlling greenhouse gas emissions.

This seems quite implausible to me. HFCs, PFCs, and SF6 collectively contribute about 1% of anthropogenic warming. As such, their complete elimination would have a fairlylimited effect. In addition, the Vienna Convention process always envisioned their elimination, so there is nothing substantially new about this announcement, other than the timing. An agreement for eliminating HCFCs has been in place since 1992:

1996 – production freeze
2004 – 35% reduction
2010 – 65% reduction
2015 – 90% reduction
2020 – 99.5% reduction
2030 – elimination

While it does seem that this timeline isn’t being followed, it remains to be seen whether this new announcement will have any effect on that.

The Kyoto Protocol targets a six different greenhouse gases, most importantly the carbon dioxide that constitutes 77% of anthropogenic climate change. If it had succeeded at reducing emissions among Annex I signatories by 5.2%, as planned, it would have been both a significant contribution and an important starting point.

None of this is to say that we shouldn’t welcome the HCFC phaseout. If nothing else, it should help with the recovery of the ozone layer. We just need to be cautious about accepting statements like the one quoted.