Mechanism design theory

Window and shadows in Montreal

The 2001 Nobel Prize in Economics was awarded to George Akerlof, Michael Spence, and Joseph Stiglitz for their work on asymmetric information. One standard assumption in neoclassical economic models is that all participants in a transaction have ‘perfect information’ about the goods or services being exchanged. The field of behavioural economics is now seeking to deepen such models, so that they can better reflect the kind of dynamics that exist in real markets.

Asymmetric information is a key factor in the functioning of real markets. When you buy a used car, the person at the lot probably knows more about it than you do. The salesperson knows more about used cars in general, may have spoken with the original seller, and may have investigated this specific car. Conversely, you know more about your health risks than your health insurer (provided you live somewhere where health insurance is private). You might know, for instance, that all your relatives die of heart attacks on their 35th birthdays and that you personally drink 3L of whisky per day.

This year’s Nobel Prize in Economics was awarded to Leonid Hurwicz, Eric S. Maskin, and Roger B. Myerson for their work on mechanism design theory. The basic purpose of the theory is to deal with problems like those of assymetric information: take a situation where people would normally have an incentive to behave badly (lie, cheat, etc) and establish rules to make it no longer in their interest to do so. We might, for instance, require used car salespeople to provide some sort of guarantee, or we might allow health insurers to void the policies of individuals who lie about their health when premiums are being set.

Reading about mechanism design feels a bit like watching engineers try to create religious commandments. This section from the Wikipedia entry illustrates what I mean.

Mechanism designers commonly try to achieve the following basic outcomes: truthfulness, individual rationality, budget balance, and social welfare. However, it is impossible to guarantee optimal results for all four outcomes simultaneously in many situations.

While it does seem a bit counterintuitive to try to achieve these things through economic means, it is probably more durable than simply drilling axioms into people’s heads. That is especially true when the counterparty they are dealing with is some distant corporation; people who would never cheat someone standing right in front of them are much more willing to deceive or exploit such a distant and amorphous entity.

Cleaner coal

Coal is a witches’ brew of chemicals including hydrocarbons, sulphur, and other elements and molecules. Burning it is a dirty business, producing toxic and carcinogenic emissions including arsenic, selenium, cyanide, nitrous oxides, particulate matter, and volatile organic compounds. Coal plants also produce large amounts of carbon dioxide, thus contributing to climate change. That said, some coal plant designs can reduce both toxic and climatically relevant emissions to a considerable extent. Given concerns about energy security – coupled with the vast coal reserves in the United States, United Kingdom, China, and elsewhere – giving some serious thought to cleaner coal technology is sensible.

Integrated Gasification Combined Cycle (IGCC) plants are the best existing option for a number of reasons. Rather than burning coal directly, they use heat to convert it into syngas, which is then burned. Such plants can also produce syngas from heavy petroleum residues (think of the oil sands) or biomass. One advantage of this approach is that it simplifies the use of carbon capture and storage (CCS) technologies, which seek to bury carbon emissions in stable geological formations. This is because the carbon can be removed from the syngas prior to combustion, rather than having to be separated from hot flue gases before they go out the smokestack.

The problems with IGCC include a higher cost (perhaps $3,593 per kilowatt, compared with less than $1,290 for conventional coal) and lower reliability than simpler designs (this diagram reveals the complexity of IGCC systems). In the absence of effective carbon sequestration, such plants will also continue to emit very high levels of greenhouse gasses. If carbon pricing policies emerge in states that make extensive use of coal for energy, both of these problems may be reduced to some extent. In the first place, having to pay for carbon emissions would reduce the relative cost of lower-emissions technologies. In the second place, such pricing would induce the development and deployment of CCS.

One way or another, it will eventually be necessary to leave virtually all of the carbon that is currently trapped in coal in the ground, rather than letting it accumulate in the atmosphere. Whether that is done by leaving the coal itself underground or simply returning the carbon once the energy has been extracted is not necessarily a matter of huge environmental importance (though coal mining is a hazardous business that produces lots of contamination). That said, CCS remains a somewhat speculative and unproven technology. ‘Clean coal’ advocates will be on much stronger ground if a single electricity generating, economically viable, carbon sequestering power plant can be constructed.

Hot Air

Meaghan Beattie and Tristan Laing

Hot Air: Meeting Canada’s Climate Change Challenge is a concise and virtually up-to-the-minute examination of Canadian climate change policy: past, present, and future. Jeffrey Simpson, Mark Jaccard, and Nic Rivers do a good job of laying out the technical and political issues involved and, while one cannot help taking issue with some aspects of their analysis, this book is definitely a good place to start, when seeking to evaluate Canada’s climate options.

Emission pathways

Hot Air presents two possible emissions pathways: an aggressive scenario that cuts Canadian emissions from 750 Mt of CO2 equivalent in 2005 to about 400 Mt in 2050, and a less aggressive scenario that cuts them to about 600 Mt. For the sake of contrast, Canada’s Kyoto commitment (about which the authors are highly critical) is to cut Canadian emissions to 6% below 1990 levels by 2012, which would mean emissions of 563 Mt five years from now. The present government has promised to cut emissions to 20% below 2006 levels by 2020 (600 Mt) and by 60 to 70% by 2050 (225 to 300 Mt). George Monbiot’s extremely ambitious plan calls for a 90% reduction in greenhouse gas emissions by 2030 (75 Mt for Canada, though he is primarily writing about Britain).

While Monbiot’s plan aims to reach stabilization by 2030, a much more conventional target date is around 2100. It is as though the book presents a five-decade plan to slow the rate at which water is leaking into the boat (greenhouse gasses accumulating in the atmosphere), but doesn’t actually specify how to plug the hole before it the boat sinks (greenhouse gas concentrations overwhelm the ability of human and natural systems to adapt). While having the hole half-plugged at a set date is a big improvement, a plan that focuses only on that phase seems to lack an ultimate purpose. While Hot Air does not continue its projections that far into the future, it is plausible that the extension of the policies therein for a further 50 years would achieve that outcome, though at an unknown stabilization concentration. (See this prior discussion)

Policy prescriptions

Simpson, Jaccard, and Rivers envision the largest reductions being achieved through fuel switching (for instance, from coal to natural gas) and carbon capture and storage. Together, these account for well over 80% of the anticipated reductions in both scenarios, with energy efficiency improvements, agricultural changes, waste treatment changes, and other efforts making up the difference. As policy mechanisms, the authors support carbon pricing (through either a cap-and-trade scheme or the establishment of a carbon tax) as well as command-and-control measures including tightened mandatory efficiency standards for vehicles, renewable portfolio standards (requiring a larger proportion of energy to be renewable), carbon management standards (requiring a larger proportion of CO2 to be sequestered), and tougher building standards. They stress that information and subsidy programs are inadequate to create significant reductions in emissions. Instead, they explain that an eventual carbon price of $100 to $150 a tonne will make “zero-emissions technologies… frequently the most economic option for business and consumers.” This price would be reached by means of a gradual rise ($20 in 2015 and $60 in 2020), encouraging medium and long-term investment in low carbon technologies and capital.

Just 250 pages long, with very few references, Hot Air takes a decidedly journalistic approach. It is very optimistic about the viability and affordability of carbon capture and storage, as well as about the transition to zero emission automobiles. Air travel is completely ignored, while the potential of improved urban planning and public transportation is rather harshly derided. The plan described doesn’t extend beyond 2050 and doesn’t reach a level of Canadian emissions consistent with global stabilization of greenhouse gas concentrations (though it would put Canada on a good footing to achieve that by 2100). While the book’s overall level of detail may not satisfy the requirements of those who want extensive technical and scientific analysis, it is likely to serve admirably as an introduction for those bewildered by the whole ecosystem of past and present plans and concerned with understanding the future course of policy.

The true price of nuclear power

Maple leaf

Several times this blog has discussed whether climate change is making nuclear power a more acceptable option (1, 2, 3). One element of the debate that bears consideration is the legacy of contamination at sites that form part of the nuclear fuel cycle: from uranium mines to post-reactor fuel processing facilities. The Rocky Flats Plant in the United States is an especially sobering example.

Insiders at the plant started “tipping” the FBI about the unsafe conditions sometime in 1988. Late that year the FBI started clandestinely flying light aircraft over the area and noticed that the incinerator was apparently being used late into the night. After several months of collecting evidence both from workers and by direct measurement, they informed the DOE on June 6, 1989 that they wanted to meet about a potential terrorist threat. When the DOE officers arrived, they were served with papers. Simultaneously, the FBI raided the facilities and ordered everyone out. They found numerous violations of federal anti-pollution laws including massive contamination of water and soil, though none of the original charges that led to the raid were substantiated.

In 1992, Rockwell was charged with minor environmental crimes and paid an $18.5 million fine.

Accidents and contamination have been a feature of facilities handling nuclear materials worldwide. Of course, this does not suffice to show that nuclear energy is a bad option. Coal mines certainly produce more than their share of industrial accidents and environmental contamination.

The trickiest thing, when it comes to evaluating the viability of nuclear power, is disentangling exactly what sort of governmental subsidies do, have, and will exist. These subsidies are both direct (paid straight to operators) and more indirect (soft loans for construction, funding for research and development). They also include guarantees that the nuclear industry is only responsible for a set amount of money in the result of a catastrophic accident, as well as the implicit cost that any contamination that corporations cannot be legally forced to correct after the fact will either fester or be fixed at taxpayer expense. Plenty of sources claim to have a comprehensive reckoning of these costs and risks, but the various analyses seem to be both contradictory and self-serving.

Before states make comprehensive plans to embrace or reject nuclear power as a climate change mitigation option, some kind of extensive, comprehensive, and impartial study of the caliber of the Stern Review would be wise.

A banking analogy for climate

[Update: 22 January 2009] Some of the information in the post below is inaccurate. Namely, it implies that some level of continuous emissions is compatible with climate stabilization. In fact, stabilizing climate required humanity to have zero net emissions in the long term. For more about this, see this post.

Every day, new announcements are made about possible emission pathways (X% reduction below year A levels by year B, and so forth). A reasonable number of people, however, seem to be confused about the relationship between emissions, greenhouse gas concentrations, and climatic change. While describing the whole system would require a huge amount of writing, there is a metaphor that seems to help clarify things a bit.

Earth’s carbon bank account

Imagine the atmosphere is a bank account, denominated in megatonnes (Mt) of carbon dioxide equivalent. I realize things are already a bit tricky, but bear with me. A megatonne is just a million tonnes, or a billion kilograms. Carbon dioxide equivalent is a way of recognizing that gasses produce different degrees of warming (by affecting how much energy from the sun is radiated by the Earth back into space). You can think of this as being like different currencies. Methane produces more warming, so it is like British Pounds compared to American dollars. CO2 equivalent is basically akin to expressing the values in the ‘currencies’ of different gasses in the form of the most important one, CO2.

Clearly, this is a bank account where more is not always better. With no greenhouse gasses (GHGs), the Earth would be far too cold to support life. Too many and all the ice melts, the forests burn, and things change profoundly. The present configuration of life on Earth depends upon the absence of radical changes in things like temperature, precipitation, air and water currents, and other climatic factors.

Assuming we want to keep the balance of the account more or less where it has been for the history of human civilization, we need to bring deposits into the account in line with withdrawals. Withdrawals occur when natural systems remove GHGs from the atmosphere. For instance, growing forests convert CO2 to wood, while single celled sea creatures turn it into pellets that sink to the bottom of the ocean. One estimate for the total amount of carbon absorbed each year by natural systems is 5,000 Mt. This is the figure cited in the Stern Review. For comparison’s sake, Canadian emissions are about 750 Mt.

Biology and physics therefore ‘set the budget’ for us. If we want a stable bank balance, all of humanity can collectively deposit 5,000 Mt a year. This implies very deep cuts. How those are split up is an important ethical, political, and economic concern. Right now, Canada represents about 2% of global emissions. If we imagine a world that has reached stabilization, one possible allotment for Canada is 2%. That is much higher than a per-capita division would produce, but it would still require us to cut our present emissions by 83%. If we only got our per-capita share (based on present Canadian and world populations), our allotment would be 24.5 Mt, about 3.2% of what we currently emit. Based on estimated Canadian and world populations in 2100, our share would be 15 Mt, or about 2% of present emissions.

Note: cutting emissions to these levels only achieves stabilization. The balance in the bank no longer changes year to year. What that balance is depends upon what happened in the years between the initial divergence between deposits and withdrawals and the time when that balance is restored. If we spend 100 years making big deposits, we are going to have a very hefty balance by the time that balance has stabilized.

Maintaining a balance similar to the one that has existed throughout the rise of human civilization seems prudent. Shifting to a balance far in excess carries with it considerable risks of massive global change, on the scale of ice ages and ice-free periods of baking heat.

On variable withdrawals

Remember the 5,000 Mt figure? That is based on the level of biological GHG withdrawal activity going on now. It is quite possible that climate change will alter the figure. For example, more CO2 in the air could make plants grow faster, increasing the amount withdrawn from the atmosphere each year. In the alternative, it is possible that a hotter world would make forests dry out, grow more slowly, and burn more. However the global rate of withdrawal changed, our rate of deposit would have to change, as well, to maintain a stable atmospheric balance.

Here’s the nightmare possibility: instead of absorbing carbon, a world full of burning forests and melting permafrost starts to release it. Now, even cutting our emissions to zero will not stop the global atmospheric balance from rising. It would be akin to being in a speeding car with no control of the steering, acceleration, or brakes. We would just carry on forward until whatever terrain in front of us stopped the motion. This could lead to a planetary equilibrium dramatically unlike anything human beings have ever inhabited. There is a reasonable chance that such runaway climate change would make civilization based on mass agriculture impossible.

An important caveat

In the above discussion, greenhouse gasses were the focus. They are actually only indirectly involved in changes in global temperature. What is really critical is the planetary energy balance. This is, quite simply, the difference between the amount of energy that the Earth absorbs (almost exclusively from the sun) and the amount the Earth emits back into space.

Greenhouse gasses alter this balance because they stop some of the radiation that hits the Earth from reflecting back into space. The more of them around, the less energy the Earth radiates, and the hotter it becomes.

They are not, however, the only factor. Other important aspects include surface albedo, which is basically a measure of how shiny the planet is. Big bright ice-fields reflect lots of energy back into space; water and dark stone reflect much less. When ice melts, as it does in response to rising global temperatures, this induces further warming. This is one example of a climatic feedback, as are the vegetation dynamics mentioned previously.

In the long run, factors other than greenhouse gasses that affect the energy balance certainly need to be considered. In the near term, as well demonstrated in the various reports of the IPCC, it is changes in atmospheric concentration that are the primary factor driving changes in the energy balance. Things that alter the Earth’s energy balance are said to have a radiative forcing effect. (See page 4 of the Summary or Policy Makers of the 4th Working Group I report of the IPCC.)

What does it mean?

To get a stable atmospheric balance, we need to cut emissions (deposits) until they match withdrawals (what the planet absorbs). To keep our balance from getting much higher than it has ever been before, we need to do this relatively quickly, and on the basis of a coordinated global effort.

Liability and computer security

One of the major points of intersection between law and economics is liability. By setting the rules about who can sue brake manufacturers, in what circumstances, and to what extent, lawmakers help to set the incentives for quality control within that industry. By establishing what constitutes negligence in different areas, the law tries to balance efficiency (encouraging cost-effective mitigation on the part of whoever can do it most cheaply) with equity.

I wonder whether this could be used, to some extent, to combat the botnets that have helped to make the internet such a dangerous place. In brief, a botnet consists of ordinary computers that have been taken over by a virus. While they don’t seem to have been altered, from the perspective of users, they can be maliciously employed by remote control to send spam, attack websites, carry out illegal transactions, and so forth. There are millions of such computers, largely because so many unprotected PCs with incautious and ignorant users are connected constantly to broadband connections.

As it stands, there is some chance that an individual computer owner will face legal consequences if their machine is used maliciously in this way. What would be a lot more efficient would be to pass part of the responsibility to internet service providers. That is to say, Internet Service Providers (ISPs) whose networks transmit spam or viruses outwards could be sued by those harmed as a result. These firms have the staff, expertise, and network control. Given the right incentives, they could require users to use up-to-date antivirus software that they would provide. They could also screen incoming and outgoing network traffic for viruses and botnet control signals. They could, in short, become more like the IT department at an office. ISPs with such obligations would then lean on the makers of software and operating systems, forcing them to build more secure products.

As Bruce Schneier has repeatedly argued, hoping to educate users as a means of creating overall security is probably doomed. People don’t have the interest or the incentives to learn and the technology and threats change to quickly. To do a better job of combating them, our strategies should change as well.

Types of goods

In economic theory, most things you can buy are ‘normal goods.’ This means that, as the price rises, people buy less of them. Conversely, people buy more as the price falls. This is all quite self-explanatory but it is interesting to note that there are other types of goods that operate in different ways.

The most common example may be inferior goods. The richer people get, the less they spend on inferior goods. This includes most kinds of discount items: once people can afford something better, they make the switch. Inferior goods reflect this property both at the micro level (an individual gets a big raise and buys less cheap IKEA furniture) and at a macro level (the mean income in a state rises and demand for low-cost gruel falls). Long distance bus trips are a classic example of an inferior good, as anyone who has spent more than twelve hours in a smelly, noisy coach can easily understand.

A somewhat perverse counterpoint to inferior goods can be found in Veblen goods. Named after the economist Thorstein Veblen, these are products for which the demand actually rises as the price does. This is essentially on account of their exclusivity. People buy Velben goods (such as Rolls Royce cars and $50,000 cell phones) precisely to demonstrate that they can. Of course, this makes them a godsend for those hoping to part status conscious rich suckers from some of their wealth.

A final possibility, which may not actually exist, is a Giffen good. To qualify, the good needs to be inferior (in the sense described above), there must be a lack of close substitutes, and the good must comprise a significant share of the purchaser’s budget. With these goods, price rises also lead to people buying more, though for a rather different reason. People who have become too poor to buy a better option fall back on a worse option. The failure of economists to find any well-defended empirical examples suggests that this kind of good may exist only in the minds of academics.

Both Giffen goods and Veblen goods exist because of possible characteristics of the buyer, rather than of the good itself. Whereas Giffen goods are easy to reconcile with ‘rationality’ as understood by economists, Velben goods do so only when they are viewed as inputs in the manufacture of the commodity actually sought: such as social status or prestige.

People wanting to read even more about goods and economic theory can look into the distinction between rivalrous and non-rivalrous goods and excludable and non-excludable goods. The two ideas together define public goods and common property goods, the existence of which make even the most hard-nosed economist recognize the efficiency of governmental action to regulate markets.

Not so jolly: the economics of gift giving

Victoria Island, Ottawa

As anyone who has ever been disappointed by what they found under the wrapping paper knows, gift-giving can lead to the misallocation of resources. Gift givers misanticipate the value a particular thing will have for the recipient, and thus devote more resources to the purchase than the recipient would. Joel Waldfogel, writing in The American Economic Review back in 1993 discussed this and other related economic issues in a notorious article called “The Deadweight Loss of Christmas.” (Available through JSTOR and Google Scholar)

Imperfect knowledge and non-ideal choices

The paper includes the gloomy conclusion that “gift giving destroys between 10 percent and a third of the value of gifts.” On this basis, the paper estimates that the deadweight loss of holiday giving in the United States in 1992 was between $4 billion and $13 billion. The article does note one possible saving grace: when recipients are ill informed about the existence of things they might enjoy, a gift can be worth more than a transfer of the equivalent quantity of cash. Of course, providing the cash and the information would achieve the same effect, without the risk that the choice will be different from what the recipient would have done with the money themself.

Gifts from friends and significant others are most efficient (largely because they know the preferences of the recipient best), while “noncash gifts from members of the extended family” are most likely to be valued by the recipient at less than their cost of purchase. Recipients value gifts from friends at 98.8% of their actual value, while those from significant others are valued at 91.7%. Parents and siblings give gifts worth 85% of their cost, while aunts and uncles manage only 64.4% and 62.9%, respectively. These conclusions were reached largely on the basis of surveys given to Yale undergraduates (favourite targets for psychological and economic experiments). Waldfogel notes that a social stigma can exist against giving cash gifts, but it is weakest where aunts, uncles, and grandparents are concerned – not coincidentally, the least effective choosers of gifts.

The thought counts

I have a more wide-ranging response of my own. Thankfully, there is a phenomena that partially offsets imperfect gift choice losses: the extent to which the very status of something as a gift increases its value in the eyes of the recipient. I can think of scores of cases where a product or service that would not have been particularly gratifying if purchased for myself was especially welcome and meaningful when received from someone else. In many cases, this creates utility significantly greater than that which could be achieved through personal spending of an equivalent sum.

I was reminded of all this when I saw Waldgofel’s article mentioned on Marginal Revolution, an interesting economics blog.

A (very) partial response to David Suzuki

Last night, I saw David Suzuki speak at a conference on health and the environment. To my surprise, I was far from impressed with most of what he said. He essentially presented a false binary: conspicuous consumption on the one hand, or the preservation of pristine nature on the other. While I certainly acknowledge that a lot of consumption is unnecessary, that doesn’t mean that all sacrifices are of the same moral variety as him choosing not to fly to Australia.

The view that pesticides should not be used in farming was broadly echoed. No doubt, there can be abuse of pesticides and there is a human and ecological cost associated with employing them. That said, it hardly seems that we can take a message of pesticide abandonment to a world of six billion, in which one and a half billion live in extreme poverty. Calling for an end to economic growth means something rather different in Canada than it does in Brazil or Bangladesh or Bolivia. Likewise, not everyone in Canadian society can switch to more ecological (and expensive) options while making only trivial sacrifices.

As a public relations figure, Suzuki obviously has to simplify his messages and present things in a form that is fairly easily repeated and absorbed. That said, the parks-versus-SUVs form of environmentalism doesn’t have much chance of being relevant outside the thinking of a privileged global elite.

Public broadcasters and the web

The existence of the internet changes the economic logic of public broadcasting. Where, at one point, the BBC was a collection of channels, each showing one bit of their vast archive at a time, now much of it is online. That creates a huge database of materials, paid for by taxpayers, and ideally free to be accessed without copyright concerns. Being able to view documentaries like Dangerous Knowledge upon demand is a notable benefit, and one not adequately captured by private sector content generators who are not concerned about societal benefits not captured in their profits.

If all the world’s national broadcasters and other public generators of knowledge would open up their libraries comprehensively, it could make the internet an even more valuable thing than it already is. Unfortunately, that process seems likely to be piecemeal and marked by set-backs. Witness the BBC iPlayer dispute.