Reading fiction aloud

Saint Catherine’s College, Oxford

I attended a sustainability forum in Wadham tonight, followed by a fancy dinner. I even got to see a well situated and previously unexplored room in college. Much more enjoyable, however, was spending a couple of hours later in the night reading aloud from Stanislaw Lem’s Mortal Engines, Simon Singh’s Fermat’s Last Theorum, Vladamir Nabokov’s Lolita, Jack Kerouac’s On the Road, and chapters 2-47 of Mark Haddon’s The Curious Incident of the Dog in the Night-Time.

I really love fiction, and quite enjoy reading aloud. With unfamiliar text, it can be quite challenging, even in the best of circumstances. You need to develop an intuition for the shape of an author’s phrases, so that you can start speaking the first portion without reading the end. Perhaps, that explains why I appreciate Nabokov so much and never enjoyed Faulkner. I don’t think you could read the latter aloud, except in halting steps where an entire sentence was decoded before the first syllable was uttered.

Making a hash of things

The following is the article I submitted as part of my application for the Richard Casement internship at The Economist. My hope was to demonstrate an ability to deal with a very technical subject in a comprehensible way. This post will be automatically published once the contest has closed in all time zones.

Cryptography
Making a hash of things

Oxford
A contest to replace a workhorse of computer security is announced

While Julius Caesar hoped to prevent the hostile interception of his orders through the use of a simple cipher, modern cryptography has far more applications. One of the key drivers behind that versatility is an important but little-known tool called a hash function. These consist of algorithms that take a particular collection of data and generate a smaller ‘fingerprint’ from it. That can later be used to verify the integrity of the data in question, which could be anything from a password to digital photographs collected at a crime scene. Hash functions are used to protect against accidental changes to data, such as those caused by file corruption, as well as intentional efforts at fraud. Cryptographer and security expert Bruce Schneier calls hash functions “the workhorse of cryptography” and explains that: “Every time you do something with security on the internet, a hash function is involved somewhere.” As techniques for digital manipulation become more accessible and sophisticated, the importance of such verification tools becomes greater. At the same time, the emergence of a significant threat to the most commonly used hashing algorithm in existence has prompted a search for a more secure replacement.

Hash functions modify data in ways subject to two conditions: that it be impossible to work backward from the transformed or ‘hashed’ version to the original, and that multiple originals not produce the same hashed output. As with standard cryptography (in which unencrypted text is passed through an algorithm to generate encrypted text, and vice versa), the standard of ‘impossibility’ is really one of impracticability, given available computing resources and the sensitivity of the data in question. The hashed ‘fingerprint’ can be compared with a file and, if they still correspond, the integrity of the file is affirmed. Also, computer systems that store hashed versions of passwords do not pose the risk of yielding all user passwords in plain text form, if the files containing them are accidentally exposed of maliciously infiltrated. When users enter passwords to be authenticated, they can be hashed and compared with the stored version, without the need to store the unencrypted form. Given the frequency of ‘insider’ attacks within organizations, such precautions benefit both the users and owners of the systems in question.

Given their wide range of uses, the integrity of hash functions has become important for many industries and applications. For instance, they are used to verify the integrity of software security updates distributed automatically over the Internet. If malicious users were able to modify a file in a way that did not change the ‘fingerprint,’ as verified through a common algorithm, it could open the door to various kinds of attack. Alternatively, malicious users who could work backward from hashed data to the original form could compromise systems in other ways. They could, for instance, gain access to the unencrypted form of all the passwords in a large database. Since most people use the same password for several applications, such an attack could lead to further breaches. The SHA-1 algorithm, which has been widely used since 1995, was significantly compromised in February 2005. This was achieved by a team led by Xiaoyun Wang and primarily based at China’s Shandong University. In the past, the team had demonstrated attacks against MD5 and SHA: hash functions prior to SHA-1. Their success has prompted calls for a more durable replacement.

The need for such a replacement has now led the U.S. National Institute of Standards and Technology to initiate a contest to devise a successor. The competition is to begin in the fall of 2008, and continue until 2011. Contests like the one ongoing have a promising history in cryptography. Notably, the Advanced Encryption Standard, which was devised as a more secure replacement to the prior Data Encryption Standard, was decided upon by means of an open competition between fifteen teams of cryptographers between 1997 and 2000. At least some of those disappointed in that contest are now hard at work on what they hope will become one of the standard hash functions of the future.

Somewhat harried

Jesus College, Oxford

Largely because of the thesis, I have fallen behind in what are normally two everyday activities: reading fiction and keeping up with my correspondence. The latter is the more embarrassing lapse, as messages of various kinds accumulate. Still, if I am to have a chapter written by Wednesday, I need to keep cracking. This one will lay out the early history of both climate change and POP research, so it involves quite a few finicky details that need to be related to the major themes and questions of the work. It also requires a broader appreciation of the roles played by different individuals and organizations. This has been fairly easy to untangle for the Stockholm Convention, despite its relative obscurity, but the sheer scale of the discussions about climate change makes it quite hard to gain a rigorous sense of who was acting most effectively and why. I am still searching for a really good history of the UNFCCC, IPCC, and Kyoto processes.

PS. Many thanks to Meaghan and Lindi for the postcards. I will reply soon, possibly with postcards from Wales.

Go

Anyone who has even been curious about the game of Go should try this interactive tutorial. Wikibooks also has an introduction, though it does not seem to have been fully written yet. The game is an attractive looking and tricky one, as employed as a dramatic device in the film A Beautiful Mind. Notably, Go is also a game in which the best human players can consistently beat very powerful computers. Unlike chess, it would appear, sheer number crunching ability is not enough to succeed at Go.

Normally played on a grid of 19×19 intersecting lines, the objectives of Go are to capture enemy stones, while also surrounding terrain. Players take turns placing stones on the board, in any position except one where the stone would be immediately captured. Stones or groups of stones that are encircled, such that there are no clear paths or ‘liberties’ extending from them, are captured. Finally, a player may not make a move that would return to the board to how it was immediately before their opponent’s last move (the ‘ko’ rule). The rules according to which stones are placed are very simple, making it initially surprising that the complexity of the game can be so great. Eventually, the game ends when both players pass their turn, indicating that neither sees a possibility for further gains.

As a beginner, playing on a 13×13 board is recommended. The standard size board has more than twice as many intersections to contest, and is probably too much for someone without a developed sense of the game to manage. Learning how to play decently is one project that I will need to suspend, until more pressing tasks are complete.

Nicholas Stern on climate change

Saint Edmund’s Hall, Oxford

During the initial coverage of Nicholas Stern’s report on the economics of climate change, I wondered why the media was paying so much attention. After all, the man is an economist reporting on something that scores of scientists have addressed comprehensively through the IPCC process. Now that I have heard him lecture, and spoken briefly with him personally, I have a much better sense. The man is what Karen Litfin calls a ‘knowledge broker,’ translating scientific data into policy options.

His basic position is the realistic liberal optimist one:

  1. Climate change is real and potentially devastating
  2. It is essentially a massive economic externality
  3. Regulating greenhouse gas (GHG) emissions is the way to stop it
  4. This can be done at moderate cost (1% of GDP) and without a massive change in (a) the basis of economic activity within the developed world or (b) the way in which people choose to live their lives.

He acknowledges that the energy balance needs to shift dramatically. In order to be responsible, he says, we need to shift all electrical production in the rich world to carbon neutral forms (renewables, nuclear, and possibly hydrocarbons with sequestration) by 2050. By that time, land transport should also be based on power sources that do not emit GHGs, whether because they are using stored electricity, or because they use fuels that are GHG neutral. India and China need to be encouraged to sequester the CO2 emitted from their coal stations, probably at the expense of the rich world. All in all, rich states should bear 60-80% of the costs of mitigation.

He focused a great deal on atmospheric CO2 levels. His target is to stabilize between 450ppm and 550ppm. This would lead to a likely scenario where mean global temperature rises by about 2 degrees Celcius (though by much more at the poles, given the nature of the climatic system). On the basis of a ‘business as usual projection’ we will hit 450ppm in eight to ten years. To stabilize at 450ppm, we would need to slow the rate of growth in GHG emissions immediately, having it peak in 2010. Then, we would need to reduce at about 6-10% a year thereafter. If we delayed the peak to 2020, we would likely be at the 550ppm portion of the range: an area that the German head of climate change policy expressed grave concern about, during the question session. Stern himself said that 550ppm is the “absolute upper bound” which it would be “outrageous” to exceed.

As for his very controversial decision about discounting rates, I think he defended himself admirably. He broke it into two bits: the possibility there will be no future generations beyond date X (they ascribed a 0.1% chance a year to an event like a comet or gamma ray burst that would simply snuff humanity out) and the strong likelihood that people in the future will be richer. The latter means that it may be economically efficient to delay some of the costs of dealing with climate change, especially given the probability that new technology will emerge.

I need to move on to other work, though I could discuss his comments for many thousands of words. I will transfer my handwritten notes to the wiki later this evening and link them here: notes from Nicholas Stern’s 21 February 2007 address to Oxford University.

PS. A few weeks ago, my default thesis music was Jason Mraz‘s superb album “Live at Java Joe.” Now, I am listening to Enter The Haggis‘ frantic song “Lannigan’s Ball” from their album Aerials over and over again.

Michaelmas 2006 supervision report

Today, I received Dr. Hurrell’s assessment of my performance in the first term of this year:

He is taking the IR of the Developing World paper this term and tells me that he is enjoying it and that it is going well. He also gave a presentation to the MPhil thesis seminar. He is making good progress with the MPhil thesis: the core question is getting narrowed down and he certainly has a range of incisive and very interesting ideas. He should have the two overview chapters of his two case studies by early in the New Year. The task in the new phase is to relate the general issues in the argument as far as possible to specific details of the cases – rather than back to more general issues. I would also note that he has continued to face quite severe financial problems resulting from the fact that he has received less in the way of student loans than he had expected.

A good assessment, all told. The last bit is probably meant to signal that he had said helpful things to the university and college bursary committees when they approached him. They will be making a decision with regards to whether they will help cover my student loan shortfall next week.

My first substantive chapter is to be finished by next Wednesday. There is really very little slack in the system now. The second substantive chapter will be due by the 15th of March, with the third due at the end of March. That will be the last opportunity to discuss anything with Dr. Hurrell. Then, I run off to Dorset for a week of frantic editing (possibly with no internet access – gasp!). Then, I will have the remainder of April prior to the 22nd to finish editing, have the thing printed and bound, and collapse in a heap, quite possibly driven to madness by the stress of the whole thing.

My tutorial reports for last year were blogged previously: Michaelmas, Hilary, and Trinity. Having put all this stuff online in the most searchable and comprehensible way possible, I hope it will help at least one person to (a) decide whether to pursue the M.Phil in International Relations here or not and/or (b) help people in the program later navigate through it.

Scribble, scribble and fancy dinners

Brasenose College, Oxford

This is going to be a busy week. I have one project due on Thursday, a thesis chapter due next Wednesday, and a great many smaller things to get through besides. Rumour has it that I have some kind of international law seminar ongoing, as well. This week, we are discussing: “International Law-Making: Treaties, Custom and Beyond.” This is also going to be my second week at Oxford that involves a trio of formal dinners: the Strategic Studies Group dinner in New College tonight, my Senior Scholarship high table dinner in Wadham on Wednesday, and the Wadham College Stahl dinner on Friday. The last of those apparently commemorates a rich benefactor of the college and involves fellows and a handful of students. Hopefully, I will see some of the people who I have met once or twice over the course of the year, but very rarely see week-to-week.

With the entire thesis due – printed, bound, and dropped off – in nine weeks’ time, I don’t see subsequent weeks being any less busy. At least there is the four-day exception of the Snowdonia trip to look forward to, not to mention the monastic thesis completion retreat to Dorset at the beginning of April.

Back to reading and scribbling…

[Update: 21 February 2007] Since no fellows are dining tonight or tomorrow, it seems this will be the first week this year where I don’t get a dinner in Wadham as a Senior Scholar. It will be a shame to break my thirteen week run of them, but I suppose the Stahl dinner on Friday is a good substitute.

PS. I have decided on a topic for my Richard Casement internship application. I just need to edit it, as well as come up with proper Economist opening and closing sections. Ideally, you want to open the article with something interesting, esoteric, but seemingly unrelated and then close it with a further clarification that reveals the analogy in a witty way.

The road to Kyoto plus, lessons from ozone

A lot of people seem to despair about the possibility of effective regulation of greenhouse gas emissions around the world, but the more I read about the cases of persistent organic pollutants and CFCs, the more plausible it seems, given that a few specific and important progressions take place.

The first is the process of scaling upwards in policy levels, as seen very distinctly with CFCs. The Rowland and Molina paper that first suggested that CFCs cause stratospheric ozone degradation was published in 1974. By 1975, two US states had already banned their use as aerosol propellants (Oregon and New York). Hopefully, the progression from there to national and international regulation is one that can be emulated. Already, lots of American cities and states have signaled that they are serious about climate change, and willing to use regulation to combat it.

The second important dynamic has to do with industry expectations. Six years before CFCs became an issue in environmental regulation, DuPont – the largest manufacturer – canceled its program for developing alternatives. When it became clear that regulation was forthcoming, they were able to field some alternatives within six months, and a comprehensive range within a few years. Up to the point where regulation seemed inevitable, they continued to claim that alternatives could not be easily developed. The point here is twofold. First, it shows that the existence of solutions to environmental problems is not independent of regulation and industry expectations about future regulation. Secondly, industries that anticipate national legislation (as they began to in the US in the mid-1980s on the CFC issue) become a powerful lobby pushing government towards completing an international agreement. It is far worse for American industry to be at a loss because local rules are tougher than global ones than it is to simply deal with some new issues.

Thus, an American administration that takes up the baton from the many states that have initiated their own efforts to deal with climate change might be able to create the same kind of expectations in industry. Some are already asking for regulation to “guide the market,” specifically decisions about what technologies and forms of capital in which to invest. From there, it is at least possible that the US could play a key role in negotiating a successor treaty to Kyoto that begins the process of stabilizing and reducing greenhouse gas emissions.

A related point has to do with the extent to which environmental images are heavily influenced by images and symbols. According to Karen Litfin, the Antarctic ozone hole was one of the major factors that led to the Montreal Protocol. She calls it an ‘anomaly,’ unpredicted by the atmospheric science that had been done up to that point, and thus capable of making scientists and politicians more aware of the possibility of unancitipated risks.

At his talk yesterday, Henry Shue says he is hoping for some iconic moment in climate change, to play a similar galvanizing role (a bare-topped Kilimanjaro, the Larsen B collapse, drowning polar bears, and Hurricane Katrina don’t seem to have done it yet, though the connection between climate change and the last of those is not entirely established.) Some spectacular and distressing (but hopefully non-lethal) demonstration of the profound effects human greenhouse gas emissions are having may be necessary to generate an urgent and powerful drive towards effective responses.

Studio photography on the (very) cheap

Antonia Mansel-Long, bounce-lit

Something useful learned tonight: using standard height white ceilings, a glossy white St. Anthony’s College laundry card, and the on-camera flash on a Canon Powershot A510 digital camera, you can pull off some tolerable bounce-lit flash photography. A hand-held mirror is even better, though I would recommend using a relatively matte ceiling, with that arrangement. The flash is only really adequate for this role in the wide-angle range, due to a low power rating, but this does make it dramatically less unflattering, through the dual benefit of eliminating white patches that have been completely overexposed and removing the unnatural shadows that arise from a flash too close to the lens.

Attempt to make diffusers out of Sainsbury’s receipts, onion-skin paper, and other miscellaneous translucent materials were less successful. I look forward to eventually having a proper off-camera flash with diffuser, not to mention the chance to do some real studio work. If only this pesky thesis wasn’t getting in the way of various hobbies.

Coffee, sandwiches, and bibliographies: the blocks from which theses are made

Hertford College, Oxford

There was a talk in Corpus Christi today that was a kind of grad student slam dunk. Organized by Cinnamon Carlane and given by Henry Shue, the talk was on the ethics of climate change. Firstly, it involved free sandwiches (fully 2/3 of which were vegetarian). Secondly, as with most of Professor Shue’s talks, it involved the distribution of a comprehensive bibliography. With a thesis upcoming, you can never have too many articles of assuredly high quality to include in your discussion and, perhaps more importantly, your bibliography. Thirdly, the room was packed with people interested in environmental politics: an elusive variety of student who seem to be spread across every program and department, and only come together under unusual circumstances.

Shue’s moral argument is, of course, very well thought out and compelling. The biggest flaw, I think, is that he is not focused enough on the policy course that would be required to deal with climate change effectively, and the secondary moral phenomena that arise from that. That said, being able to make a strong foundational case that climate change is a problem upon which we are morally obligated to act may be an important step in the generation of the requisite level of political will.

Those interested in this stuff will probably appreciate knowing that Professor Sir Nicholas Stern is talking about his report on the economics of climate change in the exam schools, this Wednesday at 5:00pm.