Kessler Syndrome

When an atom of uranium undergoes fission in the right circumstances, it produces multiple neutrons which in turn induce fission in other uranium atoms, creating a chain reaction. Today, due to a lack of caution on the part of governments, there is a risk of something similar happening to satellites in orbit around the Earth. When they collide or get blown up, satellites produce large quantities of fast-moving debris. This can cause additional satellites to disintegrate, in turn.

The nightmare scenario is one resembling a nuclear chain reaction, in which a small number of initial collisions produce debris, additional collisions, and debris in an escalating cycle, until certain orbits are no longer safe and usable. This scenario is called Kessler Syndrome, a possibility first identified in 1978 by NASA scientist Donald J. Kessler.

The risk of this outcome can be reduced by improving procedures in the future. For instance, satellites can be designed so that they fall out of orbit when they cease to be useful. There could also be financial incentives put in place to encourage the operators of satellites to behave responsibly at the end of their lives – for instance, a bond that is paid when the satellite is launched and which is refunded when it burns up in the atmosphere. If a satellite is simply left to die in orbit, the bond money can be put into an insurance fund, to compensate the owners of any satellite that it collides with.

It may also be possible to deal with some of the existing junk in space, using a variety of methods such as lasers, the after-the-fact attachment of small disposal rockets to existing satellites, or possibly the use of some kind of membrane to catch small pieces of debris.

Non-nuclear EMP

Several fictional portrayals have drawn attention to the possibility of an electromagnetic pulse (EMP) being used as a weapon, capable of disabling or destroying electronic equipment over a wide area. Such pulses can be created by detonating nuclear weapons at high altitude, though doing so in a war would provoke international outrage. To get around that, the United States and possibly others have developed non-nuclear EMP generators:

One such weapon uses a small charge of explosive to ram an armature down the axis of a current-carrying coil, squeezing its magnetic field so violently in the process that it emits a powerful burst of electromagnetic energy over distances of several hundred metres. Another type employs a Marx generator (a machine used for simulating lightning strikes) to dump a large electrical charge stored in a bank of capacitors into a specially shaped antenna.

American defence forces have converted a number of cruise missiles to function as non-nuclear EMP generators. Apparently, cars parked up to 300 metres away have had their alternators, ignition coils and engine controls disabled this way. Such e-weapons are said to have been used in Kosovo, the Persian Gulf and Afghanistan.

Intriguingly, a pair of such devices has recently broken cover. The Counter-Electronics High-Power Microwave Advanced Missile Project (CHAMP) is an unmanned aircraft fitted with a microwave pulse generator—presumably for disrupting enemy communications. The Pentagon has also announced that it is deploying an electromagnetic weapon, believed to be called Max Power, for detonating roadside bombs and disabling enemy vehicles. Both CHAMP and Max Power mimic the electromagnetic pulse of a nuclear explosion—albeit over a narrowly focused area and without the geomagnetic effect.

Such weapons could be useful for reducing civilian casualties in war, particularly in situations where military targets are located in civilian areas. For example, if a state put an air defence RADAR station in a residential area, an EMP weapon could disable it at lesser risk to the civilian population, compared with conventional munitions.

Apparently, electromagnetic pulses can also be used to punch holes through steel for industrial purposes.

Bedbugs proliferating

I have had one nasty personal experience with these fast-spreading bloodsuckers, and hope to never have another. Alas, that may be an unrealistic hope, given how they are spreading all over the world. According to the BBC, the last big outbreak happened before World War II: “[i]n the 1930s there were large swaths of London where every house was infested.” Eradication with DDT after 1946 pushed that outbreak back, but such pesticides are restricted now because of their health and environmental effects.

Apparently, bedbugs have also grown resistant to DDT, so bringing it back probably wouldn’t help address the current problem. The pesticides currently used for bedbugs may be losing effectiveness, as the creatures become resistant. Increased domestic and international travel may also contribute.

Personally, I have taken to adopting a few precautions:

  • When staying in hostels and hotels, I check for the fecal spots, moults, and blood smears they leave behind, especially when there is a severe infestation (as there is at the Sous Bois Hostel in Montreal).
  • Keeping luggage off the floor and away from upholstered furniture is also a good idea.
  • When I found that I had stayed somewhere with bedbugs, I put everything I had with me through either a high temperature wash or three weeks of sub-zero temperatures.
  • I will no longer purchase or accept used furniture.

Thankfully, these horrible creatures don’t seem to spread disease. They are revolting, however, and extremely expensive and difficult to eradicate. As such, it pays to be cautious.

Deployability of nuclear weapons

Being able to build a device that can produce a nuclear explosion is a significant challenge in itself. Also challenging is building such a device in a self-contained way which does not require difficult last-minute assembly, and which can be stored in a usable state for years. The first American bombs certainly did not meet this standard.

Captain William Parsons, a U.S. Navy weapons expert with the 509th Composite Group (the B-29 squadron that dropped the atomic bombs on Japan during WWII) described the complex and hazardous operation, in a letter intended to convince his superiors that dummy devices were required for practice runs:

It is believed fair to compare the assembly of the gun gadget [the uranium bomb] to the normal field assembly of a torpedo, as far as mechanical tests are involved… The case of the implosion gadget [the plutonium bomb] is very different, and is believed comparable in complexity to rebuilding an airplane in the field. Even this does not fully express the difficulty, since much of the assembly involves bare blocks of high explosives and, in all probability, will end with the securing in position of at least thirty-two boosters and detonators, and then connecting these to firing circuits, including special coaxial cables and high voltage condenser circuit… I believe that anyone familiar with advance base operations… would agree that this is the most complex and involved operation which has ever been attempted outside of a confined laboratory and ammunition depot.

Rhodes, Richard. The Making of the Atomic Bomb. p.590 (paperback)

Probably the reason why the bomb had to be so substantially assembled right before use had to do with the initiator – a sub-component at the very centre of the bomb, designed to produce a handful of neutrons at the critical moment to initiate fission. At the same time, it was critical that the initator not produce even a single neutron before the bomb was to be used.

In early American bombs, initiators were apparently comprised of the alpha particle emitter Polonium 210 (half life 138.4 days) sandwiched between metal foils to keep it from reacting prematurely with the beryllium metal nearby. When the high explosive shell wrapped around the natural uranium tamper and plutonium core of the implosion bomb detonated, the components of the initiator would mix and react, producing neutrons at the same time as the explosives were producing compression.

Details on initiators are still classified, so we can only speculate on how the implosion primaries in modern bombs function.

The whole issue of deployability is relevant to questions of nuclear proliferation insofar as it is more difficult to make a stable, battlefield-usable bomb than to make a device capable of generating a nuclear explosion. That being said, many of the technical details of bomb manufacture have been made available to states contemplating the development of nuclear weapons. That has partly been the product of clandestine activities like the operation of the A.Q. Khan proliferation network. It has also been the consequence of states being insufficiently cautious when it comes to safeguarding knowledge, materials, and equipment.

Reforming the IPCC

Alternative title: What to do when everybody ignores you?

In the wake of University of East Anglia email scandal, there has been yet another review of the work of the Intergovernmental Panel on Climate Change (IPCC). This one was chaired by Harold Shapiro, a Princeton University professor, and concluded that “[t]he U.N. climate panel should only make predictions when it has solid evidence and should avoid policy advocacy.”

The IPCC has certainly made some mistakes: issuing some untrue statements, and evaluating some evidence imperfectly. That being said, the details they got wrong were largely of a nitpicky character. The core claims of the IPCC reports – that climate change is real, caused by humans, and dangerous – remain supremely justified. The trouble is, governments aren’t willing to take action on anything like the appropriate scale.

The situation is akin to a doctor giving a patient a diagnosis of cancer, after which the patient decides that he will try to cut down on his consumption of sugary drinks. That might improve the patient’s health a bit, but it is not an adequate response to the problem described. At that point, it would be sensible for the doctor to engage in a bit of ‘policy advocacy’ and stress how the proposed solution is dangerously inadequate.

It can be argued that the IPCC works best when it presents the bare facts and leaves others to make policy decisions. The trouble is, people don’t take the considered opinions of this huge group of scientists sufficiently seriously. They are happy to let crackpots tell them that there is no problem or that no action needs to be taken. While scientists should not be saying: “Here is what your government’s climate change policy should be” they should definitely be saying: “Here are the plausible consequences of the policy you are pursuing now, and they don’t match with the outcomes you say you want to achieve (like avoiding over 2°C of temperature increase)”. They could also very legitimately say: “If you want to avoid handing a transformed world over to future generations, here is the minimum that must be done”. James Hansen accomplishes this task rather well:

Today we are faced with the need to achieve rapid reductions in global fossil fuel emissions and to nearly phase out fossil fuel emissions by the middle of the century. Most governments are saying that they recognize these imperatives. And they say that they will meet these objectives with a Kyoto-like approach. Ladies and gentleman, your governments are lying through their teeth. You may wish to use softer language, but the truth is that they know that their planned approach will not come anywhere near achieving the intended global objectives. Moreover, they are now taking actions that, if we do not stop them, will lock in guaranteed failure to achieve the targets that they have nominally accepted.

Scientists don’t lose their integrity when they present scientific information in a way that policy-makers and citizens can understand. Indeed, it can be argued that they show a lack of integrity when they hide behind technical language that keeps people from grasping the implications of science.

Climate and the timing of emissions

Climatologist James Hansen emphatically argues that cumulative emissions are what really matter – how much warming the planet experiences depends on what proportion of the world’s fossil fuels get burned.

One reason for this is the long lifetime of CO2 in the atmosphere, with much of it remaining after thousands of years. That being said, the model simulation I have seen shows concentrations dropping sharply, and then tapering off with time:

It seems like it would be helpful to put together that chart with this one, showing historical and expected CO2 concentration increases:

Atmospheric concentration of CO2

A combined chart on the same scale would illustrate what would happen to CO2 concentrations if we stopped emitting at some point soon, specifically what the next few decades would look like.

It seems at least logically possible that timing of emissions could matter. Imagine, for instance, that having emissions cross a certain concentration threshold would really matter. If so, spreading out human emissions so that absorption of CO2 by the oceans would keep the concentration below that cap could be quite beneficial.

It seems an important question to sort out, given how the whole BuryCoal project is focused on limiting total human emissions, rather than trying to space them out.

The Pleasure of Finding Things Out

Probably the most problematic thing about writing associated with Richard Feynman is repetition. Both his books and books about him tend to be at least quasi-biographical, and often feature the same stories, examples, explanations, and even bits of writing.

The Pleasure of Finding Things Out certainly suffers from this flaw, at least for those who have read one or two Feynman books before. It includes, for instance, his appendix to the Challenger inquiry report, which also formed a major part of What Do You Care What Other People Think. It also features Feynman’s thoughts on ‘cargo cult science’ which have been reproduced elsewhere.

All that said, the book does contain some interesting materials that do not seem to be widely available elsewhere, particularly on the subject of nanotechnology. Going back to first principles, Feynman considers what lower size limits exist for things like motors, computer processors, and data storage systems. He concludes that there is ‘plenty of room at the bottom’ and thus enormous scope for improving our capabilities in computing and other fields by relying upon very small machinery and techniques like self-assembly.

Torpedoes, Pearl Harbor, and the atomic bomb

One of the most interesting things about Richard Rhodes’ detailed history of the making of the atomic bomb is the way it gives the reader a better sense of context. This is especially true when it comes to things happening in very different places and spheres of life. It would take an unusual facility with dates, for instance, to realize how the timeline of research into the abstract physical questions about the nature of atoms lined up with political, economic, and military developments.

One grim but interesting example comes from the end of Chapter 12. In November 1941, Franklin Delano Roosevelt had just committed the United States to the serious pursuit of an atomic bomb based upon enriched uranium (U235) and three methods for producing the substance were to be attempted: gaseous diffusion, electromagnetic separation, and centrifuges (the approach Iran is using now). On December 7th of that year, the Japanese Navy attacked the American base at Pearl Harbor.

Rhodes describes how Japanese research into atomic weapons began with the personal research of the director of the Aviation Technology Research Institute of the Imperial Japanese Army – Takeo Yasuda – in 1938, and expanded into a full report on the possible consequences of nuclear fission in April 1940. Rhodes also describes a somewhat grim coincidence involving Japan, the United States, and atomic weapons. He describes how ordinary torpedoes would not have worked for the Pearl Harbor attack, because the water was insufficiently deep. As such, the torpedoes used had to be modified with a stabilizer fin and produced in sufficient quantity for the pre-emptive strike to be successful:

Only thirty of the modified weapons could be promised by October 15, another fifty by the end of the month and the last hundred on November 30, after the task force was scheduled to sail.

The manufacturer did better. Realizing the weapons were vital to a secret program of unprecedented importance, manager Yukiro Fukuda bent company rules, drove his lathe and assembly crews overtime and delivered the last of the 180 specially modified torpedoes by November 17. Mitsubishi Munitions contributed decisively to the success of the first massive surprise blow of the Pacific War by the patriotic effort of its torpedo factory in Kyushu, the southernmost Japanese island, three miles up the Urakami River from the bay in the old port city of Nagasaki. (p.393 paperback)

That attack – launched partly in response to the American embargo of aviation fuel, steel, and iron going into Japan – sank, capsized, or damaged eight battleships, three light cruisers, three destroyers, and four other ships. The two waves also destroyed or damaged 292 aircraft, and killed 2,403 Americans, while wounding another 1,178. More than 1,000 people were killed just in the sinking of the U.S.S Arizona.

CO2 is plant food

One of the many things that falls into the category of ‘things that climate change deniers say that are true, but deeply misleading’ is the claim that carbon dioxide (C02) is ‘plant food’ and thus beneficial to the planet.

This video does a nice job of smashing that argument.

Ironically, in the very long term, life on Earth actually is imperiled by the possibility of insufficient CO2, though not on a timescale that human beings need to worry about now. A billion years from now, it could be a problem.

Climate change ‘winners’

Today’s Globe and Mail makes a good point about the ongoing Russian heatwave and wildfires, namely that they are a partial counter to the argument that northern countries like Russia and Canada would benefit from a warmer climate:

Russia’s summer heat wave has dimmed prospects that northern countries will “win” from climate change thanks to factors such as longer crop-growing seasons or fewer deaths from winter cold, experts say.

Canada, Nordic countries and Russia have been portrayed as among a lucky few chilly nations where moderate climate change will mean net benefits such as lower winter heating bills, more forest and crop growth and perhaps more summer tourism.

“It’s not a matter of a benign shift to a longer growing season” for northern nations, said [Kevin Trenberth, head of climate analysis at the U.S. National Center for Atmospheric Research in Boulder, Colorado]. Russia’s heat wave doubled death rates in Moscow, wrecked a quarter of Russia’s grain crop and may cut $14-billion from gross domestic product.

It is certainly odd to see climate change deniers who – in the course of the same speech or article – will claim that climate change isn’t happening at all, that it is pefectly natural, that it is actually going to be beneficial, and that it is all China’s fault for building too many coal plants.

The fact is, all of our infrastructure was designed for the kind of climatic conditions human civilization emerged in. While it is certainly likely that a few people will benefit from climate change, for the most part it will mean that roads, buildings, agricultural systems, and so on are increasingly poorly suited to the area where they are situated.

I wrote before about climate change and Australian brushfires.