Hydrogen and AAs

Steel bridge struts

At a party this weekend, I had a conversation with someone who believed that the energy needs of the future would be solved by hydrogen. Not hydrogen as the input for nuclear fusion, but hydrogen as a feedstock for fuel cells and combustion engines. It’s not entirely surprising that some people believe this. For years, car companies have been spouting off about hydrogen powered vehicles that will produce only water vapour as emissions. The Chevron game mentioned earlier lets you install ‘hydrogen’ electricity generating capacity. The oversight, of course, is that hydrogen is just an energy carrier. You might as well say that the energy source of the future will be AA batteries.

AA batteries are obviously useful things. They provide 1.5 volts of power that you can carry around with you and use to drive all manner of gadgetry, but they are hardly an energy system unto themselves. The chemicals inside them that create their electrical potential had to be extracted, processed, and combined into a usable form. Inevitably, this process required more energy than is in the batteries at the end. The loss of potential energy is a good trade-off, because we get usable and portable power, but there is no sense in which we can say that AA batteries are an energy system.

A similar trade-off may well eventually be made with hydrogen. We may break down hydrocarbons, sequester the CO2 produced in that process, and use the hydrogen generated as fuel for cars. Alternatively, we might use gobs of electricity to electrolyse water into hydrogen and oxygen. Then, we just need to find a way to store a decent amount of hydrogen safely in a tank small, durable, and affordable enough to put in vehicles; build fleets of vehicles with affordable fuel cells or hydrogen powered internal combustion engines; and develop an infrastructure to distribute hydrogen to all those vehicles.

When you think about it, hydrogen seems less like a solution in itself, and more like the possible end-point of solving a number of prior problems. As far as ground vehicles go, it seems a safer bet to concentrate on improvements to rechargeable battery technology.

Betting on a long shot

Civilization Museum and Parliament

While it is unwise to place too much hope in unproven technologies like carbon capture and sequestration or nuclear fusion as mechanisms to address climate change, there is also a good case to be made for expanded research and development in promising areas. As such, it is more than a bit regrettable that Canada withdrew participation from the largest international fusion research effort back in 2003. It may be a long shot and it may take fifty years or more to reach the point of commercial deployment, but fusion does seem to be one possible long-term option.

In addition to providing electrical power, fusion plants could also be used to produce hydrogen for vehicles by means of electrolysis. Depending on their ultimate ability to scale production up and down, they could also be important for peak power management. Even if we accept that 50 years may be an ambitious period for fusion technology to mature, it is possible that the first commercial fusion plants could be coming online just as coal plants built today are reaching the end of their lives.

Betting on a long shot isn’t always a bad idea – especially when it is one strategy among many alternatives.

New ideas in genetics

Adobe building, Ottawa

The high school biology version of genetics we all learned seems to be faring increasingly poorly, though that is no real surprise. The first actual human genome was sequenced recently. It belongs to J. Craig Venter, founder of Celera Genomics: the private firm that competed with the Human Genome Project to first map the human genome. Both groups used genetic material from multiple subjects and used mathematical tools that may have underplayed the level of genetic diversity that exists in human DNA.

Meanwhile, RNA is getting a lot more attention.

Some half-related earlier posts: the Global Ocean Sampling Expedition and the Human Microbiome Project.

Shrimponomics

Ashley Thorvaldson and Marc Gurstein

Here is an interesting blog post analyzing theories about why people are eating more shrimp than was previously the case. In short, people without training in economics seem to focus more on the demand side than people with such training.

One response that surprised me was “a rise in the number of vegetarians who will eat shrimp.” Now, if you are a vegetarian because you think it is wrong to kill cows and chickens for food, that may be a sensible position. If you are a vegetarian for general reasons of ecological sustainability, it is a lot less valid. As fisheries go, shrimp is one of the worst when it comes to bycatch. The UN Food and Agriculture Organization says that the present shrimp catch is at least 50% above the maximum sustainable level. Shrimp also tends to be collected through a process called bottom trawling: where large steel rollers smash and kill everything on the ocean floor.

Shrimp aquaculture is arguably even worse. There are all the problems attendant to all agriculture – close quarters, disease, harvesting other creatures unsustainably to turn into feed, antibiotics, etc – and then there is the fact that mangrove swamps are ideal for conversion into shrimp farms. The UN Environment Programme estimates that 1/4 of the total destruction of these important ecosystems has been brought about by shrimp farming.

From an ecological standpoint, vegetarianism (and probably veganism) remains a far preferable option, compared to eating meat.

Random numbers

Truly random numbers are hard to find, as patterns tend to abound everywhere. This is problematic, because there are times when a completely random string of digits is necessary: whether you are choosing the winner of a raffle or generating the one-time pad that secures the line from the White House to the Kremlin.

Using random radio crackle, random.org promises to deliver random data in a number of convenient formats (though one should be naturally skeptical about the security of such services). Another page, by Jon Callas, provides further information on why random numbers are both necessary and surprisingly tricky to get.

This comic amusingly highlights another aspect of the issue.

The Great Dying

Elephant statue, National Gallery of Canada

251.4 million years ago, the earth experienced the most severe extinction event ever recorded. The Permian-Triassic (P-Tr) extinction event (informally referred to as the Great Dying) involved the loss of 90% of all extant species. This included about 96% of all marine species and 70% of terrestrial vertebrate species.

There are a number of theories about what caused the event:

  1. A comet or meteor impact
  2. Massive volcanic activity
  3. Continental evolution
  4. A supernova destroying the ozone layer
  5. Methane clathrate release

Some combination of such factors may well be responsible. Regardless of the initial cause, one of the defining elements of the P-Tr event was a high degree of global warming. Mean global temperatures increased by about 6°C, with much higher increases at the poles. This period also involved the large-scale failure of ocean circulation, leaving nutrients concentrated at the ocean bottom and an acute lack of oxygen in the sea. The latter was the product both of decreased circulation and the large-scale die off of the kind of phytoplankton species that now produce about 90% of the planet’s oxygen.

The study of such historical occurrences is useful, largely because it helps to improve our appreciation for how climatic and biological systems respond to extreme shifts. Just as the re-emergence of life after a forest fire and a clearcut may have some common properties, perhaps the patterns of decline and reformation after the P-Tr event can offer us some insight into macro level processes of ecological succession after traumatic climatic events.

Aurigid meteor shower

For those who missed the annual Perseid meteor shower, there is another chance to see some debris vaporizing in our atmosphere this week. The Aurigids are a much rarer shower, generated by comet Kiess (C/1911 N1) passing near the sun around 4 C.E. Gravity from the Earth and other planets sometimes creates dust trails that intersect with the Earth, as it moves through its orbit. The night of August 31st will be one such occasion.

Regrettably, there will not be much to see from eastern Canada. Even in the countryside, the incidence of meteors will peak at less than ten an hour. In Vancouver, however, those in the countryside can expect to see a sharp peak of activity between 4:00am and 5:00am, during which more than 200 meteors per hour should be visible. If you are looking for an excuse to escape all that city light pollution, this is an excellent one.

Aspiring amateur astronomers will find this page very informative. It includes tips on viewing, as well as a neat applet that lets you calculate the incidence of meteor activity in your location.

Greenhouse gas flowchart

Terry Fox statue

The World Resources Institute has produced an excellent flowchart showing the activities that generate greenhouse gas emissions and the magnitude of those flows in terms of CO2 equivalence.

The data is from 2000, but I would expect the relative magnitudes to be reasonably similar now. This graphic provides a powerful and intuitive view into where the problem lies, and suggests areas where the greatest improvements could be made.

Real time ocean monitoring

Emily Horn in front of Parliament

The Neptune project, being led by the University of Victoria, is quite a considerable undertaking. The first stage of the plan is to make an 800km loop of fibre-optic and electrical cable and use it to connect five living room sized automated underwater data collection systems called ‘nodes.’ These will track fish stocks and undersea earthquakes, while collecting other kinds of data on an ongoing basis. This will be the first cabled ocean observatory with multiple nodes.

Ultimately, the system will expand to include 3000km of powered fibre-optic cable connecting a larger number of nodes, all capable of returning data in real time. Compared with systematic collection of data (go to a spot at set intervals and check what is happening) or sporadic collection (just use whatever data becomes available from whenever people happen to be in a place), real time data allows for different sorts of analysis and more comprehensive evaluations. The nodes will contain instruments including temperature meters, conductivity meters, pressure gauges, acoustic dopplers and hydrophones, current meters, wave sensors, electrometers, seismometers, cameras, nutrient monitors, sample storage containers, and autonomous robots.

The system should offer some useful data on migratory fisheries and whale movements, as well as the ominous rumblings of the Juan de Fuca plate, extending from British Columbia down to Oregon. It will also contribute to a more systematic understanding of ocean geology and ecology in general.