The norovirus – way more common than you probably think

Lamplight and sunset

When people say that they are sick with ‘the flu,’ they often mean they have gastroenteritis. Influenza is caused by viruses in the family Orthomyxoviridae and is the stuff of flu vaccines and avian flu worries. Influenza usually often presents with fever, sore throat, muscle pains, severe headache, coughing, weakness and general discomfort. Gastroenteritis is defined as inflammation of the gastrointestinal tract, involving both the stomach and the small intestine. While it can be caused by bacteria including salmonella and E. Coli, gastroenteritis is usually caused by norovirus or rotavirus. In North America, only the common cold (usually caused by a picornavirus or coronavirus) is a more frequently occurring illness.

Norovirus is of particular interest, since it causes about 50% of all food-related cases of gastroenteritis. In total, it causes about 90% of non-bacterial gastroenteritis worldwide. Norovirus is transmitted either directly from person-to-person or through the faecal contamination of food or water. It is highly susceptible to bleach, though more resistant to alcohol and soap. Susceptibility to the virus is genetically linked. About 29% of the population have two recessive copies of the (1,2)fucosyltransferase gene, which means they don’t form a ligand required by the virus to bond. Lucky people. People with at least one dominant copy of the gene only get a brief period of partial immunity after the fight off the virus. Your best bet is to wash your hands often, avoid salads and shellfish cooked in places with poor hygienic standards, and avoid exposing yourself to people who are already infected.

Hydroelectricity and bare winter mountaintops

Blocks of wood, identified by species

Hydroelectricity is a crucial energy source for Canada: providing 59% of the national electricity supply (and 97% in Quebec), as well as energy for things like the Kitimat Aluminium Smelter. As such, there is good reason to be concerned about changes in mountain glaciers and snowpack arising from climate change. Ideally, you want snow and ice to accumulate in the mountains during the winter. That somewhat reduces the flow of water into reservoirs, which helps prevent the need to release large quantities because the dam is at capacity. Then, during the spring and summer, you want the ice to melt, helping to keep the water level in the reservoir relatively steady and allowing the continuous production of energy without threatening riverflow-dependent wildlife or downstream water usage.

Climate change is upsetting this dynamic in several ways. Warmer winters involve less snowfall, overwhelming dams during the wet season and failing to build up frozen reserves. Hot summers increase evaporation from reservoirs and water usage by industry and individuals. Some scientific evidence also suggests that climate change is exacerbating both the intensity of rainy and dry periods: further worsening the stability of water levels and the ability of dams to produce baseload energy reliably.

Mike Demuth, a glaciologist working for Natural Resources Canada, predicts the disappearance of all small to mid-sized glaciers in the Rockies within the next 50 to 100 years. The Athabasca and 29 other glaciers feed the Columbia River, which in turn provides 60% of the electricity used in the western United States (generated by the Grand Coulee Dam, Chief Joseph Dam, and others). The low cost of energy in the area has even led companies like Google to locate their server farms in the region. Not only is the loss of our mountain cryosphere likely to cause domestic problems, it is highly likely to eventually provoke a pretty serious international conflict.

Will technology save us?

Fountain with stones and wooden edge

All sensible commentators acknowledge that asking people to make big voluntary sacrifices to fight climate change is a strategy unlikely to succeed. People will fight to keep the benefits they have acquired, as well as their capacity to acquire yet more in the future. They will turf out or overthrow leaders who demand heavy sacrifices from them – especially if people in other places are not making the same ones.

If we accept that contention, we are left with a number of possible outcomes:

  1. Painless technological triumph: technological advances allow us to stabilize greenhouse gas concentrations without big sacrifices in current or future standard of living.
  2. Disaster provoked changed priorities: a big and undeniably climate related cataclysm convinces people to buckle down for the sake of their own safety.
  3. Inaction with fairly benign climate change: people do little or nothing, and it turns out that climate change is not as harmful as predicted.
  4. Unmitigated disaster: people do nothing or act too late and slowly, causing global disaster.

Intermediate outcomes are clearly also possible. The differences between several of these have to do with unknown facts about the climate system. Will it throw up a few big and undeniable disasters before a slippery slope is reached? What is the actual sensitivity of the climate to greenhouse gas concentration increases, once feedback and adaptive effects are included?

The first option is certainly the one most popular among politicians. Virtually everyone likes technology and progress: it creates jobs and economic growth while increasing the welfare of those already alive. What big technologies are people hoping might make the difference?

  1. Renewables: sound in theory and partially demonstrated in practice. New transmission capacity and incremental improvements in efficiency required. Potentially high land use.
  2. Biofuels: politically popular but increasingly scientifically discredited. There may be hope for cellulosic fuels.
  3. Nuclear fission: works in practice, with big non-climatic risks.
  4. Nuclear fusion: promising in theory, but nobody has made it work.
  5. More efficient machines: highly likely to occur, unlikely to be sufficient, may not cut total energy use.
  6. Carbon capture and storage: theoretically viable, undemonstrated in practice. May divert attention from technologies with longer-term potential.
  7. Geoengineering: desperate last ditch option, unlikely to work as predicted.

The question of whether climate change can be tackled without a substantial reduction in standard of living remains open. So does the question of whether climate change mitigation can be compatible with the elevation of billions in the developing world to a higher level of affluence. Given the above-stated unwillingness of anyone to undergo avoidable sacrifice, we should be hoping that technology does a lot better than expected, or some potent force changes the balance of risks and opportunities in the perception of most people.

Yet more biofuel doubts

The buzz on all the energy and environment blogs today is a new article in Science raising further doubts about the green credentials of biofuels. Searchinger et al. report that:

Using a worldwide agricultural model to estimate emissions from land use change, we found that corn-based ethanol, instead of producing a 20% savings, nearly doubles greenhouse emissions over 30 years and increases greenhouse gases for 167 years. Biofuels from switchgrass, if grown on U.S. corn lands, increase emissions by 50%.

The major contribution made by this study is a quantitative estimate of how land use change in response to biofuel production affects total greenhouse gas emissions. If the displacement of alternative land uses for corn ethanol produces net carbon emission increases, you can bet your life that clearing tropical rainforest to make palm oil is markedly worse.

Peak coal

Heap of organic apples

The common account of ‘peak oil‘ is straightforward enough. Oil is a non-renewable resource; as such, every barrel taken from the ground means one less for the future. The depletion of current reserves is temporarily offset by the discovery of new reserves and the development of better technology to extract more oil from the reserves we know about. A higher price for oil stimulates both exploration and technological development, creating a negative feedback loop that, to some extent, moderates scarcity and long-run prices. Given the finite nature of oil, it is a logical necessity that extraction will eventually exceed new discoveries and technological improvement, provided we continue to extract oil. The controversial question is when this will occur. Some people argue it already has, others that it will not take place for decades. You have to be a real optimist to think we can continue to expand or maintain present levels of oil extraction for a century.

In the conventional story, the next fossil fuel in line is coal. This is bad for a lot of reasons, including the damaging nature of coal mining and the high pollution and greenhouse gas emissions associated with burning coal. At least, the conventional wisdom says, coal is plentiful. The American Energy Information Administration estimates that 905 billion tonnes exist in recoverable reserves: enough to satisfy the present level of usage for 164 years. The World Energy Council estimates reserves at a somewhat more modest 847 billion tonnes. Combining the idea of peak oil with the reality of dirty coal has led many environmentalists to fear a world where cheap oil runs out and people switch to coal, with disastrous climatic consequences.

An article in the January issue of New Scientist challenges this orthodoxy. The article argues that official reserves have fallen over the last 20 years to an extent far greater than usage, suggesting that the estimates were over-generous. It asserts further that the ratio of official reserves to annual coal extraction worsened by 1/3 between 2000 and 2005. This is attributed primarily to increased demand in the developing world.

The article predicts that the combination of higher rates of usage and smaller than stated reserves may cause oil to “peak as early as 2025 and then fall into terminal decline.” If this is true, it massively changes the logic of coal power and carbon capture and storage. The only reason anybody wants to use coal is because they perceive it to be a relatively inexpensive and amply provisioned fossil fuel that can be obtained from stable and friendly countries. If coal plants being built today with a fifty year lifespan are going to face sharply increased feedstock prices in a few decades, their economic competitiveness compared to renewable energy may be non-existent. This is especially true of plants with carbon capture and storage (CCS) technology, since they require about 20 to 40% more fuel per unit of electricity, in order to power the separation and sequestration equipment.

One article does not make for a compelling case, especially given the poor overall record people have had of predicting energy trends and prices across decades. The article acknowledges the scepticism surrounding the idea of peak coal:

The idea of an imminent coal peak is very new and has so far made little impact on mainstream coal geology or economics, and it could be wrong. Most academics and officials reject the idea out of hand. Yet in doing so they tend to fall back on the traditional argument that higher coal prices will transform resources into reserves – something that is clearly not happening this time.

Regardless of whether this particular analysis proves to be accurate or not, it does a service in questioning an important assumption behind a fair bit of energy policy planning. The idea of peak coal has a complex relationship with climate change. On one hand, it might reduce the incentive to develop CCS, making whatever coal is burned more climatically harmful. On the other hand, awareness that coal reserves are more limited than assumed might prompt more investment in renewable energy, the only option that is sustainable in the long term.

Even if world coal reserves are significantly smaller than the official estimates above, there is a good chance that burning all that is available will have extremely adverse climatic consequences. We know the approximate level of emissions that would maintain stable atmospheric concentrations of greenhouse gasses and we know that we are way above it. What we don’t know is the shape of the damage curve associated with increased concentrations, increased radiative forcing, and further increased mean temperatures. Even if there aren’t sharp transitions within the next 150 ppm or so, it is inevitable that extensive further use of coal will push us further into unknown and potentially dangerous territory.

Canada’s anti-superbug initiative

Geodesic domes at Winterlude

Canada’s federal government is launching an initiative to combat antibiotic resistant bacteria. This is a very sensible thing to do, given how bacterial evolution is creating resistant strains at a higher rate than the one at which we are inventing new antibiotics. MRSA and its relatives could well signal a return to a world in which morbidity and mortality from bacterial illness start shifting back towards the levels prevalent before antibiotics were widely available.

We largely have ourselves to blame for the existence of these bugs. Every time a doctor prescribes unnecessary antibiotics in order to get a patient out of their office, we give them another chance to get stronger. The same goes for when a patient stops taking an antibiotic prescription when they feel better, rather than when it runs out, potentially leaving a few of the most resistant bugs behind to infect others. The same is true for all the ‘antibacterial’ soaps and cleaning products out there. Putting triclosan in soap is pretty poor prioritization. Outside the body, it makes the most sense to kill bugs with things they cannot evolve resistance to: like alcohol or bleach. Using the precious chemicals that kill them but not us to clean countertops is just bad thinking. Finally, there is the antibiotic-factory farming connection discussed extensively here before.

The federal plan involves a number of prudent steps, many of them specifically targeted to MRSA and Clostridium difficile. These include more active patient screening, better sanitization of hospital rooms, use of prophylactics like gloves and masks, and the isolation of patients with resistant strains. Given that there were 13,458 MRSA infections in Ontario hospitals in 2006, it seems that such an initiative is overdue. It would be exceedingly tragic if we comprehensively undermined one of the greatest discoveries in the history of medicine through carelessness and neglect.

SpaceShipTwo

Mailboxes

Virgin Galactic – Richard Branson’s space company – has released the design of its next generation craft: SpaceShipTwo. The machine will carry passengers into the upper atmosphere after being carried to an altitude of about 15km by a larger mothership. After spending time at 110km of altitude, the vehicle will re-enter the atmosphere. While the technology is new and doubtless interesting, there is good reason to ask whether it serves any valuable purpose.

The three aims commonly described for the technology are delivering extremely urgent packages, launching small satellites, and entertaining rich people. While it can certainly be argued that manned spaceflight has not generally been a valuable undertaking, this sort of rollercoaster ride does seem like an especially trivial use of technology. For about $200,000, you get a few minutes in microgravity, the view out the windows, and bragging rights thereafter. Satellite launching could be a lot more useful, though the Virgin group has yet to demonstrate the capability of their vehicles to do so – a situation that applies equally to the idea of making 90 minute deliveries anywhere in the world.

The Economist provides an especially laughable justification for the whole undertaking, arguing:

When space becomes a democracy—or, at least, a plutocracy—the rich risk-takers who have seen the fragile Earth from above might form an influential cohort of environmental activists. Those cynics who look at SpaceShipTwo and think only of the greenhouse gases it is emitting may yet be in for a surprise.

Hopefully, they won’t become ‘environmental activists’ of the Richard Branson variety: investing in airplanes and gratuitous spacecraft while hoping someone will develop a machine that will somehow address the emissions generated.

Common descent and biochemistry

Steam pipes in snow

Despite the dizzying array of life on Earth – if you doubt that, watch the BBC’s excellent Planet Earth series – there is a remarkable degree of biochemical consistency between all living things. This is one of the strongest arguments in favour of common descent: the idea that all living things are descended from the first replicators so evocatively described in the opening chapter of Dawkins’ The Selfish Gene. The very strongest evidence of that thesis comes not from the universality of the really essential mechanisms of life, but from the universality of arbitrary conventions common to all living things.

Some of the more astonishing elements of life are universal: the storage of constitutive information in strands of DNA or RNA, the use of three nucleotide codons to refer to amino acids, and the dominant role of proteins in cellular architecture. These are common to animals and plants, fungi and bacteria and archaea. It is difficult to imagine how living things would look if they were based on alternatives to this basic system. Then, there are elements of common biology which need to be in place, but are somewhat arbitrary. For example, there is the metabolization of glucose for energy and the use of adenosite triphosphate (ATP) as an energy carrier. Something needs to play these roles, but there are presumably other molecules that could serve the purpose. Also, unlike the consistencies in the first category, life would not be staggeringly different if different molecules served these purposes. Finally, there are what might be considered arbitrary conventions – things that were established at the origin of life, are common to all life, but which could just as well be another way or a patchwork of different ways. This includes the use of only 20 amino acids to make proteins, and the fact that the L-isomers of these acids are used. This also includes how cells establish a lower concentration of sodium inside themselves than exists in the surrounding matter, with a higher concentration of potassium inside. It could just as well have been the other way.

In a sense, it is the third category that provides the best evidence of common descent. It is like language: pretty much any language will need a way to refer to objects and to actions performed upon them. As such, the inclusion of these aspects in different languages isn’t really evidence of relation. When you find a language that has a number of arbitrary conventions in common with another (say, an alphabet), you have more reason to think they both evolved from something older.

While statistics suggests that it is highly likely, it would nonetheless be rather thrilling to find life that emerged entirely independently, somewhere out among other planets or distant stars.

Australia’s geothermal potential

Docks near Lonsdale Quay

For a country using 83% coal to power an economy that produces 25.9 tonnes of carbon dioxide equivalent per person, Australia’s Innamincka desert could prove a blessing. This is not because of the sunshine hitting it, but because of the way geothermal energy has suffused the granite under it.

Initial tests have found that the granite is at approximately 250˚C, meaning that each cubic kilometre can yield as much energy as 40 million barrels of oil. If it proves viable to use this heat to boil water and drive turbines, the share of Australian power derived from renewables could increase considerably. According to Emeritus Professor John Veevers of Macquarie University’s Department of Earth and Planetary Sciences, the rocks could “supply, without emissions, the baseload electrical power at current levels of all consumers in Australia for 70 years.”

Naturally, it is not sufficient to just have hot stones within a drillable distance. It will have to be economical to construct the power generation equipment. There will be a need for water to use as a heat carrier. Finally, it will be necessary to build transmission capacity to link new facilities with Australian cities.

In a sense, a geological formation like this is like the oil sands in reverse. Both exist in large countries with economies that depend to a considerable degree on primary commodities. Likewise, both exist in states with shockingly high per-capita greenhouse gas emissions. There are questions about commercial viability and water usage of both projects, but the broader issue with Innamincka is how many megatonnes of carbon dioxide can be kept out of the atmosphere, rather than how much will be produced through a bituminous bonanza.

Improving energy efficiency through very smart metering

Milan Ilnyckyj

With existing technology, it is entirely possible to build houses that allow their owners to be dramatically more energy aware. For instance, it would be relatively easy to build electrical sockets connected to a house network. It could then be possible to see graphically or numerically how much power is being drawn by each socket. It would also be easy to isolate the energy use of major appliances – furnaces, dish washers, refrigerators – thus allowing people to make more intelligent choices about the use and possible replacement of such devices. In an extreme case, you could have a constantly updating spreadsheet identifying every use of power, the level being drawn, the cost associated, and historical patterns of usage.

Being able to manage electrical usage through a web interface could also be very helpful. People could transfer some of their use of power to low-demand times of the day. They could also lower the temperature in houses and have it rise in time to be comfortable by the time they got home. Such controls would also be very useful to people who have some sort of home generating capacity, such as an array of solar panels. A web interface could provide real-time information on the level of energy being produced and the quantity stored.

While all of these things are entirely possible, there do seem to be two big barriers to implementation. The first is in convincing people to install such systems in new houses or while retrofitting houses. The second is to make the systems intuitive enough that non-technical people can use them pretty well. The first of those obstacles would be partially overcome through building codes and carbon pricing. The second is mostly a matter of designing good interfaces. Perhaps an Apple iHome is in order.