Talking about climate change mitigation, people often make reference to the Manhattan Project: arguing that we need a massive, technology-focused governmental effort to sort out the problem. This historical example can, however, be thought about in another way. During the final stages of WWII, the United States was preparing to invade Japan. Given the fierce resistance they encountered during the island hopping campaigns in the Pacific, they expected a very difficult battle to capture the Japanese home islands. Ultimately, those preparations were rendered unnecessary when the atomic bombings of Hiroshima and Nagasaki helped to produce a Japanese surrender.
In the climate context, the equivalent of the atomic bomb might be some miraculous new set of technologies that allows us to deal with climate change at a low cost and with few real sacrifices: algae-based biofuels, next generation fission or fusion nuclear reactors, carbon capture and storage, etc. Counting on the emergence of such technologies is akin to betting on the atomic bombs ending the war, long before it was certain that they would work or would be developed in a timely matter. While we may be lucky and see some breakthrough technologies emerge in the decades ahead, we need to do what the Americans did and plan to deal with the problem through the difficult practice of old-fashioned slogging. We need to have a plan to stabilize greenhouse gas concentrations at a safe level, and do so with the technologies and technical resources that exist today, not those that may exist in the future.
The future of our planet and of all future generations of humans depends on avoiding catastrophic climate change. Presented with that burden, we cannot just invest in researching a few technological long shots and then rest easy. We need to get ready to address the problem, no matter how costly, painful, and difficult doing so may ultimately prove to be.