Wednesday, December 21, 2011

Security in Flu Study Was Paramount, Scientist Says

original article: http://www.nytimes.com/2011/12/22/health/security-in-h5n1-bird-flu-study-was-paramount-scientist-says.html?hp

The National Science Advisory Board for Biosecurity, concerned about bioterrorism and a worldwide pandemic, has for the first time ever urged scientific journals to keep details out of reports that they intend to publish on a highly transmissible form of the bird flu called A(H5N1), which has a high death rate in people. Working with ferrets, researchers on the virus at two medical centers — Erasmus Medical Center in Rotterdam, in the Netherlands, and the University of Wisconsin-Madison — are investigating genetic changes that may make the virus more easily transmittable to people. Doreen Carvajal spoke with Ron A. M. Fouchier, the lead researcher at the Erasmus Center. An edited and condensed version of the conversation follows.
Related

Q. What was your reaction to efforts to censor the research?

A. The draft recommendations reached us at the end of November, and since that time we have been working with the journals and the international organizations to figure out a way to deal with it, because this is an unprecedented issue in science.

In principle, we of course understand the statement by the National Science Advisory Board for Biosecurity and the United States government. This is dual-use research, meaning research that can be used for good and bad purposes.

The N.S.A.B.B. advice is that we can share this in a restricted form.

We would be perfectly happy if this could be executed, but we have some doubts. We have made a list of experts that we could share this with, and that list adds up to well over 100 organizations around the globe, and probably 1,000 experts. As soon as you share information with more than 10 people, the information will be on the street. And so we have serious doubts whether this advice can be followed, strictly speaking.

Q. So what is the solution?

A. This is very important research. It raises a number of important issues that need to be shared with the scientific community. And because we cannot keep this confidential with such a large group. I think the only solution is to publish in detail.

Q. How do you sum up the most vital information that should be shared?

A. There are three aspects that need to be shared.

The first part of the work can be shared without detail. The message is that H5N1 can go airborne between mammals. Of course, we have also showed how this virus can go airborne, and which mutations cause this virus to go airborne. And those mutations, the info of those mutations, need to come in the hands of people who are doing research — for instance, the people who are doing surveillance in countries affected by H5N1. If those mutations would be detected in the field, then those countries affected should act very aggressively to stamp out the outbreaks, to protect the world.

So if we can stamp this virus out before it actually emerges, then we prevent a pandemic. And I think that is what we all want.

But even if we would not be able to prevent a pandemic — and let’s assume that there is a very small chance that the virus will emerge in nature — then our last resource would be drugs and vaccines.

Now, drugs and vaccines are normally evaluated with bird flu viruses that are not adapted to mammals. Now the questions are whether those vaccines are effective against the mammal-adapted virus. And so by doing this research, we are able to get ahead of this virus emerging in the field to test whether our last resource would be functional.

So the three things are: one is the simple fact that it can go airborne. That means that all the advice from the scientific community to outbreak countries now can be more unanimous that H5N1 is a very big risk to human health. The second thing is surveillance, and the third thing is preparation by evaluating vaccines and antivirals.

Q. What were the precautions that you took, if any, in the course of your research to guard against terrorism?

A. This experiment was not designed overnight. We started planning for these experiments 10 years ago, consulting with experts nationally and internationally about how to do this safely. We built special facilities to protect people against the virus and the virus against the people.

Q. What was special about your facilities, in the Netherlands?

A. The biosafety information can be found on our Web site. The biosecurity, I cannot release any information.

Q. Over that period, were there any safety issues?

A. Everything was smooth. There were layers upon layers upon layers of biosecurity measures. The design of this type of facility was such that it would be very unlikely for all barriers to break at the same time.

Q. How did you conduct the research?

A. I cannot disclose the methods, because the methods are supposed to be a recipe for bioterrorism.

We mutated the virus and then performed a natural selection for additional mutations. We were testing on ferrets. We designed the experiment over the course of 10 years. We have been doing hands-on work on the experiments for the last two years, testing on dozens of ferrets.

Q. Is the research finished?

A. We are continuing the work. We need to evaluate vaccines, and we need to evaluate antiviral drugs and how well they work against this virus. We also need to have a more general understanding of whether this virus could acquire abilities of airborne transmission in other ways.

Q. Have you seen any sign that government authorities or anyone else was monitoring you because of concerns about terrorism?

A. I am sure I am being monitored by many governments. But also the usual states, not only the rogue countries. If they are monitoring me, they are doing a good job of staying out of my sight.

Q. How easy is it to recreate this virus?

A. It is not very easy. You need a very sophisticated specialist team and sophisticated facilities to do this. And in our opinion, nature is the biggest bioterrorist. There are many pathogens in nature that you could get your hands on very easily, and if you released those in the human population, we would be in trouble.

And therefore we think that if bioterror or biowarfare would be a problem, there are so many easy ways of doing it that nobody would take this H5N1 virus and do this very difficult thing to achieve it.

You could not do this work in your garage if you are a terrorist organization. But what you can do is get viruses out of the wild and grow them in your garage. There are terrorist opportunities that are much, much easier than to genetically modify H5N1 bird flu virus that are probably much more effective.

Q. How difficult would it be to recreate it?

A. If we get this in the hands of labs that can already do it — such as the C.D.C. or N.I.H. laboratories — they would be able to repeat our work in a matter of weeks. But for rogue countries or terrorist groups, this would take years of work.

Q. So why such concern — aren’t you offering information that will protect countries?

A. That’s a question you should address to the advisory board. That’s our opinion, and we think this work should have been published in detail.

Q. What is your next step?

A. We will respect this advice, because this is the consensus for now. And we will work toward publishing a manuscript without the details, and we will wait on how the N.S.A.B.B. and the United States government envisages sharing the information in a classified way. As I said, we have doubts this is possible.

Q. Did you consider publishing anyway?

A. Yes, we could even launch it on our own Web site. We could do that. Of course, that’s not the smart way to move. There is an intense debate in our field, and it would be silly for us to act on our own on this. It’s better to have this discussion in the scientific and health community and see where it goes. If everybody agrees that this is the way to go, then we will respect that.

Q. What was the reaction from colleagues?

A. The only people who want to hold back are the biosecurity experts. They show zero tolerance to risk. The public health specialists do not have this zero tolerance. I have not spoken to a single public health specialist who was against publication. So we are going to see an interesting debate over the next few weeks between biosecurity experts and public health experts who think this information should be in the public domain.

Monday, November 14, 2011

Study links Parkinson's disease to industrial solvent

original article:

http://www.bbc.co.uk/news/health-15639440

snippets:

An international study has linked an industrial solvent to Parkinson's disease.

Researchers found a six-fold increase in the risk of developing Parkinson's in individuals exposed in the workplace to trichloroethylene (TCE).

Although many uses for TCE have been banned around the world, the chemical is still used as a degreasing agent.

The research was based on analysis of 99 pairs of twins selected from US data records.

Parkinson's can result in limb tremors, slowed movement and speech impairment, but the exact cause of the disease is still unknown, and there is no cure.

Research to date suggests a mix of genetic and environmental factors may be responsible. A link has previously been made with pesticide use.

'Significant association'

The researchers from institutes in the US, Canada, Germany and Argentina, wanted to examine the impact of solvent exposure - specifically six solvents including TCE.

They looked at 99 sets of twins, one twin with Parkinson's, the other without.

Because twins are genetically very similar or identical and often share certain lifestyle characteristics, twins were thought to provide a better control group, reducing the likelihood of spurious results.

The twins were interviewed to build up a work history and calculate likely exposure to solvents. They were also asked about hobbies.

The findings are presented as the first study to report a "significant association" between TCE exposure and Parkinson's and suggest exposure to the solvent was likely to result in a six-fold increase in the chances of developing the disease.

The study also adjudged exposure to two other solvents, perchloroethylene (PERC) and carbon tetrachloride (CCl4), "tended towards significant risk of developing the disease".

No statistical link was found with the other three solvents examined in the study - toluene, xylene and n-hexane.

"Our study confirms that common environmental contaminants may increase the risk of developing Parkinson's, which has considerable public health implications," said Dr Samuel Goldman of The Parkinson's Institute in Sunnyvale, California, who co-led the study published in the journal Annals of Neurology.

He added: "Our findings, as well as prior case reports, suggest a lag time of up to 40 years between TCE exposure and onset of Parkinson's, providing a critical window of opportunity to potentially slow the disease before clinical symptoms appear."
Water contaminant

TCE has been used in paints, glue, carpet cleaners, dry-cleaning solutions and as a degreaser. It has been banned in the food and pharmaceutical industries in most regions of the world since the 1970s, due to concerns over its toxicity.

In 1997, the US authorities banned its use as an anaesthetic, skin disinfectant, grain fumigant and coffee decaffeinating agent, but it is still used as a degreasing agent for metal parts.
Computer image of affected neurons in the brain of Parkinson's patients A computer image of affected neurons in the brain of Parkinson's patients

Groundwater contamination by TCE is widespread, with studies estimating up to 30% of US drinking water supplies are contaminated with TCE. In Europe, it was reclassified in 2001 as a "category 2" carcinogen, although it is still used in industrial applications.

PERC, like TCE, is used as a dry-cleaning agent and degreasing agent, and is found in many household products. CCl4's major historical use was in the manufacture of chlorofluorocarbons for use as refrigerants, but it has also been used a fumigant to kill insects in grain.

Commenting on the paper, Dr Michelle Gardner, Research Development Manager at Parkinson's UK, said: "This is the first study to show that the solvent TCE may be associated with an increased risk of developing Parkinson's.

"It is important to highlight that many of the previous uses of this solvent have been discontinued for safety reasons over 30 years ago and that safety and protection in work places where strong chemicals such as this solvent are used has greatly improved in recent years."

She also called for more research to confirm the link between TCE and other solvents with Parkinson's.

"Further larger-scale studies on populations with more defined exposures are needed to confirm the link," she said.

Wednesday, November 9, 2011

The Hidden Toll of Traffic Jams


original article:

http://online.wsj.com/article/SB10001424052970203733504577024000381790904.html?mod=WSJ_hp_MIDDLENexttoWhatsNewsThird

snippets:

Congested cities are fast becoming test tubes for scientists studying the impact of traffic fumes on the brain.

As roadways choke on traffic, researchers suspect that the tailpipe exhaust from cars and trucks—especially tiny carbon particles already implicated in heart disease, cancer and respiratory ailments—may also injure brain cells and synapses key to learning and memory.

New public-health studies and laboratory experiments suggest that, at every stage of life, traffic fumes exact a measurable toll on mental capacity, intelligence and emotional stability...

No one knows whether regular commuters breathing heavy traffic fumes suffer any lasting brain effect. Researchers have only studied the potential impact based on where people live and where air-pollution levels are highest. Even if there were any chronic cognitive effect on drivers, it could easily be too small to measure reliably or might be swamped by other health factors such as stress, diet or exercise that affect the brain, experts say.

Recent studies show that breathing street-level fumes for just 30 minutes can intensify electrical activity in brain regions responsible for behavior, personality and decision-making, changes that are suggestive of stress, scientists in the Netherlands recently discovered. Breathing normal city air with high levels of traffic exhaust for 90 days can change the way that genes turn on or off among the elderly; it can also leave a molecular mark on the genome of a newborn for life, separate research teams at Columbia University and Harvard University reported this year.

Children in areas affected by high levels of emissions, on average, scored more poorly on intelligence tests and were more prone to depression, anxiety and attention problems than children growing up in cleaner air, separate research teams in New York, Boston, Beijing, and Krakow, Poland, found. And older men and women long exposed to higher levels of traffic-related particles and ozone had memory and reasoning problems that effectively added five years to their mental age, other university researchers in Boston reported this year. The emissions may also heighten the risk of Alzheimer's disease and speed the effects of Parkinson's disease.

Reviewing birth records, Dr. Volk and her colleagues calculated that children born to mothers living within 1,000 feet of a major road or freeway in Los Angeles, San Francisco or Sacramento were twice as likely to have autism, independent of gender, ethnicity and education level, as well as maternal age, exposure to tobacco smoke or other factors. The findings were published this year in the journal Environmental Health Perspectives.

Exhaust fumes can extend farther from roadways than once thought. Traffic fumes from some major L.A. freeways reached up to 1.5 miles downwind—10 times farther than previously believed....

Scientists believe that simple steps to speed traffic are a factor in reducing some public-health problems. In New Jersey, premature births, a risk factor for cognitive delays, in areas around highway toll plazas dropped 10.8% after the introduction of E-ZPass, which eased traffic congestion and reduced exhaust fumes, according to reports published in scientific journals this year and in 2009...

Scientists are only beginning to understand the basic biology of car exhaust's toxic neural effects, especially from prenatal or lifetime exposures. "It is hard to disentangle all the things in auto exhaust and sort out the effects of traffic from all the other possibilities," says Dr. Currie, who studies the relationship between traffic and infant health.

Researchers in Los Angeles, the U.S.'s most congested city, are studying lab mice raised on air piped in from a nearby freeway. They discovered that the particles inhaled by the mice—each particle less than one-thousandth the width of a human hair—somehow affected the brain, causing inflammation and altering neurochemistry among neurons involved in learning and memory.

To study the effect of exhaust on expectant mothers, Frederica Perera at Columbia University's Center for Children's Environmental Health began in 1998 to equip hundreds of pregnant women with personal air monitors to measure the chemistry of the air they breathed. As the babies were born, Dr. Perera and colleagues tested some of the infants and discovered a distinctive biochemical mark in the DNA of about half of them, left by prenatal exposure to high levels of polycyclic aromatic hydrocarbons in exhaust.

By age 3, the children who were exposed prenatally to high exhaust levels were developing mental capacities fractionally more slowly. By age 5, their IQ scores averaged about four points lower on standard intelligence tests than those of less exposed children, the team reported in 2009. The differences, while small, were significant in terms of later educational development, the researchers said.

By age 7, the children were more likely to show symptoms of anxiety, depression and attention problems, the researchers reported this year in Environmental Health Perspectives.

"The mother's exposure—what she breathed into her lungs—could affect her child's later behavior," Dr. Perera says. "The placenta is not the perfect barrier we once thought."

Tuesday, August 23, 2011

A Helium Shortage?


original article:

http://www.wired.com/wired/archive/8.08/helium.html

There are two kinds of stable helium. You know the first one: It puts lift in birthday balloons, Thanksgiving Day parades, the Goodyear blimp.

The other kind, an isotope called helium-3, may not be as familiar. It's a naturally occurring, but very rare, variant of helium that is missing a neutron. Helium-3 is the fuel for a form of nuclear fusion that, in theory, could provide us with a clean, virtually infinite power source.

Gerald Kulcinski, director of the University of Wisconsin's Fusion Technology Institute, is already halfway there. Kulcinski is in charge of an "inertial electrostatic confinement device," an experimental low-power reactor that has successfully performed continuous deuterium-helium-3 fusion - a process that produces less waste than the standard deuterium-tritium fusion reaction.

The next step, pure helium-3 fusion (3He-3He) is a long way off, but it's worth the effort, says Kulcinski. "You'd have a little residual radioactivity when the reactor was running, but none when you turned it off. It would be a nuclear power source without the nuclear waste."

If we ever achieve it, helium-3 fusion will be the premier rocket fuel for centuries to come. The same lightness that floats CargoLifter's CL160 will allow helium to provide more power per unit of mass than anything else available. With it, rockets "could get to Mars in a weekend, instead of seven or eight months," says Marshall Savage, an amateur futurist and the author of The Millennial Project: Colonizing the Galaxy in Eight Easy Steps.

The problem? We may run out of helium - and therefore helium-3 - before the fusion technology is even developed.

Nearly all of the world's helium supply is found within a 250-mile radius of Amarillo, Texas (the Helium Capital of the World). A byproduct of billions of years of decay, helium is distilled from natural gas that has accumulated in the presence of radioactive uranium and thorium deposits. If it's not extracted during the natural gas refining process, helium simply soars off when the gas is burned, unrecoverable.

The federal government first identified helium as a strategic resource in the 1920s; in 1960 Uncle Sam began socking it away in earnest. Thirty-two billion cubic feet of the gas are bunkered underground in Cliffside, a field of porous rock near Amarillo. But now the government is getting out of the helium business, and it's selling the stockpile to all comers.

Industrial buyers use the gas primarily for arc welding (helium creates an inert atmosphere around the flame) and leak detection (hydrogen has a smaller atom, but it usually forms a diatomic molecule, H2). NASA uses it to pressurize space shuttle fuel tanks: The Kennedy Space Center alone uses more than 75 million cubic feet annually. Liquid helium, which has the lowest melting point of any element (-452 degrees Fahrenheit), cools infrared detectors, nuclear reactors, wind tunnels, and the superconductive magnets in MRI equipment. At our current rate of consumption, Cliffside will likely be empty in 10 to 25 years, and the Earth will be virtually helium-free by the end of the 21st century.

"For the scientific community, that's a tragedy," says Dave Cornelius, a Department of Interior chemist at Cliffside. "It would be a shame to squander it," agrees Kulcinski.

For helium-3's true believers - the ones who think the isotope's fusion power will take us to the edge of our solar system and beyond - talk of the coming shortage is overblown: There's a huge, untapped supply right in our own backyard.

"The moon is the El Dorado of helium-3," says Savage, and he's right: Every star, including our sun, emits helium constantly. Implanted in the lunar soil by the solar wind, the all-important gas can be found on the moon by the bucketful.

Associate professor Tim Swindle and his colleagues at the Lunar and Planetary Laboratory at the University of Arizona have already begun prospecting. Swindle has mapped likely helium-3 deposits on the moon by charting the parts of the lunar landscape most exposed to solar wind against the locations of mineral deposits that best trap the element.

But, says Swindle, when we really want a lot - when we're rocketing to the Red Planet and back for Labor Day weekend - the best place to gas up won't be the moon: "The really big source of it is way out." In our quest for helium-3, we'll travel to Uranus and Neptune, whose helium-rich atmospheres are very similar in chemical composition to the sun's. If futurists like Swindle and Savage are right, the gas will be our reason for traveling to our solar system's farthest reaches - and our means of getting there.

-Emily Jenkins

Note: The Darth Vader balloon is filled with hot air, not Helium.

Laser Advances in Nuclear Fuel


original article:

http://www.nytimes.com/2011/08/21/science/earth/21laser.html?scp=1&sq=lasers&st=cse

snippets:


Scientists have long sought easier ways to make the costly material known as enriched uranium — the fuel of nuclear reactors and bombs, now produced only in giant industrial plants.

One idea, a half-century old, has been to do it with nothing more substantial than lasers and their rays of concentrated light. This futuristic approach has always proved too expensive and difficult for anything but laboratory experimentation.

Until now.

In a little-known effort, General Electric has successfully tested laser enrichment for two years and is seeking federal permission to build a $1 billion plant that would make reactor fuel by the ton.

That might be good news for the nuclear industry. But critics fear that if the work succeeds and the secret gets out, rogue states and terrorists could make bomb fuel in much smaller plants that are difficult to detect.

Iran has already succeeded with laser enrichment in the lab, and nuclear experts worry that G.E.’s accomplishment might inspire Tehran to build a plant easily hidden from the world’s eyes.

Backers of the laser plan call those fears unwarranted and praise the technology as a windfall for a world increasingly leery of fossil fuels that produce greenhouse gases.

But critics want a detailed risk assessment. Recently, they petitioned Washington for a formal evaluation of whether the laser initiative could backfire and speed the global spread of nuclear arms.

“We’re on the verge of a new route to the bomb,” said Frank N. von Hippel, a nuclear physicist who advised President Bill Clinton and now teaches at Princeton. “We should have learned enough by now to do an assessment before we let this kind of thing out.”

New varieties of enrichment are considered potentially dangerous because they can simplify the hardest part of building a bomb — obtaining the fuel....

For now, the big uncertainty centers on whether federal regulators will grant the planned complex a commercial license. The Nuclear Regulatory Commission is weighing that issue and has promised G.E. to make a decision by next year.

The Obama administration has taken no public stance on plans for the Wilmington plant. But President Obama has a record of supporting nuclear power as well as aggressive efforts to curtail the bomb’s spread. The question is whether those goals now conflict.

The aim of enrichment is to extract the rare form of uranium from the ore that miners routinely dig out of the ground. The process is a little like picking through multicolored candies to find the blue ones.

The scarce isotope, known as uranium 235, amounts to just 0.7 percent of mined uranium. Yet it is treasured because it splits easily in two in bursts of atomic energy. If concentrations are raised (or enriched) to about 4 percent, the material can fuel nuclear reactors; to 90 percent, atom bombs.

Enrichment is so difficult that successful production is quite valuable. A pound of reactor fuel costs more than $1,000 — less expensive than gold but more than silver.

The Laser Race


The first laser flashed to life in 1960. Soon after, scientists talked excitedly about using the innovation to shrink the size of enrichment plants, making them far cheaper to build and run.

The plan was to exploit the extraordinary purity of laser light to selectively excite uranium’s rare form. In theory, the resulting agitation would ease identification of the precious isotope and aid its extraction.

At least 20 countries and many companies raced to investigate the idea. Scientists built hundreds of lasers.

Ray E. Kidder, a laser pioneer at the Livermore nuclear arms lab, estimated that the overall number of scientists involved globally ran to several thousand.

“It was a big deal,” he said in an interview. “If you could enrich with lasers, you could cut the cost by a factor of 10.”

The fervor cooled by the 1990s as laser separation turned out to be extremely hard to make economically feasible.

Not everyone gave up. Twenty miles southwest of Sydney, in a wooded region, Horst Struve and Michael Goldsworthy kept tinkering with the idea at a government institute. Finally, around 1994, the two men judged that they had a major advance.

The inventors called their idea Silex, for separation of isotopes by laser excitation. “Our approach is completely different,” Dr. Goldsworthy, a physicist, told a Parliamentary hearing.

An old black-and-white photograph of the sensitive technology — perhaps the only image of its kind in existence publicly — shows an array of pipes and low cabinets about the size of a small truck.

‘Game Changing’ Technique

In May 2006, G.E. bought the rights to Silex. Andrew C. White, the president of the company’s nuclear business, hailed the technology as “game-changing.”

Mr. Monetta of Global Laser Enrichment, the G.E.-Hitachi subsidiary, said the envisioned plant would enrich enough uranium annually to fuel up to 60 large reactors. In theory, that could power more than 42 million homes — about a third of all housing units in the United States.

The laser advance, he added, will promote energy security “since it is a domestic source.”

In late 2009, as G.E. experimented with its trial laser, supporters of arms control wrote Congress and the regulatory commission. The technology, they warned, posed a danger of quickening the spread of nuclear weapons because of the likely difficulty of detecting clandestine plants.

Experts called for a federal review of the risks. In early 2010, the commission resisted.

Late last year, the American Physical Society — the nation’s largest group of physicists, with headquarters in Washington — submitted a formal petition to the commission for a rule change that would compel such risk assessments as a condition of licensing....

This year, thousands of citizens, supporters of arms control, nuclear experts and members of Congress wrote the commission to back the society’s effort. Many of them cited well-known failures in safeguarding secrets and detecting atomic plants.

But the Nuclear Energy Institute, an industry group in Washington, objected. It said new precautions were unnecessary because of voluntary plans for “additional measures” to safeguard secrets.

A commission spokesman said the petition would be considered next year. In theory, the risk-assessment plan, if adopted, could slow or stop the granting of a commercial license for the proposed laser plant or could result in design improvements.

A POSITIVE ASSESSMENT

G.E., seizing the initiative, did an assessment of its own. It hired Dr. Kerr, the former director of Los Alamos and a former senior federal intelligence official, to lead the evaluation. He and two other former government officials concluded that the laser secrets had a low chance of leaking and that a clandestine laser plant stood a high chance of being detected.

“It’s a major industrial facility,” Dr. Kerr said of the planned Wilmington complex in an interview. “Our observation was this was not something that would sit in a garage or be easily hidden.”

Mr. Monetta added that the technical complexity and “significant size” of the laser plant were major barriers to its covert adoption abroad.

Global Laser Enrichment plans to build its complex on more than 100 acres at the Wilmington industrial park, with the main building covering nearly 14 acres. That, like Iran’s main enrichment plant, is roughly half the size of the Pentagon.

But critics say a clandestine bomb maker would need only a tiny fraction of that vast industrial ability — and thus could build a much smaller laser, perhaps like the modest apparatus in the old photograph. Each year, they note, the enrichment powers of the Wilmington plant would be great enough to produce fuel for more than 1,000 nuclear weapons.

When experts cite possible harm from the commercialization of laser enrichment, they often point to Iran. The danger, they say, lies not only in pilfered secrets, but also in the public revelation that a half-century of laser failure seems to be ending.

Their concern goes to the nature of invention. The demonstration of a new technology often begets a burst of emulation because the advance opens a new window on what is possible.

Arms controllers fear that laser enrichment is now poised for that kind of activity. News of its feasibility could spur wide reinvestigation.

Dr. Slakey of the American Physical Society noted that the State Department a dozen years ago warned that the success of Silex could “renew interest” in laser enrichment for good or ill — to light cities or destroy them.

That moment, he said, now seems close at hand.

Friday, June 24, 2011

Astronomical!



original article:

http://www.nytimes.com/2011/06/24/opinion/global/24iht-june24-ihtmag-das-32.html?src=recg


An international team of astronomers recently presented compelling evidence that our galaxy is teeming with lonely Jupiter-sized planets adrift between stars. Alone in the void, unattached to any parent sun, these cosmic orphans appear to fill the heavens in vast numbers. Extrapolating from what they observed, Takahiro Sumi, an astrophysicist at Osaka University, and his colleagues reported in the journal Nature that there could be as many as 400 billion of these lonely wanderers in our Milky Way galaxy alone....

As if on cue, NASA then announced that its Kepler spacecraft, two years into a three-and-a-half year mission to find Earth-size planets around nearby stars, had found a totally unexpected profusion of candidates. Of the 1,235 suspected planets spotted so far, moreover, about a third were in multiplanet solar systems like ours. Judging from these discoveries, it would appear that planets out there are as numerous as grains of sand. Twenty-five years ago, when I was a student in high school, only nine planets were known, all in our solar system. We learned their names and sequence from the sun, from the fleet-footed Mercury to icy Pluto. We learned of the runaway greenhouse effect that had stoked Venus to blistering temperatures and read about the giant storm that is Jupiter’s red spot, and we gazed at pictures of the rings of Saturn that the Voyager spacecraft had sent back....

It may come as a surprise that it was only in 1995 that a planet beyond our solar system was first sighted. The discovery by the Swiss astronomers Michel Mayor and Didier Queloz was confirmed soon after by an independent team in the United States. I was a graduate student then and remember the great excitement this stirred among astronomers. Like Kant, many had believed that the processes that gave rise to our solar system were not unique, and that there were other planets in the universe. Now, observations had finally caught up with belief.

Finding “exoplanets” (for extrasolar planet, as planets outside our solar system are now referred to) is no easy matter. Planets emit no light of their own, and only reflect the light of their stars. Given the interstellar distances involved, even the stars nearest to us appear only as pinpoints, so it’s a technological challenge to identify a planet thousands of times dimmer.

Mayor and Queloz met the challenge by using a spectrograph at the Haute-Provence Observatory in southeastern France to observe the rhythmic wobble of a sun-like star known as 51 Pegasi, a wobble created by the gravitational tug of an orbiting planet. This “radial velocity” technique has been used since to find many planets, but its reliance on spotting the wobble of a star tends to pick out larger planets close to their parent star — like the Jupiter-sized one Mayor and Queloz reported — which most scientists think could not be capable of supporting life.

There are ways to detect smaller planets, and the Kepler spacecraft launched on March 7, 2009, was specifically designed, according to NASA, “to survey a portion of our region of the Milky Way galaxy to discover dozens of Earth-size planets in or near the habitable zone and determine how many of the billions of stars in our galaxy have such planets.” Kepler continuously monitors 145,000 stars in the Milky Way for the brief dimming of light that would indicate a “planetary transit” — a planet crossing the face of the star....

The team that discovered the wandering orphan planets, led by Takahiro Sumi and including David Bennett from the University of Notre Dame, used an even more arcane technique — gravitational microlensing — to spot these otherwise totally invisible bodies. Based on Einstein’s premise that gravity bends light, the technique can see dark objects in the sky by measuring the light they bend from stars behind them. The astrophysicists thus saw 10 drifters, and estimated that there may be one or two of them for each of the approximately 200 billion stars in the Milky Way.

That’s a quantum leap from the nine I knew in high school (reduced to eight after Pluto was demoted to a “dwarf planet” in 2006 by the International Astronomical Union), and even from the 500 or so exoplanets confirmed as of early this year. And if Jupiter-size planets, which are easier to spot, are numbered in the billions, surely there must be many Earth-size planets out there, spinning around their stars at just the right distance to support life? It is time to rewrite the texts.

You may wonder at this point why something so Earth shattering as the discovery of innumerable planets has not caused more excitement in the broad public....

The confirmation that planets are a dime a dozen is really the culmination of the scientific revolution first started by Copernicus and Galileo and Kepler more than four centuries ago, a revolution in which our home planet lost its special place at the center of the universe. The prevailing cosmology before Copernicus — codified by the astronomer Claudius Ptolemy in the first century A.D. and, though dead wrong, accepted for the next 1,500 years — held that the Sun, Moon and planets (the six known ones) all revolved around Mother Earth, under a canopy of stars. It was a rational and well organized universe, in which the Roman Catholic Church could point with authority to heaven above and hell below.

Then Nicolaus Copernicus, a timid Polish canon, put forward an alternate, heliocentric system in which the Sun replaced Earth at the center. In 1543, just before he died, Copernicus finally summoned the courage to publish his treatise, De Revolutionibus Orbium Coelestium (“On the Revolutions of Heavenly Spheres”), which would inspire Galileo Galilei and Johannes Kepler to pursue the studies that became modern astronomy. In an age when science was inextricably linked to religion, the Catholic Church did not surrender lightly its geocentric universe. To challenge it was “false and contrary to Scripture,” Galileo was told by the Inquisition, and even though he disowned his ideas, he spent his last years under house arrest. (In 2000, Pope John Paul II formally apologized for Galileo’s trial).

But there was no turning back. Within a few decades, Isaac Newton confirmed Kepler’s ideas on planetary motion and described the natural laws that have shaped our view of the cosmos ever since. Once the Earth had been displaced from the center of the universe, it was only a matter of time before the Sun was reduced to a garden variety star in a remote spiral arm of the Milky Way galaxy; the Milky Way itself to one of a hundred billion galaxies; and our planet to a speck of cosmic dust.

Wednesday, June 8, 2011

The Gas Is Greener


original article:

http://www.nytimes.com/2011/06/08/opinion/08bryce.html?hpwhttp://www.blogger.com/img/blank.gif

snippets:

IN April, Gov. Jerry Brown made headlines by signing into law an ambitious mandate that requires California to obtain one-third of its electricity from renewable energy sources like sunlight and wind by 2020.

But there’s the rub: while energy sources like sunlight and wind are free and naturally replenished, converting them into large quantities of electricity requires vast amounts of natural resources — most notably, land.

Consider California’s new mandate. The state’s peak electricity demand is about 52,000 megawatts. Meeting the one-third target will require about 17,000 megawatts of renewable energy capacity. Let’s assume that California will get half of that capacity from solar and half from wind. Most of its large-scale solar electricity production will presumably come from projects like the $2 billion Ivanpah solar plant, which is now under construction in the Mojave Desert in southern California. When completed, Ivanpah, which aims to provide 370 megawatts of solar generation capacity, will cover 3,600 acres — about five and a half square miles.

The math is simple: to have 8,500 megawatts of solar capacity, California would need at least 23 projects the size of Ivanpah, covering about 129 square miles, an area more than five times as large as Manhattan.

Wind energy projects require even more land. The Roscoe wind farm in Texas, which has a capacity of 781.5 megawatts, covers about 154 square miles. Again, the math is straightforward: to have 8,500 megawatts of wind generation capacity, California would likely need to set aside an area equivalent to more than 70 Manhattans. Apart from the impact on the environment itself, few if any people could live on the land because of the noise (and the infrasound, which is inaudible to most humans but potentially harmful) produced by the turbines.

Unfortunately, energy sprawl is only one of the ways that renewable energy makes heavy demands on natural resources.

Consider the massive quantities of steel required for wind projects. The production and transportation of steel are both expensive and energy-intensive, and installing a single wind turbine requires about 200 tons of it. Many turbines have capacities of 3 or 4 megawatts, so you can assume that each megawatt of wind capacity requires roughly 50 tons of steel. By contrast, a typical natural gas turbine can produce nearly 43 megawatts while weighing only 9 tons. Thus, each megawatt of capacity requires less than a quarter of a ton of steel.

Such profligate use of resources is the antithesis of the environmental ideal. Nearly four decades ago, the economist E. F. Schumacher distilled the essence of environmental protection down to three words: “Small is beautiful.” In the rush to do something — anything — to deal with the intractable problem of greenhouse gas emissions, environmental groups and policy makers have determined that renewable energy is the answer. But in doing so they’ve tossed Schumacher’s dictum into the ditch.

Monday, May 16, 2011

CDC Says Lemon Eucalyptus As Effective As DEET



original articles:

http://www.treehugger.com/files/2011/05/cdc-confirms-lemon-eucalyptus-oil-as-effective-as-deet.php?campaign=daily_nl

http://mattermore.org/2011/05/02/cdc-says-lemon-eucalyptus-as-effective-as-deet

snippets:

In two recent scientific publications, when oil of lemon eucalyptus was tested against mosquitoes found in the US it provided protection similar to repellents with low concentrations of DEET

Dr. Mohammed Abou-Donia of Duke University studied lab animals' performance of neuro-behavioural tasks requiring muscle co-ordination. He found that lab animals exposed to the equivalent of average human doses of DEET performed far worse than untreated animals.

Children with DEET toxicity reported lethargy, headaches, tremors, involuntary movements, seizures, and convulsions though the amount that led to this toxicity was unreported, according to the CDC.

Another plus, lemon eucalyptus doesn’t have the oily feel and unpleasant smell of DEET products. Look for products that contain the active ingredient p-Menthane-3,8-diol, such as Cutter Lemon Eucalyptus Pump 4oz, $7.95 which can repel mosquitoes and ticks for up to 6 hours.

Thursday, May 5, 2011

Drumbeat of Nuclear Fallout Fear Doesn’t Resound With Experts


Original Article:

http://www.nytimes.com/2011/05/03/sciencehttp://www.blogger.com/img/blank.gif/03radiation.html?src=recg

Snippets:

The nuclear disaster in Japan has sent waves of radiation and dread around the globe, prompting so many people to buy radiation detectors and potassium iodide to fend off thyroid cancer that supplies quickly sold out.

The fear is unwarranted, experts say. People in Japan near the Fukushima Daiichi nuclear power plant may have reason to worry about the consequences of radiation leaks, scientists say, and some reactor workers, in particular, may suffer illness. But outside of Japan, the increase is tiny, compared with numerous other sources of radiation, past and present.

In the world’s oceans, thousands of decomposing drums of radioactive waste pose bigger dangers than the relatively small amounts of radioactive water released from the Fukushima Daiichi plant. And natural radiation from rocks, cosmic rays and other aspects of the environment, experts say, represents the biggest factor of all — far bigger than all the man-made emissions, including the current increase from the crippled Japanese reactors.

Dr. Dale Dewar, executive director of Physicians for Global Survival, a group that advocates the abolition of nuclear arms, said the accident meant future generations would live in a world with higher levels of background radiation.

During the cold war, for example, more than 500 detonations pumped the global atmosphere full of deadly radioactive materials, some of which are still emitting radiation.

Figures from the United Nations put the total bomb radiation from decades of atmospheric testing at almost 70 billion curies. By contrast, the 1986 accident at the Chernobyl nuclear power plant released about 100 million curies of the most dangerous materials.

As for Fukushima Daiichi, Japanese officials said on April 12 that the reactor complex had released about 10 million curies. In 1979, the reactor accident at Three Mile Island released about 50 curies into the environment.

Additionally, many experts say, the threat to the Japanese people is probably low because — unlike the radioactive fallout from the cold war and the Chernobyl accident — most of the radiation is believed to have blown out to sea on the prevailing winds.

The ocean has received many radiological blows over the decades. From 1946 to 1994, when the practice was banned, governments around the globe dumped many thousands of drums of radioactive waste into the abyss, as well as reactors and derelict submarines.

Scientists estimate the dumping in total involved about four million curies of radioactive materials, with the Soviet Union doing a vast majority of the disposal. Decay has lowered the level of that radiological threat over the decades, even as the rotting of drums and barrels has raised the risk of environmental contamination.

At a nuclear dump site near the Farallon Islands off San Francisco, surveys have revealed many fractured drums and evidence that some radioactive materials have spread to sea life. The Environmental Protection Agency found that sponges bore “readily measurable” amounts of plutonium 239 and plutonium 240 — types of man-made radioactive materials that seldom exist in nature. The former has a half-life of 24,360 years, and the latter 6,560 years.

Tuesday, April 12, 2011

Studies Say Natural Gas Has Its Own Environmental Problems


original article:

http://www.nytimes.com/2011/04/12/business/energy-environment/12gas.html


snippets:

Natural gas, with its reputation as a linchpin in the effort to wean the nation off dirtier fossil fuels and reduce global warming, may not be as clean over all as its proponents say....

The problem, the studies suggest, is that planet-warming methane, the chief component of natural gas, is escaping into the atmosphere in far larger quantities than previously thought, with as much as 7.9 percent of it puffing out from shale gas wells, intentionally vented or flared, or seeping from loose pipe fittings along gas distribution lines. This offsets natural gas’s most important advantage as an energy source: it burns cleaner than other fossil fuels and releases lower carbon dioxide emissions.

“These are huge numbers,” he said. “That the industry would let what amounts to trillions of cubic feet of gas get away from us doesn’t make any sense. That’s not the business that we’re in.”

Methane leaks have long been a concern because while methane dissipates in the atmosphere more quickly than carbon dioxide, it is far more efficient at trapping heat. Recent evidence has suggested that the amount of leakage has been underestimated. A report in January by the nonprofit journalism organization ProPublica, for example, noted that the Environmental Protection Agency had recently doubled its estimates for the amount of methane that is vented or lost from natural gas distribution lines.

The study combined these emissions with studies of other methane losses along the processing and distribution cycle to arrive at an estimated total methane loss range from 3.6 to 7.9 percent for the shale gas industry.

When all is factored together, Mr. Howarth and his colleagues conclude that the greenhouse gas footprint of shale gas can be as much as 20 percent greater than, and perhaps twice as high as, coal per unit of energy.

Mr. Hawkins also said that too little was known about just how much methane was being lost and vented, and that studies like Mr. Howarth’s, while needed, relied on too slim a data set to be considered the final word.

“This is a huge and growing industry, and we just don’t have the information we need to make sure that this resource is being developed as cleanly as it can be,” Mr. Hawkins said.

Wednesday, March 23, 2011

Bill Gates Bets On Next-Gen Nuclear


original article:

http://www.forbes.com/2010/03/24/nuclear-power-innovation-technology-ecotech-bill-gates.html

snippets:

Bill Gates announced his plans to fund a viable, next-generation nuclear technology called a traveling-wave reactor.

Traveling-wave reactors have been discussed for decades as a cheaper and safer alternative to typical fission reactors, but until now the supercomputers required to make such technology possible were simply not affordable.

The prototype developed by TerraPower will rely upon Microsoft's supercomputing prowess and a whole lot of computer hardware--1,024 Xeon core processors assembled on 128 blade servers offering "over 1,000 times the computational ability as a desktop computer."

Instead of requiring enriched uranium, it can burn depleted uranium and other low-grade radioactive fuel stocks. It can also burn them for a long, long time. With this new reactor, a long-term reaction is created in which the waste from breeding the fuel is recombined to create more fuel inside the reactor. Theoretically, a nuclear reactor could operate for 100 years without changing the fuel rods, and the resultant waste would be much less radioactive than the waste of our modern-day reactors.

Even if Gates' billions combined with Toshiba's know-how does result in a full-scale industrial version of the traveling-wave reactor, it will be 10 years before one is constructed. The construction process itself will take five years.

Monday, March 14, 2011

Sperm Whales May Have Names


original article:

http://www.wired.com/wiredscience/2011/03/sperm-whale-names/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+wired/index+(Wired:+Index+3+(Top+Stories+2))

snippets:

Subtle variations in sperm-whale calls suggest that individuals announce themselves with discrete personal identifier. To put it another way, they might have names.

The findings are preliminary, based on observations of just three whales, so talk of names is still speculation. But “it’s very suggestive,” said biologist Luke Rendell of Scotland’s University of St. Andrews. “They seem to make that coda in a way that’s individually distinctive.”

Rendell and his collaborators...have for years studied the click sequences, or codas, used by sperm whales to communicate across miles of deep ocean. In a study published last June in Marine Mammal Sciences, they described a sound-analysis technique that linked recorded codas to individual members of a whale family living in the Caribbean.

In that study, they focused on a coda made only by Caribbean sperm whales. It appears to signify group membership. In the latest study...they analyzed a coda made by sperm whales around the world. Called 5R, it’s composed of five consecutive clicks, and superficially appears to be identical in each whale. Analyzed closely, however, variations in click timing emerge. Each of the researchers’ whales had its own personal 5R riff....

Rendell stressed that much more research is needed to be sure of 5R’s function. “We could have just observed a freak occurrence,” he said....“This is just the first glimpse of what might be going on.”

That individual whales would have means of identifying themselves does, however, make sense. Dolphins have already been shown to have individual, identifying whistles. Like them, sperm whales are highly social animals who maintain complex relationships over long distances, coordinating hunts and cooperating to raise one another’s calves.

Sperm-whale coda repertoires can contain dozens of different calls, which vary in use among families and regions, as do patterns of behavior. At a neurological level, their brains display many of the features associated in humans with sophisticated cognition. Many researchers think that sperm whales and other cetacean species should be considered “non-human persons,” comparable at least to chimpanzees and other great apes.

Is That a Banana in Your Water?


original article:

http://news.nationalgeographic.com/news/2011/03/110311-water-pollution-lead-heavy-metal-banana-peel-innovation/

snippets:

Banana peels are no longer just for composting or comedy shows: New science shows they can pull heavy metal contamination from river water.

Traditionally, water quality engineers have used silica, cellulose, and aluminum oxide to extract heavy metals from water, but these remediation strategies come with high price tags and potentially toxic side effects of their own.

Bananas, on the other hand, appear to be a safe solution. Banana peels also outperform the competition, says Gustavo Castro, a researcher at the Biosciences Institute at Botucatu, Brazil, and a coauthor of a new study on this new use of the fruit’s peel.

For the study, Castro and his team dried and ground banana peels, then combined them in flasks of water with known concentrations of metals. They also built water filters out of peels and pushed water through them.

In both scenarios, “the metal was removed from the water and remained bonded to the banana peels,” Castro said, adding that the extraction capacity of banana peels exceeded that of other materials used to remove heavy metals.

Previous work has shown that other plant parts—including apple and sugar cane wastes, coconut fibers, and peanut shells—can remove potential toxins from water.

Climate-smart agriculture is needed

original article:

http://www.nature.com/news/2011/110302/full/news.2011.131.html?s=news_rss

snippets:

Between 70% and 80% of agricultural greenhouse-gas emissions, such as nitrous oxide, come from the production and use of nitrogen fertilizers. So future rises in food production must be achieved without corresponding boosts in fertilizer use, added Gordon Conway, professor of international development at Imperial College London.

Conway heralded the 'fertilizer trees' Faidherbia albida as the future, particularly for farmers in Africa. These trees, which reintroduce nitrogen to the soil, have been shown to quadruple African maize yields in soils with no artificial fertilizer added.

But David Powlson, a retired soil scientist with a visiting professorship at the University of Reading, UK, urged caution. He says that countries' fertilizer use should differ according to their stages of development, particularly in Africa, which has soils that are starved of key nutrients such as nitrogen and phosphorus.

The overuse of nitrogen fertilizers elsewhere in the world, such as in China, "should not be used as an excuse not to give nitrogen fertilizers to Africa", he says.

Wednesday, March 9, 2011

When Energy Efficiency Sullies the Environment


Original Article:

http://www.nytimes.com/2011/03/08/science/08tier.html?_r=1&src=recg

Snippets:

Energy-efficiency standards have been embraced by politicians of both parties as one of the easiest ways to combat global warming. Making appliances, cars, buildings and factories more efficient is called the “low-hanging fruit” of strategies to cut greenhouse emissions.

But a growing number of economists say that the environmental benefits of energy efficiency have been oversold. Paradoxically, there could even be more emissions as a result of some improvements in energy efficiency, these economists say.

The problem is known as the energy rebound effect. While there’s no doubt that fuel-efficient cars burn less gasoline per mile, the lower cost at the pump tends to encourage extra driving. There’s also an indirect rebound effect as drivers use the money they save on gasoline to buy other things that produce greenhouse emissions, like new electronic gadgets or vacation trips on fuel-burning planes....

“Efficiency advocates try to distract attention from the rebound effect by saying that nobody will vacuum more because their vacuum cleaner is more efficient,” Mr. Shellenberger said. “But this misses the picture at the macro and global level, particularly when you consider all the energy that is used in manufacturing products and producing usable energy like electricity and gasoline from coal and oil. When you increase the efficiency of a steel plant in China, you’ll likely see more steel production and thus more energy consumption....”

But if your immediate goal is to reduce greenhouse emissions, then it seems risky to count on reaching it by improving energy efficiency. To economists worried about rebound effects, it makes more sense to look for new carbon-free sources of energy, or to impose a direct penalty for emissions, like a tax on energy generated from fossil fuels. Whereas people respond to more fuel-efficient cars by driving more and buying other products, they respond to a gasoline tax simply by driving less.

Friday, February 18, 2011

It's Time for Millennium Consumption Goals


original article:

http://blogs.worldwatch.org/transformingcultures/mcgs/

Snippets:

“a Sri Lankan scientist is calling for the drafting of “Millennium Consumption Goals” to [help] rich countries to curb their climate-damaging consumption habits, in the same way the poor have Millennium Development Goals (MDGs) to get them out of poverty.”

For those unfamiliar with the Millennium Development Goals, these are a set of 8 goals for “underdeveloped” societies to halve poverty, lack of access to clean water, illiteracy, and other key indicators of underdevelopment by 2015... As the scientist, Mohan Munasinghe, noted, consumption is at the heart of overdeveloped countries’ environmental burden so tackling this issue head-on is key.

1. Halve obesity and overweight rates by 2020 (we’re starting the MCGs later than the MDGs). This will reduce mortality, morbidity, and economic costs, as well as reduce ecological pressures driven by overconsumption of food.

2. Halve the work week from the current 40+ hour per week to 20 hours per week. This will better distribute jobs, wealth, promote healthier living, and reduce economic activity, which is essential in our ecologically taxed world.

3. Better distribute wealth by raising taxes on the wealthiest members of society. That one will get me in trouble with the American Tea Party but let’s dust off the idea of Noblesse Oblige: to those given much, much is expected in return. The days of extreme wealth spent on luxurious living must draw to a close. The Earth can’t handle it any longer.

4. Double the rate of use of non-motorized transport (bikes, walking, etc.). Increasing these forms of transport will improve health, reduce fossil fuel and material use, and make for safer cities.

5. Guarantee access to health care for all. Yes, another minefield in the USA, but standard procedure in most industrial countries so that’ll be an easy goal for most countries to achieve.

Wednesday, February 16, 2011

Population Growth and Ecological Footprint - It's About Equity, Environment & Preventing Collapse by Matthew McDermott



Original Article:

http://www.treehugger.com/files/2011/02/population-growth-ecological-footprint-about-equity-environment-collapse.php?campaign=daily_nl

Snippets:

As it stands now, 500 million people on the planet (about 7% of the world population) is responsible for 50% of all CO2 emissions. At the other end of the scale, the bottom 3 billion people are responsible for just 6% of the total. The United States leads the world, with its 5% of world population roughly responsible for one-third of all global expenditures on goods and services.

If that level of resource consumption was extended globally, the planet could support just 1.4 billion people.

To equitably support current population levels and not continue to degrade the ability of the planet to support us, we'd all have to live like the average person in Thailand or Jordan--roughly $5,000 a year's worth of consumption.

And remember that population growth is expected to continue until we hit about 9 billion people (or perhaps more, if recent UN warnings bear out). Which means that the global resource pie gets sliced into even smaller and smaller equal pieces--or relatively equal at least, I'm not advocating absolute equality as the ideal.

The hard part of this should be obvious if we hold on to equity as a virtue (and make no mistake I think we should): If a minority of the world's people consume the vast majority of the world's resources, doing so ecologically unsustainably, and there's a large group of people claiming a right to have what that minority do, a recipe for collapse is quickly created.

Beyond the usual (and valid) suggestions of increasing women's education and reproductive freedom, creating more gender equity, and lifting people out of absolute poverty, there are several things to do that may be able to prevent this both concrete and conceptual:

Greater Efficiency Can Help, But Not Solve This Problem

Efficiency and waste. Improving the former and reducing the latter can certainly help everyone do more with less resource consumption. Will it make up for the fact that the ecological footprint of the average citizen in the United States is roughly four times the carrying capacity of the planet and even that of the average person in China is unsustainable as well? Probably not, but both are critically important.

Until Environmental Damage Is Incorporated Into Economics We Will Continue Making Bad Choices

Start measuring more than GDP, incorporating well-being and environmental factors (like depleting natural capital), and reporting this as the baseline of national worth. Plenty of more articulate people than I have written extensively on alternate economic indicators such as the Genuine Progress Indicator.

'Developed' Is A Bad Term For What Industrial Nations Are

Getting rid of the terms 'developed' and 'developing' as applied to nations. Even 'emerging economy' is problematic. Developed and developing signify complete in some way or being in the process of reaching completeness. Emerging is a variation on this. All imply inherent goodness in consuming resources like those of us in the US, in Europe, Japan, Australia, etc etc etc do, when in fact ecologically speaking that level of consumption extended globally is a disaster in the making.

Continued Idealization of Economic Growth Is Delusion

Going along with this is moving our expectations and language away from growth economics to steady-state economics. We have been so indoctrinated with the notion that growth is good always that this may be difficult, but given what we now know about the ecological limits of the planet and how in the highest consuming nations of the world any growth is likely uneconomic, to not cease talking about economic growth as an unqualified good thing is just delusional.

We Have to Be Able to Calmly Talk About Population Growth & Resource Consumption

Break the taboo on talking about population growth as somehow an infringement on personal liberty...If we cannot even have a nuanced discussion of the relation of how our personal reproductive choices, the group reproductive choices of nations, as well as how our individual and collective consumer choices combine into environmental and social impact, we will surely choose the to continue down the path of collapse.

Friday, February 4, 2011

A 375-Mile Battery Range: Too Good to be True?



Full Article:

http://wheels.blogs.nytimes.com/2011/02/03/a-375-mile-battery-range-too-good-to-be-true/

Snippet:

Last October, a Kolibri-powered Audi A2, converted by DBM Energy GmbH and Lekker Energie with funding from the German economy ministry, traveled from Munich to Berlin, around 375 miles, which the car covered in about seven hours without recharging. Upon arrival, its 115-kWh pack was only around 80 percent depleted, implying a total range of more than 400 miles from a pack weighing just 770 pounds. For comparison, the Tesla Roadster’s pack, which claims 245 miles of range, weighs 990 pounds.

If verified — and DBM states on its Web site that the inspection organization DEKRA checked the vehicle and also cites 30 eyewitnesses — it would be a world record. A specially designed battery-powered Daihatsu Mira went 623 miles on a track last May, but while only averaging 25 miles per hour. The 375-mile journey by the Lekker Mobil is notable because it was done on public roads in wet weather at an average speed of 55 m.p.h.

As for the controversy? The A2 disintegrated in a December fire while parked in a warehouse, though DBM claims that a makeshift battery unit, and not the one used during the supposed record run, was installed at the time. The fire is under police investigation, but it has prompted skeptics to further question whether DBM has anything to hide.

Wednesday, January 26, 2011

What's Mine Is Yours: The Rise of Collaborative Consumption



Overview:

Collaborative Consumption describes the rapid explosion in swapping, sharing, bartering, trading and renting being reinvented through the latest technologies and peer-to-peer marketplaces in ways and on a scale never possible before. If you've used a car sharing service like Zipcar, experienced peer-to-peer travel on Airbnb, given away or found something on Freecycle or lent money through Zopa, you're already part of the rise of Collaborative Consumption.

Full Interview:

http://www.treehugger.com/files/2011/01/rachel-botsman-explains-how-collaborative-consumerism-will-change-our-world-interview.php?campaign=daily_nl:

Snippets:

We are just in the nascent stages of Collaborative Consumption. We have already seen examples like Netflix, eBay and Zipcar become household names but that has taken a decade - technology and consumer values were playing catch-up. But I think the current massive cultural and technological shift is accelerating the next wave of Collaborative Consumption at an astonishing rate.

I think it's critical for more big brands to enter the space. BMW, Daimler and Peugeot have all recently launched car sharing models. Amazon just announced its 'Buy Back' scheme of second-hand unwanted books. I would love to see a big bank enter the social lending space; for a retail giant like Target to launch an innovative rental model; for a brand like Zappos to create a shoe swapping and repair platform....

Swapping sites for goods with limited value or that fulfill a temporary need such a baby goods, books and DVDs are growing at a staggering rate; Peer-to-peer space rental sites (homes, gardens, parking spaces, storage etc.) such as AirBnb, Landshare and Parkatmyhouse are exploding in mainstream popularity; Bike sharing is the fastest growing form of transportation in the world; Co-working spaces are popping up in the world's major cities; I think 2011 is the year that we start to see skill or 'favor' share communities such as TaskRabbit, Skillshare and Hey Neighbor start to take off.

Big picture (and I am talking in 10-20 years time), I think we will see the way we measure 'wealth', 'growth' and 'happiness' being completely redefined. We are already seeing countries such as the UK, Canada and France looking at reinventing measures beyond GDP that give a picture of the holistic well-being of a nation. As Sarkozy commented, "So many things that are important to individuals are not included in GDP."

Tuesday, January 25, 2011

Polar bear's epic nine day swim in search of sea ice



full article:

http://news.bbc.co.uk/earth/hi/earth_news/newsid_9369000/9369317.stm

snippets:

"This bear swam continuously for 232 hours and 687 km and through waters that were 2-6 degrees C," says research zoologist George M. Durner.

"We are in awe that an animal that spends most of its time on the surface of sea ice could swim constantly for so long in water so cold. It is truly an amazing feat."

By fitting a GPS collar to a female bear, researchers were able to accurately plot its movements for two months as it sought out hunting grounds.

The scientists were able to determine when the bear was in the water by the collar data and a temperature logger implanted beneath the bear's skin.

The bears hunt their prey on frozen sea ice: a habitat that changes according to temperature.

"This dependency on sea ice potentially makes polar bears one of the most at-risk large mammals to climate change," says Mr Durner.