Study: multiple-fault ruptures can trigger stronger earthquakes

A new study by Stanford University could improve future seismic hazard predictions.

The new research reveals how the rupture of multiple faults can lead to stronger earthquakes. Based on the new findings, the 1812 earthquake of Southern California was likely due to the slippage of one fault triggering the rupture of a second fault.

Previously, scientists only blamed the San Andreas Fault for the 7.5 magnitude quake of Southern California. However, the study reveals the nearby San Jacinto Fault to be an accomplice.

The San Jacinto Fault has been underestimated in causing serious quakes in tandem with the San Andreas Fault.

“This study shows that the San Jacinto Fault is an important player that can influence what the San Andreas Fault is doing and lead to major earthquakes,” said Julian Lozos, the author of the study published in ‘Science Advances’ and who is currently an assistant professor of geological sciences at California State University . “It’s important that we learn more about how activity on any single fault can affect other faults.”

This new study can improve future seismic hazard predictions. (Photo courtesy of : www.morguefile.com)

This new study can improve future seismic hazard predictions.
(Photo courtesy of : www.morguefile.com)

According to evidence found by the study, the San Jacinto Fault slipped first between the cities of San Jacinto and Moreno Valley. The rupture then travelled north and crossed over to the San Andreas Fault close to a place called the Cajon Pass. This location is where the two fault lines run as close as 1.5 kilometres. Together the two ruptured faults caused the Southern California earthquake on that ill-fated December morning.

“Understanding this earthquake in particular, and complex earthquakes in general, is important to quantifying seismic hazard”, said geophysicist Greg Beroza, the Wayne Loel Professor at Stanford School of Earth, Energy & Environmental Sciences.

Lozos’ research could help the Uniform California Earthquake Rupture Forecast (UCERF) in preparing for future earthquakes. In earlier UCERF reports the estimated chance of an earthquake shaking California by 2015 was about 4.7 percent. However, in its latest report, after taking into account the effect of multi-fault ruptures, this estimate has grown to about 7 percent.

Lozos also hopes his research increases earthquake awareness in the general public. Especially for the millions of Californians living in the Inland Empire, undercut by both the San Andreas and San Jacinto Fault lines.

“People shouldn’t just be thinking about the San Andreas Fault,” Lozos said. “There are lots of other faults, so it’s important for everyone in the regions at risk to have their earthquake kits ready.”

Stanford nuclear expert gives three lessons, five years after Fukushima disaster

An expert on nuclear materials at Stanford University says we need to reassess natural disaster risks, acknowledge the links between nuclear energy and renewables and rethink the language used when referring to these disasters.

According to Rodney Ewing, the Frank Stanton Professor in Nuclear Security and senior fellow at the Centre for International Security and Cooperation in the Freeman Spogli Institute, the reason for the nuclear meltdown was not an accident as mentioned in the media and various scientific papers, but rather a failure of the safety analysis.

In case of a powerful earthquake, power plants automatically shut down their reactors. In addition, generators start to work immediately to sustain the circulation of coolant over the nuclear fuel to prevent heating and possible meltdown. However, at Fukushima the tsunami flooded the diesel engines which were installed low and close to the coast, cutting off the power supply to the cooling system. The poorly placed generators lead to the catastrophic tragedy at Fukushima. “This is why when I refer to the tragedy at Fukushima, it was not an accident,” said Ewing.

“The Japanese people and government were certainly well acquainted with the possibility of tsunamis.”

His second lesson is to rethink the meaning of ‘risk’. He says referring to an earthquake or tsunami as a rare event, while geological records show it has happened and will happen again, reduces its sense of urgency to be prepared in advance.

Stanford expert says we need to reassess natural disaster risks. (Photo courtesy of : www.morguefile.com)

Stanford expert says we need to reassess natural disaster risks.
(Photo courtesy of : www.morguefile.com)

“It can be that the risk analysis works against safety, in the sense that if the risk analysis tells us that something’s safe, then you don’t take the necessary precautions,” he said. The assumption that the reactors were safe during an earthquake, prevented further anticipation in case of a tsunami.

He said in the case of the reactors at the Fukusima Power Plant, one should not assess the risk of one of its reactors being hit by a tsunami. But must assess the risk of a tsunami hitting any one of the reactors over its lifetime. In the latter the probability of a reactor being hit by a tsunami increases, especially if the geological record for the evidence of tsunamis are also considered.

The third lesson learned, according to Ewing, is the strong link between nuclear energy and the future of renewables. Since the Fukushima tragedy Ewing has noticed the continuous effect of the Fukushima disaster throughout the nuclear industry. He believes this impact in turn will greatly effect the future of renewable energy resources.

The Nuclear Commission in the United States has required all reactor sites to reassess their risk from natural disasters, especially in the central of the United States. However, this reaction was not globally shared.

“In countries like Germany and Switzerland, the Fukushima tragedy was the last straw,” Ewing said. “This was particularly true in Germany, where there has always been a strong public position against nuclear power and against geologic waste disposal. Politically, Germany announced that it will shut down its nuclear power plants.”

Ewing says Germany is a great example, since it is a technologically advanced country trying to avoid the use of nuclear energy. At the same time it is trying to reduce its carbon emissions. This move towards renewable energy sources will be costly, but it’s a price many Germans are willing to pay.

Study: Companies and environmentalists working together can slow deforestation

According to a Stanford University study, collective efforts to reduce deforestation are more than twice as effective as “confrontational” programs implemented by either nongovernmental  organizations or industry.

Various eco-certifications inform consumers of their impact on deforestation. However, there hasn’t been much research on their effectiveness up until now.

The study finds that  these certifications have improved forest product sustainability to a great extent.

According to “Impacts of Nonstate, Market-Driven Governance on Chilean forests” published in “Proceedings of The National Academy of Sciences”, Market-driven attempts have reduced deforestation to a great extent, with multi-party collaborations having the greatest impact.

“Our research shows that these market-based conservation efforts have reduced deforestation in Chile,” said lead author Robert Heilmayr, a recent graduate from Stanford’s Emmett Interdisciplinary Program in Environment and Resources, in the paper “Impacts of Nonstate, Market-Driven Governance on Chilean forests” published in “Proceedings of The National Academy of Sciences”

A comparison on the conservation outcomes between CERTFOR, a largely industry developed certification program, Joint Solutions Project (JSP), an NGO-instigated deforestation moratorium and Forest Stewardship Council (FSC), a cooporation between industry and nongovernmental organizations, has provided insight into this issue.

DSC06896

FSC was most successful in reducing deforestation. (Photo Courtesy : www.morguefile.com)

While CERTFOR had 16 percent reduction in deforestation on average, JSP-only participants experienced an average reductions of 20 percent. With 43% reduction in deforestation, FSC resulted in the greatest success.

According to Heilmayr and co-author  Eric Lambin,the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy & Environmental Sciences, the balance between strict environmental requirements with cost-effective solutions was responsible for FSC’s leading success. This balance creates a notion among participants that their interests have been protected and hence they follow through on requirements.

The analysis also suggests in contrast to government policies, private and market-driven programs are better at lowering deforestation rates in places of high deforestation.

“Traditional conservation policies like national parks often protect remote, pristine locations,” Heilmayr said. “Agreements between companies and environmentalists can reduce deforestation in more threatened forests.”

“In the globalization era, deforestation is increasingly associated with consumption in distant, international markets,” said Lambin . “We need new approaches to environmental governance that regulate the impact of international actors.”

Intense ocean turbulence discovered near sea floor in equatorial pacific

Stanford scientists say waves hitting the equatorial seafloor create centimetre-scale turbulence which is essential in driving ocean circulation on a global scale.

The new findings published in the journal Geophysics Research Letters could eventually lead to improved future climate forecasts.

“Climate models don’t do a great job of simulating global ocean circulation because they can’t simulate the small scales that are important for deep ocean mixing,” said study co-author Ryan Holmes in a statement, a graduate student at Stanford’s School of Earth, Energy & Environmental Sciences.

The meridional overturning circulation (MOC) is a global conveyor belt where cooled surface waters in high latitudes flow toward the equatorial regions. These waters eventually mix with warmer, less dense water and rise to the surface. They then flow toward the higher latitudes to complete the cycle. One such cycle takes hundreds to thousands of years to complete.

Until now, scientists didn’t exactly know where in the tropical oceans this mixing of currents took place. They believed that intense deep ocean mixing required water to flow over rugged terrain.

James Moum, a professor of physical oceanography at Oregon State University, and Holmes, who had been investigating equatorial mixing in ocean models for his PhD, went on a five-week research cruise in the equatorial Pacific to better understand mixing in tropical oceans.

The team encountered strong turbulence along a 700-meter vertical stretch of water close to the ocean floor. This turbulence was unexpected as this part of the ocean floor was relatively smooth.

“This was the first time that anyone had observed turbulence over smooth topography that was as strong as that found over rough topography,” said Holmes.

Using computer models Leif Thomas, an expert in the physics of ocean circulation at Stanford and Holmes created a model simulating how winds blowing across the ocean surface create ‘internal waves’. However, their model did not produce the turbulence necessary for abyssal mixing. Instead the internal waves bounced between two vertical bands of water and the smooth sea floor without breaking.

It wasn’t until Thomas incorporated the horizontal components of Earth’s spin into their simulation that everything began to fall into place.“It occurred to me that internal waves at the equator, where the effects of the horizontal component of Earth’s spin are most pronounced, could experience an analogous behavior when the waves reflect off the seafloor,” said Thomas.

Holmes says, after including this horizontal component they found it changed the physics of waves in the deep equatorial oceans. This component likely drove them to break and cause turbulence and mixing.

Thomas says the new findings point out the important role the deep equatorial waters play for Earth’s climate system.

“Scientists have long known that the equatorial upper ocean is critical for the physics of internal variations in climate such as El Niño,” he said. “Our new study suggests the abyssal ocean at the equator could impact the climate system on much longer timescales.”

Breakthrough in psychology may lead to new treatments for depression

A new study has shown the link between noradrenergic neurons and susceptibility to depression for the first time.

The study was published by Bruno Giros’ team, a researcher at the Douglas Mental Health University Institute and Professor of psychiatry at McGill University, in the journal ‘Nature Neuroscience’.

“We know that a small cerebral structure, known as the ventral tegmental area, contains dopaminergic neurons that play a key role in vulnerability to depression,” said Bruno Giros, whose team is part of the CIUSSS de l’Ouest-de-l’Île-de-Montréal research network.

By mimicking stressful events in animal models, the researchers found out that an increase in dopaminergic activity increases cases of depression.

The dopaminergic neuron is controlled by the noradrenergic neuron. “It is this control that steers the body’s response toward resilience or toward vulnerability to depression,” said Giros.

Giros’ team showed, animals incapable of releasing noradrenaline, are more likely to develop depression following chronic stress. However, this is not the case if the situation is reversed ; Increasing noradrenaline production does not lead to higher resilience and less depression.

The noradrenergic neurons are found in a Cerebral structure called the Locus Coeruleus. These neurons connect with each other via a neurotransmitter molecule called noradrenaline. It regulates emotions, sleep and mood disorders – and now, Giros believes, it is also involved in resilience and depression.

Stressful life events like  job loss, accident and death of a loved one can cause  major depression in some but not in others. A determining factor is resilience, a biological mechanism that enables an individual to snap out from a traumatic or stressful event. However, researchers are still working on how resilience plays a role.

“Beyond this discovery about the brain mechanisms involved in depression, our results help explain how adrenergic drugs may work and could be used to treat major depression,” said Giros.

Quality of school impacts economic growth in developing countries: expert

The quality rather the quantity of education will better a nation’s economy in the developing world according to a Stanford University expert.

“If there is going to be inclusive economic development across the world, attention must focus on school quality and having all students achieve basic skills,” wrote Eric Hanushek, a Stanford economist in a new study published in Science magazine.

The implications are more important for developing countries as these countries lack knowledge-based economies.

Many believe rates of school attendance, student enrollment and years of school to be major factors affecting the economy. However, economic growth is highly dependent on students’ basic skills in math and literacy.

Increasing human capital – the combined economic value of the skills, knowledge and experience within a community – has been misinterpreted in its implementation, according to Hanushek. He said it has led to policies looking to increase head counts, enrollment and retention in schools, ignoring important issues related to increasing skills and knowledge among students.

“We argue that too much attention is paid to the time spent in school, and too little is paid to the quality of the schools and the types of skills developed there,” Hanushek wrote with co-author Ludger Woessmann, an economics professor at the University of Munich.

The study indicates ‘knowledge capital’, cognitive skills of the population, rather than human capital of ‘school attainment’, the highest level of education completed, is the key factor in economic development. The study also finds student enrollment measures have no correlation to how much students are learning.

Track records show, leading world economies are changing to knowledge-based economies.

“They require both the skills to innovative and a highly skilled workforce to execute new designs. As economies move from agriculturally based to manufacturing based to knowledge based, the importance of cognitive skills becomes magnified,” he said.

He also added, job markets rapidly change in growing economies, requiring individuals to adjust to new demands.

New technique could rewire and grow new neurons

A new technique will potentially create new neurons and allow them to reconnect in people with central nervous system damage.

A research team led by McGill University and the Montreal Neurological Institute has created new functional connections between neurons for the first time.

These artificial neutrons grow 60 times faster, but are identical to naturally growing neurons in the human body.

(Courtesy of: McGill University)

“It’s very exciting, because the central nervous system doesn’t regenerate”, said Montserrat Lopez in a statement, a McGill post-doctoral fellow who spent four years developing, fine-tuning and testing the new technique. “What we’ve discovered should make it possible to develop new types of surgery and therapies for those with central nervous system damage or disease.”

To make healthy neuronal connections that transmit electrical signals in the same way that naturally grown neurons do, precise manipulation and specialized instruments are needed. This amount of precision is due to the minute size of the neurons, which are 1/100th of a single hair strand. An atomic force microscope is used to stretch the transmitter part of a neuron and reconnect with the part of the neuron that acts as a receiver.

Margaret Magnesian, a neuroscientist at the Montreal Neurological Institute and an author on the paper “Rapid Mechanically Controlled Rewiring of Neuronal Circuits,” says ”this technique can potentially create neurons that are several [millimetres] long, but clearly more studies will need to be done to understand whether and how these micro-manipulated connections differ from natural ones.”

A new design could bring Internet access to the entire globe

More than three billion people don’t have Internet access across the globe – imagine connecting to the web just by attaching a thin panel to the back of a tablet.

Professor George Eleftheriades and his team in The Edward S. Rogers Sr. Department of Electrical & Computer Engineering have created a metamaterial surface – an engineered material not found in nature. This surface focuses electromagnetic waves into a concentrated beam optimizing the way antenna works.

The work was originally published in the journal “Nature Communications.” The prototype is an inexpensive, thin antenna similar to a patterned ceiling tile allowing the transmission of a signal, such as broadband internet directly from space.

“The beams that come off of this surface are like lasers – we can send this energy very far, maybe even all the way to a satellite in orbit,” said Eleftheriades in a statement.

Cavity-excited Huygens’ metasurface antenna. (Courtesy of: www.Nature.com)

Cavity-excited Huygens’ metasurface antenna. (Courtesy of: www.Nature.com)

A typical satellite requires a tripod-shaped structure at its centre, helping maintain a certain distance from the surface to focus beams. The satellite therefore results in a bulky and large set up.

The leading-edge technology in the new design makes a thin, flat and uniformly illuminated antenna compared to a bulky rooftop satellite dish.

“With this design, we’ve optimized the way the antenna works to overcome the traditional compromise between the size of low-profile aperture antennas, and the strength of their beams,” said Eleftheriades.

Currently their structure is two centimetres thick, and their goal is to design a thinner and more sharply focused panel.

“Many companies are working toward providing Internet to the rest of the world,” explained Eleftheriades. “They’re looking for low-cost, low-profile, antennas to communicate with statellites, and they have to be portable. We think this design is a step toward that.”

Greenland Ice Sheet melting at an alarming rate: report

Researches have found the Greenland Ice Sheet is melting three times as fast as it did during the entire 20th century.

The Greenland Ice Sheet is the second largest in the world. Satellite images reveal it has been melting since 1997 at an alarming rate, contributing to a significant rise in global sea levels.

Climate researchers from the centre for GeoGenetics along with national and international researchers collected about a century’s worth of data.

This is the first time information was collected through observation rather than through computer model-generated estimates.

The United Nations Climate Panel’s (IPCC) however, did not include the Greenland Ice Sheet as a contributing factor in their latest report from 2013. The reason is due to the lack of direct observation – information on thermal expansion of ocean water was also missing for similar reasons.

“If we do not know the contribution from all the sources that have contributed towards global sea level rise, then it is difficult to predict future global sea levels. In our paper we have used direct observations to specify the mass loss from the Greenland Ice Sheet and thereby highlight its contribution to global sea level rise”, said Kristian K. Kjeldsen from the centre for GeoGenetics at the University of Copenhagen in a statement.

The loss from the Greenland Ice Sheet between 1900 and 2010 accounts for 10 to 18 percent of total sea level rise according to the results published in the journal of ‘Nature’.

Associate Professor Shfaqat Abbas Khan, at the Technical University of Denmark (DTU) says the average melting rate has been the greatest in the past decade over 115 years of observation.

“We are one step further in mapping out the individual contributions to global sea level rise. In order to predict future sea level changes and have confidence in the projections, it is essential to understand what happened in the past ”, said senior author on the paper, Professor Kurt H. Kjær, from the centre for GeoGenetics in a statement.  Kjær also says the data will be an important contributor in future IPCC reports.

 

Video cameras give insight into tool use among New Calendonian crows

Ecologists from the University of St Andrews in the UK were able to observe and record the tool-assisted foraging behaviour among New Caledonian (NC) crows in the wild.

NC crows are the only non-human species known to create hooked tools to extract embedded prey in the wild. Little is known on how the tools are made and used due to the shy nature of these species. This study was the first to help obtain the amount of time NC crows spend foraging with and without tools.

Miniature video cameras were attached to 19 wild crows. The footage of 10 birds was recovered and analyzed after about a week.

The study published in the journal Biology Letters states: “across all 10 birds, it was estimated that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour,”

Footage from the University of St Andrews on the New Caledonian crow’s Behaviour.

The recordings revealed the birds made the hooked tool by snapping off the twig just above and below a branching node. The crow then stripped the bark and leaves from the longer, thinner branch and created a hook at the node.

This research will further help ecologists determine the reason behind this little understood behaviour of NC crows.

However, due to short recording periods, the study cannot determine if individual crows are similar in their reliance on tool-assisted foraging.

Previously observed tool use behaviour of the New Caledonian Crow in lab setting.