Study: Mushrooms could be used as sustainable building material

According to a new UBC study, mushrooms could take up a new role as sustainable building material. Who could imagine mushrooms in their furniture? In a cutting-edge design project, six new stylish benches have been placed outside the UBC bookstore, assembled from light-coloured honeycomb-shaped bricks. These bricks are then placed under a top of clear acrylic. The bricks are very much alive, grown from a mix of Oyster mushroom spores and alder sawdust packed into moulds. Assistant professor at UBC School of Architecture and Landscape Architecture, Joe Dahmen, and his partner in work and life, Amber frid-Jimenez, Canada research Chair in Design and Technology at Emily Carr University of Art and Design, came up with this design when expecting their second child. While working on an architectural installment made of fabricated polystyrene blocks — which are not the most benign material —  they decided to look into more eco-friendly options. “Amber couldn’t get near the thing because it was so toxic,” Dahmen said. “It got me thinking that there must be a more natural material that would still enable a similar range of expression.”

In their search, Dahmen and Frid-Jimenez discovered the world of mycelium biocomposites. The product is a resistant material with qualities similar to polystyrene foams. Mycelium bicomposites are at risk of contamination by mould and bacteria if they are over half-metre in thickness. To overcome this obstacle, Dahmen created a new process inspired from the wasps’ nest. “I was really amazed at the honeycomb structure, because it’s a highly efficient way of occupying space,” he said. “It’s scalable, it can go in any direction, and it’s extremely spatially efficient.” “Their biggest application in the long run is in architecture and construction,” said Dahmen. “The average age of commercial buildings in North America is under 40 years. If we could imagine construction materials that add positive value to ecosystems as they break down, we have a whole new paradigm for the way we approach buildings, at a time when we’re demolishing most buildings long before they wear out.” According to Dahmen mycelium bicomposites could be used instead of polystyrene, from packaging to building insulation. “Styrofoam is a material that functions for a short amount of time as packaging, and then spends hundreds, if not thousands, of years in a landfill,” he added. Mycelium bicomposites not only require less energy to grow but also completely decompose when composted. They also help break down other materials in the waste stream and make them accessible to other organisms. An American company recently signed a contract to supply Ikea with mycelium-based packaging.  The method had yet to be done in Canada.

Study: multiple-fault ruptures can trigger stronger earthquakes

A new study by Stanford University could improve future seismic hazard predictions.

The new research reveals how the rupture of multiple faults can lead to stronger earthquakes. Based on the new findings, the 1812 earthquake of Southern California was likely due to the slippage of one fault triggering the rupture of a second fault.

Previously, scientists only blamed the San Andreas Fault for the 7.5 magnitude quake of Southern California. However, the study reveals the nearby San Jacinto Fault to be an accomplice.

The San Jacinto Fault has been underestimated in causing serious quakes in tandem with the San Andreas Fault.

“This study shows that the San Jacinto Fault is an important player that can influence what the San Andreas Fault is doing and lead to major earthquakes,” said Julian Lozos, the author of the study published in ‘Science Advances’ and who is currently an assistant professor of geological sciences at California State University . “It’s important that we learn more about how activity on any single fault can affect other faults.”

This new study can improve future seismic hazard predictions. (Photo courtesy of : www.morguefile.com)

This new study can improve future seismic hazard predictions.
(Photo courtesy of : www.morguefile.com)

According to evidence found by the study, the San Jacinto Fault slipped first between the cities of San Jacinto and Moreno Valley. The rupture then travelled north and crossed over to the San Andreas Fault close to a place called the Cajon Pass. This location is where the two fault lines run as close as 1.5 kilometres. Together the two ruptured faults caused the Southern California earthquake on that ill-fated December morning.

“Understanding this earthquake in particular, and complex earthquakes in general, is important to quantifying seismic hazard”, said geophysicist Greg Beroza, the Wayne Loel Professor at Stanford School of Earth, Energy & Environmental Sciences.

Lozos’ research could help the Uniform California Earthquake Rupture Forecast (UCERF) in preparing for future earthquakes. In earlier UCERF reports the estimated chance of an earthquake shaking California by 2015 was about 4.7 percent. However, in its latest report, after taking into account the effect of multi-fault ruptures, this estimate has grown to about 7 percent.

Lozos also hopes his research increases earthquake awareness in the general public. Especially for the millions of Californians living in the Inland Empire, undercut by both the San Andreas and San Jacinto Fault lines.

“People shouldn’t just be thinking about the San Andreas Fault,” Lozos said. “There are lots of other faults, so it’s important for everyone in the regions at risk to have their earthquake kits ready.”

Stanford nuclear expert gives three lessons, five years after Fukushima disaster

An expert on nuclear materials at Stanford University says we need to reassess natural disaster risks, acknowledge the links between nuclear energy and renewables and rethink the language used when referring to these disasters.

According to Rodney Ewing, the Frank Stanton Professor in Nuclear Security and senior fellow at the Centre for International Security and Cooperation in the Freeman Spogli Institute, the reason for the nuclear meltdown was not an accident as mentioned in the media and various scientific papers, but rather a failure of the safety analysis.

In case of a powerful earthquake, power plants automatically shut down their reactors. In addition, generators start to work immediately to sustain the circulation of coolant over the nuclear fuel to prevent heating and possible meltdown. However, at Fukushima the tsunami flooded the diesel engines which were installed low and close to the coast, cutting off the power supply to the cooling system. The poorly placed generators lead to the catastrophic tragedy at Fukushima. “This is why when I refer to the tragedy at Fukushima, it was not an accident,” said Ewing.

“The Japanese people and government were certainly well acquainted with the possibility of tsunamis.”

His second lesson is to rethink the meaning of ‘risk’. He says referring to an earthquake or tsunami as a rare event, while geological records show it has happened and will happen again, reduces its sense of urgency to be prepared in advance.

Stanford expert says we need to reassess natural disaster risks. (Photo courtesy of : www.morguefile.com)

Stanford expert says we need to reassess natural disaster risks.
(Photo courtesy of : www.morguefile.com)

“It can be that the risk analysis works against safety, in the sense that if the risk analysis tells us that something’s safe, then you don’t take the necessary precautions,” he said. The assumption that the reactors were safe during an earthquake, prevented further anticipation in case of a tsunami.

He said in the case of the reactors at the Fukusima Power Plant, one should not assess the risk of one of its reactors being hit by a tsunami. But must assess the risk of a tsunami hitting any one of the reactors over its lifetime. In the latter the probability of a reactor being hit by a tsunami increases, especially if the geological record for the evidence of tsunamis are also considered.

The third lesson learned, according to Ewing, is the strong link between nuclear energy and the future of renewables. Since the Fukushima tragedy Ewing has noticed the continuous effect of the Fukushima disaster throughout the nuclear industry. He believes this impact in turn will greatly effect the future of renewable energy resources.

The Nuclear Commission in the United States has required all reactor sites to reassess their risk from natural disasters, especially in the central of the United States. However, this reaction was not globally shared.

“In countries like Germany and Switzerland, the Fukushima tragedy was the last straw,” Ewing said. “This was particularly true in Germany, where there has always been a strong public position against nuclear power and against geologic waste disposal. Politically, Germany announced that it will shut down its nuclear power plants.”

Ewing says Germany is a great example, since it is a technologically advanced country trying to avoid the use of nuclear energy. At the same time it is trying to reduce its carbon emissions. This move towards renewable energy sources will be costly, but it’s a price many Germans are willing to pay.

Study: Companies and environmentalists working together can slow deforestation

According to a Stanford University study, collective efforts to reduce deforestation are more than twice as effective as “confrontational” programs implemented by either nongovernmental  organizations or industry.

Various eco-certifications inform consumers of their impact on deforestation. However, there hasn’t been much research on their effectiveness up until now.

The study finds that  these certifications have improved forest product sustainability to a great extent.

According to “Impacts of Nonstate, Market-Driven Governance on Chilean forests” published in “Proceedings of The National Academy of Sciences”, Market-driven attempts have reduced deforestation to a great extent, with multi-party collaborations having the greatest impact.

“Our research shows that these market-based conservation efforts have reduced deforestation in Chile,” said lead author Robert Heilmayr, a recent graduate from Stanford’s Emmett Interdisciplinary Program in Environment and Resources, in the paper “Impacts of Nonstate, Market-Driven Governance on Chilean forests” published in “Proceedings of The National Academy of Sciences”

A comparison on the conservation outcomes between CERTFOR, a largely industry developed certification program, Joint Solutions Project (JSP), an NGO-instigated deforestation moratorium and Forest Stewardship Council (FSC), a cooporation between industry and nongovernmental organizations, has provided insight into this issue.

DSC06896

FSC was most successful in reducing deforestation. (Photo Courtesy : www.morguefile.com)

While CERTFOR had 16 percent reduction in deforestation on average, JSP-only participants experienced an average reductions of 20 percent. With 43% reduction in deforestation, FSC resulted in the greatest success.

According to Heilmayr and co-author  Eric Lambin,the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy & Environmental Sciences, the balance between strict environmental requirements with cost-effective solutions was responsible for FSC’s leading success. This balance creates a notion among participants that their interests have been protected and hence they follow through on requirements.

The analysis also suggests in contrast to government policies, private and market-driven programs are better at lowering deforestation rates in places of high deforestation.

“Traditional conservation policies like national parks often protect remote, pristine locations,” Heilmayr said. “Agreements between companies and environmentalists can reduce deforestation in more threatened forests.”

“In the globalization era, deforestation is increasingly associated with consumption in distant, international markets,” said Lambin . “We need new approaches to environmental governance that regulate the impact of international actors.”

Intense ocean turbulence discovered near sea floor in equatorial pacific

Stanford scientists say waves hitting the equatorial seafloor create centimetre-scale turbulence which is essential in driving ocean circulation on a global scale.

The new findings published in the journal Geophysics Research Letters could eventually lead to improved future climate forecasts.

“Climate models don’t do a great job of simulating global ocean circulation because they can’t simulate the small scales that are important for deep ocean mixing,” said study co-author Ryan Holmes in a statement, a graduate student at Stanford’s School of Earth, Energy & Environmental Sciences.

The meridional overturning circulation (MOC) is a global conveyor belt where cooled surface waters in high latitudes flow toward the equatorial regions. These waters eventually mix with warmer, less dense water and rise to the surface. They then flow toward the higher latitudes to complete the cycle. One such cycle takes hundreds to thousands of years to complete.

Until now, scientists didn’t exactly know where in the tropical oceans this mixing of currents took place. They believed that intense deep ocean mixing required water to flow over rugged terrain.

James Moum, a professor of physical oceanography at Oregon State University, and Holmes, who had been investigating equatorial mixing in ocean models for his PhD, went on a five-week research cruise in the equatorial Pacific to better understand mixing in tropical oceans.

The team encountered strong turbulence along a 700-meter vertical stretch of water close to the ocean floor. This turbulence was unexpected as this part of the ocean floor was relatively smooth.

“This was the first time that anyone had observed turbulence over smooth topography that was as strong as that found over rough topography,” said Holmes.

Using computer models Leif Thomas, an expert in the physics of ocean circulation at Stanford and Holmes created a model simulating how winds blowing across the ocean surface create ‘internal waves’. However, their model did not produce the turbulence necessary for abyssal mixing. Instead the internal waves bounced between two vertical bands of water and the smooth sea floor without breaking.

It wasn’t until Thomas incorporated the horizontal components of Earth’s spin into their simulation that everything began to fall into place.“It occurred to me that internal waves at the equator, where the effects of the horizontal component of Earth’s spin are most pronounced, could experience an analogous behavior when the waves reflect off the seafloor,” said Thomas.

Holmes says, after including this horizontal component they found it changed the physics of waves in the deep equatorial oceans. This component likely drove them to break and cause turbulence and mixing.

Thomas says the new findings point out the important role the deep equatorial waters play for Earth’s climate system.

“Scientists have long known that the equatorial upper ocean is critical for the physics of internal variations in climate such as El Niño,” he said. “Our new study suggests the abyssal ocean at the equator could impact the climate system on much longer timescales.”

An exploration into China’s pollution resolution

As of 2015’s concluding months, China’s air pollution problem has been hitting the headlines, serving as a constant reminder of its persistence and severity. Although 10 cities in China had been issued red alerts December last year, and announced unsafe for citizens to remain outdoors for prolonged periods, Greenpeace’s 2015 data reveals that PM 2.5 levels (particulate matter levels from coal combustion) in China had in fact dropped by 10%.

Premier Li Ke-Qiang was also said to be waging a “war with pollution”. Despite this unexpected reversal, most of the major Chinese cities maintain dangerous levels of smog and air quality. So, after all, is China in the process of improving air quality to meet international safety standards, and what are some of its measures due to be implemented in the near future?

To nobody’s surprise, as the world leaves 2015 behind and prepares for the dawn of 2016, Beijing’s smog levels also strike an all-time high, resulting in streets to be cleared and gas masks to be pulled out. China’s pollution problem has remained a tenacious one due to excessive coal-burning in local factories and power plants.

The multiple red alerts and smog-filled photographs issued last year are grabbing more international attention than ever. However, following Greenpeace’s 2015 report revealing China’s improving air quality, China had announced its termination in the constructing of new local coal mines within the next three years, as well as its plans of closing down up to 1,000 mines in correspondence to its persistent pollution problem. China’s coal ban and declining coal consumption is a heavily persuasive progress in its journey to cleaner and safer air.

Morning smog makes things a little less clear (Courtesy of flickr.com)

 

What are some of China’s measures for improving air quality?

It has been revealed that China had devised multiple plans to keep smog levels in check, which are possible reasons behind the decreasing levels of particulate matter. Apart from the central government’s efforts to decrease pollution levels, provincial governments in China are also warming up to join the pollution resistance, and this was an ongoing process since 2014.

The Aviation Industry Corporation of China presents flying parafoil drones: unmanned bots with wings and chemical parachutes that are equipped with particulate-clearing technologies. These drones will be tested in major Chinese cities, and are useful resources for surveillance, disaster relief, and as an integrated tool in agriculture. Furthermore, the central government had been encouraging its citizens to abandon their cars, and to replace them with riding bikes and walking, as well as implementing 25-year environment laws and tax breaks to boost the market of eco-friendly green cars.

If China had all these proposals under their belt, the process of relieving pollution and improving air quality should be a quicker and more effective one than it is at the moment. Why have red alerts for hazardous smog levels been issued all over the country, even after these measures have taken place?

The Diplomat states that China still “faces problems in implementing and enforcing these proposals”, pinpointing its improper operating of pollution control units and realized difficulties of catching and convicting non-compliance scattered nationwide. The Tianjin explosion that happened last summer would be a case on point, as it had created a leakage of toxic cyanide chemicals that infiltrated a populated residential and commercial area in the city. Within the warehouse, discrepancies in the storage content reports were noted by Tianjin’s State Administration of Work Safety, and authorities did not find out what the warehouse contained until after the incident.

Greenpeace’s data conveys that China has been making progressive efforts that led to a perceivable drop in its PM 2.5 levels last year. It is clear that although China has theoretical ideas and measures since 2014 to lower pollutant levels and clear the atmosphere of smog and dirt, implementing them would require rigorous efforts, intricate organization, as well as increased funding.

Greenland Ice Sheet melting at an alarming rate: report

Researches have found the Greenland Ice Sheet is melting three times as fast as it did during the entire 20th century.

The Greenland Ice Sheet is the second largest in the world. Satellite images reveal it has been melting since 1997 at an alarming rate, contributing to a significant rise in global sea levels.

Climate researchers from the centre for GeoGenetics along with national and international researchers collected about a century’s worth of data.

This is the first time information was collected through observation rather than through computer model-generated estimates.

The United Nations Climate Panel’s (IPCC) however, did not include the Greenland Ice Sheet as a contributing factor in their latest report from 2013. The reason is due to the lack of direct observation – information on thermal expansion of ocean water was also missing for similar reasons.

“If we do not know the contribution from all the sources that have contributed towards global sea level rise, then it is difficult to predict future global sea levels. In our paper we have used direct observations to specify the mass loss from the Greenland Ice Sheet and thereby highlight its contribution to global sea level rise”, said Kristian K. Kjeldsen from the centre for GeoGenetics at the University of Copenhagen in a statement.

The loss from the Greenland Ice Sheet between 1900 and 2010 accounts for 10 to 18 percent of total sea level rise according to the results published in the journal of ‘Nature’.

Associate Professor Shfaqat Abbas Khan, at the Technical University of Denmark (DTU) says the average melting rate has been the greatest in the past decade over 115 years of observation.

“We are one step further in mapping out the individual contributions to global sea level rise. In order to predict future sea level changes and have confidence in the projections, it is essential to understand what happened in the past ”, said senior author on the paper, Professor Kurt H. Kjær, from the centre for GeoGenetics in a statement.  Kjær also says the data will be an important contributor in future IPCC reports.

 

Editorial: Small-scale farmers set hopes high for Paris climate change conference

UN member countries, non-governmental organizations, lobbyist groups and UN agencies represented their interests in the development of a universal agreement on climate change at the COP21 climate change conference in Paris, with many expecting great results.

Fairtrade International, one of the largest fair trade certification groups – part of a movement asking consumers to pay a little more for a product to help the people who produce it – is one such organization.

In a short video series featuring farmers from across the globe, from Kenya to Peru to India, Fairtrade sparks a discussion on the monumental importance of reducing climate change.

What’s so important about COP21?

Climate change has begun to be a front and centre issue for both developed and developing countries today; meetings like COP serve as valuable opportunities for the global community to work together towards a common goal of reducing the effects of global warming.

Within COP21 is the Lima-Paris Action Agenda (LPAA), a platform where countries and non-governmental actors discuss their respective interests in order to reach an agreement on cooperative climate change action.

The LPAA highlights in a statement both the mounting threat of climate change against agriculture and the fact that agriculture accounts for 24% of greenhouse gas emissions, major contributors to global warming. This is a key concern for Fairtrade because many of their partners’ livelihoods are agricultural.

The LPAA proposed initiatives focusing on four areas: soils in agriculture, food losses and waste, sustainable production, and resilience of farmers. Partnerships developed within the LPAA will commit money and technical knowledge towards supporting farmers in developed and developing countries to become key actors in the global drive to reduce climate change.
[vimeo 147620976 w=500 h=281]

Fairtrade farmers

The farmers featured in Fairtrade’s video series are members of cooperatives – jointly-owned businesses that share profits with members, in their countries – partnering with Fairtrade to ensure that members receive fair payment for their goods.

Generally, Fairtrade partners are farmers or artisans who partner with the organization as a way to combat the highly competitive nature of free trade that would pay them very low prices for their work. These competitive prices are not enough for them to survive on.

With agriculture as their livelihoods, they understandably have many concerns about climate change and how COP21 decisions will directly affect them.

Mabraat Kabbada harvesting coffee cherries.

Their thoughts on COP21

The farmers in Fairtrade’s videos express a sense of urgency about COP21. From Kenya to India to Mexico, climate change is affecting small-scale farmers in devastating ways.

[vimeo 147471403 w=500 h=281]

“It doesn’t rain when it should, and it rains when it shouldn’t,” says Luis Martínez Villanueva, a representative from Mexico. Changing weather patterns are problematic for farmers, driving down production by causing droughts and crop diseases. With falling production, farmers’ incomes are falling, too.

Facing such grim realities, farmers set their hopes high for COP21. Victor Biwot, a tea farmer from Kenya, says he’d like to see more activities spearheaded by developed countries to reduce greenhouse gas emissions and to support African farmers to adopt energy efficient methods.

A representative of the Suminter India Organic Farmers Consortium, Benny Mathew, demonstrates that solar power allows farmers to dry 500g of seeds using 250kg instead of 1500kg of firewood. By using a more sustainable energy source, farmers in Kerala, India are able to do more work at less cost to the environment.

Making moves towards greener energy use and reducing greenhouse gas emissions may seem like moving a mountain, but to small-scale farmers, it will be life-changing.

“We need to have high expectations even if we don’t reach them,” Villanueva continues, “small changes in big countries mean that small countries can have high hopes.”     

Thousands attend climate march in downtown Vancouver

Thousands attended the Global Climate March in Vancouver filling the streets with messages to world leaders who have arrived in Paris.

The UN climate summit kicks off on Monday where discussions on national limits of greenhouse gas emissions will arise.

People began gathering in the early afternoon in front of the Vancouver Art Gallery then marched throughout the downtown core. Climate change activists held protests across many cities.

Check out how some people described the events on Twitter:

 

 

The debate over a “mini ice age” in 2030

Warnings of a “mini ice age” have circulated the media. The news came after researcher Valentina Zharkova, a professor of mathematics at Northumbria University in England, looked further into variations in solar radiations predicting a significant drop of 60% in solar activity between 2030 and 2040.

It was first noticed by scientists about 170 years ago that the Sun’s activity varies over a cycle lasting around 10 to 12 years. Cycles do vary, but researches have yet to create a model that fully explains these fluctuations.

“The waves fluctuate between the northern and southern hemispheres of the Sun. Combining both waves together and comparing to real data for the current solar cycle, we found that out predictions showed an accuracy of 97%,” said Zharkova in a statement.

During Cycle 26, which covers the decade from 2030 to 2040, the two waves will become out of sync – causing a significant reduction in solar activity.

Image: Yohkoh/ISAS/Lockheed-Martin/NAOJ/U. Tokyo/NASA.

“When they are out of phase, we have solar minimums. When there is a full phase separation, we have conditions last seen during the Maunder minimum, 370 years ago,” explains Zharkova.

The Maunder minimum was a period in the 1600’s and early 1700’s of the “Little Ice Age” – a period that coincided with Europe and North America experiencing cooler than average temperatures.

Professor at the University of British Columbia Dr. John Innes, doesn’t think we should be jumping to any conclusions, “the direct link between the minimum in sunspot activity and the temperature cooling is not quite so definite.”

Innes explains in an email statement that there may have been other factors at work during the Maunder minimum, such as increased volcanic activity, which would have generated ash that reduced the amount of energy reaching the Earth’s surface.

“They are virtually all based on models, and models are often wrong …  the idea is interesting, and worth looking at more carefully,” says Innes.