Archive for the ‘Green’ Category

Hostess Liquidation Sets Off Online Twinkie Run

Wednesday, December 5th, 2012

As Hostess Brands, Inc. – the 80-plus-year-old baker of such iconic brands as Twinkies, HoHos, Hostess CupCakes, DingDongs and Wonder Bread – enters into liquidation and probable acquisition, the online price of the sugar-filled delicacies is soaring.

As soon as Hostess announced the end of production, Twinkie lovers started scrambling to buy every Twinkie they could find.  In some cases, the buyers visited eBay and Craigslist to sell their hoards – presumably at a significant profit.  Price tags for a box of 10 Twinkies hit four and even five digits; the retail price is in the $5 range.  One enterprising eBay seller, who is asking $10,000 for a box of Twinkies, has vowed to donate 10 percent of the proceeds to local charities.  Despite the conventional wisdom about the longevity of Twinkies, the majority purchased in mid-November will be past their sell date in approximately one month.

Chances are that another company will buy out Hostess Brands and breathe new life into its products.  The most likely purchaser is C. Dean Metropoulos and Co., a private-equity firm that specializes in resurrecting troubled heritage brands, such as Pabst Blue Ribbon beer, Bumble Bee Tuna, Chef Boyardee pasta, and PAM cooking spray.  Of greatest value to any purchaser are Hostess’ intellectual property rights, which allow the manufacture and sale of the brand’s flagship products.

Hostess’ intrinsic brand value means that acquiring the name is a good business decision for the eventual buyer. According to Michael J. De La Merced, “One company’s castoffs can be another company’s golden goose — or in this case, cream-filled confection.  It is a common trade in bankruptcy court.  Sellers are hoping to drum up cash for their creditors, and buyers are betting they can revive the brands.  Even though consumers are increasingly crunching on kale, Twinkies and other sugary snacks still make loads of money.  Hostess makes billions of dollars of sales each year.”

Ancient Harappan Civilization a Victim of Climate Change

Wednesday, July 11th, 2012

Climate change isn’t new. A recent study found that it destroyed an ancient civilization approximately 4,000 years ago. The gradual eastward movement of monsoons across Asia at first supported the formation of the Harappan civilization in the Indus Valley by allowing large-scale agricultural production, then wiped out the civilization as water supplies disappeared.  This the initial reasoning behind why the Indus valley flourished for 2,000 years, became home to large cities and an empire the size of modern Egypt and Mesopotamia, then dwindled to small villages and isolated farms.

The Harappan civilization, named after its largest city, Harappa, evolved approximately 5,200 years ago and reached its pinnacle between 4,500 and 3,900 years ago, occupying what is now Pakistan, northwest India and Eastern Afghanistan.  An urban society with major cities, a distinctive style of writing and extensive trade, the society accounted for roughly 10 percent of the world’s population at its height and equaled Egypt in its power.  The Harappans’ downfall came because they did not attempt to develop irrigation to support agriculture but relied on the yearly monsoons.  The civilization was largely forgotten until the 1920s when researchers began studying it in depth.

Antiquity knew about Egypt and Mesopotamia, but the Indus civilization, which was bigger than these two, was completely forgotten until the 1920s,” said Liviu Giosan, a geologist at Woods Hole Oceanographic Institution in Massachusetts.  “There are still many things we don’t know about them.”

Nearly 100 years ago, researchers found many remains of Harappan settlements along the Indus River and its tributaries and in a vast desert region.  There were signs of sophisticated cities, sea links with Mesopotamia, internal trade routes, arts and crafts, and writing that has not yet been deciphered.  “They had cities ordered into grids, with exquisite plumbing, which was not encountered again until the Romans,” Giosan said.  “They seem to have been a more democratic society than Mesopotamia and Egypt — no large structures were built for important personalities like kings or pharaohs.  Until now, speculations abounded about the links between this mysterious ancient culture and its life-giving mighty rivers,” Giosan said.

“Our research provides one of the clearest examples of climate change leading to the collapse of an entire civilization,” Giosan said.  The researchers first analyzed satellite data of the landscape influenced by the Indus and neighboring rivers.  Between 2003 and 2008, the researchers gathered samples of sediment from the Arabian Sea coast, the irrigated valleys of Punjab and the northern Thar Desert to find their source and ages and create a timeline of landscape changes.  “It was challenging working in the desert — temperatures were over 110 degrees Fahrenheit all day long,” Giosan said.

After collecting the necessary data, “we could reexamine what we know about settlements, what crops people were planting and when, and how both agriculture and settlement patterns changed,” said researcher Dorian Fuller, an archaeologist with University College London.  “This brought new insights into the process of eastward population shift, the change towards many more small farming communities, and the decline of cities during late Harappan times.”

The insolation — the solar energy received by the Earth from the sun — varies in cycles, which can impact monsoons,” Giosan said.  “In the last 10,000 years, the Northern Hemisphere had the highest insolation from 7,000 to 5,000 years ago, and since then insolation there decreased.  All climate on Earth is driven by the sun, and so the monsoons were affected by the lower insolation, decreasing in force.  This meant less rain got into continental regions affected by monsoons over time.”

For the next several centuries, Harappans seem to have fled along an escape route toward the Ganges basin, where monsoon rains remained reliable.  “We can envision that this eastern shift involved a change to more localized forms of economy — smaller communities supported by local rain-fed farming and dwindling streams,” Fuller said.  “This may have produced smaller surpluses, and would not have supported large cities, but would have been reliable.”

Public Transport Booms in the Recession

Monday, July 2nd, 2012

Soaring gas prices lured Americans out of their cars and onto public transportation, adding up to a five percent increase in ridership in the first three months of 2012, the most significant 1st quarter increase since 1999.  According to the American Public Transportation Association (APTA),  Americans took almost 125 million more rides on public transit in the first three months of 2012 than they did in the same timeframe last year.  Ridership declined following the September 11, 2001, terrorist acts and had remained more or less stagnant until last year.

“More people are choosing to save money by taking public transportation when gas prices are high,” said Michael Melaniphy, APTA president.  Gas prices aren’t the only reason for the growth, Melaniphy said.  With local economies rebounding, more people are commuting to new jobs, some of them on public transportation, he said.  “As we look for positive signs that the economy is recovering, it’s great to see that we are having record ridership at public transit systems throughout the country.”

There are a number of reasons why more Americans are using public transportation,” according to Melaniphy.  “For example, public transportation systems are delivering better, reliable service and the use of real-time technology, which many systems use, makes it easy for riders to know when the next bus or train will arrive.”  Melaniphy said the growing public transit ridership should encourage lawmakers to reach a deal on transportation spending.  Congress has been negotiating on a possible compromise on transportation spending for some time.  “As Congress is negotiating a federal surface transportation bill that is now more than 2 ½ years overdue, our federal representatives need to act to ensure that public transportation systems will be able to meet the growing demand,” Melaniphy said.  “It’s obvious from the surge in public transit ridership in the first quarter that Americans need and want public transportation.”

The federal Energy Information Administration said that U.S. gas demand fell to a 11-year low in the 1st quarter, at less than 8.5 million barrels a day; that was 1.5 percent less than the 2011 1st quarter average.  The drop came as the average nationwide price of regular-grade gasoline set a 1st quarter record of $3.60 a gallon, an increase of 9.6 percent when compared with last year.

APTA reported that all public transit modes saw 1st quarter increases, with light-rail use up by 6.7 percent and subways and elevated train ridership up by 5.5 percent.  Commuter rail ridership rose 3.9 percent, while bus ridership was up approximately 4.5 percent.

The fact that people are finding new jobs helped, said Melaniphy.  Ridership on heavy rail — subways and elevated trains — rose in 14 of 15 systems.  Use of light rail — streetcars and trolleys  rose in 25 of 27. And 34 of 37 large cities saw increases in bus ridership.  “It’s nationwide,” Melaniphy said, resulting in fuller trains and buses straining system capacity.

Fully 12 cities saw their highest ridership ever, including New York; Boston;  Oakland; San Diego; Charlotte; Tampa; Indianapolis; Ann Arbor, MI; Fort Myers, FL; Ithaca, NY; and Olympia, WA.

Germany Runs Half the Country on Solar Power

Tuesday, June 26th, 2012

During a spell of extremely sunny weather, – on Saturday, May 26 — the solar-energy record by sourcing nearly 50 percent of its daytime electricity needs from sunshine.  According to Germany’s Institute of the Renewable Energy Industry (IWR), solar power plants produced an unprecedented 22 gigawatts of electricity, approximately the same amount generated by 20 nuclear power stations operating at full capacity.  Even on days when it’s not setting records, Germany has formidable solar numbers.  On Friday, May 25, while its citizens were at work and its power-hungry factories were running, one-third of Germany’s power was produced by solar plants.  The German government plans to move to 100 percent renewable energy by 2022.

Germany decided to abandon nuclear power after the 2011 Fukushima nuclear disaster, closing eight plants immediately.  They will be replaced by renewable energy sources such as wind turbines, solar and bio-mass.

“Never before anywhere has a country produced as much photovoltaic electricity,” said Norbert Allnoch, IWR director.  “Germany came close to the 20 gigawatt (GW) mark a few times in recent weeks.  But this was the first time we made it over.”  Germany has nearly as much installed solar-power generation capacity as the rest of the world combined and gets nearly four percent of its annual electricity needs from the sun alone.  Its goal is to cut its greenhouse gas emissions by 40 percent from 1990 levels by 2020.

According to critics, renewable energy is not reliable enough nor is there enough capacity to power major industrial nations like Germany.  Chancellor Angela Merkel disagrees, noting that Germany is eager to demonstrate that is indeed possible.  The jump above the 20 GW level was due to increased capacity and bright sunshine nationwide.

Government-mandated support for renewables has helped Germany became a world leader in renewable energy.  The incentives provided through the state-mandated feed-in-tariff (Fit) are not without controversy, however. The tariff is the main support for the industry until photovoltaic prices fall further to levels similar for conventional power production.

Utilities and consumer groups have complained the tariffs for solar power adds about two cents per kW/h on top of electricity prices in Germany that are already among the highest in the world, with consumers paying about 23 cents kW/h, compared to New York, which pays 17.50 cents KW/h or Phoenix at 9.9 cents kW/h.  German consumers pay about €4 billion per year on top of their electricity bills for solar power, according to a 2012 report by the country’s environment ministry.  Critics also complain that employing increasing levels of solar power makes the national grid less stable due to fluctuations in output since Germany is not renowned for its sunny climate.

2012 CoreLogic Storm Surge Report Contains Some Surprises

Monday, June 18th, 2012

Which American city is at the greatest financial risk from a hurricane?  If you think it’s New Orleans or Miami, you’re wrong.  According to CoreLogic, a data analysis firm, it’s New York City that is at the greatest risk, both from the number of properties impacted and the dollar value of the damage.  The area also includes Long Island and northern New Jersey.

“The summer of 2011 gave us some startling insight into the damage that even a weak storm can cause in the New York City metro area,” CoreLogic vice president Howard Botts said.  “Hurricane Irene was downgraded to a tropical storm as it passed through New Jersey and New York City, but the impact of the storm was still estimated at as much as $6 billion.  Economic losses mounted swiftly as businesses shuttered, the New York City mass transit system came to a sudden halt and emergency response teams were called into action to prepare for the worst.”

A hurricane is more likely to make landfall in Miami than New York, according to Colorado State University research.  The odds are 5.3 percent for Miami and 0.2 percent for New York City.  Fast forward a half century in the future and the odds rise to 95.5 percent for Miami and 6.6 percent for New York.  Despite the discrepancy in numbers, the risk exists, especially from flooding.  While most people associate hurricane damage with wind, the storm surge from rising waters caused by cyclones has the greatest impact.

That fact became evident to residents of the northeast after last summer’s Hurricane Irene.  Although the insured impact of Irene on New York City was relatively mild, one of the insurance industry’s nightmares has always been a major hurricane traveling up the Hudson River and striking the city and its suburbs.  Some estimates believe that an event of this magnitude could cause $100 billion just in insured losses, with economic damage greater than that.  According to CoreLogic estimates, the property at risk in the New York City area is worth some $168 billion.

Core Logic said that more than four million homes in the United States are at risk from hurricane-related flood damage, with more than $700 billion in property potentially vulnerable.  There are 2.2 million homes worth more than $500 billion at risk along the Atlantic coast; another 1.8 million homes worth $200 billion are imperiled along the Gulf coast.  Approximately 35 percent of the at-risk homes are in Florida and another 12 percent or so in Louisiana, the firm said.  In terms of value of property, more than 40 percent of the risk is concentrated in Florida and New York.

The 2012 CoreLogic Storm Surge report provides the first-ever analysis of residential property hurricane risk along the Atlantic and Gulf coasts broken down by region and by state, as well as a snapshot of the risk by metro areas.

Though more frequently impacted states like Florida, Texas and Louisiana get the most attention when it comes to hurricane vulnerability and destruction, Hurricane Irene made it very clear last summer that hurricane risk is not confined to the southern parts of the country,” Botts said.  “That’s why we felt it was important this year to highlight storm-surge risk in a brand new way to establish a better understanding of exposure throughout the states that are most at risk of a direct hurricane hit.  As we got a glimpse of during Irene, our 2012 report shows even a Category 1 storm could cause property damage in the billions along the northeastern Atlantic Coast and force major metropolitan areas to shut down or evacuate.”

CoreLogic created its Storm Surge Report to improve understanding of the risk that it poses to homes in areas prone to tropical storms.  Storm surge is triggered by the high winds and low barometric pressure associated with hurricanes, which cause water to mass inside a storm as it moves across the ocean before releasing as a powerful rush overland when the hurricane moves onshore.

“The data we compile is useful for insurance providers and financial services companies, to help them better understand potential exposure to damage for homes — particularly those that do not fall into designated FEMA Special Flood Hazard Areas,” Botts said.  “Homeowners who live outside of high risk flood zones are not required to carry flood insurance under the National Flood Insurance Program (NFIP), and may not be fully aware of the risk storm surge poses to their home or property.  When a storm strikes the coast, storm-surge flooding can inundate homes far inland and cause significant losses from powerful surge waters, damaging debris and standing water left behind.”

Bonn Climate Change Summit Has Its Own Storm Clouds

Monday, June 4th, 2012

Disagreement emerged early during the latest round of international climate change talks in Bonn, with the European Union (EU) and developing countries clashing over the future of the Kyoto protocol.  Under the terms of last year’s Durban Platform, the EU had agreed to sign an extension of the Kyoto protocol before it lapses at the end of this year in return for an agreement from all nations that a new binding treaty will be completed by 2015 and enacted by 2020.

Climate negotiators want to build on the progress achieved in Durban last year, like the agreement on a second commitment period for the Kyoto Protocol, a treaty which limits the emissions of most developed countries but which expires at the end of this year.  The length of the second commitment period is one of the issues under discussion in Bonn.  Unfortunately, Kyoto plays an progressively more marginal role in the climate-change issue because it doesn’t include the biggest emitters of carbon dioxide and other gases that contribute to global warming.  The United States exited Kyoto, claiming it was unfair because it didn’t impose any emissions reductions on fast-growing developing nations such as China and India.  Canada also said it would withdraw from the treaty last year.

Last year’s United Nations (U.N.) climate talks in Durban supported a package of measures which would ultimately force the world’s polluters to take legally binding action to slow the pace of global warming.  Delegates agreed on the “Durban Platform for Enhanced Action” – a process that would apply to all parties under the U.N.’s climate convention.  A clear timetable and targets have not yet been set.  “Parties need to think between now and Doha how they want to organize their work between now and 2015 and how they will move towards that legal agreement,” Christiana Figueres, executive secretary of the U.N.’s Framework Convention on Climate Change, said.  “My hope is they will establish milestones along the way so they are able to measure their progress.”

Figueres cited new research that predicts that the Earth’s temperature could rise by as much as five degrees Celsius (41 degrees Fahrenheit) from pre-industrial levels on current pledges.  “We still have a gap remaining between intent and effort,” Figueres said.

Additional issues discussed in Bonn and at a larger climate change conference in Qatar later this year include implementing an extension to the Kyoto Protocol; how long that will last; how to raise ambition on emissions cut pledges, as well as raising long-term financing to help vulnerable countries adapt to the harmful effects of climate change.

The treaty currently being negotiated would require all nations to curb warming.  Identifying those requirements is the primary challenge, which is why negotiators are focusing on solving incremental, less contentious issues before moving on.  “First and foremost we have to ensure that there is no backtracking on what was agreed in Durban,” said Christian Pilgaard Zinglersen, a Danish official representing the European Union.  Climate activists warned that potentially disastrous consequences of global warming, including floods and droughts and rising sea levels, will be impossible to prevent unless the pace of negotiations accelerates.  “If you look at the science, we’re spending time we don’t have,” said Tove Ryding, Greenpeace’s climate policy coordinator.

We have all the means at our disposal to close the gap, and the long-term objectives of governments remain attainable,” Figueres said.  “But this depends on stronger emissions reduction efforts, led by industrialized countries.  A sufficient level of ambition to support developing country action, concrete and transparent implementation, today, tomorrow and into the foreseeable future, is the answer.  Progress here in Bonn can give countries the confidence they need to push ahead with national climate policies.  In turn, many countries are beginning to adopt ambitious climate change legislation, which is sending good signals to the international negotiations.  All of this can give society and businesses confidence to act faster themselves.”

Antarctic Ice Melting Faster Than Thought

Wednesday, May 16th, 2012

In a sign that global warming is a reality, a new study reveals that ice shelves in western Antarctica are melting at a faster pace than previously known. Data collected by a NASA ice-watching satellite show that the ice shelves are being eaten away from below by ocean currents, which have been growing warmer even faster than the air above.  Launched in January of 2003, NASA’s ICESat (Ice, Cloud and Land Elevation Satellite) studied the changing mass and thickness of Antarctica’s ice from polar orbit.  An international research team used more than 4.5 million surface height measurements collected by ICESat’s GLAS (Geoscience Laser Altimeter System) instrument between October of 2005 and 2008.  The conclusion was that 20 of the 54 shelves studied — nearly half — were losing thickness.

Melting of ice by ocean currents can take place when air temperature remains cold, maintaining a steady process of ice loss — and ultimately a rise in the sea level.  “We can lose an awful lot of ice to the sea without ever having summers warm enough to make the snow on top of the glaciers melt,” said Hamish Pritchard of the British Antarctic Survey and the study’s lead author.  “The oceans can do all the work from below.”  The study also found a shift in Antarctica’s winds as a result of climate change.  “This has affected the strength and direction of ocean currents,” Pritchard said.  “As a result warm water is funneled beneath the floating ice.  These studies and our new results suggest Antarctica’s glaciers are responding rapidly to a changing climate.  We’ve looked all around the Antarctic coast and we see a clear pattern: in all the cases where ice shelves are being melted by the ocean, the inland glaciers are speeding up.  It’s this glacier acceleration that’s responsible for most of the increase in ice loss from the continent and this is contributing to sea-level rise.”

Antarctica contains adequate ice to raise sea levels by approximately 187 feet, although it’s unlikely to melt for thousands of years, according to the United Nations.  Some ice shelves are thinning by a few meters a year, and glaciers in response are draining billions of tons of ice into the sea, Pritchard said.  “Most profound contemporary changes to the ice sheets and their contribution to sea level rise can be attributed to ocean thermal forcing that is sustained over decades and may already have triggered a period of unstable glacier retreat.”

Some ice shelves are thinning just a few feet a year, and glaciers drain billions of tons of ice into the sea as a result.  “This supports the idea that ice shelves are important in slowing down the glaciers that feed them, controlling the loss of ice from the Antarctic ice sheet,” Pritchard said.

While conducting the study, the researchers measured how ice shelf height changed, using computer models to check changes in ice thickness due to natural snow accumulation.  Additionally, they used a tide model that eliminated height changes due to rising tides.  “This study shows very clearly why the Antarctic ice sheet is currently losing ice, which is a major advance,” said Professor David Vaughan, the leader of ice2sea.  The study is significant because it shows the key to predicting how an ice sheet might change in the future.  “Perhaps we should not only be looking to the skies above Antarctica, but also into the surrounding oceans,” Vaughan added.

Tom Wagner, cryosphere program scientist at NASA, said that the study demonstrates how “space-based, laser altimetry” can expand scientists understand of the earth.  “Coupled with NASA’s portfolio of other ice sheet research using data from our GRACE mission, satellite radars and aircraft, we get a comprehensive view of ice sheet change that improves estimates of sea level rise.”

“When ice shelves completely collapse — and we’ve seen that before — the grounded glaciers behind them will speed up; we know that,” said co-author Helen Amanda Fricker of Scripps Institution of Oceanography at the University of California San Diego.  “But what this study is showing, which is very new, is that you don’t need to lose the shelf entirely for this to happen; just a reduction in the thickness of the ice shelf is enough to allow more of the grounded ice behind it to flow off the continent.”

EPA Putting the Lid on Coal-Fired Power Plants

Monday, April 16th, 2012

The Environmental Protection Agency (EPA) announced new greenhouse-gas standards for power plants, following through with the authority conferred by a 2007 Supreme Court ruling declaring carbon dioxide a pollutant under the Clean Air Act.  The new regulation effectively bans new coal-fired power plants unless they capture and sequester carbon dioxide.  Advanced natural-gas plants would meet the standard without mitigation, while existing power plants would be grandfathered in.  The regulation would require new power plants to emit no more than 1,000 pounds of CO2 per megawatt‐hour of electricity generated.

What are the implications?  It is clear that the short-term impact will be minimal: cheap natural gas derived from plentiful shale deposits is already overtaking coal as a source of power.

An average coal-fired plant generates more than 1,700 pounds of carbon dioxide per megawatt. The majority of natural-gas fired plants – and the bulk of power plants currently under construction – emit less than the new standard, approximately 800 pounds per megawatt.

Environmentalists praised the proposed restrictions, while the coal industry warned that the change would lead to higher electricity prices.  “Today we’re taking a common-sense step to reduce pollution in our air, protect the planet for our children, and move us into a new era of American energy,” said EPA Administrator Lisa Jackson.  “We’re putting in place a standard that relies on the use of clean, American-made technology to tackle a challenge that we can’t leave to our kids and grandkids.”  Currently, there is no consistent national limit on the amount of carbon emissions that new power plants can release.  According to an EPA fact sheet, the agency was obliged by the landmark 2007 Supreme Court ruling “to determine if (the emissions) threaten public health and welfare.”  In December of 2009, the EPA formally confirmed that greenhouse gases “endanger the public health and welfare of current and future generations.”

Older coal plants have already been going offline, thanks to low natural gas prices and weaker demand for electricity. Nevertheless, some accused the Obama administration of clamping down on low-priced, domestic energy sources and said the regulation raises questions about the seriousness of the president’s pledge for an “all-of-the-above” energy policy.  “This rule is part of the Obama administration’s aggressive plan to change America’s energy portfolio and eliminate coal as a source of affordable, reliable electricity generation,” said Representative Fred Upton, (R-MI), chairman of the House Energy and Commerce Committee.  “EPA continues to overstep its authority and ram through a series of overreaching regulations in its attacks on America’s power sector.”

“There are areas where they could have made it a lot worse,” said Scott Segal, director of the Electric Reliability Coordinating Council, a coalition of power companies.  Nevertheless, “the numerical limit allows progress for natural gas and places compliance out of reach for coal-fired plants” not planning to capture carbon dioxide.  Steve Miller, CEO and President of the American Coalition for Clean Coal Electricity, a group of coal-burning electricity producers, was more negative about the proposal.  “The latest rule will make it impossible to build any new coal-fueled power plants and could cause the premature closure of many more coal-fueled power plants operating today,” Miller said.

Writing for Reuters, John Kemp, Senior Market Analyst, Commodities and Energy notes that “Because natural gas is currently so much cheaper than coal, the agency projects gas-fired units will be the facilities of choice until at least 2020.  ‘Energy industry model ling forecasts uniformly predict that few, if any, new coal-fired power plants will be built in the foreseeable future,’ according to the proposed rule.  The key word is ‘foreseeable’.  No one can predict the economics of natural gas as far ahead as 2020, let alone 2030.  Recent development of abundant gas reserves through fracking may have caused prices to plunge, leading to a ‘golden age of gas’, but just seven years ago the industry was gripped by panic about gas production peaking and thought America stood on the brink of needing to import increasing quantities of expensive gas.”

Jeff Goodell of Rolling Stone writes “So this new rule is, at best, a baby step in the right direction.  As always with the climate crisis, physics is moving much faster than politics.  Just yesterday top scientists warned that global warming is close to irreversible now. In the biggest sense, we’re still doing next to nothing to confront this crisis.  Global carbon pollution is rising faster than ever, and the weather – to say nothing of future our future climate – is getting wilder.  The urgency of our situation just underscores the need for an economy-wide price on carbon, or cap-and-trade system, which would impact all major emissions sources and actually limit the amount of carbon we dump into the atmosphere, rather than just speeding the shift from coal to gas.  Still, this is an important moment, a small sign of progress.  Goodbye, Mr. Coal.  Don’t let the door hit you on the way out.”

Office Buildings Have to Get on the Smart Grid

Monday, April 9th, 2012

Back in the day, hot weather that overtaxed the power grid meant that office buildings had to turn down the air conditioning to save electricity.  Meanwhile, the employees would notice that their surroundings were getting appreciably warmer.  Today – because more buildings are connected to a smarter grid – fewer adjustments need to be made, one of those perhaps switching to a secondary power source that happens to be roof-mounted solar panels.

Writing in National Real Estate Investor magazine, Managing Editor Susan Piperato says that “Smart grids are digital networks connecting utilities, power-delivery systems and buildings.  Traditionally, when local energy grids become overtaxed, what’s referred to as a ‘demand event,’ a utility company must fire up additional power stations to meet increased demand or ask its biggest users to reduce power consumption.  If a building that isn’t connected to the smart grid receives notice of a demand event, the building engineer must manually reduce the building’s power usage in some way.  When the event is over, the engineer manually revs up the building’s energy consumption level again.  But by connecting to smart grids and utilizing demand response (DR) systems, a building can determine automatically through its building management system (BMS) how much electricity it needs at various times of day.

DR systems manage buildings’ consumption of electricity in response to supply conditions and respond to a utility company’s demand event by automatically reducing the amount of power being used or starting on-site power generation through, say, a solar panel array or wind turbine.”

Unfortunately, the majority of commercial owners are not using these highly beneficial advanced technologies.  A surprising finding of a preliminary CoR Advisors survey found that most commercial building owners and managers aren’t even thinking of connecting.

According to the survey, a mere 19 percent of buildings have some type of automated connection to the smart grid, while 32 percent are using DR systems.  CoR Advisors President and CEO Darlene Pope said that 68 percent of building owners are not planning to connect their buildings to automated smart grid systems within the next three to five years.  The survey, commissioned by Continental Automated Buildings Association and conducted by CoR Advisors, asked 25,000 commercial building owners and managers about “their attitude about smart buildings and the smart grid,” according to Pope.  The survey included approximately 12,000 buildings comprising approximately 1.2 billion SF.

A Federal Energy Regulatory Commission rule allows smart buildings in DR programs to recoup some of their investment by shedding load during times of non-peak demand and selling that excess capacity at spot pricing, Pope said.  “So if the price of electricity is $200 a kilowatt hour in Dallas in the middle of August because there’s such a demand for capacity…and you have a building in New Jersey that you want to shed load, and you want to sell it in Texas, you can do that.  While you might be paying 17 cents per kilowatt hour in New Jersey, you can sell it to people in Dallas for $200.”

Why don’t more building owners take advantage of these new technologies?  According to John Bredehorst, executive vice president with WSP Flack+Kurtz, his firm’s “more responsible clients” are already participating.  Additionally proactive clients are the exception and not the rule.  The smart grid is still nascent, and “No one wants to be a guinea pig.”  Many existing buildings lack the infrastructure to support the technology, meaning that their owners don’t have the ability to cut electrical consumption by switching to a secondary power source.

California is showing the most interest in smart-grid buildings, according to WSP Flack+Kurtz’s Clark Bisel, senior vice president.  He is managing the construction of 350 Mission Street in downtown San Francisco, which hopes to acquire LEED-Platinum status and will have smart grid connection and DR systems.  It’s a multifaceted project, but Bisel sees 350 Mission as “a very good idea from the building owner’s perspective.  Demand response is becoming more of a discussion topic,” he said, with the big electrical utilities directly approaching customers and offering competitive rates for DR programs.

“California is a hotbed for this,” Bisel said.  “Frankly, the utility crisis of 2000 is still in people’s minds, so that may be more of a reason why (smart grid and DR) is active out here.”  If there was more new construction, according to Bisel, the smart grid and DR would take off even faster.  With new construction, “You have architects, engineers, and developers all very engaged in the whole dialogue so when a utility company approaches, they find a very receptive audience.”  When it comes to existing buildings, operations are decentralized and owners are “more interested in making tenants comfortable.”

WSP Flack+Kurtz’s Bredehorst says it will take 10 years for the smart grid and DR to catch on.  With more smart buildings being constructed, he said, existing buildings will need to make alterations to stay competitive.  Additionally, utility companies need become more proactive: “They make it easy for building owners to be able to upgrade their building’s control system and tie it in with a utility,” he said.  “They need to make it so that it’ll be enough of an incentive for (buildings) to reduce their load, but also easy enough that (building owners) don’t have to tie (DR) in with their entire infrastructure and change out a lot of equipment.”

Want an Energy Efficient Home? Push the Green Button

Wednesday, March 28th, 2012

Want more control over electrical use in your home?  The Green Button Initiative might be the answer. “Imagine being able to shrink your utility bill, or knowing the optimal size and cost-effectiveness of solar panels for your home, or verifying that energy-efficiency retrofit investments have successfully paid for themselves over time” said Aneesh Chopra, Chief Technology Officer for the United States.  “Far too often these and similarly important — and potentially money-saving — opportunities are unavailable to us.  Why?  Because consumers haven’t had standard, routine, easy-to-understand access to their own energy usage data.”

To help achieve that goal, the Obama Administration recently announced a major step forward in solving this problem.  According to Chopra, “I announced the launch of the Green Button initiative, an Administration-led effort based on a simple, common-sense goal: provide electricity customers with easy access to their energy usage data in a consumer-friendly and computer-friendly format via a ‘Green Button’ on electric utilities’ website.  With this information in hand, customers can take advantage of innovative energy apps to help them understand their energy usage and find ways to reduce electricity consumption and shrink bills, all while ensuring they retain privacy and security.”

Access to household energy use data is key to helping consumers conserve energy and save money. Because Green Button is available to everyone, it is already driving innovation among website and software developers interested in using that standard to provide innovative services –  from information about how to save energy or choose appropriately sized solar panels to fun Facebook apps.  Additionally, the Green Button is likely to support a new generation of interactive thermostats and virtual energy audits that will recommend retrofits that will improve efficiency in homes and businesses.

“Green Button marks the beginning of a new era of consumer control over energy use, and local empowerment to cut waste and save money,” Chopra said.  “With the benefits of open data standards, American app developers and other innovators can apply their creativity to bring the smart grid to life for families — not only in California but in communities all across the nation.”

Writing for the View on Energy blog, Jeanne Roberts says that “What it means for consumers is a way to monitor and take charge of their home energy use, via computer technology, and hopefully to lower monthly utility bills as a result. In short, a little bit of ‘green’ technology that could allow consumers to save a lot of green if used wisely.  As an added advantage, energy use reduction nationwide by residential consumers could help the nation reach Obama’s stated goals of energy security (by reducing dependence on oil) and energy efficiency, both of which lead to a ‘clean’ energy future.”

Philip Henderson of has an interesting take on the Green Button after perusing his difficult-to-read electric bill.  “If my bill were entered in a Worst Utility Bill contest, it probably would not win — I’ve seen some that are worse.  This is NOT to denigrate my utility company — it is a power company, after all, not a design shop.  This is why the ‘Green Button Project is so interesting and important.  It’s based on a simple concept — give the customer his or her utility billing information in a form that actually usable.  Click a green button on the utility’s website and billing data is delivered and can be used by various apps.  With my data, I will be able to use any billing presentation system I want — I can find the one that suits me best.  (Can’t you hear the iPhone developers tapping away to create cool new tools?)”