- New Battery System Could Reduce Buildings’ Electric Bills
- Insurance Firms Increasingly Investing in Wind Energy in Europe
- Record-Breaking Welsh Wind Farm Approved
- Most-Polluted Cities in US (American Lung Association Infographic), & Find Out Your City’s Grade
- 3 Charts Showing that Solar Power Has Hit a Tipping Point
- U.S. Navy Rides the Terahertz Wave to Next-Gen Electronics
- 500 Years of Underground Carbon Storage Mapped across North America
- Eight Automakers Agree on Standardized Electric Vehicle Charging
- Turning Tons of Food into Energy
- Toyota Unveils First All-Electric SUV
- L.A. Can Generate 5 GW of Solar Power Using Virtually No Additional Land, Minimal Backup
- One Plug to Rule them All: New EV Fast-Charging System
- FLOW Energy Winners
- Solar Schools Goes National (UK)
- Largest Low-Concentrating Solar PV Power Plant in World Now Online
Posted: 09 May 2012 07:16 AM PDT
An operating prototype zinc-anode battery system has been developed and is now housed in the basement of Steinman Hall on The City College of New York campus. It consists of 36 individual 1-kWh nickel-zinc flow-assisted cells strung together and operated by a sophisticated advanced battery management system (BMS) that controls the charge/discharge protocol that could eventually lead to a battery capable of more than 5,000 to 10,000 charge cycles and a useful life exceeding ten years.
"This is affordable, rechargeable electricity storage made from cheap, non-toxic materials that are inherently safe," said Dr. Sanjoy Banerjee, director of the CUNY Energy Institute and distinguished professor of engineering in CCNY's Grove School of Engineering. "The entire Energy Institute has worked on these batteries – stacking electrodes, mounting terminals, connecting to the inverters – and they are going to be a game changer for the electric grid."
The prototype is being expanded currently to 100 kWh with another 200 kWh expected to be installed later this year, at which point it will be capable of meeting more than 30 percent of Steinman Hall’s peak-demand power needs, providing the college with savings of $6,000 or more per month.
A battery such as this could be installed in industrial facilities and large commercial properties, and can be produced for a cost of approximately $300 to $500 per kWh range, which amounts to a payback period of three to five years for many applications.
Source: City College of New York
Posted: 09 May 2012 07:06 AM PDT
Wind energy is cheap. As such, it’s a good long-term investment for companies with the capital… such as insurance companies. A new analysis from PwC, European Power & Renewables Deals. Quarterly M&A outlook, finds that insurers and other financial bodies are increasingly looking to wind energy in Europe for secure, long-term investment.
“Insurers, infrastructure funds and other financial investment entities are becoming active bidders in sales of European power assets, particularly electricity and gas distribution network companies but, also increasingly, existing or 'just operational' windpower projects,” PwC Russia writes. “At a time of low interest rates and market investment uncertainty, such assets offer stable, long-term and predictable returns.”
PwC predicts that solar will start to fill this role and receive these investments to a much larger degree soon as well.
“Looking ahead, solar PV looks set to breakthrough and feature more prominently in renewables deal flow. The sector has been hit by subsidy cuts but, as subsidies are being cut, prices are actually falling more rapidly. This is enabling the sector to remain attractive as a source of predictable long-term returns in jurisdictions where investors can have confidence that any future changes will not be applied retrospectively.”
Posted: 09 May 2012 06:50 AM PDT
While this 299-MW wind farm just approved for South Wales doesn’t compare with the insane 20,000-MW Gansu Wind Farm slowly being built in China, or even the 3,000-MW Alta Wind Energy Center that leads the US (currently at 1,020 MW of capacity), the Pen Y Cymoedd Wind Energy Project is huge by most standards and will be the largest wind farm in England and Wales when completed by Vattenfall.
The wind farm was just approved by energy minister Charles Hendry. The wind farm, located between Neath and Aberdare, will include 76 wind turbines and is expected to generate enough electricity for up to 206,000 homes a year.
The wind farm is expected to increase renewable electricity generation in Wales by a considerable 37%.
“Onshore wind plays an important role in enhancing our energy security,” said Hendry. “It is the cheapest form of renewable energy and reduces our reliance on foreign fuel. This project in South Wales will generate vast amounts of home-grown renewable electricity and provide a significant benefits package for the local community.”
The wind farm, likely to be completed and turned on in 2016, is projected to be in operation for 25 years and is expected to pump £1 billion into the Welsh economy in its three years of construction.
Vattenfall still needs to review the energy secretary’s letter of consent, but a positive decision and final construction plans are expected soon.
Neither of the communities directly affected by the wind farm were opposed to it.
Posted: 09 May 2012 06:27 AM PDT
But first… in case your city isn’t on the infographic below and you’re curious about what grade it got, you can enter your zipcode in the widget below and find out your city’s grade.
Now, on to the infographic and guest post (hold down the ‘ctrl’ or ‘command’ key and click ‘+’ to enlarge the infographic):
Good News is in the Air (And Some Not-so-good, Too)
Ever wonder how safe the air is to breathe where you live? As a pulmonologist and as chair of the American Lung Association's National Board of Directors, I'm pleased to share with you the best tool we have for answering that question: the American Lung Association's State of the Air 2012 report.
We just released this, our 13th annual report card on April 25. With the latest report, we found a mix of really good and not-so-good news.
But first, let me give you 5 quick reasons why healthy air should matter to you:
State of the Air 2012 report covers the years 2008-2010, the most recent quality-reviewed data available. So what did we find? Here's the good news:
And now, the not-so-good news:
As you can see, we have plenty of work to do to ensure that all Americans, not just some of us, breathe clean, healthy air.
Too many people continue to develop life-threatening health conditions; many even die prematurely, as a result of inhaling dirty air. Unfortunately, some in Congress are steadfast in their efforts to weaken the Clean Air Act.
We urge you to join in our fight for healthy air. Start by finding the grade your community got in our latest State of the Air report. Type in your zip code in the report card above.
Contact your members of Congress and ask them to support the Clean Air Act—including standing up against any actions to weaken, block or delay full implementation of this lifesaving law.
We also want to hear why you want healthy air. Share with us why healthy air means so much to you and your family. Learn more about how air pollution impacts your community by visiting: www.stateoftheair.org.
Together, we can have truly healthy air in our future, but that all depends on our actions today. We look forward to hearing from you.
Albert A. Rizzo, MD
Posted: 09 May 2012 05:48 AM PDT
As the economics of solar PV continue to improve steadily and dramatically, McKinsey analysts conclude that the total “economic potential” of solar PV deployment could reach 600-1,000 gigawatts (1 million megawatts) by 2020.
In the year 2000, the global demand for solar PV was 170 megawatts.
That doesn’t mean 1 million megawatts will get installed by 2020; it’s just an estimate of the economic competitiveness of solar PV. When factoring in real-word limitations like the regulatory environment, availability of financing, and infrastructure capabilities, the actual yearly market will be closer to 100 gigawatts in 2020.
That could bring in more than $1 trillion in investments between 2012 to 2020.
The McKinsey report, appropriately named "Darkest Before Dawn," highlights three crucial factors that are giving the solar industry so much momentum — even with such a violent shakeout occurring in the manufacturing sector today.
1. Because solar mostly competes with retail rates, the economic potential for the technology in high resource areas is far bigger than actual deployment figures would suggest. McKinsey predicts that the cost of installing a commercial-scale solar PV system will fall another 40 percent by 2015, growing the "unsubsidized economic potential" (i.e. the economic competitiveness without federal subsidies) of the technology to hundreds of gigawatts by 2020.
2. The most important cost reductions in the next decade will come not through groundbreaking lab-scale improvements, but through incremental cost reductions due to deployment. The McKinsey analysis shows how the dramatically these cumulative cost improvements can change the economics of solar. (For more, see: Anatomy of a Solar PV System: How to Continue "Ferocious Cost Reductions" for Solar Electricity.)
3. Solar is already competitive in a variety of markets today. As the chart below illustrates, there are at least three markets where solar PV competes widely today: Off-grid, isolated grids, and the commercial/residential sectors in high-resource areas. Of course, the competitiveness of the technology varies dramatically depending on a variety of local factors. But this comparison shows just how steadily the cross-over is approaching.
Wait, solar is actually competitive? Didn't the death of Solyndra mean the death of the solar industry? Addressing the solar skeptics, the McKinsey analysts counter the notion that the solar sector is down for the count:
The short-term picture for solar is extraordinarily challenging, particularly for manufacturers trying to figure out how to make a profit with such a massive oversupply of panels on the market. But this is not an industry in its death throes; these are natural pains for a disruptive, fast-growing industry. The tipping point is upon us.
This article was originally published on Climate Progress and is republished with permission.
Posted: 09 May 2012 05:00 AM PDT
The U.S. Navy is behind a push to exploit one of the "hottest" areas of the electromagnetic spectrum, the terahertz band. The Office of Naval Research contributed to a breakthrough project at Lawrence Berkeley National Laboratory last fall with the help of graphene nanoribbons, and just last month a team of ONR-funded researchers at the University of Notre Dame announced another new milestone.
The attraction of the terahertz band
Terahertz waves are situated between the microwave and optical light frequencies, at the "farthest end of the far infrared." In communications, they could transmit far greater amounts of information than either radio waves or microwaves.
In imaging, terahertz frequencies could lead to the development of diagnostic equipment that avoids the health risks of x-rays.
However, expanding the real-world applications of this part of the spectrum has been stuck for want of a material that can be used to manipulate terahertz waves with precision.
Graphene and terahertz waves
The terahertz worm began to turn in 2004, when a team of researchers in the U.K. literally used sticky tape to lift a one-atom thin sheet of carbon from a chunk of graphite.
Called graphene, the new material possesses outsized strength and unique electrical properties, which have made it the focus for bringing about the next generation of super fast, super small, flexible and even transparent electronic devices.
As Notre Dame researcher Berardi Sensale- Rodriguez explained in a prepared statement:
"A major bottleneck in the promise of THz technology has been the lack of efficient materials and devices that manipulate these energy waves. Having a naturally two-dimensional material with strong and tunable response to THz waves, for example, graphene, gives us the opportunity to design THz devices achieving unprecedented performance."
Graphene nanoribbons to the rescue
Last fall's breakthrough at Lawrence Berkeley involved the fabrication of graphene nanoribbons, made by etching patterns into a sheet of carbon atoms laid over a silicon oxide substrate. An overlay of ion gel was used to complete the gated structure of a semiconductor system.
The team was able to "tune" or manipulate the ribbons in to control the movement of electrons. This collective movement, or oscillation, of electrons is referred to as a plasmon.
According to Berkeley research Feng Wang, plasmons can be observed by eye, in the unique glow from medieval-era stained glass which is caused by electrons oscillating on the surface of metal nanoparticles including gold and copper.
A similar effect occurs in graphene but at lower frequencies, which are not visible to the naked eye.
The Berkeley team discovered that altering the width of the graphene nanoribbons will cause the electron waves to "slosh" back and forth at different frequencies, which makes the ribbons absorb different frequencies of light.
The demonstration marked a step along the way to practical, real-world applications partly because the team was able to measure the difference in absorption rates at room temperature, in contrast to other research tracks that require temperatures in the absolute zero range.
The findings of the Notre Dame team, published in mid-April, also involved the development of a practical, room-temperature operation. The team was able to demonstrate proof of concept for a graphene based modulator, building on previous research into the use of an electron gas to manipulate terahertz waves.
The idea of using an electron gas dates all the way back to 2006, so given the pace of research in both the Berkeley and Notre Dame cases a practical graphene/terahertz device is far from bouncing out of the laboratory door and onto retail shelves.
Aside from challenges within the research itself, the commercialization of graphene devices depends on the development of cost effective methods for fabricating mass quantities of graphene, and sticky tape will only get you so far. At this point there have been some promising developments, but the goal has proved elusive.
Not to worry, though – the Navy is all over that one, too. Through a separate ONR-funded program, researchers at Rice University are developing a simple, one-step process for creating nanoscale graphene discs.
Follow me on Twitter: @TinaMCasey.
Posted: 09 May 2012 02:24 AM PDT
North America has at least 500 years of underground carbon dioxide (CO2) storage capacity, according to the North American Carbon Storage Atlas (NACSA). The project, a joint venture between the United States, Canada, and Mexico, is the first-ever atlas to map out potential storage sites.
While the atlas includes high- and low-range estimates, the low (and more realistic) end finds 136 billion metric tons of storage in oil and gas fields, 65 billion metric tons in coal fields, and 1.7 trillion metric tons in saline reservoirs. Combined, these sites represent over 500 years of storage.
Key Data Merged
In addition to mapping out potential storage sites, NACSA also plots the locations of 2,250 large, stationary carbon dioxide sources, mainly large fossil-fuel burning power plants. The combined data has been used to create an online viewer and website, and integrates contributions from the 400 organizations in the U.S. Department of Energy's (DOE) Regional Carbon Sequestration Partnerships.
By overlaying the two sides of the carbon capture and sequestration (CCS) equation — capture and transportation — the atlas may help facilitate building the infrastructure necessary to capture emissions while all three countries transition to clean energy sources.
Cost Hurdles Remain
The feasibility of CCS has long been debated, but would be a critical tool to slowing climate change. CCS covers many different types of technologies, but the basic theory is that CO2 emissions are captured at large point sources (like power plants) and chilled to a liquid form. Once converted, the CO2 would be piped to suitable locations and safely sequestered underground.
Adding the requisite equipment to existing smokestacks and building new pipelines would cost millions of dollars per site, a significant hurdle to the technology best embodied by the DOE's oft-delayed FutureGen project.
Some Testing Completed
However, a related project may be reducing the gap between potential and reality by testing potential CO2 storage sites. DOE recently announced test drilling had been completed at three potential underground storage sites, with two located in proximity to significant emissions sources.
The Newark basin, which runs under a heavily industrialized section of New York, New Jersey, and Pennsylvania is estimated to have a storage capacity of up to 10 billion metric tons. The Rock Springs Uplift, in southwestern Wyoming, is located near several of the state's largest emissions sources and has a storage capacity of 23 billion metric tons.
Source: Green Car Congress
Posted: 08 May 2012 06:52 PM PDT
This marks the first step towards harmonising the electric vehicle market, by creating one charging option to suit them all. This way there will be no proprietary charging systems requiring a specific charging set up. Now, one charging station will charge multiple vehicles.
This removes just one more barrier towards the electric car becoming a widely adopted mode of transport for the common household.
The new system integrates one-phase AC-charging, fast three-phase AC-charging, DC-charging at home, and ultra-fast DC-charging at public stations into one inlet.
“The system will optimize customer ease of use and will accelerate more affordable deployment of electrified vehicles and charging infrastructure,” said Ford in a statement.
“The system maximizes the capability for integration with future smart grid developments through common broadband communication methods regardless of the global location of the charging system,” General Motors said in a statement of its own, and went on to add that the combined charging approach will reduce development and infrastructure complexity, improve charging reliability, reduce the total cost-of-ownership for end customers and provide low maintenance costs.
Live charging demonstrations will be conducted during the Electric Vehicle Symposium 26 (EVS26) May 6-9.
Posted: 08 May 2012 06:49 PM PDT
American River Packaging in the Sacramento, CA area will soon begin using an anaerobic digestion system to convert 7.5 tons of daily food waste into 1,300 kWh of renewable energy per day. About 37% of the company’s electricity will be generated by the waste-to-energy technology. Converting the large amounts of food waste will also divert about 2,900 tons of waste from landfills each year.
Bacteria are used in anaerobic digestion to break down biodegradable waste into energy in the form of biogas. Components of this fuel source are methane, carbon dioxide, and trace amounts of hydrogen, carbon monoxide, and nitrogen. Other useful byproducts are compost, water, and natural fertilizer.
Anaerobic digestion begins when a group of microorganisms converts organic material, so other organisms can form organic acids. Then anaerobic bacteria utilize these acids, so the decomposition process can be completed.
The anaerobic digestion system being used at American River Packaging is the result of ten years of research by Ruihong Zhang, a UC Davis professor of biological and agricultural engineering. Her technology has been licensed by the start-up Clean World Partners. They focus on providing waste management systems employing anaerobic digestion to help generate energy and to attempt to divert some of the millions of tons of organic matter currently going into landfills.
“I applaud Professor Zhang for this tremendous accomplishment,” said UC Davis Chancellor Linda P.B. Katehi. ”Scientists like Professor Zhang are helping UC Davis address the most pressing global problems of our time. Her work brings us a giant step closer to the sustainable future we all hope for.”
Posted: 08 May 2012 06:44 PM PDT
The RAV4 EV is expected to go on sale in late summer of 2012 through select dealers in four major Californian metropolitan markets; Sacramento, San Francisco Bay Area, Los Angeles/Orange County and San Diego. It’s price will be $49,800, and Toyota is expecting to sell around 2,600 units over the next three years.
"It's all about blending the best of two worlds," said Bob Carter, group vice president and general manager of the Toyota division. "The all-new RAV4 EV marries the efficiency of an EV with the versatility of a small SUV – in fact, it is the only all-electric SUV on the market."
According to the Toyota press release, “the RAV4 EV combines a Tesla designed and produced battery and electric powertrain with Toyota's most popular SUV model.”
“The front wheel drive RAV4 EV allows drivers to select from two distinctly different drive modes, Sport and Normal. In Sport mode, the vehicle reaches 0-60 mph in just 7.0 seconds and has a maximum speed of 100 mph. Normal mode achieves 0-60 mph in 8.6 seconds with a maximum speed of 85 mph. Maximum output from the electric powertrain is 154 HP (115kW) @ 2,800 rpm.”
"We believe that the RAV4 EV will attract sophisticated early technology adopters, much like the first-generation Prius," said Carter. "It's designed for consumers who prioritize the environment and appreciate performance. We look forward to seeing how the market responds.”
Posted: 08 May 2012 06:31 PM PDT
Benefits of Distributed Rooftop Solar Schemes Like This
Due to the fact that rooftop solar schemes involve only a few or several kilowatts of solar panels per rooftop, rooftop solar panels are spread out (distributed) over a much larger area than they would be if installed in a typical solar power plant.
Typical utility-scale solar power plants contain many solar panels clustered barely a few feet away from each other, and this puts them at risk of the worst reliability issue that solar power plants have — clouds, which I discuss in the “reliability benefits” section below.
The reliability issue also creates a need for a large amount of backup from other non-solar power plants, or from batteries, which are a more expensive alternative.
Reliability and Cost Benefits
At a conventional solar power plant, storm clouds can cover most of the solar panels and significantly reduce their power output.
When this happens, natural gas peaking power plants may be switched on to compensate for this, or modern baseload power plants (natural gas, coal, nuclear, geothermal) may be adjusted, such as GE’s recent combined cycle natural gas (CCNG) plant, which can adjust its power production by 50 MW per minute to compensate for those pesky clouds.
Now, for the good news: Distributed solar schemes have the potential to reduce the need for backup generators and energy so much that only a small fraction of the solar panels would need backup at any given moment. In other words, little energy storage or backup is required, which saves a massive amount of money.
There will be large storms, which are exceptions to this rule but, not most of the time.
Why: Spreading solar panels out over a large area of hundreds or even thousands of miles, across the U.S or large U.S. regions, for example, means that when a few houses are overcast by clouds, less than 20 kW of solar panels are affected, compared to an example 20,000 kW of panels at a utility-scale solar power plant.
This is up to 1,000 times worse!
Another potential cost benefit: Land does not have to be purchased for rooftop solar panels, because they are put on existing rooftops. Most rooftop space is not put to use. Therefore, putting solar panels on them isn’t really a waste of space, because they are generating clean electricity while occupying no additional land.
h/t PR Newswire
Posted: 08 May 2012 05:23 PM PDT
Ford, GM, Chrysler and five top German car makers are on board with a new standard connecting system that can fast-charge an electric vehicle in as little as 15 minutes. It's a killer combination of standardization and convenience that could break the U.S. electric car market wide open.
Standardization is the linchpin of the gasoline powered auto industry – imagine if you had to hopscotch over half a dozen gas stations to find one where the nozzle could fit into your tank – and it is even more critical for the nascent EV sector, which is in hot competition to win a foothold in the mainstream car market.
Advantage of standard fast-charging system
A standardized charging system would help to lower manufacturing costs and simplify operation at the consumer end, as somewhat dryly explained by GM:
"The combined charging approach will reduce development and infrastructure complexity, improve charging reliability, reduce the total cost-of-ownership for end customers and provide low maintenance costs."
The new standard will also speed the transition to a two-way, interactive power grid that incorporates vehicle batteries as a significant energy source. For example, car owners could use their vehicle battery for auxiliary power at home or to run other equipment, or they could sell excess power back to the grid.
The new standard has been adopted by the Big Three U.S. auto makers along with Audi, BMW, Daimler, Porsche and Volkswagen, which are demonstrating the new charging equipment this week at the Electric Vehicle Symposium 26 in Los Angeles.
The system will also be adopted throughout Europe beginning in 2017.
So, what’s this new Combined Charging System?
Called DC Fast Charging with a Combined Charging System or "combo connector" for short, the standard was developed by the Society of Automotive Engineers International.
The combo connector is an adaptation of an existing J1772 connector that has roots in the 1990's. According to SAE, in 2010 the original J1772 standard was updated to a five-pin connector, to accommodate charging at 120 and 240 volts.
The latest J1772 charging port has two parts. The upper section retains the configuration of the 2010 standard, which means that slow-charging EVs already on the market can transition seamlessly to the new connector.
The lower section contains a second set of pins to accommodate fast-charging battery technology that was not commercially available in 2010. All together the combo connector will enable charging up to 500 volts.
Final approval for the new standard is expected by August 2012, and SAE expects the eight U.S. and German car makers to begin production of vehicles equipped with the new J1772 in 2013.
GM learns from past EV lessons
GM pushed hard for the new global standard after its experience in developing the ill-fated EV1 in the 1990's, according to the company's Director of Infrastructure Planning, Britta Gross. EV1, which went into limited production but was soon pulled from the market, is the subject of the documentary "Who Killed the Electric Car?"
In an SAE article last week by Patrick Ponticel, Gross explained:
"We [GM] learned a lot of lessons on the EV1, and we have vowed to make sure some of the hard lessons learned don't happen again. One lesson is that we can't go it alone on infrastructure, and on the standard for infrastructure… So we vowed on the [Chevrolet] Volt program to not proceed until the industry had condensed around charging infrastructure."
GM apparently took no chances when developing the Volt. According to Ponticel, GM's Engineering Specialist for Global Codes and Standards Development, Gery Kissel, also chairs the SAE International J1772 Task Force
A glitch in the global EV standardization scheme
The global picture for standardization is still complicated by Japan, which has its own fast charging system called CHAdeMO.
So far the J1772 standard hasn't stopped Japan from positioning itself to lead in the U.S. EV market, since car makers such as Nissan and Mitsubishi offer models with ports for both CHAdeMO and J1772 charging.
However, Gross suggests that the single-port configuration will give the U.S. and German car makers an advantage in production costs and consumer convenience. There may also be some marginal savings in maintenance and replacement costs.
The split in charging standards comes as car makers vie to gain an edge in lucrative new markets, not only in the U.S. but in China, where GM just announced that along with several affiliates it already reached the million-car mark in April for the 2012 selling year, the earliest it has ever reached that goal.
On the other hand, wireless EV charging could make all of this a moot issue when the next generation of EVs rolls off the assembly line.
Image: Courtesy of GM.
Follow me on Twitter: @TinaMCasey.
Posted: 08 May 2012 04:48 PM PDT
As the DOE noted in its press release on the matter: “The Energy Department's National Clean Energy Business Plan Competition (NCEBPC) for university students is part of Startup America, the White House campaign to inspire and promote entrepreneurship. In mid-June 2012, the six DOE-sponsored regional student competition winners will compete in Washington, D.C. This national initiative enables student participants to gain the skills required to build new businesses and transform promising innovative energy technologies from U.S. universities and National Laboratories into innovative new energy products that will to solve our nation's energy challenges, spur business creation, create American jobs, and boost American competitiveness.”
Here’s a little info on the three FLOW winners (via an email from the folks at Caltech), certainly some top cleantech startups to keep an eye on:
1st Place – “Stanford Nitrogen” ($100,000 Prize)
A start-up company committed to recovering energy from waste products. The Stanford Nitrogen Group has developed a new wastewater treatment process for the removal and recovery of energy from waste nitrogen (i.e. ammonia). This process improves the efficiency and lowers the cost of N-treatment and to our knowledge is the first wastewater treatment process to recover energy from nitrogen. The process is termed the Coupled Aerobic-anoxic Nitrous Decomposition Operation (CANDO) and consists of 2 principal steps: (1) biological conversion of ammonia to N2O gas, and (2) combustion of a fuel (i.e. biogas) with N2O to recover energy.
Currently, wastewater treatment facilities are experiencing dual financial pressures, rising energy costs and meeting increasingly stringent nitrogen discharge regulation. Wastewater treatment imposes a 3% load on U.S. energy supply and is often the highest energy expenditure of municipalities. The discharge of ammonia to water bodies causes dead zones; as a result, increasingly stringent regulation has imposed low N-discharge limits for which many wastewater treatment facilities do not have the current capability to meet economically. As a result, wastewater treatment facilities have a strong interest in energy efficient and low-cost N-treatment processes. CANDO has the potential to meet these needs.
For the treatment of wastewater, CANDO reduces the cost of treating nitrogen by at least 50%, improves energy efficiency by recovering energy from waste nitrogen and enabling increased methane recovery from organic matter, decreases the production of biosolids, mitigates the release of the greenhouse gas N2O, and improves water quality.
And in areas around the globe where there is nitrogen pollution or dead zones — which depletes oxygen, causing the fish and wildlife to perish — Stanford Nitrogen’s new processes can be utilized.
2nd Place – “Greenbotics” ($60,000 Prize)
This new company plans to manufacture automated robotic cleaners to clean dirty solar panels, increasing their output and efficiency by 15%. They project that this new economical technology will yield a 5% increase in solar energy production in the future.
3rd Place – “Xite Solar” ($40,000 Prize)
This start-up proposes development of new high efficiency, low-cost solar cells and to boost performance of current solar cells by using earth-abundant nontoxic materials for semiconductors. Their new heterojunction process has already received two patents.
Other competition finalist clean energy ideas included:
Posted: 08 May 2012 03:13 PM PDT
And if you’re a school insider, you can do more:
Again, would love to see this idea spread to the US — if it doesn’t soon, maybe I’ll just have to be the one to get it rolling!
Image via 10:10 Solar Schools email
Posted: 08 May 2012 02:13 PM PDT
“Owned and operated by Convert Italia, the solar tracking power plant combines Convert Italia's MX1 horizontal tracking system with Solaria's advanced low concentration photovoltaic modules,” a news release on the matter stated.
The solar power plant is 2 megawatts (MW) in size, which isn’t huge but isn’t small either, and it is apparently the largest for this class of solar power. (Note: I’ve asked for a photo of the solar power plant and will add it here if provided.)
"Solaria's innovative modules are a perfect fit with our Convert MX1 tracker," said Giuseppe Moro, president of Convert Italia. "Tracking systems, which follow the sun's transit across the sky, increase sunlight capture, generating up to 30% more electricity than fixed PV arrays. The result combines industry-leading performance and reliability with compelling system economics."
"Convert Italia's trackers unlock the unique advantages of Solaria modules,” stated Dan Shugar, Solaria CEO. “Our proprietary cell multiplication technology enables our high efficiency modules to use less than half the silicon of other PV modules."
Image: Solaria solar panels
|You are subscribed to email updates from CleanTechnica |
To stop receiving these emails, you may unsubscribe now.
|Email delivery powered by Google|
|Google Inc., 20 West Kinzie, Chicago IL USA 60610|