Skip navigation

Tag Archives: United States

solar tower with heliostatsolar troughStirling dishSolar power plant in Queensland (annexure 1)It is a fact that solar energy is emerging as a key source of future energy as the climate change debate is raging all over the world. The solar radiation can meet world’s energy need completely in a benign way and offer a clear alternative to fossil fuels. However the solar technology is still in a growing state with new technologies and solutions emerging. Though PV solar is a proven technology the levelised cost from such plants is still much higher than fossil fuel powered plants. This is because the initial investment of a PV solar plant is much higher compared to fossil fuel based power plants. For example the cost of a gas based power plant can be set up at less than $1000/Kw while the cost of PV solar is still around $ 7000 and above. However solar thermal is emerging as an alternative to PV solar. The basic difference between these two technologies is  PV solar converts light energy of the sun directly into electricity and stores in a battery for future usage; solar thermal plants use  reflectors (collectors)  to focus the solar light to heat a thermic fluid or molten salt to a high temperature. The high temperature thermic fluid or molten salt is used to generate steam to run a steam turbine using Rankine cycle or heat a compressed air to run a gas turbine using Brayton cycle to generate electricity. Solar towers using heliostat and mirrors are predicted to offer  the lowest cost of solar energy in the near future as the cost of Heliostats are reduced and molten salts with highest eutectic points are developed. The high eutectic point molten salts are likely to transform a range of industries for high temperature applications. When solar thermal plants with molten salt storage can approach temperature of 800C, many fossil fuel applications can be substituted with solar energy. For example, it is expected by using solar thermal energy 24×7 in Sulfur-Iodine cycle, Hydrogen can be generated on a large commercial-scale at a cost @2.90/Kg.Research and developments are focused to achieve the above and it may soon become a commercial reality in the near future.

“The innovative aspect of CSP (concentrated solar power) is that it captures and concentrates the sun’s energy to provide the heat required to generate electricity, and not using fossil fuels or nuclear reactions. Another attribute of CSP plants is that they can be equipped with a heat storage system to generate electricity even when the sky is cloudy or after sunset. This significantly increases the CSP capacity factor compared with solar photovoltaics and, more importantly, enables the production of dispatchable electricity, which can facilitate both grid integration and economic competitiveness. CSP technologies therefore benefit from advances in solar concentrator and thermal storage technologies, while other components of the CSP plants are based on rather mature technologies and cannot expect to see rapid cost reductions. CSP technologies are not currently widely deployed. A total of 354 MW of capacity was installed between 1985 and 1991 in California and has been operating commercially since then. After a hiatus in interest between 1990 and 2000, interest in CSP has been growing over the past ten years. A number of new plants have been brought on line since 2006 (Muller- Steinhagen, 2011) as a result of declining investment costs and LCOE, as well as new support policies. Spain is now the largest producer of CSP electricity and there are several very large CSP plants planned or under construction in the United States and North Africa. CSP plants can be broken down into two groups, based on whether the solar collectors concentrate the sun rays along a focal line or on a single focal point (with much higher concentration factors). Line-focusing systems include parabolic trough and linear Fresnel plants and have single-axis tracking systems. Point-focusing systems include solar dish systems and solar tower plants and include two-axis tracking systems to concentrate the power of the sun.

Parabolic trough collector technology:

The parabolic trough collectors (PTC) consist of solar collectors (mirrors), heat receivers and support structures. The parabolic-shaped mirrors are constructed by forming a sheet of reflective material into a parabolic shape that concentrates incoming sunlight onto a central receiver tube at the focal line of the collector. The arrays of mirrors can be 100 meters (m) long or more, with the curved aperture of 5 m to 6 m. A single-axis tracking mechanism is used to orient both solar collectors and heat receivers toward the sun (A.T. Kearney and ESTELA, 2010). PTC are usually aligned North-South and track the sun as it moves from East to West to maximize the collection of energy. The receiver comprises the absorber tube (usually metal) inside an evacuated glass envelope. The absorber tube is generally a coated stainless steel tube, with a spectrally selective coating that absorbs the solar (short wave) irradiation well, but emits very little infrared (long wave) radiation. This helps to reduce heat loss. Evacuated glass tubes are used because they help to reduce heat losses.

A heat transfer fluid (HTF) is circulated through the absorber tubes to collect the solar energy and transfer it to the steam generator or to the heat storage system, if any. Most existing parabolic troughs use synthetic oils as the heat transfer fluid, which are stable up to 400°C. New plants under demonstration use molten salt at 540°C either for heat transfer and/or as the thermal storage medium. High temperature molten salt may considerably improve the thermal storage performance. At the end of 2010, around 1 220 MW of installed CSP capacity used the parabolic trough technology and accounted for virtually all of today’s installed

CSP capacity. As a result, parabolic troughs are the CSP technology with the most commercial operating experience (Turchi, et al., 2010).

Linear Fresnel collector technology:

 Linear Fresnel collectors (LFCs) are similar to parabolic trough collectors, but use a series of long flat, or slightly curved, mirrors placed at different angles to concentrate the sunlight on either side of a fixed receiver (located several meters above the primary mirror field). Each line of mirrors is equipped with a single-axis tracking system and is optimized individually to ensure that sunlight is always concentrated on the fixed receiver. The receiver consists of a long, selectively coated absorber tube.

Unlike parabolic trough collectors, the focal line of Fresnel collectors is distorted by astigmatism. This requires a mirror above the tube (a secondary reflector) to refocus the rays missing the tube, or several parallel tubes forming a multi-tube receiver that is wide enough to capture most of the focused sunlight without a secondary reflector. The main advantages of linear Fresnel CSP systems compared to parabolic trough systems are that:

LFCs can use cheaper flat glass mirrors, which are a standard mass-produced commodity;LFCs require less steel and concrete, as the metal support structure is lighter. This also makes the assembly process easier.

»»The wind loads on LFCs are smaller, resulting in better structural stability, reduced optical losses and less mirror-glass breakage; and.

»»The mirror surface per receiver is higher in LFCs than in PTCs, which is important, given that the receiver is the most expensive component in both PTC and in LFCs.

These advantages need to be balanced against the fact that the optical efficiency of LFC solar fields (referring to direct solar irradiation on the cumulated mirror aperture) is lower than that of PTC solar fields due to the geometric properties of LFCs. The problem is that the receiver is fixed and in the morning and afternoon cosine losses are high compared to PTC. Despite these drawbacks, the relative simplicity of the LFC system means that it may be cheaper to manufacture and install than PTC CSP plants. However, it remains to be seen if costs per kWh are lower. Additionally, given that LFCs are generally proposed to use direct steam generation, adding thermal energy storage is likely to be more expensive.

Solar to Electricity technology:

Solar tower technologies use a ground-based field of mirrors to focus direct solar irradiation onto a receiver mounted high on a central tower where the light is captured and converted into heat. The heat drives a thermodynamic cycle, in most cases a water-steam cycle, to generate electric power. The solar field consists of many of computer-controlled mirrors, called heliostats that track the sun individually in two axes. These mirrors reflect the sunlight onto the central receiver where a fluid is heated up. Solar towers can achieve higher temperatures than parabolic trough and linear Fresnel systems; because more sunlight can be concentrated on a single receiver and the heat losses at that point can be minimized. Current solar towers use water/steam, air or molten salt to transport the heat to the heat-exchanger/steam turbine system. Depending on the receiver design and the working fluid, the upper working temperatures can range from 250°C to perhaps as high 1 000°C for future plants, although temperatures of around 600°C will be the norm with current molten salt designs. The typical size of today’s solar power plants ranges from 10 MW to 50 MW (Emerging Energy Research, 2010). The solar field size required increases with annual electricity generation desired, which leads to a greater distance between the receiver and the outer mirrors of the solar field. This results in increasing optical losses due to atmospheric absorption, unavoidable angular mirror deviation due to imperfections in the mirrors and slight errors in mirror tracking.

Solar towers can use synthetic oils or molten salt as the heat transfer fluid and the storage medium for the thermal energy storage. Synthetic oils limit the operating temperature to around 390°C, limiting the efficiency of the steam cycle. Molten salt raises the potential operating temperature to between 550 and 650°C, enough to allow higher efficiency supercritical steam cycles although the higher investment costs for these steam turbines may be a constraint. An alternative is direct steam generation (DSG), which eliminates the need and cost of heat transfer fluids, but this is at an early stage of development and storage concepts for use with DSG still need to be demonstrated and perfected.

Solar towers have a number of potential advantages, which mean that they could soon become the preferred CSP technology. The main advantages are that:

»»The higher temperatures can potentially allow greater efficiency of the steam cycle and reduce water consumption for cooling the condenser;

»»The higher temperature also makes the use of thermal energy storage more attractive in order to achieve schedulable power generation; and

»»Higher temperatures will also allow greater temperature differentials in the storage system, reducing costs or allowing greater storage for the same cost.

The key advantage is the opportunity to use thermal energy storage to raise capacity factors and allow a flexible generation strategy to maximize the value of the electricity generated, as well as to achieve higher efficiency levels. Given this advantage and others, if costs can be reduced and operating experience gained, solar towers could potentially achieve significant market share in the future, despite PTC systems having dominated the market to date. Solar tower technology is still under demonstration, with 50 MW scale plant in operation, but could in the long-run provide cheaper electricity than trough and dish systems (CSP Today, 2008). However, the lack of commercial experience means that this is by no means certain and deploying solar towers today includes significant technical and financial risks.

Sterling dish technology:

The Stirling dish system consists of a parabolic dish shaped concentrator (like a satellite dish) that reflects direct solar irradiation onto a receiver at the focal point of the dish. The receiver may be a Stirling engine (dish/ engine systems) or a micro-turbine. Stirling dish systems require the sun to be tracked in two axes, but the high energy concentration onto a single point can yield very high temperatures. Stirling dish systems are yet to be deployed at any scale. Most research is now focused on using a Stirling engine in combination with a generator unit, located at the focal point of the dish, to transform the thermal power to electricity. There are currently two types of Stirling engines: Kinematic and free piston. Kinematic engines work with hydrogen as a working fluid and have higher efficiencies than free piston engines. Free piston engines work with helium and do not produce friction during operation, which enables a reduction in required maintenance. The main advantages of Stirling dish CSP technologies are that:

»»The location of the generator – typically, in the receiver of each dish – helps reduce heat losses and means that the individual dish-generating capacity is small, extremely modular (typical sizes range from 5 to 50 kW) and are suitable for distributed generation;

»»Stirling dish technologies are capable of achieving the highest efficiency of all type of CSP systems

»»Stirling dishes use dry cooling and do not need large cooling systems or cooling towers, allowing CSP to provide electricity in water-constrained regions; and

»»Stirling dishes, given their small foot print and the fact they are self-contained, can be placed on slopes or uneven terrain, unlike PTC, LFC and solar towers. These advantages mean that Stirling dish technologies could meet an economically valuable niche in many regions, even though the levelised cost of electricity is likely to be higher than other CSP technologies. Apart from costs, another challenge is that dish systems cannot easily use storage. Stirling dish systems are still at the demonstration stage and the cost of mass-produced systems remains unclear. With their high degree of scalability and small size, stirling dish systems will be an alternative to solar photovoltaics in arid regions.”

(Source : IRENA 2012)

 

Bio-LNG (01)Bio-LNG (02) Bio-LNG (03) Bio-LNG (04) Bio-LNG (05) Bio-LNG (07) Bio-LNG(06) Bio-LNG (08) Bio-LNG (09) Bio-LNG (10) Bio-LNG (11)

A new concept known as “hydraulic fracturing “ to enhance the recovery of land fill gas from new and existing land fill sites have been tested jointly by a Dutch and  Canadian companies. They claim it is now possible to recover such gas economically and liquefy them into Bio-LNG to be used as a fuel for vehicles and to generate power.

Most biofuels around the world are now made from energy crops like wheat, maize, palm oil, rapeseed oil etc and only  a minor part is  made from waste. But such a practice in not sustainable in the long run considering the anticipated food shortage due to climate changes.   The EU wants to ban biofuels that use too much agricultural land and encourage production of biofuels that do not use food material but waste materials. Therefore there is a need to collect methane gas that is emitted by land fill sites more efficiently and economically and to compete with fossil fuels.

There are about 150,000 landfills in Europe with about 3–5 trillion cubic meters of waste (Haskoning 2011). All landfills emit landfill gas; the contribution of methane emissions from landfills is estimated to be between 30 and 70 million tons each year. Landfills contributed an estimated 450 to 650 billion cubic feet of methane per year (in 2000) in the USA. One can either flare landfill gas or make electricity with landfill gas. But it is prudent to produce the cleanest and cheapest liquid biofuel namely “Bio-LNG”.

Landfill gas generation: how do these bugs do their work?

Researchers had a hard time figuring out why landfills do not start out as a friendly environment for the organisms that produce methane. Now new research from North Carolina State University points to one species of microbe that is paving the way for other methane producers. The starting bug has been found. That opens the door to engineer better landfills with better production management. One can imagine a landfill with real economic prospects other than getting the trash out of sight. The NCSU researchers found that an anaerobic bacterium called Methanosarcina barkeri appears to be the key microbe. The following steps are involved in the formation of landfill gas is shown in the diagram

Phase 1: oxygen disappears, and nitrogen

Phase 2: hydrogen is produced and CO2 production increases rapidly.

Phase 3: methane production rises and CO2 production decreases.

Phase 4: methane production can rise till 60%.

Phases 1-3 typically last for 5-7 years.

Phase 4 can continue for decades, rate of decline depending on content.

Installation of landfill gas collection system

A quantity of wells is drilled; the wells are (inter) connected with a pipeline system. Gas is guided from the wells to a facility, where it is flared or burnt to generate electricity. A biogas engine exhibits 30-40% efficiency. Landfills often lack access to the grid and there is usually no use for the heat.

The alternative: make bio-LNG instead and transport the bio-LNG for use in heavy-duty vehicles and ships or applications where you can use all electricity and heat.

Bio-LNG: what is it?

Bio-LNG is liquid bio-methane (also: LBM). It is made from biogas. Biogas is produced by anaerobic digestion. All organic waste can rot and can produce biogas, the bacteria does the work. Therefore biogas is the cheapest and cleanest biofuel  that can be generated without competing  with food or land use. For the first time there is a biofuel, bio-LNG, a better quality fuel than fossil fuel.

The bio-LNG production process

Landfill gas is produced by anaerobic fermentation in the landfill. The aim is to produce a constant flow of biogas with high methane content. The biogas must be upgraded, i.e. removal of H2S, CO2 and trace elements;

In landfills also siloxanes, nitrogen and Cl/F gases. The bio-methane must be purified (maximum 25/50ppm CO2, no water) to prepare for liquefaction. The cold box liquefies pure biomethane to bio-LNG

Small scale bio-LNG production using smarter methods.

•Use upgrading modules that do not cost much energy.

•Membranes which can upgrade to 98-99.5 % methane are suitable.

•Use a method for advanced upgrading that is low on energy demand.

•Use a fluid / solid that is allowed to be dumped at the site.

•Use cold boxes that are easy to install and low on power demand.

•Use LNG tank trucks as storage and distribution units.

•See if co-produced CO2 can be sold and used in greenhouses or elsewhere.

•Look carefully at the history and present status of the landfill.

What was holding back more projects?

Most flows of landfill gas are small (hundreds of Nm3/hour), so economy of scale is generally not favorable. Technology in upgrading and liquefaction has evolved, but the investments for small flows during decades cannot be paid back.

Now there is a solution: enhanced gas recovery by hydraulic fracturing. Holland Innovation Team and Fracrite Environmental Ltd. (Canada) has developed a method to increase gas extraction from landfill 3-5 times.

Hydraulic fracturing increases landfill gas yield and therefore economy of scale for bio-LNG production

The method consists of a set of drilling from which at certain dept the landfill is hydraulically broken. This means a set of circular horizontal fractures are created from the well at preferred depths. Sand or other materials are injected into the fractures. Gas gathers from below in the created interlayer and flows into the drilled well. In this way a “guiding” circuit for landfill gas is created. With a 3-5 fold quantity of gas, economy of scale for bio-LNG production will be reached rapidly. Considering the multitude of landfills worldwide this hydraulic fracturing method in combination with containerized upgrading and liquefaction units offers huge potential. The method is cost effective, especially at virgin landfills, but also at landfill with decreasing amounts of landfill gas.

Landfill gas fracturing pilot (2009).

• Landfill operational from 1961-2005

• 3 gas turbines, only 1 or 2 in operation at any time due to low gas extraction rates

• Only 12 of 60 landfill gas extraction wells still producing methane

• Objective of pilot was to assess whether fracturing would enhance methane extraction rates

Field program and preliminary result

Two new wells drilled into municipal wastes and fractured (FW60, FW61). Sand Fractures at 6, 8, 10, 12 m depth in wastes with a fracture radius of 6 m. Balance gases believed to be due to oxygenation effects during leachate and

Groundwater pumping.

Note: this is entirely different from deep fracking in case of shale gas!

Conceptual Bioreactor Design

 The conceptual design is shown in the figures.There are anaerobic conditions below the groundwater table, but permeability decreases because of compaction of the waste. Permeability increases after fracking and so does the quantity of landfill gas and leachate.

Using the leachate by injecting this above the groundwater table will introduce anaerobic conditions in an area where up till then oxygen prevailed and so prevented landfill gas formation

It can also be done in such a systematic way, that all leachate which is extracted, will be disposed off in the shallow surrounding wells above the groundwater table.

One well below the groundwater table is fracked, the leachate is injected at the corners of a square around the deeper well. Sewage sludge and bacteria can be added to increase yield further

Improving the business case further

A 3-5 fold increased biogas flow will improve the business case due to increasing

Economy of scale. The method will also improve landfill quality and prepare the landfill for other uses.

When the landfill gas stream dries up after 5 years or so, the next landfill can be served by relocating the containerized modules (cold boxes and upgrading modules). The company is upgrading with a new method developed in-house, and improving landfill gas yield by fracking with smart materials. EC recommendations to count land fill gas quadrupled for renewable fuels target and the superior footprint of bio-LNG production from landfills are beneficial for immediate start-ups

Conclusions and recommendations

Landfills emit landfill gas. Landfill gas is a good source for production of bio-LNG. Upgrading and liquefaction techniques are developing fast and decreasing in price. Hydraulic fracturing can improve landfill gas yield such that economy of scale is reached sooner. Hydraulic fracturing can also introduce anaerobic conditions by injecting leachate, sewage sludge and bacteria above the groundwater table. The concept is optimized to extract most of the landfill gas in a period of five years and upgrade and liquefy this to bio-LNG in containerized modules.

Holland Innovation Team and Fracrite aim at a production price of less than €0.40 per kilo (€400/ton) of bio-LNG, which is now equivalent to LNG fossil prices in Europe and considerably lower than LNG prices in Asia, with a payback time of only a few years.

(Source:Holland Innovation Team)

 

The recent debate between the presidential nominees in US election has revealed their respective positions on their policies for an energy independent America. Each of them have articulated how they will increase the oil and gas production to make America energy independent, which will  also incidentally create number of jobs in an ailing economy. Each one of them will be spending a billion dollar first, in driving their messages to the voting public. Once elected, they will explore oil and gas aggressively that will make America energy independent. They will also explore solar and wind energy potentials simultaneously to bridge any shortfall. Their policies   seem to be unconcerned with global warming and its impact due to emission of GHG but, rather aggressive in making America an energy independent by generating an unabated emission of GHG in the future. Does it mean an ‘energy independent America’ will spell a doom to the world including US?

The best option for America to become energy independent will be to focus  on energy efficiency of existing technologies and systems, combining renewable fossil fuel energy mix, base load renewable  power and storage technologies, substituting Gasoline with Hydrogen using renewable energy sources. The future investment should be based on sustainable renewable energy sources than fossil fuel. But current financial and unemployment situation in US will force the new president to increase the conventional and unconventional oil and gas production than renewable energy production, which will be initially expensive with long pay pack periods but will eventually meet the energy need in a sustainable way. The net result of their current policies will be an enhanced emission of GHG and acceleration of global warming. But the energy projections in the U.S. Energy Information Administration’s (EIA’s) Annual Energy Outlook 2012 (AEO2012) projects a reduced GHG emission.

According to Annual Energy Outlook 2012 report:

“The projections in the U.S. Energy Information Administration’s (EIA’s) Annual Energy Outlook 2012 (AEO2012) focus on the factors that shape the U.S. energy system over the long-term. Under the assumption that current laws and regulations remain unchanged throughout the projections, the AEO2012 Reference case provides the basis for examination and discussion of energy production, consumption, technology, and market trends and the direction they may take in the future. It also serves as a starting point for analysis of potential changes in energy policies. But AEO2012 is not limited to the Reference case. It also includes 29 alternative cases, which explore important areas of uncertainty for markets, technologies, and policies in the U.S. energy economy. Many of the implications of the alternative cases are discussed in the “Issues in focus” section of this report.

Key results highlighted in AEO2012 include continued modest growth in demand for energy over the next 25 years and increased domestic crude oil and natural gas production, largely driven by rising production from tight oil and shale resources. As a result, U.S. reliance on imported oil is reduced; domestic production of natural gas exceeds consumption, allowing for net exports; a growing share of U.S. electric power generation is met with natural gas and renewable; and energy-related carbon dioxide emissions stay below their 2005 level from 2010 to 2035, even in the absence of new Federal policies designed to mitigate greenhouse gas (GHG) emissions.

The rate of growth in energy use slows over the projection period, reflecting moderate population growth, an extended economic recovery, and increasing energy efficiency in end-use applications.

 

Overall U.S. energy consumption grows at an average annual rate of 0.3 percent from 2010 through 2035 in the AEO2012 Reference case. The U.S. does not return to the levels of energy demand growth experienced in the 20 years before the 2008- 2009 recession, because of more moderate projected economic growth and population growth, coupled with increasing levels of energy efficiency. For some end uses, current Federal and State energy requirements and incentives play a continuing role in requiring more efficient technologies. Projected energy demand for transportation grows at an annual rate of 0.1 percent from 2010 through 2035 in the Reference case, and electricity demand grows by 0.7 percent per year, primarily as a result of rising energy consumption in the buildings sector. Energy consumption per capita declines by an average of 0.6 percent per year from 2010 to 2035 (Figure 1). The energy intensity of the U.S. economy, measured as primary energy use in British thermal units (Btu) per dollar of gross domestic product (GDP) in 2005 dollars, declines by an average of 2.1 percent per year from 2010 to 2035. New Federal and State policies could lead to further reductions in energy consumption. The potential impact of technology change and the proposed vehicle fuel efficiency standards on energy consumption are discussed in “Issues in focus.”

Domestic crude oil production increases

Domestic crude oil production has increased over the past few years, reversing a decline that began in 1986. U.S. crude oil production increased from 5.0 million barrels per day in 2008 to 5.5 million barrels per day in 2010. Over the next 10 years, continued development of tight oil, in combination with the ongoing development of offshore resources in the Gulf of Mexico, pushes domestic crude oil production higher. Because the technology advances that have provided for recent increases in supply are still in the early stages of development, future U.S. crude oil production could vary significantly, depending on the outcomes of key uncertainties related to well placement and recovery rates. Those uncertainties are highlighted in this Annual Energy Outlook’s “Issues in focus” section, which includes an article examining impacts of uncertainty about current estimates of the crude oil and natural gas resources. The AEO2012 projections considering variations in these variables show total U.S. crude oil production in 2035 ranging from 5.5 million barrels per day to 7.8 million barrels per day, and projections for U.S. tight oil production from eight selected plays in 2035 ranging from 0.7 million barrels per day to 2.8 million barrels per day (Figure 2).

With modest economic growth, increased efficiency, growing domestic production, and continued adoption of nonpetroleum liquids, net imports of petroleum and other liquids make up a smaller share of total U.S. energy consumption

U.S. dependence on imported petroleum and other liquids declines in the AEO2012 Reference case, primarily as a result of rising energy prices; growth in domestic crude oil production to more than 1 million barrels per day above 2010 levels in 2020; an increase of 1.2 million barrels per day crude oil equivalent from 2010 to 2035 in the use of biofuels, much of which is produced domestically; and slower growth of energy consumption in the transportation sector as a result of existing corporate average fuel economy standards. Proposed fuel economy standards covering vehicle model years (MY) 2017 through 2025 that are not included in the Reference case would further cut projected need for liquid imports.

Although U.S. consumption of petroleum and other liquid fuels continues to grow through 2035 in the Reference case, the reliance on imports of petroleum and other liquids as a share of total consumption decline. Total U.S. consumption of petroleum and other liquids, including both fossil fuels and biofuels, rises from 19.2 million barrels per day in 2010 to 19.9 million barrels per day in 2035 in the Reference case. The net import share of domestic consumption, which reached 60 percent in 2005 and 2006 before falling to 49 percent in 2010, continues falling in the Reference case to 36 percent in 2035 (Figure 3). Proposed light-duty vehicles (LDV) fuel economy standards covering vehicle MY 2017 through 2025, which are not included in the Reference case, could further reduce demand for petroleum and other liquids and the need for imports, and increased supplies from U.S. tight oil deposits could also significantly decrease the need for imports, as discussed in more detail in “Issues in focus.”

Natural gas production increases throughout the projection period, allowing the United States to transition from a net importer to a net exporter of natural gas

Much of the growth in natural gas production in the AEO2012 Reference case results from the application of recent technological advances and continued drilling in shale plays with high concentrations of natural gas liquids and crude oil, which have a higher value than dry natural gas in energy equivalent terms. Shale gas production increases in the Reference case from 5.0 trillion cubic feet per year in 2010 (23 percent of total U.S. dry gas production) to 13.6 trillion cubic feet per year in 2035 (49 percent of total U.S. dry gas production). As with tight oil, when looking forward to 2035, there are unresolved uncertainties surrounding the technological advances that have made shale gas production a reality. The potential impact of those uncertainties results in a range of outcomes for U.S. shale gas production from 9.7 to 20.5 trillion cubic feet per year when looking forward to 2035.

As a result of the projected growth in production, U.S. natural gas production exceeds consumption early in the next decade in the Reference case (Figure 4). The outlook reflects increased use of liquefied natural gas in markets outside North America, strong growth in domestic natural gas production, reduced pipeline imports and increased pipeline exports, and relatively low natural gas prices in the United States.

Power generation from renewable and natural gas continues to increase

In the Reference case, the natural gas share of electric power generation increases from 24 percent in 2010 to 28 percent in 2035, while the renewable share grows from 10 percent to 15 percent. In contrast, the share of generation from coal-fired power plants declines. The historical reliance on coal-fired power plants in the U.S. electric power sector has begun to wane in recent years.

Over the next 25 years, the share of electricity generation from coal falls to 38 percent, well below the 48-percent share seen as recently as 2008, due to slow growth in electricity demand, increased competition from natural gas and renewable generation, and the need to comply with new environmental regulations. Although the current trend toward increased use of natural gas and renewable appears fairly robust, there is uncertainty about the factors influencing the fuel mix for electricity generation. AEO2012 includes several cases examining the impacts on coal-fired plant generation and retirements resulting from different paths for electricity demand growth, coal and natural gas prices, and compliance with upcoming environmental rules.

While the Reference case projects 49 gigawatts of coal-fired generation retirements over the 2011 to 2035 period, nearly all of which occurs over the next 10 years, the range for cumulative retirements of coal-fired power plants over the projection period varies considerably across the alternative cases (Figure 5), from a low of 34 gigawatts (11 percent of the coal-fired generator fleet) to a high of 70 gigawatts (22 percent of the fleet). The high-end of the range is based on much lower natural gas prices than those assumed in the Reference case; the lower end of the range is based on stronger economic growth, leading to stronger growth in electricity demand and higher natural gas prices. Other alternative cases, with varying assumptions about coal prices and the length of the period over which environmental compliance costs will be recovered, but no assumption of new policies to limit GHG emissions from existing plants, also yield cumulative retirements within a range of 34 to 70 gigawatts. Retirements of coal-fired capacity exceed the high-end of the range (70 gigawatts) when a significant GHG policy is assumed (for further description of the cases and results, see “Issues in focus”).

Total energy-related emissions of carbon dioxide in the United States stay below their 2005 level through 2035

Energy-related carbon dioxide (CO2) emissions grow slowly in the AEO2012 Reference case, due to a combination of modest economic growth, growing use of renewable technologies and fuels, efficiency improvements, slow growth in electricity demand, and increased use of natural gas, which is less carbon-intensive than other fossil fuels. In the Reference case, which assumes no explicit Federal regulations to limit GHG emissions beyond vehicle GHG standards (although State programs and renewable portfolio standards are included), energy-related CO2 emissions grow by just over 2 percent from 2010 to 2035, to a total of 5,758 million metric tons in 2035 (Figure 6). CO2 emissions in 2020 in the Reference case are more than 9 percent below the 2005 level of 5,996 million metric tons, and they still are below the 2005 level at the end of the projection period. Emissions per capita fall by an average of 1.0 percent per year from 2005 to 2035.

Projections for CO2 emissions are sensitive to such economic and regulatory factors due to the pervasiveness of fossil fuel use in the economy. These linkages result in a range of potential GHG emissions scenarios. In the AEO2012 Low and High Economic Growth cases, projections for total primary energy consumption in 2035 are, respectively, 100.0 quadrillion Btu (6.4 percent below the Reference case) and 114.4 quadrillion Btu (7.0 percent above the Reference case), and projections for energy-related CO2 emissions in 2035 are 5,356 million metric tons (7.0 percent below the Reference case) and 6,117 million metric tons (6.2 percent above the Reference case)”.  (Ref:U.S. Energy Information Administration).

Those who studied chemistry and conducted laboratory experiments in universities will be familiar with precautionary measures we take to avoid  accidents. Aprons, gloves, goggles and fume cub-boards with exhaust fans are some few examples of protective measures from flames, hot plates and fumes. The blue color of the flame represented the degree of hotness of the flame from Bunsen burner; the pungent smell pointed to the ‘Gas plant’ that generated ‘water gas’ for Bunsen burners. The familiar smells of chemicals would bring ‘nostalgic memories’ of college days. Each bottle of chemicals would display a sign of warning ‘Danger or Poison’. We could recognize and identify even traces of  gases or fumes or chemicals immediately. Those memories embedded deeply in our memories and I vividly remembered even after few decades I left university.

I could smell traces of Chlorine in the air even at a distance of 20 miles from a Chloroalkali plant in sixties, when air pollution controls were not stringent. People who lived around the factory probably were used to live with that smell for generations. Many families had not breathed  fresh air in their life time, because they have not breathed air without traces of chlorine.They lived all their lives in the same place because agriculture was their profession. Many people developed breathing problems during  their old ages and died of asthma and tuberclosis.The impact of these fumes cannot be felt in months and years but certainly can be felt after decades especially at old ages, when the body’s immune system deteriorates. Bhopal gas accident in India is a grim reminder of  such tragedy of chemical accidents and how they can contaminate air, water and earth and degrade human lives. But we learnt any lessons from those accidents?

During experimental thermonuclear explosion in the desert of Australia by then British army, people were directly exposed to nuclear radiation. Many of those  who saw this explosion developed some form of cancer or other later in their life .They were treated as heroes then. After several decades of this incident, many exposed to this experiment are now demanding compensation from current British government. But have we learnt any lessons from those incidents? Many politicians still advocate ‘Nuclear energy as a safe and clean energy’. Yes, until we meet with an another accident!

We human beings identified the presence of  chemicals in Nature and used them for our scientific developments. We identified fossil fuels as ‘Hydrocarbons’ and burn them to generate power and to run our cars. We emit toxic gases and fumes every second of our lives, when we switch our lights on or start our cars.Imagine the amount of gases and fumes we emit everyday all over the world by billions of people for several decades. It is a simple common sense that we are responsible for these emissions and we contaminate the air we breathe. Nature does not burn Hydrocarbons everyday or every month or every year. In fact Nature buried these Hydrocarbons deep down the earth like we bury our dead.

Can people who breathed Chlorine for decades and died of asthma or tuberculosis prove that they died due constant inhalation of Chlorine emitted by the Chloroalkali plant? The Court and Authorities will demand ‘hard evidence’ to prove that Chlorine emitted by Chloroalkli plants caused these diseases. We use science when it suits us and we become skeptics when it does not suit us. They know it is almost impossible to prove such cases in our legal system and they can get away scot-free. The same argument applies to our ‘Greenhouse gas emission’ and ‘Global warming’.

We contaminate  our air, water and earth with our population explosion, industrialization and our life styles. Yet, major industrialized countries are not willing to cut their emissions but want to carry on their ‘economic growth’. But these countries got it completely wrong. In chemical experiments, one can draw conclusions by ‘observations’ and ‘Inference’. Inference is a scientific tool and not a guess work. From overwhelming evidences of natural disasters occurring around the world one can ‘infer’ that human activities cause these disasters. Nature is now showing this by devastating ‘the business and economic’ interest of nations because that is the only way Governments can learn lessons. They don’t need ‘harder evidence’ than  monetary losses. According to recent reports:

“The monetary losses from 2011’s natural catastrophes reached a record $380 billion, surpassing the previous record of $220 billion set in 2005. The year’s three costliest natural catastrophes were the March earthquake and tsunami in Japan (costing $210 billion), the August-November floods in Thailand ($40 billion), and the February earthquake in New Zealand ($16 billion).

The report notes that Asia experienced 70 percent, or $265 billion, of the total monetary losses from natural disasters around the world—up from an average share of 38 percent between 1980 and 2010. This can be attributed to the earthquake and tsunami in Japan, as well as the devastating floods in Thailand: Thailand’s summer monsoons, probably influenced by a very intensive La Niña situation, created the costliest flooding to date, with $40 billion in losses.”

Environment Pollution Authority EPA of US Government regulated the gas emission standards for power plants for oxides of Nitrogen and Sulfur in the past but not for GreenHouse gas emissions into the atmosphere. However when President Obama took over power, EPA passed ‘Clean Air Act’ to regulate the emission standards of all gases including GHG for new stationary power plants. This act projected to prevent over 230,000 early deaths in US alone by 2020 due to Carbon dioxide. According to this act,

1.  Starting in January 2011, large industrial facilities that must already obtain Clean Air Act permits for non-GHGs must also include GHG requirements in these permits if these increase are newly constructed and have the potential to emit 75,000 tons per year of carbon dioxide equivalent (CO2e) or more or modify and increase GHG emissions by that amount.

2.  Starting in July 2011, in addition to facilities described above, all new facilities emitting GHGs in excess of 100,000 tons of per year CO2e and facilities making changes that would increase GHG emissions by at least 75,000 tpy CO2e, and that also exceed 100/250 tons per year of GHGs on a mass basis, will be required to obtain construction permits that address GHG emissions (regardless of whether they emit enough non-GHG pollutants to require a permit for those emissions.)

3.  Operating permits will be needed by all sources that emit at least 100,000 tons of GHG per year on a CO2e basis beginning in July 2011.

4. Sources less than 50,000 tons of GHGs per year on a CO2e basis will not be required to obtain permits for GHGs before 2016. (Sources: clean technica)

According to Stanford scientist Mark Jacobson, there is a definite link between the Carbon dioxide and increasing deaths. While the argument continues between believers of global warming and skeptics, it clear that Carbon pollution kills people without any discrimination. Any gaseous emission into the atmosphere will eventually spread across the borders of each country and becomes a global issue.

EPA in each country in the world should pass similar legislation to curb GHG emission at least to protect their people, if not to curtail global warming. What is most surprising is some scientists still want more ‘scientific data’ to accept whether GHG causes global warming or not. One need not be a rocket scientist to conclude that chemical pollution is slowly poisoning the air, water and earth. Hundreds of chemicals that we used in the past were abandoned due to their harmful effects. For example, Asbestos,DDT,Chlorine for disinfecting drinking water, coal tar dyes, Nicotine, Refrigerants like Fluorocarbon etc to name a few. We can choose to ignore the warnings of Nature and carry on the business as usual in the name of science. But we cannot ignore people claiming their legitimate rights to live and breathe a quality air to lead a normal life. It is a human right issue. It is not an issue that can be debated only by scientific community and decided.

WHO should classify ‘Quality air’ as a fundamental human right with great urgency. Governments around the world can pass ‘Clean air act’ similar to US. They may not levy carbon tax or offer new incentives to promote green energy, but regulate the indiscriminate emission of GHG into the atmosphere, which passively kills millions of people around the world. This is nothing but ‘weapons of mass destruction’ in a passive way, but on a grander scale. When ‘passive smoking’ is a serious health issue, Carbon emission too is a  serious health issue. It is the duty of industries to incorporate carbon pollution prevention measures by scientific innovations.

Hydrogen has been accepted as a source of clean energy for many reasons. Hydrogen can eliminate anthropic Greenhouse Gas  into the atmosphere and stop global warming. It has high energy content than any other fossil fuels we are currently using, making it an efficient fuel. The combustion product of Hydrogen is only water which is   recyclable. Many people, Governments  and institutions around the world are trying develop  cheaper methods of generating Hydrogen from various sources both renewable as well as non-renewable. The non-renewable sources are supposed to facilitate a smooth transition from fossil fuel economy to Hydrogen economy.

However, all attempts to generate Hydrogen at a cost lower than the projected cost of $ 2.50 per kg by DOE has not been successful, even though many recent technologies are promising. Meanwhile massive investments are made on Renewable Energy including wind, solar and biological all over the world. Generating Hydrogen from water using Solid Polymer Membrane Electrolyzer is a known technology using renewable energy sources. One can easily deploy such systems for commercial applications even though it is now expensive.

Many people and institutions are also claiming ‘free energy’ sources with or without generating Hydrogen. In some cases researches are claiming an abnormal production of Hydrogen using ‘Cold plasma’ or ‘Plasma electrolysis’ of water, as much as 800% more than the theoretical values. Some companies claim low energy consumption using photo- catalyst to generate Hydrogen  using direct sunlight and water. Hydrogen generation using renewable sources is a distinct possibility to cut the cost of Hydrogen in the long run. However, the world is in hurry to develop a cheap and sustainable method of Hydrogen generation without any greenhouse gas emissions.

One US based company is claiming to have invented a new Hydrogen atom which has not been reported before in the literature. According to the inventor, this new atom of Hydrogen is called ‘Hydrino’.He has presented a detailed theory called ‘Grand Unified Theory’   that predicts catalysts that allow energy to be extracted from lower energy state of Hydrogen atom. They have demonstrated the process using a proto type in the laboratory and their claims have been validated by an independent Laboratory after conducting trial runs and analyzing the results using spectrum analysis and other techniques.

The process involves a generation of Hydrogen by using electrolysis of water. The resulting Hydrogen is then reacted with a proprietary solid catalyst developed by the company. According to the company,

“Since certain proprietary catalysts cause the hydrogen atoms to transition to lower-energy states by allowing their electrons to fall to smaller radii around the nucleus with a release of energy that is intermediate between chemical and nuclear energies, the primary application is as a new primary energy source. Specifically, energy is released as the electrons of hydrogen atoms are induced by a catalyst to transition to lower-energy levels (i.e. drop to lower base orbits around each atom’s nucleus). The lower-energy atomic hydrogen product called “hydrino” reacts with another reactant supplied to the reaction cell to form a hydride ion bound to the other reactant to constitute a novel proprietary compound. Alternatively, two hydrinos react to form a very stable hydrogen-type molecule called molecular hydrino. Thus, rather than pollutants, the byproducts may have significant advanced technology applications based on their stability characteristics. For example, hydrino hydride ions having extraordinary binding energies may stabilize a cation (positively charged ion of a battery) in an extraordinarily high-oxidation state as the basis of a high-voltage battery. Further, significant applications exist for the corresponding molecular hydrino wherein the excited vibration-rotational levels could be the basis of a UV laser that could significantly advance photolithography and line-of-sight telecommunications. A plasma-producing cell based on the extraordinarily energetic Process has also been developed that may have commercial applications in chemical plasma processing and as a light source.”

The company claims that an average generating capacity of a system will be 1000kw, with installed cost at $1000/kw with fuel cost at less than $0.001/kw with zero greenhouse emission.The solid catalyst is regenerated and recycled. The cost of Hydrogen from electrolysis becomes insignificant due to generation  of large excess thermal energy, to generate power.

The above claims are too attractive to ignore and it could be a game changer in the energy industry. The output energy is more than the theoretical values calculated,  thus violating the Law of Thermodynamics. This excess energy is attributed to the presence of ‘Hydrino’. However, one has to be open to new ideas because science is ever-changing and even well-established theories and concepts are challenged as Science evolves with new discoveries and inventions.

 

 

 

Stanley Meyer, a freelance inventor from USA demonstrated a car that ran on water, according to an Equinox programme that was televised in 1995. Stane Meyer’s dune buggy ran 100 miles from 1 gallon of water. He claimed that water would be the fuel that could revolutionize the auto industry in America. However, his tragic death in 1998 brought the issue to a closure.  Many people and institutions are still trying to replicate his invention at least partly and claiming success.  He received a   number of patents based on his inventions. He worked nearly 30 years on his invention before he began to work on a book titled, “With the Lord, there is a purpose” describing his “faith walk” with the Lord to fulfill end-time Prophecy.  He continued with his speaking engagements throughout the world.  However, such ‘free energy’ devices are still not getting the approval of the larger scientific community as well as Government agencies for some reasons or other. According to Stanley Meyer, “the law of Physics establishes a proven function based on ‘Pre-set’ conditions…change any of the conditions and the Law no longer applies….A new law emerges in the consciousness of physics. Why? Because atoms possess intelligence—-Performing ‘what if’ logic function under different ‘preset’ conditions.” His claims were based on scientific principles and explanations.  Based on his invention, many of ‘Electrolyzing devices’ appeared in the market.  They supply Do It Yourself  kits that can be fitted into a car to cut Gasoline consumptions; but they do not entirely  substitute Gasoline like Stanley Meyer demonstrated. There are still missing pieces of information or claims. He was able to show and claim “Hydrogen fracturing process to disassociate water molecules by way of voltage simulation, ionization of combustible gases by electron ejection and then preventing the water formation during thermal ignition releasing a thermal explosive energy beyond ‘normal gas burns’ levels under control state… and such an atomic energy process is environmentally safe”.   He did not use ‘Heavy water’ called ‘Deuterium’ but normal water and controlled state and shown that the covalent bond of water can be broken using an electronic circuit using water as dielectric medium of a capacitor.  It uses a high voltage but a low current and the process is instantaneous.  It differs from the ‘Faraday’s law of electrolysis’ in a conventional sense. The scientific community seems to be a little more understanding with an open mind in recent times to such ‘free energy’ concepts and devices than in the past.  ‘Resonance electrolysis’ has been reported by few institutions and people as an alternative to ‘conventional water electrolysis’ to cut energy consumption. Decomposition of water into its molecules requires high temperature above 3000°C using a process known as ‘pyrolyis’ and a technique to separate the decomposed molecules from reunion for water formation.  Prof. Mizuna of Hokkaido University of Japan and his coworkers demonstrated ‘Plasma Electrolysis’ by an experiment which showed an evolution of anomalous amount of Hydrogen and oxygen sometimes as much as 80 times more than normal Faraday’s electrolysis of gas generation. Though such reaction requires a very high temperature they could not successfully measure the reaction temperatures during the experiments. They used a Platinum anode and Tungsten cathode and a provision to separate Hydrogen and oxygen gases. They concluded at the end of the experiment that the input voltage and the current efficiency were critical parameters.  On increasing the Voltage to several thousands, they said the current efficiency can exceed unity.  The anomalous release of gases indicates that the electrolysis is not a normal electrolysis but beyond that. (Ref:Mizuno, T., T. Akimoto, and T. Ohmori. Confirmation of anomalous hydrogen generation by plasma electrolysis. in 4th Meeting of Japan CF Research Society. 2003. Iwate, Japan: Iwate University) In all these experiments the gases coming out of the system are not at high temperature but at normal room temperature.  The chemistry of water molecule decomposition and plasma pyrolysis is not fully understood.  After all ‘Cold fusion’ seems to be plausible under certain conditions and it may be a panacea for the world’s energy problems.  When our energy requirement exceeds a limit due to a population explosion and industrialization then finding a solution becomes a daunting task. Mohandas Gandhi said: “There is enough for everybody’s need but not for everybody’s greed. Be the change what you wish to see in the world”.

%d bloggers like this: