The Buying and Selling of Greenhouse Gases

Andrea Polli 2007

The accelerating crisis in climate change and the realization that humans are the primary cause of this change has raised questions about ownership and responsibility. Who ‘owns’ the climate change crisis and who is responsible for mitigating and reversing it if possible? The overwhelming response to these questions by governments internationally has been to propose a market solution—in essence, to sell the atmosphere. This paper explores the idea of air for sale from an economic, political and cultural perspective.

In the nineteenth century Alfred Russel Wallace called the atmosphere the 'great aerial ocean,’ today Tim Flannery calls it the 'energetic onion skin.’ [1] This thin, moving coating of the Earth is actually made up of four layers: the lowest layer is the troposphere, which extends seven miles above the Earth and contains about 80% of all the atmosphere's gases. The lowest third of this layer contains half of the Earth's gasses and is the only breathable part. The troposphere is warmest at the bottom and is expanding due to greenhouse gasses. The place between troposphere and the layer above it, the ozone-rich stratosphere, is called the tropopause and has been rising. This rising is due to both the troposphere expanding and the stratosphere cooling and shrinking due to the depletion of ozone. Above the stratosphere and 30 miles above the surface of the Earth is the mesosphere, and above this is a slight trickle of gas that extends into outer space called the thermosphere. Most of this paper will discuss gasses in the troposphere specifically, except in the parts referring to the ozone layer, which will concern the stratosphere.

There are two classifications of anthropogenic compounds that are of concern in the atmosphere: pollutants and greenhouse gasses. The US Environmental Protection Agency (EPA) classifies emissions which are or could be harmful to people as 'criteria pollutants.' The criteria pollutants are: carbon monoxide (CO), lead (Pb), nitrogen dioxide (NO2), ozone (O3), particulate matter (PM), and sulfur dioxide (SO2). There are also a large number of other compounds that have been determined to be hazardous called ‘air toxics.’ All told, in the US, 188 gasses are currently classified as pollutants.

Greenhouse gasses are a class of gasses that trap heat near the Earth's surface. There are about 30 greenhouse gasses, but the primary ones are: carbon dioxide (CO2, which is used as the yardstick for measuring the warming potential of other greenhouses gasses—in 'CO2 units'), methane (CH4), and water vapor (H2O). Some greenhouse gasses are classified as pollutants, but others, including CO2, are not.

Most of the atmosphere is nitrogen (N), about 78% and oxygen (O2) makes up between 20 and 21%. Water vapor makes up approximately 25% over the full atmosphere, but only about 1-4% near sea level. The rest of the compounds are present in much smaller amounts, argon (Ar) at 0.9%, carbon dioxide at about 0.03%, methane at 1.745 parts per million by volume (ppmv), ozone at between 0.01 to 0.07 ppmv. The atmosphere also contains varying trace amounts of sulfur dioxide, chlorofluorocarbons (CFCs), hydrochlorofluorocarbons (HCFCs), particulate matter and other compounds including the full range of criteria pollutants and air toxics.

In this section, I will outline the human causes and effects of these particular compounds: methane, ozone, water vapor, chlorofluorocarbon and hydrochlorofluorocarbon, particulate matter, sulfur dioxide, and carbon dioxide.

Methane is an important anthropogenic greenhouse gas that is primarily caused by microbes in animal waste and decaying biological matter. Its concentration has doubled in last few hundred years, and as a greenhouse gas it is 60 times more potent in capturing heat than CO2 but remains in the atmosphere for a much shorter period of time. It is estimated that methane will be the cause of 15-17% of future warming [2]

Ozone is both a pollutant and a greenhouse gas. The US EPA regularly monitors surface ozone and provides public warnings on days with very high levels. Studies have found that ozone causes shortness of breath and coughing, triggers asthma attacks, and high levels cause an increase in emergency room visits and hospital admissions. Ozone is an extremely reactive gas that irritates the respiratory system and can kill people with severe respiratory problems, and studies show that as ozone levels increase, the risk of premature death increases.

In the troposphere, ozone is caused by smog that forms when nitrogen oxide and hydrocarbons released into the atmosphere interact with one another, often because of sunlight, heat and stagnant air. Ozone is the major component of smog, and in the past hundred years, there has been a global increase in ozone at ground level. However, in the stratosphere, ozone occurs naturally and is very necessary in shielding the surface from dangerous UV radiation. Ozone depletion in the stratosphere contributes to higher rates of skin cancer, higher rates of cataracts, and other vision problems. A motto the EPA uses to describe this ozone distribution is 'good up high, bad nearby.'

Water vapor is an important greenhouse gas: in the air, it can trap heat. At higher temperatures, the atmosphere is able to hold more water vapor and therefore traps more heat, accelerating global warming. However, by blocking sunlight, water vapor in the form of clouds can also have a cooling effect. High, thin clouds promote warming; low, thick clouds promote cooling. The unpredictability of clouds contributes most to the uncertainty of the science of climate change. [3] In 2001, meteorologist Richard Lindzen of MIT put forward a hopeful but controversial idea that clouds regulate the Earth's heat by blocking more radiation at higher temperatures, acting like an 'iris' of an eye or camera, but unfortunately no evidence has been found for this to date.

The CFC and HFC family of chemicals are powerful greenhouse gases that also destroy ozone in the stratosphere. Chlorofluorocarbons are compounds containing chlorine, fluorine and carbon only: they contain no hydrogen; while hydrochlorofluorocarbons are of a class of haloalkanes where not all hydrogen has been replaced by chlorine or fluorine. These human-invented compounds were used widely as refrigerants, propellants, and cleaning solvents, but because of the effects on the ozone layer, much of their use was prohibited by the Montreal Protocol. HCFCs are used primarily as CFC substitutes, as their ozone-depleting effects are only about 10% that of the CFCs. [4]

SO2 (sulfur dioxide) is not a greenhouse gas; it actually has a cooling effect, yet it is a dangerous pollutant that causes acid rain by reacting with water to create suphuric acid. Acid rain kills plants and animals, makes lakes acidic, and damages tree growth, sometimes killing whole forests. In addition, SO2 can affect human health, in particular people suffering from asthma and chronic lung diseases. SO2 is released primarily when low quality coal is burned and lasts a few weeks in the atmosphere.

Particulate pollution (identified in pollution measurements as PM2.5 + PM10) is a mix of soot, smoke and other tiny particles formed in the atmosphere primarily from sulfur dioxide, nitrogen oxides, and ammonia. Not considered a greenhouse gas, it lasts only a few weeks in the atmosphere and can have a cooling effect. Particulate pollution can scatter and absorb light, cutting sunlight by up to 10% and lowering temperatures. This cooling effect was illustrated dramatically during the three days after September 11, 2001 when planes were grounded over the US. Scientists found that actual temperatures were higher than expected during those three days, and now many attribute this to the lack of contrails—particulate pollution caused by airplanes—in the air. Particulate pollution has also been shown to induce heart attacks and strokes, cause lung cancer, trigger asthma attacks and increase the need for medical care and hospital visits. People with cardiovascular diseases, children and the elderly are most vulnerable to the health risks associated with particle pollution, as are people who suffer from chronic lung disease.

Carbon Dioxide (CO2) is the most important greenhouse gas because it is so widely produced, and it stays in the atmosphere for a very long time—56% of all the CO2 that has ever been created by burning fossil fuel is still in the air. CO2 is produced through burning any carbon-based material and through decomposition, and of the CO2 caused by burning fossil fuels, 41% is from coal, 39% is from oil and 20% is from gas. Burning a ton of coal creates 4 tons of CO2. [5]

CO2 plays a very important function in the Earth’s ecosystem. It is necessary in helping to maintain warm temperatures on the Earth. It is also necessary for plant growth, although plants grown experimentally in high CO2 environments have been found to have reduced nutritional value. In terms of food production, trees benefit much more from increased CO2 than the grasses that are the staple of the human diet (rice, wheat, corn). Warming caused by CO2 will also decrease crop yields because of more ozone and changes in moisture. There is 50 times more CO2 in the ocean than in the atmosphere, and too much CO2 in the ocean causes acidification. [6]

In spring, plants cause a decrease in total CO2, and in autumn cause an increase due to decomposition. Paul Crutzen named the current era the 'Anthropocene' age, or the age of humanity, when human activity alters the global chemistry of the Earth. Crutzen estimates this age starting at around 1800 AD, or the beginning of the industrial age, because since then CO2 levels have been steadily rising. But Bill Ruddiman places the beginning of the dramatic human effect on the environment earlier, locating the anthropocene’s start at the beginning of agriculture.

There is presently around 3 parts per 10,000 of CO2 in the Earth's atmosphere (approximately 380ppm), and this amount has been steadily rising (2-3ppm each year). [7] The amount is very small, but it has a profound effect on warming, if it was to increase to only 1% of the atmosphere, the oceans would boil. [8] Scientists estimate that climate could stabilize with 550ppm maximum, which, according to the Intergovernmental Panel on Climate Change, would require a reduction in CO2 emissions of 60-80% present levels. [9]

Air quality is monitored using a number of different methods, the most accurate being a combination of ground-based flask sampling and continuous monitoring, flask sampling at tall towers, and remote sensing from aircraft and satellite. The National Oceanic and Atmospheric Administration (NOAA) operates an international system of flask sampling in which volunteers from 60 sites around the world travel to coastlines, hike up mountains or walk miles into the desert once a week in order to fill two glass flasks with air using a battery-operated pump and compressor. After several weeks, these volunteers deliver the flasks to a mailroom in a US Embassy, a nearby meteorological agency or a university department from which the flasks are returned to a laboratory in Boulder, Colorado. There, at the US NOAA Earth Science Research Laboratory (ESRL), scientists in the Global Monitoring Division analyze the air to determine the global mix of greenhouse gases. ‘We get about 15,000 [flasks] a year,’ said Russell Schnell, director of observatory and global network operations in the Global Monitoring Division. [10]

On the local level, the state of California recently implemented a comprehensive air monitoring system. A bill passed by the state legislature in 2007 requires California to reduce its carbon emissions to 1990 levels by 2020, a reduction of 25%. By 2050, carbon emissions must be reduced to 80% below 1990 levels. Given that California has the fifth largest economy in the world, with the highest greenhouse gas emissions of any state—estimated at around 400 million metric tons per year—both these targets require substantial reductions.

Lawrence Berkeley National Laboratory developed a monitoring system for the state of California to help it achieve its emissions reductions goals called CALGEM: the California Greenhouse Gas Emissions Project. CALGEM is a good example of a system working through a combination of ground-based measurements and continuous monitoring, flask sampling at tall towers, and remote sensing by aircraft and satellite that provide estimates of the totals using spectral analysis. [11]

‘Economic super-powers have been as successful today in their disproportionate occupation of the atmosphere with carbon emissions as they were in their military occupation of the terrestrial world in colonial times.’ Andrew Simms, The New Economist Foundation [12]

In 1997, the Kyoto Protocol was born, requiring 35 industrialized countries and the EU to reduce greenhouse gas emissions by an average of 5% below 1990 levels by 2012. Despite being the world's biggest emitter of greenhouse gasses, the United States is not a part of the protocol. In the Kyoto Protocol, the main 'basket' of greenhouse gases to be reduced are: carbon dioxide, methane, nitrous oxide, hydrofluorocarbons, perfluorocarbons, and sulphur hexafluoride. In addition, the protocol identifies 'annex' gases (or indirect greenhouse gases) that developed countries must also monitor. These are: carbon monoxide, nitrogen oxides, non-methane volatile organic compounds and sulfur oxides.

The Kyoto protocol includes a global greenhouse gas emissions trading system that has now been in place in Europe for over two years. This emissions trading system could also be called a ‘cap and trade’ system, and this is how it works: in the first year, credits are generous, the total amount of emissions for each company is determined and each company gets close to that amount. Then, each subsequent year, the amount of credits allotted to each company is reduced, allowing companies to slowly lower emissions.

Where does the idea of turning a problem with emissions into a market-driven economy come from? I took a look at some historical models:

The Chicago Climate Exchange

The most recent model I looked at is the Chicago Climate Exchange, a voluntary global emissions trading system launched in 2003 that covers six greenhouse gases. Companies commit to reduce their baseline greenhouse gas emissions by six percent between 2007 and 2010. The CCE traded over 1 million tons of CO2 in its first six months of carbon trading, but as a voluntary system this is nowhere near the scale needed to make the necessary reductions.

Sulfur Dioxide Trading

In 1990, the US launched a cap-and-trade program in sulfur dioxide as an amendment to the Clean Air Act. An initial objective of the program was to reduce sulfur dioxide emissions from utilities by 8.5 million tons below 1980 levels. To accomplish this, electric utility plants above a certain size were given an initial allocation of emissions allowances based on historical patterns. The Act included stiff penalties for excess emissions, at a value more than 10 times that of reduction costs. The program achieved a very high degree of compliance and the US Congress considers the program a success. The current European greenhouse gas trading system is modeled after this system. [13]

CFC Trading

Another model is The Montreal Protocol on Substances That Deplete the Ozone Layer, an international treaty that entered into force on January 1, 1989 and designed to protect the ozone layer by phasing out the production of a number of substances believed to be responsible for ozone depletion, in particular CFCs and halons. The Montreal Protocol created a system for the international trading of allowances. In the protocol, trading is combined with a tax, to offset any large profits from allowances that might discourage the reduction of CFCs. Since the Montreal Protocol came into effect, the atmospheric concentrations of the most important CFCs and related chlorinated hydrocarbons have either leveled off or decreased, and Kofi Annan, Former Secretary General of the United Nations, calls the protocol ‘Perhaps the single most successful international agreement to date.’ [14]

However, by 1997, smuggling of CFCs from developing nations, where they were still permitted, into the US and other developed nations, had become big business. In Miami in 1997 smuggling of CFCs was believed to be second only to cocaine. And the problem with smuggling still hasn't ended today. Although the protocol's goal was to completely phase out these gases by 2000, as recently as 2005, for example, several companies in Eastern China were found to be involved in illegal international trading of CFCs. [15]

The difference between a CO2 market and both the CFC and sulfur dioxide markets is that the technology exists to clean up both CFC and SO2 emissions, and in any case these markets involve only a relatively small number of companies with outdated technology. The CO2 system, on the other hand, involves thousands of companies—and there is no existing technology to make coal burn more cleanly or sequester carbon emissions safely.

Another model for distributing air rights is the right of the airwaves, or broadcasting rights. The US Federal Communications Commission (FCC), which was created out of the Communications Act of 1934, regulates all non-government wire and wireless communications. The Act specified only that broadcasting be in the hands of American citizens, and left it up to the FCC to decide how to license broadcast rights. In the past, applicants were required to describe plans for programming to be judged on general usefulness to the public and practicality. In practice this was a combination of private and governmental control, with a strong emphasis on private control.

However, in 1993, Congress gave the FCC authority to use competitive bidding, and since 1994, the FCC has conducted auctions of licenses for the electromagnetic spectrum, open to any eligible company or individual that submits an application and an upfront payment. According to the FCC, the auctions more effectively assign licenses than the previously used hearings or lotteries and has substantially reduced the time from initial application to license grant. In 1997 congress passed legislation that required the FCC to use auctions for all licensing unless exemptions applied, for example public safety, educational and public broadcasting. For a commercial broadcaster, purchasing through auction became the only way to gain rights to the airwaves. [16]

In 1996, the FCC relaxed the rules that restricted broadcasters from owning several radio or television stations in one market, allowing broadcasters, for example, to own an unlimited number of radio stations. This move has created an increase in advertising prices and caused an unexpected outcry from communities who have found that the consolidation of the media within the hands of a few large corporations has resulted in a loss of quality local programming, with corporations instead polluting the airwaves with homogenized, computer-controlled broadcasts. The lesson to be learned that might apply to a future market in greenhouse gasses is that the de-regulation of the market, allowing the unrestricted purchasing of air rights, be it polluting or broadcasting, could well result in a disaster to the quality of air and the airwaves.

The New York City noise code can also be looked at as a model for emissions control on a local level. Like air pollution, loud noise poses heath risks. The Environmental Protection Agency warns against exposure to sound more than 75 decibels, while most New York City traffic is at 85 decibels, an ambulance siren is 120, and the subway is 95. Hearing loss is not the only health problem that can be caused by constant exposure to noise; the resulting stress has been shown to increase the risk of heart attacks. [17]

In 1972, New York was the first city in the US to adopt a noise code, and on July 1, 2007, the first update to the noise code in 35 years went into effect. Police are equipped with noise meters and the code limits loud air conditioners, fans, car alarms, music, construction and barking dogs. Policing the noise code is difficult because like air pollution, noise can be elusive, and honking horns or loud car stereos are often gone by the time police arrive. It is difficult to pinpoint the source of noise and individual sound sources can add up to loud noises. For example in the case of groups of air conditioners, each unit may not exceed noise regulations, but in combination they can be deafening. The right to produce noise is not traded on a market, but like the emissions markets, the biggest producers of noise get special allowances—airplanes and trains are exempt from the New York noise code.

Another environmental market system that could serve as a useful model is a system used in some global fisheries known as ‘catch quotas.’ In most US fisheries, the government controls catch amounts by limiting the number of days fishermen operate and how much they collect each trip, but in Alaska since 1995 a market system called ‘dedicated access privileges’ has been in place, which grants shares in each fishery to individual fishermen, who can then can buy and sell their shares. Since 2005 there has been a move to expand this program throughout the US, which has angered many environmentalists who see it as the privatization of a public resource. [18]

This brings us to a fundamental aspect of the greenhouse gas emissions trading system, the issue of property rights to the air. The idea of an ecological economics came from the understanding that environmental resources are finite, and since these resources can be destroyed, there should be incentives for protecting them. Ecological economics provides both a mechanism for the valuation of environmental resources and an incentive for keeping within an established environmental ‘budget.’ In 1997, the US Congress described it in this way:

From an economic perspective, pollution problems are caused by a lack of clearly defined and enforced property rights. Smokestack emissions, for example, are deposited into the air because the air is often treated as a common good, available for all to use as they please, even as a disposal site. Not surprisingly, this apparently free good is overused. A primary and appropriate role for government in supporting the market economy is the definition and enforcement of property rights. Defining rights for use of the atmosphere, lakes, and rivers is critical to prevent their overuse. Once legal entitlement has been established, markets can be employed to exchange these rights as a means of improving economic efficiency. [19]

Emissions trading systems have long been criticized. At the 1992 Earth Summit in Rio, the NGO Global Forum emphasized avoiding pollution trading schemes that ‘perpetuate or worsen inequities hidden behind the problem or have a negative impact.’ Later, arguments escalated, calling trade in greenhouse gases a new form of colonialism. Arguments for and against this system range from concerns about flaws and possible abuses of the system to criticism of its fundamental assumptions about ownership.

The biggest beneficiaries of a greenhouse gas emissions trading scheme are in the banking industry (in the US, this industry is currently lobbying heavily for the implementation of a carbon cap and trade system) and the nuclear power industry that stands to gain a loosening of restrictions on the production of new power plants.

However, despite the support of the banks, not all economic experts endorse the use of such a system. Many would rather see a carbon tax in place. In practice, a carbon tax functions very much like a trading system: polluting companies either pay a tax or pay for carbon credits for their emissions. Both systems will also raise consumer prices on fossil fuels. However, critics of the tax system say that it doesn’t provide the incentive or ‘race for the pot of gold’ that the carbon trading system provides by financially rewarding companies that can substantially cut emissions. [20]

Nicole Gelinas of The Wall Street Journal argues against a trading system in the US from the perspective of global competition. She says that if US-based energy companies can't limit emissions, the international cap and trade system would allow them to buy emissions credits from other countries. She calls this a ‘direct subsidy to developing nations by paying for their power-plant upgrades.’ In other words, a Federal carbon tax, an alternative to the international carbon cap and trade system, would provide revenue to the US government that could then be used to subsidize US power plant upgrades, while an international cap and trade system could put this revenue in the hands of other countries, particularly those that already have reduced emissions. Gelinas's views seem to be in line with the US government, as one of the reasons the US has stated it is against the Kyoto Protocol is that it provides exceptions for developing countries. [21]

In The Weather Makers, Tim Flannery argues with this position, stating that although developing nations were not bound by the Montreal Protocol, it was very successful. In general, however, Flannery is mixed on the subject of carbon trading. He talks about the case of eastern European countries whose economies have suffered ruin since the 90s and therefore are producing approximately 25% less CO2 than they were in 1990. Since the Kyoto trading system requires that emissions be reduced by only 8% from 1990 levels, these countries end up with a surplus of carbon credits, known as ‘hot air’ to critics of the protocol [22]

Flannery also argues that creating a new global currency is too risky, since the foundation of any currency is trust—in this case trust that the seller will lower emissions—and he sees no guarantee that this will happen. However, he observes that emissions trading is cost-effective, and as a tool to reduce pollution it has been successful in the past, as in the example of sulfur dioxide trading.

In Carbon Trading, Larry Lohman quotes Flannery's work when outlining the potential effects of climate change, but is much more pessimistic than Flannery with regard to markets. He looks at the seriousness of the problem and predicts that many markets will collapse. He gives the example of the insurance business, providing quotes from insurance specialists who estimate that chaotic climate change could cause insurance rates to increase by several times that of world economic growth, creating a situation where the world economy will not be able to sustain the losses and will collapse.

Lohman is critical of the hope held by many industrialized countries that technological developments that allow continued use of fossil fuels such as carbon sequestration is the solution. He devotes a chapter on analyzing how these technological solutions are nothing but a smokescreen to distract public attention from the government's lack of necessary action. He sees this as a ‘second strategy,’ the first being denial of the existence of anthropogenic climate change.

The first strategy works to reshape or suppress understanding of the climate problem so that public reaction to it will present less of a political threat to corporations. The second strategy appeals to technological fixes as a way of bypassing the debate over fossil fuels while helping to spur innovations that can serve as new sources of profit. The third strategy appeals to a 'market fix' that secures that property rights of heavy Northern fossil fuel users over the world's carbon-absorbing capacity while creating new opportunities for corporate profit through trade. [23]

In Earth in the Balance, Al Gore embraces in part this ‘second strategy,’ proposing an SEI (Strategic Environment Initiative) which like the existing US SDI (Strategic Defense Initiative), would be a major national effort, but with a focus on the environment rather than the military. Although he does emphasize that any new technologies should be evaluated, the focus of his proposed SEI is on developing new technologies to combat climate change.

Finally, Lohman is very critical of what he calls the third strategy, the ‘market fix,’ primarily because of the property rights issue—the privatization of the air. Like Flannery, he is concerned that the creation of a ‘top down’ greenhouse gas emissions market without public debate will ensure a market filled with distrust—a lack of faith in both its structure and its implementation.

The bottom line according to Lohman is that the current emissions trading structure gives the most allowances to the biggest emitters, effectively giving a handout of billions of dollars to the most egregious polluters and providing incentives to them to keep polluting.

In The Great Emissions Rights Give-away, Andrew Simms, policy director of the NEF (New Economics Foundation) and Feasta (The Foundation for the Economics of Stability), proposes an alternative structure for EU carbon trading. In this structure, the EU's emissions allowances would be divided up on an equal per capita basis and distributed to every EU resident. Residents would then be able to sell these allowances to companies or keep them off the market promoting cleaner air.

Simms compares this approach to a number of alternative approaches. Dr. David Fleming’s Tradable Energy Quotas proposal has governments provide each citizen with a portion of carbon units equal to the amount of the general public’s fossil fuel use who then can use the units to purchase energy and fuel, or sell unused units on an open market. Carbon units used by industry are sold at an auction similar to the FCC broadcast rights auction.

The Sky Trust in the US proposes a system where any money made from the sale of emissions credits are kept in a trust to ensure that any financial gains are used to benefit the public.The proposals of both The Sky Trust and NEF/Feasta are based on the fundamental principle that the atmosphere belongs to all people equally and not to governments or corporations.

One might think that the idea of 'air for sale,' however absurd, is only in place in the abstract arena of the market. After all, no one would actually pay for the air they breathe. Culturally, this is no longer the case, as evidenced by the rising popularity of something called the oxygen bar and canned air.

In both the case of the oxygen bar and of canned air, oxygen is touted as a cleansing and medical 'therapy.' Advertisements focus on the healing power of air, using aromatherapy and something marketers call 'oxygen therapy.' Advertisers promote the air as energizing for exercise, effective in combating cigarette smoke and curing a hangover. They sell on the idea of being pure, fresh and clean, and many promote it as an escape from the smog of city life. In the case of the oxygen bar, customers pay for a five-minute session or so, in which they are able to relax and breathe clean, sometimes scented air.

The oxygen bar started as a trend in the 1990s in Japan, Mexico and South America and quickly spread to nightclubs, spas, casinos and malls in Europe and the US. In 2003, the oxygen bar at Olio!, a restaurant at the MGM Grand Hotel in Las Vegas boasted 200 to 400 customers per day. [24] Portable canned air is becoming just as popular and widespread. In Japan, a recent large-scale commercial venture is O2supli, a portable can of oxygen. The oxygen comes in two flavors: ‘strong mint’ (called the brain can) and ‘grapefruit’ (called the body can) and cost 600 yen a can.

The idea behind the product is to allow buyers to replenish their oxygen levels anytime they feel a lack of it due to stress, fatigue, or other factors … Each can contains enough oxygen for 35 two-second inhalations, meaning each can lasts for roughly a week if it is used five or six times a day. At first the canned oxygen will be sold in Tokyo … then at all 11,000 of Seven-Eleven Japan’s nationwide stores. [25]

How could our global culture have reached a point where the absurd notion of buying and selling the air is acceptable on any level, corporate or individual?

‘When art becomes idea, idea becomes commodity’ [26]

Perhaps the arts, specifically contemporary conceptual artworks, have played a role in making the purchase of air culturally acceptable. As creative works, art and architecture have value in society, not just cultural value (although they have that too), but monetary value. In the 50s and 60s, Yves Klein's idea of Air Architecture challenged the definitions of art and architecture, but on a wider scale may have contributed to the idea of commodifying the public resource of air. Klein was interested in the ways that humans can use science and technology to conquer the ephemeral, to the point of turning even air and fire into building materials. Klein saw science and technology as the savior of architecture, promoting new forms and structures made by sculpting the air and other ‘immaterial-materials.’ He believed that Air Architecture would actually improve the environment, saying that ‘Air Architecture must be adapted to the natural conditions and situations, to the mountains, valleys, monsoons, etc., if possible without requiring the use of great artificial modifications.’ [27]

In the late 1960s a group of artists including Robert Barry became associated with art dealer Seth Siegelaub and started producing work that questioned the limits of art. Barry’s work, known as ‘invisible’ art, included The Inert Gas Series (1969) in which a specific amount of gases such as neon, xenon and helium are released ‘from measured volume to indefinite expansion’ in the Mojave Desert. [28]

Lucy Lippard observed in Six Years: The Dematerialization of the Art Object that 'novelty is the fuel of the art market,’ and at the time of Robert Barry’s Inert Gas Series, this 'fuel' is was being burned at a rapid pace, constantly stretching the boundaries of the definition of art. This 'dematerialization' attempts to remove art from its status as commodity by creating such 'objects' of art, like the natural expansion of gas, that would be absurd to commodify, yet this work doesn't remove itself from the art market. In fact, the accelerating movement was being driven in large part by the art market, looking for newer and more avant-garde ideas. [29]

These ties to the market, necessary for the creation of the art and the survival of the artist, nonetheless create a paradoxical situation in which the immaterial is moved into the object realm. The critical stance of the artist on the art market is completely lost through the positioning of the work within the art market.

Tue Greenfort's 2005 Bonaqua Condensation cube is an homage to Hans Haacke's Condensation Cube of 1963. The contemporary work uses Bonaqua, popular brand of bottled water, as the substance of condensation. Greenfort is directly addressing the issue of ownership. What was considered in 1963 to be a public resource is by 2005 a commercial product. Like the earlier work, the piece is positioned as a gallery artwork, an object with at least the expectation of being given a monetary value. Also like the earlier work, this piece pokes fun at the absurdity of the system, but problematically both remain a part of that system.

Here are some examples of recent works related to air that don't reside in the gallery context but rather in the context of public art. Laurie Palmer’s 2005 ‘Hays Woods / Oxygen Bar’ project at Carnegie Mellon University draws attention to the natural processes that create air, and highlights it as a public resource.

Want to breathe some pure oxygen courtesy of plants from Hays Woods? Try the Oxygen Bar, a project of artist A. Laurie Palmer. The oxygen bar is a mobile breathing machine, offering free hits of ‘natural’ oxygen on a first-come, first-serve basis. This oxygen is produced by the photosynthetic work of green plants (from Hays Woods) and is offered as a public service. It reproduces in miniature the beneficial cleansing and refreshing effects of city green spaces on the air we breathe. The oxygen bar anticipates the imminent loss of public resources that filter Pittsburgh’s dirty air and replenish it with oxygen, in particular, Hays Woods. At the same time, the oxygen bar anticipates the active participation of citizens of Allegheny County in land use decisions affecting our public health. [30]

The premise of Amy Balkin’s 2007 work Public Smog lies in the economic system of emissions trading. The 'global public' purchases as many emissions allowances as possible on the emissions market. These carbon offsets are then retired, in other words taken off the market, making them unavailable to polluting corporations. By openly embracing the free market for the public good, Balkin presents a sharp critique of the system the project must operate within. The 'public space' in which her work operates is the emissions trading market.

Balkin's work clearly questions the emissions trading system. I believe that the solution she proposes is meant to be absurd. But, in the context of contemporary culture, the solution seems like a viable one, in fact it's very similar to the structure proposed by Feasta, where people buy and sell emissions credits on the market, except in the case of Feasta, a certain amount of emissions credits are distributed for free to citizens. A group called TheCompensators* proposes the exact same solution as Balkin, claiming to have retired over 1500 EU emissions allowances. [31] These solutions create potential problems grounded in the very idea of the market. Healthy markets grow, and if people decide to buy emissions credits or clean air, what could happen to the market is that more emissions credits will be issued to balance the market and meet demand, in effect forcing the public to pay ever higher prices for clean air.

The difference between Balkin’s Public Smog and TheCompensators* projects lies not only in that one identifies itself as an art project and the other as an environmental project, but in the underlying metaphors that support the work. By using the metaphor of a public park, Balkin’s work allows viewers and participants to look at the system of emissions trading through a familiar lens. Most viewers, understanding the difference between public and private property in the context of land and the implications of privatizing public land, gain a greater understanding of the implications of an emissions trading system through this metaphor.

The message of Public Smog becomes even clearer when seen in relation to one of Balkin’s earlier works, This is the Public Domain. In 2003, Balkin purchased a 2.5 acre parcel of land located in Tehachapi, California, intended as a permanent, international commons, free to everyone, held in perpetuity. Since there is no precedent to creating such a commons in the current legal system, Bakin’s project involves a complicated legal process that explores solutions in both real property law and copyright law. Through this process, Balkin questions the foundations of the existing laws.

Although Balkin identifies her projects as art projects, in the case of This is the Public Domain, this classification is essential to her legal approach which explores the land as a creative work of art, there is a movement of people who move fluidly between the roles of artists, environmentalist and activist. Ben Engebreth’s work Personal Kyoto provides individuals in various US cities the chance to personally comply with the Kyoto protocol. Engebreth himself doesn't identify purely as an artist or call this project an art project, and the project operates outside of the art market on the internet. Personal Kyoto analyzes electric usage information and calculates an energy reduction goal of something like what the Kyoto Protocol requires. Personal Kyoto allows individuals to monitor electric use with the goal of reducing their personal consumption of greenhouse gases.

Personal Kyoto and other projects that focus on personal responsibility embrace the philosophy of 'voluntary' emissions reduction, like the Chicago Climate Exchange on an individual scale. Does this focus on personal and corporate choice and responsibility detract from the urgent need to curb emissions now? Is the American individualist ideal—the ideal behind the problematic proposal made by the US to the EU recently promoting volunteer emissions reduction—going to promote the changes needed? Voluntary emissions reductions may be a great idea on an individual level, but will they work on a corporate scale?

The questions raised here do not represent a criticism of those artworks—they should be praised for bringing up such complex issues. The paradoxical problems that are expressed through them are a function of the system in which the works exist: either within the art world, with a gallery economy based on the buying and selling of works, or within the public art world, in which works are supported by government or private interests, including those works which operate in the semi-public forums of the market or internet. In the context of climate change, the works bring up larger questions about the potential of art in a time of global environmental crisis, and more specifically the potential of art and science collaboration.

Airlight is the name given to a visible white smog caused by the illumination of fine dust particles in the air. The term is often used in Los Angeles, where car exhaust fumes create airlight, what author Lawrence Weschler describes as 'a billion tiny suns.' [32]

The Airlight series first began as Airlight Taipei in summer 2006. Summer in Taipei is unbearably hot and humid, forcing residents to stay in air-conditioned buildings most of the day. The city is crowded, with over six million in the greater Taipei area. Although public transportation is excellent, several elevated highways cut through the city, like contrails cutting through the dense air. Taipei’s geography works against its air quality: the city is located at the base of a bowl, surrounded on all sides by small mountains with only one small outlet for the stagnant air, wich often remains trapped for days. In addition, Taipei is located downwind of Southern China, where the energy demands of recent modernization has meant the development of more coal-burning power plants. Wind flow from west to east shifts a large amount of the pollution from China’s coal industry into Taipei’s air. The effects of poor air quality are visible in the faces of its citizens—or rather, over the faces of citizens, as dust masks have become a fashion item, color-coordinated with clothing and motorbike helmets.

During a residency at the Taipei Artist Village, I had the great fortune to meet and collaborate with Dr. Chung-Ming Liu, Director of the Global Change Research Center and Professor of the Department of Atmospheric Sciences at National Taiwan University. For our project, Dr. Liu gathered and formatted real-time Taipei air quality data for almost 20 sites around the city and uploaded the results onto a website. This allowed me to automatically download statistics gathered hourly on particle pollution, ozone, and other pollutants in the atmosphere and convert this information in real-time into a changing rhythmic visual and sonic landscape, translating the ‘noise’ of the pollutants into a kind of rhythmic ‘noise’ that expressed what Dr. Liu called the ‘daily variation’ of air quality in the city. The traffic engineering office of Taipei city offers a large number of public traffic cams, so I was able to synchronize the sound of the air quality with live traffic webcam images. I used the pollutant levels to make the images break apart, appearing and disappearing with rising and lowering pollutant levels.

The idea of ‘noise’ was central to the structure of both the sound and image in this work. I used a source sound that had a wide frequency spectrum—in other words, a very ‘noisy’ sound. I then used the levels of pollutants to amplify and filter frequencies in this noise, picking out certain frequencies to represent levels of pollutants and creating an effect of high-pitched screeches like automobile brakes when pollutant levels were high. Despite this noisy structure the resulting sound would loop, changing only slightly each hour with a new value. This repetitive structure created a rhythmic, ambient sound that functioned very much like background noise. The imagery was also structured around the idea of noise. The original image was an unaltered traffic cam image that would pixellate based on the levels of pollutants in the air. This had the effect of a rhythmic blurring and focusing of the image, in time with the sound. This rhythmic blurring and focusing lent a sense of quivering or breathing, giving the image a kind of life. In discussing ephemeral and process-based art, Steven Connor says that ‘in much recent art, air has become the marker, not of the difference between art and life, but of the aspiration of art to trespass beyond its assigned precincts, to approach and merge into the condition of “life”.’ [33] In the Airlight series, I have attempted to give a kind of ‘life’ to the air quality data being collected, creating an alarming scream and image blur that increases in intensity as the levels of pollutants increase. [34]

Since Airlight Taipei was first shown, I have been able to create similar projects in two other locations: Southern California and Boulder, Colorado. The first project was Airlight Socal, for which I was able to work with live webcams from the California Department of Transporation (DOT) and daily amounts of O3 and NO2 for various locations in Southern California updated hourly provided by the South Coast Air Quality Management District AQMD with the help of Kevin Durkee. [35] For Airlight Boulder, I integrated hourly air quality data provided by The AQI and VSI Air Quality Reporting Systems of The Colorado Department of Public Health and Environment and webcam images from the Colorado DOT.

Air quality data is most often presented to the public through pre-formatted webpages that include charts, graphs and color-coded alerts. For the Airlight series, it is necessary to have access to the raw data, and each organization I worked with was able to provide this raw data to me on an individual basis. Through the Boulder project, I came in contact with the AIRNow Data Management Center that collects and distributes air quality data from over 2000 sites in the US. They were very interested in the work I was doing, and are interested in ways in which the data being collected can be communicated to the public more effectively. I communicated extensively with AIRnow staff about the kind of data formatting I need for the project, and they used the Airlight project as a model for implementing a way to provide raw data access to artists and developers around the world through the web interface. Presently, developers need to contact AIRnow for the specific URLs to access the raw data, but hopefully they will create a more open platform in the future as the public realizes the importance of air quality issues.

To me, this paper brings up more questions than answers. Weather and climate research has a certain amount of unavoidable uncertainty. This uncertainty has been used to discredit science and exploited to support various political agendas. Art often presents a personal interpretation of information. How does the personal expression element of art intersect with the uncertainties of weather and climate science? Can this intersection help or hinder public understanding of science? What is the responsibility of the artist to present 'correct' science in art/science collaborations? With an element of uncertainty, how is 'correct' science to be presented?

Currently in the US, science is highly politicized, particularly the science of climate change. How do these politics affect art and science collaborations dealing with climate change? Can art help science out of the political quagmire?

Because artworks operate within various economies, whether they are gallery works or works of public art, how does this positioning in the marketplace affect the message and effectiveness of the work? In particular, how does the marketplace affect the interpretation of works addressing the urgent issue of climate change? Must artworks always be seen in the context of some market? Is this beneficial for addressing issues related to air quality and climate change? What other mechanisms exist or can be created for art and art/science collaborations so that artwork can be experienced without a market bias?

[1] Tim Flannery (2005) The Weather Makers. New York: Grove Press, p. 5 [2] Flannery, p. 30 [3] Flannery, p. 28 [4] The US Environmental Protection Agency, October 11, 1997 <> [5] Flannery, p. 70 [6] Flannery, p. 32 [7] Flannery, p. 5 [8] Flannery, p. 24 [9] Flannery, p. 29 [10] USINFO interview, August 24, 2007 <> [11] Science @ Berkeley Lab, April 2007 <> [12] Larry Lohman, (2006) ‘Carbon Trading, A Critical Conversation on Climate Change, Privatisation and Power’ Development Dialogue no. 48, p. 19 [13] The Joint Economic Committee Study, US Congress July 1997 <> [14] ‘Stratospheric ozone depletion 10 years after Montreal’ Natural Science October 8, 1997 <> [15] ‘Corruption stalls government attempts to curb CFC trade’ China Development Brief December 15, 2005 <> [16] The Federal Communications Commission <> [17] ‘Noise Raises Heart Attack Risk’ CBS News November 28, 2005 <> [18] Juliet Eilperin ‘Bush Aims for Market Approach to Fishing’ The Washington Post Tuesday, September 20, 2005 p. A21 <> [19] The Joint Economic Committee Study, US Congress July 1997 <> [20] Deborah Solomon “Climate Change’s Great Divide” The Wall Street Journal September 12, 2007 [21] Nicole Gelinas “A Carbon Tax Would be Cleaner” The Wall Street Journal August 23, 2007 [22] Flannery, p. 225 [23] Lohman, p. 54 [24] Jean Rimbach ‘Oxygen Bar in Atlantic City, N.J., Feeds Fresh Air to Tourists’ Free Republic, Dec. 31 2003 <> [25] Mainichi Daily News May 14, 2006 [26] Alexander Alberro and Sabeth Buchmann (Eds.) (2006) Art After Conceptual Art. Cambridge: MIT Press [27] Yves Klein (2004) Air Architecture edited by Peter Noever and Francois Perrin, Los Angeles: MAK Center for Art and Architecture [28] Paul Wood ‘Movements in Modern Art: Conceptual Art’ 35-6 [29] Lucy Lippard and John Chandler (1968) “The Dematerialization of Art.” Art International, February p.31 [30] Proposal materials, Laurie Palmer (2005) ‘Hays Woods/ Oxygen Bar’ project, Carnegie Mellon University [31] TheCompensators* <> October 16, 2007 [32] Laurence Wechsler (2004) Vermeer in Bosnia New York: Pantheon [33] Steven Connor (2007) “Next to Nothing: The Arts of Air” talk given at Art Basel [34] See <>

Main Resources

Tim Flannery (2005) The Weather Makers. New York: Grove Press Larry Lohman, (2006) ‘Carbon Trading, A Critical Conversation on Climate Change, Privatisation and Power’ Development Dialogue no. 48 The Intergovernmental Panel on Climate Change, Climate Change 2007: The Physical Science Basis Summary for Policymakers, February 2007 The Intergovernmental Panel on Climate Change, Working Group II Contribution to The Intergovernmental Panel on Climate Change Fourth Assessment Report: Climate Change 2007: Climate Change Impacts, Adaptation and Vulnerability Summary for Policymakers. unedited, 2007 “The Great Emissions Rights Giveaway.” Feasta, The New Economic Forum, March 2006

Additional Sources

Michael Pollan (2007) The Omnivore's Dilemma. New York: Penguin. Al Gore (2006) Earth in the Balance. New York: Rodale Books. Julie Sze (2007) Noxious New York: The Racial Politics of Urban Health and Environmental Justice. Cambridge: MIT Press “Banks Urging U.S. to Adopt the Trading of Emissions” The New York Times James Kanter September 26, 2007 Alfred Russel Wallace (1903) Man's Place in the Universe: A Study of the Results of Scientific Research in Relation to the Unity or Plurality of Worlds. London: McClure, Phillips & Co. “Development of an Implementation Plan for Carbon Monitoring in California,” by Marc L. Fischer, William J. Riley, and Shaheen Tonse, was prepared for the California Energy Commission Public Interest Energy Research (PIER) Program. Lucy Lippard (1973) Six Years: The Dematerialization of the Art Object from 1966 to 1972 New York: Praeger. Alex Stefen, ed. (2006) World Changing: A User’s Guide for the 21st Century New York: Abrams.

  • luminous/who_owns_the_air.txt
  • Last modified: 2010-07-21 14:33
  • by