Everything You Need to Know About Your Car’s Carbon Footprint
So what exactly is a carbon footprint? In simplest terms, carbon footprint is the total amount of greenhouse gases produced by various human activities within a given time frame. Carbon footprint is usually measured in tons of carbon dioxide (CO2) released into the atmosphere. Your home has a carbon footprint if you use oil, coal or gas to heat it. That cheeseburger at your favorite restaurant has a carbon footprint: raising the beef and wheat and operating the restaurant all involves greenhouse gases. Your car has a carbon footprint, dependent on the vehicle’s fuel consumption and driving distance.
Transportation accounts for 31% of the annual carbon dioxide emissions in the United States. Annual emissions in the U.S. are around 5 million kilotons, meaning cars account for 1,550,000 kilotons of carbon dioxide emissions every year. By comparison, electricity accounts for 37% of total emissions, industry 15%, and residential and commercial use 10% — less than a third that of automobiles nationwide.
Why is carbon footprint important? Carbon dioxide emissions are offset by natural features of the environment, like forests, which remove CO2 from the atmosphere. However, large amounts of CO2 emissions outstrip the ability of these natural sources to exchange CO2 for oxygen. Changes in the amount of CO2 in the atmosphere changes the amount of heat retained by the earth’s atmosphere (the “greenhouse effect”), which in turn can effect changes in the global climate.
Can the car you drive, the condition the car is in, or even choosing to drive it make a difference in carbon footprint? The answer is yes. Every gallon of gas burned releases 24 pounds of CO2 into the atmosphere. Taking the bus, walking, or riding your bike could account for hundreds of pounds of carbon dioxide a day. The average American releases 19 tons of CO2 into the atmosphere every year. Cutting down even a fraction of that could make a huge difference.
If you do drive, however, fuel efficiency is the biggest indicator of your car’s carbon footprint. The more miles per gallon your car gets, the less carbon dioxide it will pump into the air. In general, this means that smaller, more fuel-efficient cars will have a much smaller carbon footprint than a large SUV.
The true carbon footprint of a vehicle can be a little more complicated than that, however. Common wisdom suggests that newer cars generally have a smaller carbon footprint and are easier on the environment. This is certainly true for hybrids and electric cars, which have been gaining in popularity. But studies have shown that the manufacturing of a car accounts for about 30% of its carbon footprint over its lifespan.
This means that driving an older used car might actually have less of an impact on the environment than buying new. And though hybrids and electric cars provide an appealing alternative to traditional transportation, neither are perfect. Hybrids in particular feature batteries with a significant environmental impact compared to non-hybrids. Electric cars might seem emission-free on the surface, but that’s only true if the energy source they use to recharge is itself renewable.
Carbon emissions make a demonstrable effect on the environment. More CO2 in the atmosphere means the atmosphere retains more heat. This, in turn, leads to warmer land temperatures and sea water, which in turn has led to the melting of glaciers and the global rise of sea levels. Consequences of climate change include heat waves, flooding, and extreme weather, as well as an impact on the availability of fresh water.
How does this relate to you and your car? It all comes back to the average carbon footprint. The average per capita carbon footprint is 6 tons per year. In the U.S. the average is 20 tons per year, and a third of that is created by automobile emissions — meaning your car itself can put out over 6 tons of C02 a year — enough for two people.
A carbon footprint comes down to choices. Your carbon footprint is a rough measure of how many tons of carbon your choices (whether related to your car, your diet, or your environment) put into the atmosphere. Factors will include:
The number of people in your household
What kind of home you live in
The efficiency of your lighting, heating and cooling
How much meat and organic food you have in your diet
Your recycling and waste management habits
What kind of car you own and how much you drive.
There are many free online tools to make estimating your carbon footprint easy:
While there are many tools for factoring your car into your overall carbon footprint, the basic equation remains the same: the average car produces 20 pounds of CO2 for every gallon of gas burned. To measure your car’s carbon footprint yourself, simply determine your gas mileage and figure out how many gallons of gas you burn in a typical day, week or year. This will tell you your car’s basic carbon footprint. Rail, bus and airline travel have their own carbon footprints, which can range from smaller to much larger.
Getting your first look at your carbon footprint can be confusing and demoralizing. The good news is, there are many ways you can reduce the carbon footprint of your car, and most of them are pretty easy.
For example: driving the speed limit, maintaining steady speed, and accelerating and decelerating slowly can all save on gas and thus reduce your carbon footprint. Keeping your car well-maintained also helps: replace your oil and air filters, check the tire pressure and keep your tires properly inflated. Consider switching to a lower mileage car insurance — if you drive less and take public transportation (or your bike) more often, your carbon footprint will naturally go down!
Finally, if you are in the market for a new car, consider your options. A new fuel-efficient car might be the best choice, but a used fuel-efficient model might be even better. Hybrids and electrics, though each have their own challenges, provide eco-friendly alternatives to gas-burning vehicles. When selecting a new car, choose the ones with the lowest proven carbon footprint.
Caring for the environment is everyone’s responsibility, but with a little planning and some careful choices, it doesn’t have to be a job.
Yes, we have seen some encouraging developments: a promised reduction in the use of antibiotics by Tyson Foods and McDonald’s, a marginal wage increase by McDonald’s for a small portion of its worst-paid workers, a reduction of the use of artificial colors by Nestlé, Kraft and others; the elimination of aspartame in some diet drinks by Pepsi (to be replaced by different artificial sweeteners, of course); a more sweeping (and credible) announcement on additives by Panera; and Chipotle’s claim to have all-but-eliminated foods produced using genetic engineering.
Wage increases and reduced antibiotics are welcome developments; the rest of this barely registers in its significance. Replacing aspartame with sucralose or high fructose corn syrup with sugar is rearranging the deck chairs. (The Panera move seems exceptional, but Panera is a new-wave company and has shown some principles from the start. They’re not reacting to pressure from the food movement but from their enlightened chief executive, Ron Shaich.)
This isn’t a case of perfect being the enemy of good enough, but one of not getting carried away by what amounts to a little greenwashing. We need to change that.
I recently had occasion to review some of the columns I’ve written over the last four-plus years in order to collect them for my new book, “A Bone to Pick.” The first one, from 2011 — “A Food Manifesto for the Future” — is a good yardstick to measure progress over that period. In it, I made a number of suggestions with which I think it’s safe to say most food activists would agree. Among these were:
• end subsidies to processed food
• break up the Department of Agriculture and empower the Food and Drug Administration
• outlaw concentrated animal feeding operations
• encourage and subsidize home cooking; tax the marketing and sale of unhealthful foods
I would create a similar list now, and would add “remove the routine use of antibiotics from food production,” “radically improve and expand the school lunch program” and “ensure that there is land for people who want to grow real food on it,” but it wouldn’t have mattered much. I have written about those issues repeatedly, and we’ve seen minimal movement on any of these fronts.
What have we seen? Some increase toward labeling foods produced with genetically engineered seeds, which — if it were to lead to greater transparency — would be a good thing. But this is not a burning issue; better to see labeling that addresses antibiotics, pesticides and treatment of workers and animals. We’ve also seen a strong push, via the 2010 Healthy Hunger-Free Kids Act, to improve school lunches, food-wise the most vital act of the (Michelle) Obama administration. And, as mentioned before, there have been some promises on the part of a couple of corporations to reduce antibiotic use.
But the real action has been local-level success on increasing the minimum wage. And this is for obvious reasons: One, it’s not strictly a food issue, though the biggest number of low-wage workers in the country is in the food system. Two, real organizers are involved. And three, the food workers and their organizers have formed alliances with traditional unions, which — if only barely — remember how to fight.
That’s what the rest of us need to learn: How to use basic organizing skills and how to fight. We need to prioritize one or more issues, we need to unite on those issues, and we need to gather others to apply pressure on politicians at every level and directly on corporations when possible.
The Sierra Club’s “Beyond Coal” campaign gives us a model: a clear goal (close all coal plants by 2030), nearly 200 organizers (and nearly two dozen lawyers), real funding and real results: New coal plant construction has been halted and 188 plants have been closed or scheduled to close since the beginning of the campaign in 2010.
The still-forming food movement must narrow its focus to a few possible-to-win issues. A short list from which to pick might include restricting the use of antibiotics, which is winnable with the right president; eliminating Concentrated Animal Feeding Operations or CAFOs, which are comparable to coal mines and might be winnable with a strong legal strategy and a series of local fights (imagine the excitement if even a single CAFO were shut down); marketing junk to children (around which a number of groups are strategizing, but so far without a strong presence); and fair treatment of workers. (There are other possibilities, of course; I’m not the decider.)
Even before there’s an organization willing to do Sierra Club-like work on one or more of these issues, there is something everyone who cares about food can do, starting right now: Push political candidates running for every single office to take a stand on food issues like these. Eighteen months from now we’re all electing House members, a third of us will choose senators, and there is, as you might have heard, a presidential race.
To my knowledge, and with the exception of the wage fights and Bernie Sanders, no presidential candidate has spoken about any of the above issues. Even Bernie Sanders, by far the most principled and thoughtful of the lot, may not be aware of the importance of these. (Sanders has come out in favor of G.M.O. labeling, however.) Yet he’s a natural ally, and could (and should) be pushed to bring them into any potential debates with Hillary Clinton.
Of course, few of us are going to have much access to Bernie or Hillary. We can, however, reach the people running for Congress from our districts, and it’s time to start asking them questions like these: Where do you stand on getting the routine use of antibiotics out of our food supply? On polluting our land, water and air, on using precious resources to raise tortured animals? On making sure my kids grow up eating decent food? And so on.
I’ll believe there’s a food movement when Hillary Clinton and Jeb Bush are forced to talk directly about food issues. I’ll believe we’re effective when I see the routine use of antibiotics outlawed and when that first CAFO closes. I’ll know we’ve started to win when anyone who wants to farm real food has land on which to do it, when there are high-quality school lunches that are free for all, when we’ve started talking about providing that same quality dinner to anyone who needs it. Until then, we have a lot of work to do.
Anger Translation: Why Obama was Ranting against GOP Climate Policy at Press Dinner
The Comedy Central show, Key and Peele, made famous a bit by Keegan-Michael Key on Obama called the “Luther, the anger translator,” which plays on the president’s laid back approach to controversial issues. But at the annual White House Correspondents’ dinner, Obama outdid “Luther” in ranting against climate denialism. The irascible Luther, taken aback, told the president he needed “counseling.”
As with all good comedy, the bit laid out a key truth about politics: The major problem the world faces in avoiding climate disruption is not economic or technological. It is plain old-fashioned greed and mule-headedness.
Geography matters. Texas and Iowa have a lot of wind. Georgia does not. All three have a lot of sun. But the point is that renewables have reached a price point where they make perfect sense in many markets even without subsidies. Indeed, in some places wind is so cheap that the utility might have to be subsidized for producing it.
So what is the problem?
There are 600 coal plants in the US. They produce a significant amount of our carbon emissions. They are already there, in operation, and there are train lines to bring them coal, and the electricity grid has already been built to bring the power they generate to consumers in cities. So how much it technically costs to generate electricity per kilowatt hour (5 cents) by coal is not the most important thing. The facility and infrastructure costs have already been sunk, years ago. It will for a long time be more expensive to build a new wind farm and connect it to the grid than just to go on burning coal.
In part, the problem is that coal receives massive hidden subsidies. The Environmental Protection Agency has routinely ignored the violation by coal plants of the Clean Air and Water Act. The plants have been allowed to spew acid rain and mercury (a nerve poison) and to give people bronchitis and lung cancer. Then, they cause chemical and coal ash pollution of our drinking water. Not to mention that they’ve been allowed to put billions of metric tons of deadly greenhouse gases into the atmosphere, which will boomerang on humanity big time. Some scientists put the real cost of coal energy at 44 cents a kilowatt hour if you figure in all the damage. That the government has let the industry skate on its damage is a form of subsidy. (The 80,000 coal miners will find other occupations. There were nearly that many workers in Blockbuster video stores a decade ago, and streaming video put them all out of work. Installing solar panels on the roofs of all American buildings will create a lot of new jobs. There are 120,000 solar energy workers already).
The same arguments can be made for natural gas plants. That they are less polluting than coal isn’t saying much, since coal is very, very polluting, and so is natural gas.
So the tremendous price fall in the cost of wind or solar electricity can’t enlist market forces to replace deadly fossil fuels by itself. We need public policy. We need electric lines to be built out from where the wind energy is (this would have to happen in Michigan, e.g, which has almost no renewable energy even though the state is rich in wind). And we need sin taxes to be put on fossil fuels, just as states have put them on cigarettes, to recognize their fatal human health impacts.
We are in a race. We’ll soon be locked in to an average 3.6 degree Fahrenheit (2 degrees C.) rise in world temperatures. That average includes the oceans, which are cold. So the land average in the temperate zone will be higher. But if we go on spewing carbon dioxide and methane on this scale, we can easily go higher, to a 7 degrees F. average increase, which is really 15 degrees for a lot of cities. At that level of increased heat, like setting off millions of atomic bombs in the atmosphere, we can’t be sure how the weather patterns will change, and they could go chaotic, endangering human life.
So we’ve solved the technological problems already. We’ve solved the economic problems. We haven’t solved the policy problems, and it is because we don’t care enough. Some two-bit thugs in Syria can announce themselves fundamentalists and cut off a few heads, and the US public will suddenly demand that billions of dollars be spent bombing them. But we’re not demanding sin taxes on deadly hydrocarbons, which are already killing thousands of Americans annually and are poised to kill millions– even though there are inexpensive renewable substitutes for them that could be implemented for a per capita cost of a couple of lattes a month for a few years.
And that is why the calm and laid-back Obama went into his comedic rant. He knows all this. He knows where the problem lies. It lies in the hold that Big Oil, Big Coal and Big Gas has on the US Congress. And that hold is increasingly a death grip.
The future in which power company customers transitioning to solar electricity generation can choose to either maintain grid connection or cost-effectively generate off-grid has arrived. The implications of this reality, however, are only just beginning to dawn.
Rising on the horizon of a new solar energy era, concerns for stranded assets are keeping power company stakeholders up at night. Equally disconcerting, low- and fixed-income electricity consumers with no financial ability to upgrade to solar will find themselves unable to handle higher utility bills as electricity prices increase, due to increased grid-defection.
Projecting the Rising Solar Energy Era
Only with an informed roadmap of the economic implications of America’s rising love affair with solar energy, can we navigate a socially just and economically viable way forward. The Economics of Load Defection, a new report released by Rocky Mountain Institute (RMI), offers this roadmap.
Noting that retail prices for grid electricity are climbing while costs for solar PV and batteries are declining, The Economics of Load Defection projects the coming electricity load and revenue loss that utilities could well face in the coming 10–15 years. Implications for utility companies and regulators are clearly detailed in the new report, as well as possible paths forward.
Solar-plus-battery systems are expected to play a major role in America’s future electricity grid. But exactly what that role will be is not yet clear. Retail pricing structures, utility business models, and regulatory frameworks are all evolving at a steady pace, and outcomes of these evolutionary processes will largely determine which trajectory the grid will follow into the rising solar energy era.
Focusing on Grid Load and Sales Revenue Economics
As the report authors point out, solar-plus-battery systems are increasingly becoming a cost-effective option for property owners. And, as these utility customers determine and adopt economically optimal solar energy configurations, their dependence on the grid for meeting electricity needs decreases significantly, while solar PV rises to supply the majority of their needs at significantly reduced cost. Although this sounds alarmingly like an S.O.S. from a sinking ship, the report authors offer this for reassurance:
“While the presence of such customer choice has important implications, the number of customers who would actually choose to defect is probably small. The far more likely scenario is customer investment in grid-connected solar-plus-battery systems. Since such systems would benefit from grid resources, they could be more optimally sized, thus making them smaller, less expensive, economic for more customers sooner, and adopted faster. More specifically how system configurations and economics would evolve over time, and what magnitude of customers, load, and revenue that could represent, are the focus of this analysis.”
James Mandel, RMI Principal and report author, notes, “These findings should be compelling for customers and technology providers.” Mandel continues, “No matter how expensive retail electricity gets in the future, customers that invest in these grid-connected systems can contain their electricity costs at or below a ‘peak price,’ yielding significant savings on their monthly utility bill.”
“This is not all risk,” explained RMI manager and report coauthor Leia Guccione. “Because these solar-plus-battery systems are grid-connected, they can offer value and services back to the grid. We need not see them only as a threat.”
The Implications of Increased Electricity Load Defection
Nevertheless, exploring the implications of increased load defection, the report points out that “even if only a fraction of customers adopt such systems, utilities could face lost kWh sales from central generation, potentially undermining revenue needed for ongoing grid investment and maintenance. For example, in the Northeast United States, by 2030 maximum residential and commercial load defection could total 140 million MWh and $35 billion per year.”
In reality, however, the report authors appear to favor utility company adoption of solar-friendly regulatory reform and solar energy grid-connection strategies as a way of slowing the arrival of the inevitable death of the grid. According to the report, even the prevalent “net energy metering” (NEM) strategies and more recently proposed “fixed energy charges,” are not strong enough measures to control ultimate load defection:
Net energy metering is a contentious yet prevalent policy that has successfully supported distributed solar PV’s growth in the U.S. Some argue that it hastens load loss from the grid (net-metered solar PV customers quickly reach effectively zero net grid purchases) and that abolishing net metering will preserve grid load.
Our findings suggest that eliminating net metering merely delays inevitable significant load loss. Grid-connected solar-plus-battery systems will gradually but ultimately cause a near-total load loss even in net metering’s absence. However, fixed charges — which some utilities have recently proposed — don’t ‘fix’ the problem. Similar to our “with” and “without” NEM scenarios, residential fixed charges would likely alter (i.e., delay) the economics for grid-connected solar and solar-plus-battery systems, but likely wouldn’t alter the ultimate load defection outcome. Customers might instead wait until economics and other factors reach a tipping point threshold and more dramatically “jump” from grid dependence to off-grid solar-plus-battery systems that offer better economics for electric service.
Major Findings of The Economics of Load Defection
The major findings reported in The Economics of Load Defection are as follows:
• Solar-plus-Battery Systems Rapidly Become Cost Effective
From the utility customer’s economic perspective, a grid only system configuration evolves in the near term to grid-plus-solar, and then to grid-plus-solar-plus-batteries in the longer term. The report states, “Grid-connected systems of this analysis become economic for customers much sooner, with substantial utility load loss well within the economic life and cost recovery period for major assets. Smaller solar-only systems are economic today in three of our five geographies, and will be so for all geographies within a decade. New customers will find solar-plus-battery systems configurations most economic in three of our geographies within the next 10–15 years.”
• Solar PV Supplants the Grid Supplying the Majority of Customers’ Electricity
The utility customer initially receives the majority of his electricity supply from the grid. “Over time as retail electricity prices from the grid increase and solar and battery costs decrease, customers logically reduce their grid purchases until the grid takes a backup-only role. Meanwhile, solar-plus-battery systems eventually provide the majority of customers’ electricity. For example, in Westchester County, NY, our analysis shows the grid’s contribution shrinking from 100% today for commercial customers to ~25% by around 2030 to less than 5% by 2050. Inversely, solar PV’s contribution rises significantly to make up the difference.”
• Potentially Large kWh Defection Could Undermine Revenue for Grid Investment Under Current Rate Structure and Business Models
The report authors estimate that the grid requires an approximate investment of $100 billion a year, or $2 trillion between 2010 and 2030. This annual investment is expected to be recovered through electricity sales revenue. They point out, however, that a large impact on system economics can come from a relatively small decline in kWh sales revenue. “Notably, our analysis shows that grid-connected solar-plus-battery systems become economic for large numbers of customers, and those systems have the potential to supply greater and greater portions of customers’ electricity. Assuming customer adoption follows optimal economics, the magnitude of potential kWh defection from the grid is large.”
As an example, the report projects what the maximum possible kWh sales erosion might be in the US Northeast by 2030, only 15 years away:
~58 million MWh annually (50% of utility residential kWh sales)
9.6 million customers
~83 million MWh (60% of utility commercial kWh sales)
1.9 million customers
Significant Implications vs Emerging Opportunities
While implications from the above findings could be very large, the report also recognizes emerging opportunities. Although grid-connected customers are projected to represent significant electricity load loss, the customers’ grid-connected solar-plus-battery systems “can potentially provide benefits, services, and values back to the grid, especially if those value flows are monetized with new rate structures, business models, and regulatory frameworks.” Crucially, grid-connected customers are projected to maintain their connection to the grid, as long as grid defection isn’t encouraged by penalizing charges and/or changes to retail electricity price rate structures.
Participants in the electricity system market and other stakeholders are facing profound impacts which, according to the report authors, come with the following considerations:
• For customers that invest in solar PV and solar-plus-battery systems, the emergence of choice is good news.
Report analysis suggests that, “with smart solar-plus-battery investments, customers could see peak pricing emerge, insulating themselves from rising prices for grid-supplied electricity.” However, traditional grid-supplied customers would see higher pricing from rising retail prices, and defected customers (off-grid) would face the necessity of larger, more expensive stand-alone solar-plus-battery systems.
• For owners and operators of central generation and transmission (such as independent power producers and merchant power plants), report findings are likely bad news.
The report analysis predicts that the decline of sales from central generation will accelerate with the adoption of solar-plus-battery systems, and the risk of stranded assets is real. As noted in the report, “Existing assets still within their economic life and cost recovery period will serve a smaller and smaller remaining load, requiring price increases to cover costs and returns. Meanwhile, assets in the planning pipeline won’t see the future demand to justify their capacity and generation output.” Reductions in peak price spikes are also likely in deregulated markets, and solar-plus-battery systems are additionally expected to encroach on markets for ancillary services.
• For distribution grid operators (such as wires-only utilities), the emergence of distributed solar PV and batteries is good news.
Distribution grid customers with solar and battery systems are anticipated to provide value to the grid including upgrade deferrals, congestion relief, and ancillary services. However, the report notes that in order to fully capitalize on these opportunities, new business models, pricing schedules, and regulatory reforms need to evolve.
• For vertically integrated utilities, these systems will strain current business models, and adjustments will be necessary.
New business models are needed to fully capitalize on the rising adoption of solar PV and batteries. The report authors anticipate that similar challenges will be faced by distribution utilities whose revenue depends on volumetric sales of electricity.
Stranded Assets vs Integrated Grid
Jules Kortenhorst, CEO of Rocky Mountain Institute and Carbon War Room notes, “Today’s electricity system is at a metaphorical fork in the road. Down one path are pricing structures, business models and regulatory environments that favor eventual grid defection.” The report authors explain that this path favors grid defection, resulting in an upward price spiral. This path leads to the inevitable stranding of grid assets serving a dwindling load.
Adopting an off-grid solar-plus-battery option will become increasingly appealing for increasing numbers of customers, leading to skyrocketing prices for customers remaining on the grid. In particular, low- and fixed-income customers will be forced to bear a disproportionate burden of the rising retail price for electricity. Down this path, resources on both the grid side and the customer side reach a point of being overbuilt and underutilized, a classic example of stranded assets leaving excess capital on both sides of the electric meter.
On the other hand, RMI CEO Kortenhorst explains this possible future, “Down another road, those same factors are appropriately valued as part of a transactive grid with lower system-wide costs and the foundation of a reliable, resilient, affordable and low-carbon grid of the future in which customers are empowered with choice.”
This alternative path favors business models, regulatory reforms, and stable price structures in which, as the report authors suggest, “distributed energy resources [DERs] such as solar PV and batteries — and their inherent benefits and costs — are appropriately valued as part of an integrated grid.” Such an integrated grid offers a future, according to the report, where grid and customer-side resources collaborate with far greater efficiency in the generation and usage of both capital and physical assets.
Optimizing the Future Grid of the Solar Energy Era
These two pathways into the rising solar energy era are not set in stone. There is ample room for innovative thinking, entrepreneurial planning, and socially just strategies. However, the decisions being made today are likely to set us on a course that becomes more and more difficult to correct. The time frame for optimizing the future grid is relatively short, and growing shorter and more urgent for some geographical regions where solar options are already prevalent and appealing.
Kortenhorst summarizes this urgent need for determining the best path forward: “That’s why RMI is focused on new utility business models, regulatory reform in places like New York, and accelerated adoption of rooftop solar and other DERs—so that the grid of the future can provide customers reliable, clean, affordable power for decades to come.”
Founded in 1982, Rocky Mountain Institute is an “independent, nonprofit think-and-do tank.” Engaging with businesses, communities, and institutions, RMI promotes advanced market-based solutions to drive a cost-effective divestment from fossil fuels to efficiency and renewables. Their work aims to accelerate and scale replicable solutions that transform global energy use for a clean, prosperous, and secure future.
Aisha Abdelhamid is a native of Long Beach, California, residing in Egypt. Besides writing for SolarLove.org, she is also the Site Director and writer for InspiredEconomist.com, and writes for EdenKeeper.org and Planetsave.com. A retired Computer Engineer with the U.S. Dept. of Defense, her latest work published for the DoD was “Personal Financial Management.” Commissioned by Congress, this award winning 10-course training set is hosted by NFL Hall of Famer Ronnie Lott, and is mandatory financial training for every branch of the US Military.
A drone built by Agribotix, a Boulder startup, flies over a farm in Weld County, Colo. The drone has a camera that snaps a high-resolution photo every two seconds. From there, Agribotix stitches the images together, helping the farmer see what’s happening in a field. Luke Runyon/Harvest Public Media/KUNC hide caption
itoggle caption Luke Runyon/Harvest Public Media/KUNC
Colorado is famous for its beer and its beef. But what about its farm drones?
In the last several years, Boulder and Denver have become hubs for tech startups, and companies in the state’s Front Range are on a tear, patenting new technologies in irrigation, food science and plant genetics. Public scientists are keeping pace, publishing research articles in agricultural science in record numbers.
That’s prompted local economists to make some bold predictions.
“We’re poised, if we play our cards right, both as a state government, as a land grant institution [Colorado State University], as an industry, to become the Silicon Valley for agriculture in the 21st century,” says Greg Graff of Colorado State University.
But at the first Colorado State University Agricultural Innovation Summit, held Mar. 18-20, Governor John Hickenlooper didn’t start by trumpeting the state’s farmers or scientists or entrepreneurs. He started instead by touting the accomplishments of a European country six times smaller than Colorado.
“The Netherlands isn’t very big. And they don’t have a whole lot of people,” Hickenlooper said. But, he noted, the Dutch economy has become a powerhouse in growing vegetables, producing dairy products and processing poultry.
What they lack in manpower, they make up for in science and cooperation. Dutch universities pass research on to farmers. Food processing companies have staked headquarters there. Small tech start-ups pop up to solve nagging problems. They do it all as neighbors, in a tightly knit area called the Dutch Food Valley.
“What’s interesting is we’re doing that exact same kind of innovation right here in Colorado,” Hickenlooper said. That’s why Hickenlooper and economists are increasingly talking about Colorado’s potential to become the Silicon Valley of agriculture.
The equation for the growth sounds something like: universities plus entrepreneurs minus regulation multiplied by high quality of life equals innovation. That’s according to The Emergence of an Innovation Cluster in the Agricultural Value Chain along Colorado’s Front Range, a report by Graff published in November.
“To borrow a phrase from real estate, the three most important factors in driving innovation in any industry are: talent, talent and talent. And we have a quality of life here in the Colorado Front Range that attracts and retains world class management and scientific talent,” Graff says.
All that scientific research and talent is concentrated along the northern Front Range, leading to new ideas and new businesses, he says. Colorado’s food and ag industries have been growing two to four times faster than the state’s economy overall, the report notes. The state’s plains may be where the corn is grown and cattle are raised, but Graff said it’s Denver where agriculture is being transformed.
“The urban core is in fact the heart of agricultural innovation in the state of Colorado,” Graff said.
“We’re seeing this industry grow exponentially in Denver,” said the city’s mayor Michael Hancock. “Small businesses are going into incubators and they’re coming out as stronger businesses ready to contribute to the marketplace.”
Denver’s also home to some of the biggest players in food processing, hosting headquarters for the largest maker of mozzarella cheese in the world, Leprino Foods, and the country’s biggest flour milling company, Ardent Mills. Greeley is home to JBS USA, the North American arm of the largest meat packing company in the world. Boulder has become a hub for the production and processing of organic and natural foods with companies like Celestial Seasonings and Justin’s Nut Butter.
Governor Hickenlooper said unlike other sectors in the state, the food industry seems to be stable.
“[Agricultural] innovation is going to create high-paying jobs that are long-lived. It’s not going to be some of this boom and bust stuff that we’ve seen in the past,” Hickenlooper said, in a not-so-subtle dig at the energy industry and their history of boom-and-bust cycles in the state.
All this movement within the state’s agricultural economy came as a bit of surprise to former Larimer County commissioner Kathay Rennels. She’s now with CSU and said no single person or organization can take credit for Colorado’s burgeoning ag innovation hub.
“We have a research corridor here that grew organically,” Rennels said. “It grew by itself and it probably grew because nobody saw it, so they couldn’t screw it up.”
But screwing it up is still a possibility. The same report that identified the ag innovation cluster said it’ll take a concerted effort to nurture the fledgling sector, and that Colorado’s movement to corner the market on ag innovation likely won’t be realized for more than a decade.
Luke Runyon reports from Colorado for KUNC andHarvest Public Media, a public radio reporting collaboration that focuses on agriculture and food production issues.
High tunnels can bring benefits to farmers and schools
by Pete Huff, originally published by Institute for Agriculture and Trade Policy | TODAY
High tunnels—also known as hoop houses or passive solar greenhouses—are an increasingly common feature on farms through the Upper Midwest, where their use provides valuable extension to the region’s short growing season. Local food markets—including farm to school—stand to benefit from the increased availability of fruits and vegetables throughout the year produced by the increased use of high tunnels. IATP’s new report, Extending the Growing Season: High Tunnels Use and Farm to School in the Upper Midwest, explores this relationship further. By looking at best practices in high tunnel use and Farm to School activities, the report identifies innovative approaches with the potential for linking the two practices more effectively. Such innovative ideas drive recommendations for more comprehensive support for increased on-farm implementation of high tunnels and for farm to school activities throughout the Upper Midwest.The release of this report is timely, as critical federal funding and resources for the expansion of high tunnel use in the Upper Midwest and the nation are at risk. President Obama’s budget for fiscal year 2016, which is currently being considered by the House and Senate Budget Committees in their budget resolution processes, carries a request to cut $373 million from the United States Department of Agriculture (USDA) Environmental Quality Incentives Program (EQIP). This program, among other things, helps fund the popular Seasonal High Tunnel Initiative, which provides financial and technical support for farmers interested in implementing or expanding the use of high tunnels on their farm. High tunnels are low-cost and flexible tools that, when integrated and managed successfully, provide farmers with greater control over growing conditions and create an opportunity to increase the length of the growing season for specialty crops.
The Obama Administration’s proposed cut could result in a slowdown on the expansion of high tunnel use on Upper Midwestern farms, which would reduce the amount of early and late season fruits and vegetables available to supply burgeoning local food markets, including farm to school. The growing season of the Upper Midwest is naturally limited – typically lasting from mid-May to early October – and, as the U.S. Environmental Protection Agency notes, increasingly volatile due to climate change, potentially offsetting any gains attributed to anthropogenic warming. Such climatic realities pose challenges for farm to school activities throughout the region. While such activities are increasingly popular options for successfully enhancing student, farmer and community well-being, building alignment between the growing season and the school year is a perennial challenge that is exacerbated by the new climate reality for farmers in Minnesota, Wisconsin and Iowa. The increased use of high tunnels by fruit and vegetable farms offers an opportunity to create better alignment between local farmers and their neighboring schools.
Often, stringent budgetary limitations for K-12 food services can put fruits and vegetables produced in high tunnels out of reach of the cafeteria tray. Such early and late season production is typically geared toward higher margin direct sale markets, such as farmers’ markets and restaurants, which will maximize the return for the farmer and the return on investment for the high tunnel. While there are instances where seasonal produce is purchased from local farmers by K-12 schools, the primary benefit of high tunnel use is more indirect. Research on the productivity gains provided by high tunnels and theprice premium season-extended produce commands in high-margin direct markets indicates that increased high tunnel use can increase farmer incomes and will, in turn, encourage increased participation in farm to school markets as farms seek to diversify their secondary markets or uphold their commitment to social values that farm to school activities yield.
Snug Haven Farm in Wisconsin is an excellent example of this in action. While not subsidized by EQIP’s high tunnel program, the farm’s CSA for premium winter spinach allows subscribing members to pay a little extra in order to help subsidize the farm’s supply of the same spinach to Farm to School Snackprograms. The success of the farm also allows it to continuously support farm to school activities in other ways, upholding the broader values of the farm. The work of Snug Haven Farm demonstrates how balancing high-margin markets with lower-margin markets can result in healthy, locally-produced food showing up in classrooms while ensuring the financial success of the farmers.
Since the inception of the EQIP Seasonal High Tunnel Initiative, the number of -funded high tunnels in the Upper Midwest has increased dramatically, with Minnesota, Wisconsin and Iowa averaging a higher number of newly constructed tunnels than the national 2010-2013 average. USDA data shows that of the 10,273 high tunnels funded nationally in 2010-2013, just over 12 percent were in these states. Despite this success, the high tunnel funding “eggs” remain primarily in the federal budget “basket.” This puts the future of high tunnel support at risk due to cuts such as those proposed by the Obama administration. It is critical that EQIP funding for dedicated high tunnels be reinstated in the federal budget.
At the same time, we need to encourage diverse options for farmers to gain access to resources to offset the startup costs of constructing high tunnels on the state and local level. The Hoop Houses for Health program run by the Michigan Farmers’ Market Association (MIFMA) provides an excellent example for encouraging high tunnel use amongst farmers while also encouraging participation in farmers’ markets and farm to school markets. Participating farmers are able to repay their high tunnel construction loans by providing free produce to qualifying low-income individuals via farmers’ markets and farm to school programs. In doing so, the program promotes these markets in low-income communities while simultaneously increasing access to fresh food and reducing the financial burden of high tunnel uptake for farmers. It is a model that should be replicated throughout the country – particularly in states with limited growing seasons, such as those in the Upper Midwest.
Health Professionals Call: Ban Fracking for Five Years
Donna Ann Ward, co-founder of CoWatchingOil LA, overlooks the Murphy Oil fracking site in residential Los Angeles. She and many other Angelinos believe that fracking wells in the city are responsible for severe public health impacts. Photo: Sarah Craig / Faces of Fracking via Flickr (CC BY-NC-ND).
The arguments against fracking on public health and ecological grounds are overwhelming. There are clear grounds for adopting the precautionary principle and prohibiting fracking.
Medact, the UK-based public health group concerned with the social and ecological determinants of health, have published their long-awaited report on the impacts of fracking upon public health.
This last study is notable in that, despite its relatively early date, from the available evidence on environmental effects its screening exercise determined that many aspects of shale gas were “high risk”.
And yet, when Public Health England reviewed AEA Technology’s report for the UK Government they concluded that the risks were likely to be low. This dismissal of risk by Public Health England, in favour of the Government’s belief in ‘gold-plated regulation’, is the starting point for Medact’s review.
This view was outlined in their letter to the British Medical Journal, signed by the report’s authors and 18 other UK public health professionals, regarding the need for precautionary action to prohibit ‘fracking’ rather than regulate it:
“Fracking is an inherently risky activity that produces hazardous levels of air and water pollution that can have adverse impacts on health. The heavy traffic, noise and odour that accompanies fracking, as well as the socially disruptive effects of temporary ‘boomtowns’ and the spoilage of the natural environment are additional health hazards …
“The arguments against fracking on public health and ecological grounds are overwhelming. There are clear grounds for adopting the precautionary principle and prohibiting fracking.”
Government turning a blind eye to fracking waste and toxins
In assessing the evidence on the impacts of shale gas extraction, Medact found that there were many more and complex effects upon the environment and public health than the Government’s reviews have been willing (or politically constrained) to acknowledge.
While there has been discussion about earthquakes and water pollution, there has been little consideration given to the trace contaminants of shale gas – and how these affect public health.
Nor is there any realistic assessment of the waste management implications of shale gas development – and how the large volumes of toxic solids, liquids and gases generated by the process will be safely dealt with.
Given their significance, Medact’s report includes a supplementary paper specifically on the toxic contaminants associated with the process.
The practical problem, and the implicit strength of the Government’s call for ‘regulation’ in response to criticism is that it defers the need to produce reasoned solutions today – allowing the policy to proceed unhindered.
This approach, as the report outlines, stores up a whole range of uncertainties over health and environmental impacts which, at present, have no quantifiable answers.
The ‘known unknowns’ of fracking
The Medact report poses three key ‘known unknowns’ in relation to the Government’s policy on ‘fracking’:
Firstly, we have incomplete knowledge, caused by a lack of information about fracking operations in the US and elsewhere which has hampered attempts to conduct evidence-based public health studies;
Secondly, fracking is a relatively new activity and, as acknowledged by The Council of Canadian Academies, evidence about the longer term cumulative impacts of fracking are generally not yet available and difficult to predict reliably; and
Thirdly, human exposure to the risks and hazards associated with ‘fracking’ will vary from site to site, depending on a host of geological, social, demographic, agricultural and economic factors – meaning that health impacts will be unevenly and often unfairly distributed between and within local communities.
The report goes on to state: “For these reasons, although one can state categorically that fracking poses threats to human health, the precise level of risk cannot be known with certainty.
“Assessing the level of risk requires careful judgement based on the available evidence and an appropriate attitude towards the precautionary principle, whilst considering contextual factors and the potential benefits of fracking.”
In addition to Medact’s own report, they commissioned an additional study produced jointly by the UCL Energy Institute, Warwick Business School and UK Energy Research Centre. This sets ten evidence-based conditions which unconventional gas production needs to satisfy in order to demonstrate its viability and sustainability.
As this paper states in relation to this list of conditions, “most if not all of them are not [met] at present”. The paper concludes:
“Given the current incomplete state of knowledge about shale gas and its potential role in a low-carbon transition, we suggest that policy makers should take as their basis for energy policy that there will be no shale gas produced domestically and plan their gas security strategy accordingly.”
A five-year public health moratorium
In the conclusions to their report, Medact highlight the inconsistencies between the reviews carried out by various medical and scientific organisations from different countries, and the paucity of evidence underpinning the UK Government’s policy conclusions.
They also note the growing number of state or national governments who have concluded, on the basis of the presently available evidence, that the risks and harms associated with ‘fracking’ outweigh the potential benefits.
Given the state of our knowledge of ‘fracking’ today, Medact consider it prudent and responsible to call for a five year moratorium on all activities related to shale gas development. During this time public health agencies should review all new published research, and carry out a debate on the uncertainties which are identified.
In the present media tit-for-tat of claim and counter-claim, Medact’s report is a positive contribution to the evidence-based debate over ‘fracking’ in Britain. I am sure that those campaigning against the process will find much in it that will inform and improved their work.
The positive point is that the industry funded Task Force on Shale Gas (TFSG) has just begun its own review of the health and environmental impacts of unconventional gas extraction. Medact’s report, given its scope and source material, sets a high bar for the TFSG to reach if their report is to be considered an honest and reasoned review of current evidence.
The Government’s policy on ‘fracking’ is not based upon a reasoned consideration of evidence. It is an ideologically-driven policy – based upon a mistaken pursuit of economic growth at all costs, and which supports fossil fuels irrespective of their consequences for human health, climate change and wider environmental sustainability.
Medact’s report adds to the growing indictment of the UK’s current energy policy, and its implications for our future well-being.
Paul Mobbs is an independent environmental consultant, investigator, author and lecturer, and maintains the Free Range Activism Website (FRAW).
On March 24, the Texas House of Representatives’ Energy Resources Committee passed a bill that would rescind the fracking ban in Denton and other efforts by local Texas municipalities to protect themselves from the oil and gas industry. Once language in the bill is finalized, which could happen today, the legislation will make its way to the full Texas Senate for a vote.
“The oil and gas industry are getting what they always wanted – to get these pesky cities out of the way. They’re utilizing the lack of diligence and gullibility of state government – who are bought and paid for by industry, by using the Denton fracking ban to get what they want,” Denton Councilman Kevin Roden told DeSmogBlog.
“It is a political cliché to take advantage of a good crisis. And the fracking ban gave them a good crisis.” Roden said.
Instead of fighting the ban in the courts, industry made a preemptive move to eliminate local ordinances altogether by pushing representatives to pass laws against ordinances in their way.
On March 23, hundreds turned up to speak out against State Rep. Drew Darby‘s (R – San Angelo) proposed House Bill 40 at a hearing in Austin that lasted more than eight hours. A vote was not taken on HB40 then. However, the next day, the Texas Senate Natural Resources & Economic Development Committee voted unanimously to approve SB 1165, a similar bill that would assert the state’s preemptive right to regulate oil and gas development.
Senate Bill 1165 is pretty much the same as HB40, according to Kathy McMullen, head of the Denton Drilling Awareness Group. “It is a common tactic to submit two bills that are nearly identical in hopes one of them goes through, and that is what happened,” McMullen told DeSmogBlog. “The day following the marathon hearing, citizens and local politicians had to go home, but industry stayed and got what it wanted,” McMullen said.
If passed and ultimately signed, the bill would make it Texas state policy to “fully and effectively exploit oil and gas resources,” and limit local restrictions to whatever industry considers “commercially reasonable.” Local governments will no longer be able to “enforce an ordinance or other measure, or an amendment or revision of an existing ordinance or other measure that bans, limits, or otherwise regulates an oil and gas operation within its boundaries or extraterritorial jurisdiction.”
“This bill would shift the standard to the operator’s interests by requiring validity to be determined by a test of what is commercially reasonable. So, the question shifts from, ‘Is this reasonable in terms of protecting the community?’ to ‘Is this reasonable in terms of allowing operators to fully exploit minerals?’” Adam Briggle, co-founder of the Denton Drilling Awareness Group, told DeSmogBlog.
“For over a decade, more than 300 cities have come up with their own ordinances to do things how they see fit, a right the Texas constitution gives them,” McMullen said. “Now they are all on the chopping block, since the bill gives industry and the state the power to decide what is commercially reasonable.”
The bill would be retroactive, making it impossible to enforce all the ordinances created in the last decade in more than 300 cities, according to the Texas Municipal League.
If the Senate passes the bill, “ten years of work I have done has gone down the drain,” Sharon Wilson, Earthworks’ Gulf Regional Organizer and an outspoken anti-fracking activist, told DeSmogBlog. She began helping local groups create ordinances that help keep their cities livable after she moved away from an area with heavy drilling herself.
Lance Irwin, a founding member of the Mansfield Group, said this new bill wouldundo all the work they just did to get the little bit of an improvement they got. “If the state is going to come in and undo everything the municipalities have been fighting for, this is going to be a war,” he told DeSmogBlog.
Mansfield’s city council left its setback ordinance at 600 feet, the same distance set in Fort Worth and Arlington, two cities in the Barnett Shale where fracking took hold at the start of the boom. However, there are exceptions to a 600-foot set back rule in certain cases, including situations where the well existed before the ordinance was enacted.
The ordinances in Fort Worth and Arlington were praised by committee members at the hearing. Fort Worth “has done this right. They have done this consistently,” Rep Darby, said. “A lot of people say we should adopt [Fort Worth’s] ordinance, and say that is the best practices. You are to be congratulated for that.”
The Committee asked Fort Worth representatives to help them rewrite the bill to mirror that city’s ordinances.
“New peer-reviewed research shows the serious health effects of living close to drilling and fracking. Our ordinance needs to be strengthened in light of this new information. We need much bigger setbacks, better continuous monitoring of emissions for starters, and tough penalties for violators. We also now have a few years of lived experience and anecdotal information from residents who live close to fracking, and we need to pay attention to it. Our ordinance cannot become the state’s model for regulation of oil and gas. It has failed us,” Bhandari said.
Kyev Tatum, a pastor and civil rights activist from Fort Worth, doesn’t think his city is a model to go by either.“Fort Worth emissions are horrible,” he told DeSmogBlog. “You cannot allow 3,000 natural gas wells to be drilled inside an inner city area and not expect it to have an environmental, economical, and physical impact. Fracking is causing more sickness. We have the highest asthma rate and the highest infant mortality rate in the state,” Tatum said.
Tatum took part in the march celebrating the 50-year anniversary of the civil rights movement in Selma.
“This is the same mess, different address. Different time, same tactics. We took a giant step forward in the 1960s, but now, 50 years later, we have taken a giant step back,” Tatum said.
The hypocrisy of what is happening in Austin struck many, including Denton Councilman Kevin Roden.“The whole tone is very anti-city. They are against the plastic bag ordinance, the tree ordinance, anti-smoking, texting while driving ordinances, too. Nationally, it is a conservative principle to fight for local control, yet this Republican-led coalition is doing the opposite,” Roden said.
“If you can’t convince Texas citizens this whole energy revolution on the back of fracking is a great thing, that is a problem for the whole county,” Roden pointed out. “It is unclear how it will play out,” he added. “Amendments could be brought to the floor or a middle ground could be found, but I don’t feel too optimistic, based on the rhetoric I heard from the committees up to this point.”
Denton’s attorneys will be making suggestions, along with the mayor, as well as attorneys representing cities across Texas.
Pastor Tatum hopes those in Denton don’t feel defeated. “Now is the time to discourage the governor and the senators from signing the bill, not to give up. And if it doesn’t work, it will be time to challenge the law,” he said.
“We have to call our representative and senators, and tell then not to vote for the bills,” McMullen said. “If this bill works in Texas, other states will try this against their constituents who want stronger gas ordinances, like Colorado and Wyoming.”
Mailie Bush, a Denton resident and mother of two and a member of the Denton Drilling Awareness Group, was offended that Austin thinks it knows more than the local municipalities do.
“They are trying to take away control from us to make our own decisions. They don’t know what it is like to live this close to oil and gas development. Maybe if they spent some time here they would see why we needed to pass the ban,” she told DeSmogBlog.
“It is discouraging to see everything you worked for to protect your family is going down the tubes. However it also motivates; this cannot stand. It energizes me and makes me ask, what can I do now-what do I have to do to get them to listen to me,” Bush said.
Wilson, who began her testimony against the bill in Austin by stating that she wished she had worn her waders to the hearing, not believing what she was listening to, said:
“I don’t know what exactly our next steps are going to be, but I know we are going to have a whole lot more people taking those steps with us. And most of them are going to be Republicans.”
Roden expressed similar sentiments:
“I don’t think Austin knows how much it has made the problem worse. There is going to be a huge political backlash. The more extreme they get the more awakened the average citizen will get and see this is crazy.”