Wednesday, December 28, 2011

The Postal Service Is Fighting for Its Life and Should Be Saved.


Mail carrier Mike Gillis delivers mail Tuesday, Dec. 6, 2011 in Montpelier, Vt. First-class U.S. mail will slow even more by next spring under plans by the cash-strapped U.S. Postal Service to eliminate more than 250 processing centers. , Toby Talbot / AP Photo,Toby Talbot

 

Blogger's note:
Chris Hayes, of MSNBC's "Up", says that if the Postal Service was in the Fortune 500, it would be #29!!



By John Avlon



The U.S. Postal Service, crippled by rising costs and stifling regulations, loses $5 A billion year, but it can and should be saved.

A crowded post office is part of the scenery of the season-–long lines, arms full of packaged presents, spare Christmas decorations hanging under the humming lights.

But the post office as we know it is in peril, bleeding $5 billion a year in losses. The culprit isn’t just the rise of the Internet as an alternative to envelopes and stamps. It’s a series of stifling regulations imposed on the organization by Congress, such as requiring the U.S. Postal Service to fully fund all its future pension obligations outright--a measure that would bankrupt any city, state, or business.

It is a solvable problem, but unless our dysfunctional divided Congress takes action in the New Year, post office closures and reduced hours will be the least of our worries. Without reform, America faces the very real possibility that the USPS will not be able to make its payroll in the summer of 2012.


In some ways the U.S. Postal Service is an exemplary civic institution, fulfilling its foundational mandate “to bind the Nation together through the personal, educational, literary, and business correspondence of the people.” And perhaps contrary to the impulse of reflexive government cutters, it does not currently cost the U.S. taxpayer a dime. Part of the genius of the place is that it is self-funded by the price of stamps and services provided, recording year-end profits and losses. Imagine if more government worked along those logical lines.

But it’s not a business. “If we were a private sector company, we would have already declared bankruptcy,” says USPS press spokeswoman Susan McGowan.


The amount of mail circulated has been in decline since 2006, despite America’s rising population. The USPS has taken proactive steps to cut costs, reducing its workforce by more than 120,000 employees over the past four years and slashing operating expenses by more than $12 billion-–all while reaching 150 million American households and businesses each day and generating almost $1 trillion in economic activity. But the long-term trend is not its friend, and Postmaster General Patrick Donahoe has been fighting for greater flexibility to meet the challenge of change.
Shutting large numbers of local post offices would seem to be the least desirable route, especially from a citizens’ perspective.
“You know that phrase ‘speed kills?’ Well, the lack of speed will kill the Postal Service.” Donahoe stated in a November 21 speech at the National Press Club. “That’s the stark choice: A more flexible business model that allows us to control costs quickly, or very large losses that will ultimately burden the taxpayer.”

To that end, Donahoe has been pushing a plan to consolidate 260 processing centers along with lobbying Congress for a change to its absurd pension funding requirements. Another option would be to reduce the current six-day delivery schedule to five days a week. Donahoe states that the necessary route to fiscal stability would require $20 billion in annual costs cut by 2015.

Shutting large numbers of local post offices would seem to be the least desirable route, especially from a citizens’ perspective. These still serve as touchstones for communities, a place where neighbors gather and in many cases the most direct daily contact they have with their federal government. Changing the funding formula for pensions is the most logical and painless solution. Some union representatives might object, but the stark alternative is a looming GM-style bailout for the U.S. Postal Service.

This is a scenario that the USPS leadership seems determined to avoid. “Congress can increases the Postal Service's borrowing limit, but that is not something we are seeking because it does not offer a long-term solution to our financial crisis,” attests Susan McGowan. This is an unusually responsible approach by a federal bureaucracy.

The good news is that there is a bipartisan bill--co-sponsored by centrist Senators Joe Lieberman, Tom Carper, Scott Brown, and Susan Collins--that would modernize the U.S. Postal Service and deliver many of the changes that the postmaster general is requesting in terms of cutting operating costs and creating the flexibility to increase revenue. Known as the 21st Century Postal Service Act, it was proposed in November and received the requisite hearings on the Homeland Security and Government Affairs Committee. But the bill still needs to be given a green light by Senate Majority Leader Harry Reid to go to the Senate floor for a vote.

This should be filed under “No Brainer.” Only senators completely in the thrall of unions or those who might cynically see political benefit in the symbolism of the U.S. Postal Service unable to meet payroll in the heat of a presidential election summer would oppose it. But the pervasive political brinksmanship on Capitol Hill has displayed a repeated ability to screw up a two-car parade. Failure to act could do what rain, sleet, and snow have failed to do--stop the delivery of the mail.

Let’s hope this urgent and worthy effort to save the USPS from insolvency is exempted from the usual polarized paralysis. The citizen reaction to an avoidable post-office crisis would be swift and merciless, pushing congressional approval ratings to new lows. The alternative--reasoning together in a spirit of long-term fiscal responsibility--could restore some faith in Congress while strengthening the core, constitutionally mandated civic institution that is the United States Postal Service.
~~~~~~~~~~~~

Friday, December 23, 2011

The 13 Best Political Films of 2011.

AlterNet / ByJulianne Escobedo Shepherd

 

Looking back at the movies that moved us most.
 
  We Were Here
Photo Credit: wewerehere.com
This year was defined by anxiety: the economy roiled, the GOP was increasingly hostile, the government careened towards shutdown more than once. And while these things all still seem to loom, 12 months later, there is a landscape of renewed hope and empowerment. The Arab Spring set off revolutions across the Middle East, which first inspired the Western world to rise up into Occupy Wall Street. Now the ripple effect of people power travels further, as we see the germination of the Russian Winter. Culturally, we’re gearing for a seismic shift: In 2012, expect to see the effects of the year manifested in film, music, and art. But in 2011, we felt the tremors, and a clutch of political films and documentaries both presaged and inspired the increasing awareness and resolve we’ve seen smattering across the globe. You’ll see some of these in the Oscar nomination lineup, but all of them are must-see.


1. Margin Call (dir. JC Chandor)



As Occupy Wall Street was congealing—and the scrutiny surrounding Wall Street's robbery and subsequent bailout was occupying America's consciousness—an intensely disquieting thriller called Margin Call was released. Set over the course of 24 hours inside an ostensibly fictional Wall Street firm in the hot zone that was 2008, it's an intimate look at the decision-making that precipitated the financial crisis, "inspired" by real events, including the ultimate meltdown of mortgage securities. The all-star ensemble cast is collectively brilliant at portraying the nuance of the morality, and lack of it, that these firms displayed—Zachary Quinto's troubled math genius acts as a compass against the supreme evil embodied in the CEO and other top-level employees, portrayed by Jeremy Irons, Simon Baker, and Demi Moore, whose Machiavellian greed leads them to sacrifice not only company employees, but the American people. Though the technical aspects of the financial crisis can sometimes seem arcane, Margin Call threaded together an idea of how it could happen—as interpreted by writer/director JC Chandor, who'd never made a film before this one—and gave us a clearer view into what exactly we were protesting. Stunning. (Currently in theaters.)


2. We Were Here (dir. David Weissman, Bill Weber)





Perhaps this year’s dramatic bigotry against gay works of art dating back or relating to the AIDS crisis seeped into the collective consciousness, because this is one of two important films looking back at AIDS activism in the 1980s and ’90s. (The second, How to Survive a Plague, debuts at Sundance in February.) We Were Here focuses on San Francisco as it began to feel the early effects of what was then-called “the gay cancer,” tapping into five people who were there and their profound, unfading memories. As frightening and depressing as the documentary is, the incredible community that mobilized to fight both the disease and the perception of those infected is the true story here. It’s activism at one of its most inspiring moments. (Currently playing in Denver, Tulsa, and various cities in Canada; more dates to come)


3. Into the Abyss (dir. Werner Herzog)




A Werner Herzog romp is always fun — resplendent with his pithy, absurdist observations and pleasurable deadpan — but this film takes him in a more somber, more serious direction, as he examines the case of Michael Perry, a Texas man on death row for the murder of three people, and what it means when a democratic society enacts an eye for an eye. In the wake of Troy Davis, it’s an important, self-searching look at a society gone haywire and questions the nature of humanity with typical Herzog objectivity. He does not judge, only poses questions — it’s philosophically impactful as a result. (Opened wide in November.)


4. The Adjustment Bureau (dir. George Nolfi)



Philip K. Dick left so much work that was dying to be made into film. The Adjustment Bureau, based on his short story, is a perfect example why: when done well, his paranoia (and his prescience) translates thrillingly to the big screen, and in the hands of a great actor like Matt Damon, just gets more electric. It's the tale of an ambitious would-be Senator who's been tracked for the Presidency by a mysterious organization called the Adjustment Bureau, which not only fixes fate for those it works with, but proves to have psychic abilities about them, removing any semblance of free will. Damon's character reluctantly accepts this (mostly out of fear), but once the Adjustment Bureau threatens to come between himself and his potential love (Emily Blunt), he's no longer having it. It's complete romantic sci fi, but through its fantastical lens raises questions about the powerful and often nefarious mechanisms that currently exist in politics, from lobbyists to campaign managers. (Shades of the 2000 election, too.) Plus Damon and Blunt work great together, their chemistry and talent playing off each other. (Currently available for home viewing.)


5. The Ides of March (dir. George Clooney)




Speaking of electoral politics, Matt Damon's partner in progress George Clooney shined in this as both actor and director. Focusing on a hotly contested presidential campaign (the screenwriter, Beau Willimon, worked with Howard Dean in 2004), the "ides" refers to the dirty politics that unseat a deputy campaign manager and sully an entire presidential election. Ryan Gosling, as said disgraced employee, imparts a savvy toughness that alludes to a character in his other breakthrough film this year, Drive, but this one inflicts just as many bruises with virtually no physical violence. (Currently in theaters.)


6. Miss Representation (dir. Jennifer Siebel Newsom)



A highly critical look at media representations of women that value us solely for our tits and ass and never for our intellect, director Newsom pulls no punches in exposing the most damaging and disgusting comments made toward us through the years. Most importantly, she looks at how young girls are affected by the portrayals, and how their potential is truncated by a sexist landscape — but can be reawakened by positive representation and women writing our own history. Alternately depressing and inspiring! (Aired on Oprah's OWN Network in October, with screenings across the US currently scheduled.)


7. The Black Power Mixtape, 1967–1975



In the late ’60s, Swedish journalists, fascinated and compelled by the Black Power movement in the United States, began documenting its most famous and outspoken purveyors more diligently than any American media, filming interviews with the likes of Angela Davis, Harry Belafonte, and Stokely Carmichael. Roundly dissed by stateside institutions and propagandist journalists, they nevertheless continued their work for several years, showing the true anguish and dignity that propelled one of the country’s most resolute and beautiful moments — and hopefully reframing it in the American consciousness from a “violent” movement to one that sought to empower until it was sabotaged by the government. (Currently in theaters across America.)


8. Paradise Lost 3: Purgatory (dir. Joe Berlinger, Bruce Sinofsky)



Similarly to Into the Abyss, Paradise Lost 3 takes a long, hard look at the justice system in America, but this is the third in a trilogy about the West Memphis Three, teenagers who spent almost two decades in prison for the murder of three boys despite evidence to the contrary — and who were finally cleared after DNA evidence proved their innocence. A story with a strong following, from normal justice activists to the likes of Johnny Depp and Pearl Jam’s Eddie Vedder, the finale offers a denouement to a broken justice system. (Opened in October; will air on HBO beginning January 12.)


9. Rise of the Planet of the Apes (dir. Rupert Wyatt)



The ever-canny James Franco stars in this sci fi thriller as a well-meaning scientist, but obviously it's the apes who steal his absurdist shine: led by a super-intelligent super-chimp, a cache of experimented-upon apes revolts against humanity, taking to the streets in protest and, you know, smashing cars in the process. An incredibly fun film to watch, and if you let yourself be distracted from insane special effects, you'll get the union subtext and the limits of science, along with some commentary on how, if we can't co-exist naturally with nature, it will make a concerted effort not to have to co-exist with us. (Out now for home viewing.)


10. If a Tree Falls: A Story of the Earth Liberation Front (dir Michael Curry)



How urgent is it to avert environmental disaster, and how far would you go to do it? This intense documentary follows the 2005 prosecution of one operating cell of the ELF under what the government termed America’s “No. 1 domestic terrorist threat” — despite the fact that their actions did not physically harm anyone. The ELF’s tactic was to set businesses it deemed environmentally damaging alight, including SUV dealerships and timber companies, which was counteracted by the government’s invocation of corporate personhood. In the interim, riot cops cracked down on protests — violent ones, the kind we’re seeing a lot of today — which are painfully and shockingly depicted here. (Aired on PBS in September; DVDs available.)


11. Addiction Incorporated (dir. Charles Evans Jr)



The true story of Victor DeNoble, who presented nicotine-free tobacco to Philip Morris after learning that the drug led to both heart disease and addiction — and who fought them when they opted for more addictive additives. Standing up against the tobacco industry has been documentary fodder before, but this look at one man’s resolve is inspiring. (Airing now in New York; opening wide in January.)


12. The Loving Story (dir. Nancy Biurski)



In 1967, Richard and Mildred Loving were arrested for marrying interracially, which was illegal in the state of Virginia. Banned from ever visiting their families together, they eventually sued the state with the help of the ACLU (and at the behest of Robert Kennedy). Their suit eventually made it to the Supreme Court, which overturned all miscegenation laws, stating they were in violation of the Fourteenth Amendment. This documentary, complete with incredible archival footage of the Lovings, examines the history — and the present, including President Obama — of interracial marriage, but also tells a beautiful tale of love that couldn’t be kept down. (Opened in Nov; HBO will air in February.)


13. Battle for Brooklyn (dir. Michael Galinsky, Suki Hawley) Oscar shortlisted



For over six years, residents of downtown Brooklyn battled Bruce Ratner, one of the largest real estate developers in New York, for the heart of the neighborhood. After the state and Mayor Michael Bloomberg, invoking eminent domain, rezoned and seized the area known as Atlantic Yards, Ratner began developing his vision: a huge sports arena for the Brooklyn Nets, several skyscrapers (including mixed-income housing, the “mixed-income” part of which was eventually scrapped), and area for mass retail which some fretted would attract national chains and dilute the community reliance that’s been a part of downtown Brooklyn for decades. But most devastatingly, the land on which the new development was proposed already held apartment buildings and other living units. Set to be razed, its occupants and neighbors, some of whom had lived in the buildings for decades, embarked on a battle for their lives and principles. This compelling documentary chronicles the fight. (Upcoming screenings all across America, available on DVD soon.)
~~~~~~~~~~~~~
Julianne Escobedo Shepherd is an associate editor at AlterNet and a Brooklyn-based freelance writer and editor. Formerly the executive editor of The FADER, her work has appeared in VIBE, SPIN, New York Times and various other magazines and websites.

Wednesday, December 21, 2011

The Best, Worst and Most Overlooked Food Stories of 2011.





AlterNet / ByAri LeVaux


Here's a look at what (sometimes baffling) stories Americans found most compelling and several important ones they missed.
Every December for the last nine years, the Hunter PR firm has announced the results of a nationwide survey of Americans' picks the top ten food news stories of the year. The list says as much about the media that writes the headlines as it does about the people who remember them.

The survey also investigated how Americans respond to the news, and found that 61 percent of those surveyed changed their food habits based on news coverage. Forty-five percent were influenced to cook more at home. Who can blame them?

The FDA Food Safety Modernization Act was signed on January 4, a milestone that took sixth place on the Hunter survey. The bill was in response to contamination events from previous years, but it set the tone for the year to come as well. The year's No. 1 story was the cantaloupe-borne listeria that killed 30 people, while Cargill's 36-million-pound turkey recall took fourth.

The food safety bill has yet to stem the tide of factory farm-borne disease, but it's already created problems for small farmers, who are finding themselves overwhelmed with the so-called Good Agricultural Practices the bill mandates. County and university extension agents are scrambling to set up web pages to help deal with the surge of annoyed farmers trying to follow the new rules.

The most baffling entry on the Hunter list may be a food-safety issue of a different sort: the USDA lowered the internal temperature requirements for commercially served pork from 160 to 145 degrees. Perhaps the masses are anticipating moister pork loin whilst out on the town. I doubt many members of the general public even own a meat thermometer for home cooking. Thus, they've probably been eating undercooked pork at home all along. But nonetheless, something about those 15 degrees really captivated readers.

What does it say about America that medium rare pork is bigger news than tens of thousands of North Africans that starved this year from a harsh mix of drought and war? But then, most Africans probably wouldn't rank Michelle Obama's MyPlate nutritional guide as their No. 2 news story of the year, either. It's to be expected that people are most focused on what directly affects them.

The only place where North African starvation intersects with the Hunter list is in position # 3: record-breaking global food prices. And prices might just go higher. The world's population is growing, the land base isn't, speculation on food commodities is virtually unregulated, we're eating more meat, and severe weather events are wreaking havoc on crops with greater frequency than ever.

Half of Hunter's top ten involved nutritional issues. This can be encouraging and frustrating. It's important to get people thinking about nutrition, and mandatory nutritional labeling of chain restaurant menus (#5), for example, may encourage that. But we still have to apply critical thinking to the numbers, and even understanding the numbers can be derailed by a faulty premise. MyPlate, for example, is smudged with corporate fingerprints, like the dairy industry's recommendation that adult humans should eat or drink cow milk products three times a day. This isn't nutritional guidance so much as political armbending.

Two of the most envelope-pushing nutrition stories on the Hunter list evolved from court cases. In slot #9, General Mills is being sued for marketing sugary fruit leather as health food, when such formulations are in fact recipes for obesity.

In another child obesity story, which took #8 on the list, an Ohio court removed a 200-pound 8-year-old boy from his Cleveland home. The move was justified on the basis of imminent health risk, including diabetes, heart problems, and other forms of early death and disability. Poor nutrition, according to the court, can equal neglect.

And now, here's a rundown of a few important stories that escaped the Hunter survey's radar. Prices fetched by Midwestern agricultural land hit record heights, with choice parts of Iowa breaking $20,000 an acre thanks in part to the market for corn-based ethanol. Today farmers can essentially grow bushels of gasoline in their cornfields. But the writing is on the wall for the industry: political support for corn-based ethanol subsidies is crumbling, and $6 billion in subsidies are in danger of being dropped from next year's farm bill.

Dramas over biotechnology provided no shortage of important headlines this past year. Despite overwhelming opposition from public comments, agency scientists, and even a few pesky court rulings, USDA and FDA only increased their efforts to improve the bottom lines of genetically modified (GM) crop companies. Such agency advocacy included the approval of GM alfalfa and sugar beets, which both have the potential to destroy important sectors of the organic industry.

Agency support for biotech grew even as evidence came to light of the health and environmental hazards of genetically modified crops. Several studies found that consumption of GM corn and soybeans causes significant organ disruptions in rats and mice. And there is so much evidence that Monsanto's rootworm resistant corn plant is breeding GM corn resistant rootworms, you'd think former Monsanto lawyers were writing the USDA's regulations. Which they are.

Recent surveys have shown that more than 90 percent of Americans want labels on their food indicating whether it includes genetically modified ingredients. I wouldn't be surprised if in 2012 this vast majority will finally get its wish. A broad coalition of organizations, lead by the Center for Food Safety, has launched Just Label It, a campaign aiming to either convince the FDA to mandate labeling, or convince President Obama to make the agency do it. The campaign has momentum, public support, and an election year on its side.

This year saw the food police empowered by FDA's Food Modernization Act, and they repeatedly clashed with locavores. The Rawesome food-buying club was raided and shut down by federal and Los Angeles County officials for selling raw milk, a crime that has been prosecuted in various ways elsewhere around the nation. And Southern Nevada officials in November shut down a "farm to table" dinner at a Community Supported Agriculture farm for a number of supposed food safety infractions. Regulations designed to address the profit-chasing ways of big food corporations don't currently leave much room to operate for small farmers and consumers. Producers are being strangled by red tape, while the people looking to buy their food can't do so without breaking some law.

This kind of meddling in our mouths won't fly in America. Expect such clashes to continue until food safety laws are modified to allow small-scale, local agriculture to thrive in peace, unmolested by bureaucrats.
~~~~~~
Ari LeVaux writes a syndicated weekly food column, Flash in the Pan.

Wednesday, December 14, 2011

Feds Link Water Contamination to Fracking for the First Time.

The agency's findings could be a turning point in the heated national debate about fracking.



In a first, federal environment officials today scientifically linked underground water pollution with hydraulic fracturing, concluding that contaminants found in central Wyoming were likely caused by the gas drilling process.

The findings by the Environmental Protection Agency come partway through a separate national study by the agency to determine whether fracking presents a risk to water resources.

In the 121-page draft report released today, EPA officials said that the contamination near the town of Pavillion, Wyo., had most likely seeped up from gas wells and contained at least 10 compounds known to be used in frack fluids.

“The presence of synthetic compounds such as glycol ethers … and the assortment of other organic components is explained as the result of direct mixing of hydraulic fracturing fluids with ground water in the Pavillion gas field,” the draft report states. “Alternative explanations were carefully considered.”

The agency’s findings could be a turning point in the heated national debate about whether contamination from fracking is happening, and are likely to shape how the country regulates and develops natural gas resources in the Marcellus Shale and across the Eastern Appalachian states.

Some of the findings in the report also directly contradict longstanding arguments by the drilling industry for why the fracking process is safe: that hydrologic pressure would naturally force fluids down, not up; that deep geologic layers provide a watertight barrier preventing the movement of chemicals towards the surface; and that the problems with the cement and steel barriers around gas wells aren’t connected to fracking.

Environmental advocates greeted today’s report with a sense of vindication and seized the opportunity to argue for stronger federal regulation of fracking.

“No one can accurately say that there is ‘no risk’ where fracking is concerned,” wrote Amy Mall, a senior policy analyst at the Natural Resources Defense Council, on her blog. “This draft report makes obvious that there are many factors at play, any one of which can go wrong. Much stronger rules are needed to ensure that well construction standards are stronger and reduce threats to drinking water.”

A spokesman for EnCana, the gas company that owns the Pavillion wells, did not immediately respond to a request for comment. In an email exchange after the EPA released preliminary water test data two weeks ago, the spokesman, Doug Hock, denied that the company’s actions were to blame for the pollution and suggested it was naturally caused.

“Nothing EPA presented suggests anything has changed since August of last year– the science remains inconclusive in terms of data, impact, and source,” Hock wrote. “It is also important to recognize the importance of hydrology and geology with regard to the sampling results in the Pavillion Field. The field consists of gas-bearing zones in the near subsurface, poor general water quality parameters and discontinuous water-bearing zones.”

The EPA’s findings immediately triggered what is sure to become a heated political debate as members of Congress consider afresh proposals to regulate fracking. After a phone call with EPA chief Lisa Jackson this morning, Sen. James Inhofe, R-Okla., told a Senate panel that he found the agency’s report on the Pavillion-area contamination “offensive.” Inhofe’s office had challenged the EPA’s investigation in Wyoming last year, accusing the agency of bias.

Residents began complaining of fouled water near Pavillion in the mid-1990s, and the problems appeared to get worse around 2004. Several residents complained that theirwell water turned brown shortly after gas wells were fracked nearby, and, for a time, gas companies operating in the area supplied replacement drinking water to residents.

Beginning in 2008, the EPA took water samples from resident’s drinking water wells,finding hydrocarbons and traces of contaminants that seemed like they could be related to fracking. In 2010, another round of sampling confirmed the contamination, and the EPA, along with federal health officials, cautioned residents not to drink their water and to ventilate their homes when they bathed because the methane in the water could cause an explosion.

To confirm their findings, EPA investigators drilled two water monitoring wells to 1,000 feet. The agency released data from these test wells in November that confirmed high levels of carcinogenic chemicals such as benzene, and a chemical compound called 2 Butoxyethanol, which is known to be used in fracking.

Still, the EPA had not drawn conclusions based on the tests and took pains to separate its groundwater investigation in Wyoming from the national controversy around hydraulic fracturing. Agriculture, drilling, and old pollution from waste pits left by the oil and gas industry were all considered possible causes of the contamination.

In the report released today, the EPA said that pollution from 33 abandoned oil and gas waste pits – which are the subject of a separate cleanup program – are indeed responsible for some degree of shallow groundwater pollution in the area. Those pits may be the source of contamination affecting at least 42 private water wells in Pavillion. But the pits could not be blamed for contamination detected in the water monitoring wells 1,000 feet underground.

That contamination, the agency concluded, had to have been caused by fracking.

The EPA’s findings in Wyoming are specific to the region’s geology; the Pavillion-area gas wells were fracked at shallower depths than many of the wells in the Marcellus shale and elsewhere.

Investigators tested the cement and casing of the gas wells and found what they described as “sporadic bonding” of the cement in areas immediately above where fracking took place. The cement barrier meant to protect the well bore and isolate the chemicals in their intended zone had been weakened and separated from the well, the EPA concluded.

The report also found that hydrologic pressure in the Pavillion area had pushed fluids from deeper geologic layers towards the surface. Those layers were not sufficient to provide a reliable barrier to contaminants moving upward, the report says.

Throughout its investigation in Wyoming, The EPA was hamstrung by a lack of disclosure about exactly what chemicals had been used to frack the wells near Pavillion. EnCana declined to give federal officials a detailed breakdown of every compound used underground. The agency relied instead on more general information supplied by the company to protect workers’ health.

Hock would not say whether EnCana had used 2 BE, one of the first chemicals identified in Pavillion and known to be used in fracking, at its wells in Pavillion. But he was dismissive of its importance in the EPA’s findings. “There was a single detection of 2-BE among all the samples collected in the deep monitoring wells. It was found in one sample by only one of three labs,” he wrote in his reply to ProPublica two weeks ago. “Inconsistency in detection and non-repeatability shouldn't be construed as fact.”

The EPA’s draft report will undergo a public review and peer review process, and is expected to be finalized by spring.

Abrahm Lustgarten is a former staff writer and contributor for Fortune, and has written for Salon, Esquire, the Washington Post and the New York Times.
~~~~~~~~~~~~~
Nicholas Kusnetz has written for The Nation, Miller-McCune, The New York Times and other publications. He is a graduate of UC Berkeley’s Graduate School of Journalism.

Seven Diseases Big Pharma Hopes You Get in 2012.




Supply-driven marketing not only turns the nation into pill-popping hypochondriacs, it distracts from Pharma's drought of real drugs for real medical problems.

It used to be joked that a consultant is someone who borrows your watch to tell you what time it is. These days, the opportunist is Big Pharma, which raises your insurance premiums and taxes while providing you "low-priced" drugs that you paid for.

How did Pharma get a good third of the United States taking antidepressants, statins, and Purple Pills, albeit at low prices? By selling the diseases of depression, high cholesterol, and gastroesophageal reflux disease, or GERD. Supply-driven marketing, also known as "Have Drug — Need Disease and Patients," not only turns the nation into pill-popping hypochondriacs, it distracts from Pharma's drought of real drugs for real medical problems.


Of course, not all diseases are Wall Street pleasers. To be a true blockbuster disease, a condition must (1) really exist but have huge diagnostic "wiggle room" and no clear-cut test, (2) be potentially serious with "silent symptoms" said to "only get worse" if untreated, (3) be "underrecognized,"

"underreported" with "barriers" to treatment, (4) explain hitherto vague health problems a patient has had, (5) have a catchy name — ED, ADHD, RLS, Low T or IBS — and instant medical identity, and (6) need an expensive new drug that has no generic equivalent.

Here are some potential blockbuster diseases Pharma hopes you get in 2012.

Adult ADHD
Everyday problems labeled as "depression" sailed Pharma through the last two decades. You weren't sad, mad, scared, confused, remorseful, grieving, or even exploited. You were depressed, and there was a pill for that. But depression peaked just like the Atkins Diet and the Macarena. Luckily, there is adult ADHD (Attention Deficit Hyperactivity Disorder), which has doubled in women 45 to 65 and tripled in men and women 20 to 44, according to the Wall Street Journal.

Like depression, adult ADHD is a catch-all category. "Is It ADHD or Menopause?" asks an article in Additude, a magazine devoted exclusively to ADHD. "ADD and Alzheimer's: Are These Diseases Related?" asks another article in the same magazine.

"I'm Depressed. Could it be ADHD?" says an ad in Psychiatric News, showing a pretty but pouting young woman. In the same publication, another ad titled "Broken Promises" says, "Adults with ADHD were nearly 2x more likely to have been divorced," while exhorting doctors to "screen for ADHD."

Adults with ADHD are often "less responsible, reliable, resourceful, goal-oriented, and self-confident, and they find it difficult to define, set, and pursue meaningful internal goals," says an article cowritten by Harvard child psychiatrist Dr. Joseph Biederman, who is credited with putting "pediatric bipolar disorder" on the map. They "show tendencies to being self-absorbed, intolerant, critical, unhelpful, and opportunistic," and "tend to be inconsiderate of other people's rights or feelings," says the article, describing most people's brothers-in-law.

Adults with ADHD will have trouble keeping a job and get worse without treatment, says WebMD, tapping into the second requirement of a blockbuster disease — symptoms worsen without pills. "Adults with ADHD may have difficulty following directions, remembering information, concentrating, organizing tasks, or completing work within time limits," according to the website, whose original partner was Eli Lilly.

How did Pharma get five million kids and now, maybe, their parents on ADHD meds? Ads on 26- by 20-foot screens in Times Square that ask "Can't focus? Can't sit still? Could you or your child have ADHD?" four times an hour couldn't hurt. (Bet no one had trouble focusing on that!)

Still, convincing adults they aren't sleep deficient or bored but have ADHD is only half the battle. Pharma also has to convince kids who grew up diagnosed as ADHD not to quit their meds, says Mike Cola of Shire (which makes the ADHD drugs Intuniv, Adderall XR, Vyvanse, and the Daytrana patch). "We know that we lose a significant number of patients in the late teen years, early 20s, as they kind of fall out of the system based on the fact that they no longer go to a pediatrician."

A Shire ad in Northwestern University's student paper this year takes the issue head on. "I remember being the kid with ADHD. Truth is, I still have it," says the headline splashed across a photo of Adam Levine, the lead singer of Maroon 5. "It's Your ADHD. Own It," was the tagline. (Was "Stay Sick" the runner-up?)

Of course, pushing speed on college kids (or anyone, for that matter) isn't too hard. Why else do meth dealers say, "First taste free"? But Pharma is so eager to retain its pediatric ADHD market, it has funded for-credit courses for doctors, such as "Identifying, Diagnosing, and Managing ADHD in College Students" and "ADHD in College: Seeking and Receiving Care During the Transition From Child to Adult."

To make sure no one thinks ADHD is a made-up disease, WebMD shows color-enhanced Pet scans of the brains of a normal person and an ADHD sufferer (flanked by an ad for Vyvanse). But it is doubtful the scans are really different, says psychiatrist Dr. Phillip Sinaikin, author of Psychiatryland. And even if they are, it proves nothing.

"The crux of the matter is that there is simply no definitive understanding of how neuronal activity is related to subjective consciousness, the age-old unsolved body/mind relationship," Sinaikin told AlterNet. "We have not advanced beyond phrenology, and this article in WebMD is simply the worst kind of manipulation by the drug industry to sell their overpriced products, in this case a desperate effort by Shire to maintain a market share when Adderall goes generic."

Rheumatoid Arthritis
Rheumatoid arthritis is a serious and dangerous disease. But so are Pharma's immune-suppressing biologic drugs like Remicade, Enbrel, and Humira, which are pushed to treat it. While RA attacks the body's own tissues, leading to inflammation of the joints, surrounding tissues, and organs, immune suppressors can invite cancers, lethal infections, and activate TB.

In 2008, the FDA announced that 45 people on Humira, Enbrel, Remicade, and Cimzia died from fungal diseases, and investigated Humira's links to lymphoma, leukemia, and melanoma in children. This year, the FDA warned that the drugs can cause "a rare cancer of white blood cells" in young people, and the Journal of the American Medical Association (JAMA) warned of "potentially fatal Legionella and Listeria infections."

Immune-suppressing drugs are also dangerous to the pocketbook. One injection of Remicade costs up to $2,500; a month's supply of Enbrel costs $1,500; and a year's supply of Humira costs up to $20,000.
Once upon a time, RA was diagnosed from the presence of "rheumatoid factor" and inflammation. But, thanks to Pharma's supply-driven marketing, stiffness and pain are all that are required for the diagnosis today. (Athletes and people born before 1970 — line forms to the left!)

In addition to diagnostic wiggle room and a catchy name, RA has other blockbuster disease requirements. It will "only get worse" if untreated, says WebMD, and it is often "misdiagnosed" and underreported, says Abbott's Heather Mason, because "people often don't know what they have for a while."

So serious a disease, it costs over $20,000 a year to treat but so subtle you may not know you have it? RA sounds like a blockbuster.

Fibromyalgia
Another underreported disease is fibromyalgia, characterized by widespread. unexplained bodily pain. Fibromyalgia is "almost a textbook definition of an unmet medical need," says Ian Read of Pfizer, which makes the first drug to be approved for fibromyalgia, the seizure pill Lyrica. Pfizer gave nonprofit groups $2.1 million in 2008 to "educate" doctors about fibromyalgia and financed PSAs (pharma service announcements) depicting sufferers describing their symptoms without mentioning a drug. Lyrica now makes $3 billion a year.

Still, Lyrica has to fight Cymbalta, the first antidepressant to be approved for fibromyalgia. Eli Lilly prepositioned Cymbalta for the physical "pain" of depression in a campaign called "Depression Hurts" before the fibromyalgia approval. Treatment of a fibromyalgia patients with either Lyrica or Cymbalta hovers around $10,000, say medical journals.

Pharma and Wall Street may be happy with fibromyalgia drugs, but patients aren't. On askapatient.com, the drug-rating website, patients on Cymbalta reported chills, jaw problems, electrical "pings" in their brain, and eye problems. This year, four patients reported the urge to kill themselves, a frequently reported side effect of Cymbalta. Lyrica users on askapatient reported memory loss, confusion, extreme weight gain, hair loss, impaired driving, disorientation, twitching, and worse. Some patients take both drugs.

SLEEP DISORDERS
Middle of the Night Insomnia
Sleep disorders are a goldmine for Pharma because everyone sleeps — or watches TV when they can't. To churn the insomnia market, Pharma rolls out subcategories of insomnia, such as chronic, acute, transient, initial, delayed-onset, terminal, early-morning, menopausal, and the master category of nonrestful sleep. This fall, Pharma rolled out a new version of Ambien for "middle-of-the-night" insomnia called Intermezzo, even though Ambien is paradoxically notorious for middle-of-the-night awakenings: people "waking up" in an Ambien blackout and walking, talking, driving, making phone calls, and eating food.

Many became aware of Ambien's "lights-on-nobody-home" effect when former Rhode Island Representative Patrick Kennedy drove to Capitol Hill to "vote" at 2:45 a.m. in 2006 on Ambien and crashed his Mustang. But it was Ambien's EWI effect — eating while intoxicated — not DWIs that gave the pill its worst rap. Fit and sexy people awoke amid mountains of pizza, Krispy Kreme, and Häagen-Dazs cartons, their contents consumed by their evil twin on Ambien.

Excessive Sleepiness and Shift Work Sleep Disorder
Needless to say, people with insomnia won't be bright-eyed and bushy-tailed the following day — whether they didn't sleep or whether they have sleeping pill residues in their system. In fact, they are actually suffering from the underrecognized and underreported epidemic of Excessive Daytime Sleepiness. The main medical causes of EDS or ES are sleep apnea and narcolepsy, but last year Pharma rolled out a lifestyle-caused "Shift Work Sleep Disorder." (No, it doesn't meant you can't sleep because your partner "shifts" in his or her sleep.) Ads for Provigil, a Schedule IV stimulant that treats EDS along with Nuvigil show a judge in his black robe, nodding out on the job, with the headline "Struggling to Fight the Fog?" ("Yo! Your Honor! I'm trying to plead!").

Of course wakefulness agents contribute to insomnia, which contributes to wakefulness problems in a kind of perpetual pharmaceutical jet lag. In fact, the sleeping pill/alertness aid habit is so common, it threatens to create a new meaning for "AA" — Adderall and Ambien!

Insomnia That Is Really Depression
Sleep disorders have also given a new lease on life to antidepressants. Doctors now prescribe more antidepressants for insomnia than they do sleeping pills, according to CNN. They also often combine them, since "insomnia and depression often occur together, but which is the cause and which is the symptom is often unclear."

WebMD agrees with doubling the drugs. "Depressed patients with insomnia who were treated with both an antidepressant and a sleep medication fared better than those treated only with antidepressants," it writes. Ka-ching!

In fact, many of the new blockbuster diseases from adult ADHD and RA to fibromyalgia are treated with new drugs piled on top of existing ones that aren't working, a Pharma contrivance called polypharmacy. It brings to mind the store owner who says, "I know that 50 percent of my advertising is wasted — I just don't know which 50 percent."
~~~~~~~~~
Martha Rosenberg frequently writes about the impact of the pharmaceutical, food, and gun industries on public health. Her work has appeared in the Boston Globe, San Francisco Chronicle, Chicago Tribune, and other outlets.

Monday, December 12, 2011

10 Most Horrifically Unhealthy Cereals.


Many popular cereals contain more sugar than a Twinkie or Chips Ahoy cookies.

The following article first appeared in Mother Jones.

As a kid, I once begged my mom for a product called Ice Cream Cones Cereal. That name really tickled Mom, the sheer audacity of it. It wasn't even trying to sound healthy! Needless to say, Ice Cream Cones never made it into our shopping cart. Apparently, it didn't make it into very many other shopping carts either: According to Wikipedia, it lasted for only a few months in 1987.

I'd always kind of thought that the demise of Ice Cream Cones Cereal proved that even stressed-out parents wouldn't go for such an unapologetic nutritional disaster. But boy was I wrong! In perusing a new report on sugar cereals from the Environmental Working Group, I learned about many modern cereals my seven-year-old self would have been clamoring for, including Smorz, Froot Loops Marshmallows, and Cap'n Crunch's OOPS! All Berries.

In case you couldn't tell from their names, those cereals pack in a lot of sugar (or corn syrup, but as I've said before, basically same dif). And they aren't the only ones: EWG found that three of the most popular kids' cereals (Kellogg's Honey Smacks, Post Golden Crisp, and General Mills Wheaties Fuel) contain more sugar per serving by weight than a Twinkie, and 44 others have as much sugar as three Chips Ahoy cookies.

The top ten worst, ranked by percent sugar by weight:
1 Kellogg's Honey Smacks 55.6%
2 Post Golden Crisp 51.9%
3 Kellogg's Froot Loops Marshmallow 48.3%
4 Quaker Oats Cap'n Crunch's OOPS! All Berries 46.9%
5 Quaker Oats Cap'n Crunch Original 44.4%
6 Quaker Oats Oh!s 44.4%
7 Kellogg's Smorz 43.3%
8 Kellogg's Apple Jacks 42.9%
9 Quaker Oats Cap'n Crunch's Crunch Berries 42.3%
10 Kellogg's Froot Loops Original 41.4
EWG points out that the sugar content in these dessert-like cereals is much greater than federal guidelines recommend:
More than three-quarters of children’s cereals do not meet the federal Interagency Working Group’s proposed nutrition guidelines for 2016. Far more meet the industry’s standards for foods nutritious enough to be marketed to children.
Eighty-two percent of General Mills children’s cereals don’t meet the federal guidelines, but only 5 percent fail to meet the industry’s standards. Not surprisingly, General Mills has joined other food, media, and entertainment companies in calling to replace the government proposal with industry’s
more lenient guidelines.
But major cereal makers don’t even take their own industry’s targets seriously; one-fourth of children’s cereals contain too much sugar.
So what's a parent to do? In my house growing up, my folks were partial to a rather dreary cereal called Amaranth Flakes. If you prefer your cereal a bit less austere, these major brands are good choices, says EWG
  • Kellogg's Mini-Wheats:
    Unfrosted Bite- Size,
    Frosted Big Bite,
    Frosted Bite-Size,
    Frosted Little Bite
  • General Mills Cheerios Original
  • General Mills Kix Original
Even cheaper, and hardly any sugar at all: a bowl of oatmeal.

The EWG has more breakfast factoids and suggestions here.
~~~~~~~~~~~~~
Kiera Butler is an associate editor at Mother Jones. For more of her stories, click here.

Thursday, December 8, 2011

5 Ridiculous Myths People Use to Trash Local Food -- And Why They're Wrong.

 


Articles "debunking" the local food movement are stale, shallow and often incorrect.

It's become predictable. At nearly regular intervals, someone, somewhere, will decide it's time to write another article "debunking" the local food movement. The latest installment is by Steve Sexton, posted on the Freakonomics blog (which also treated us to a previous post called "Do We Really Need a Few Billion Locavores?") And we must not forget the leading anti-locavore, James McWilliams, who gave us "The Locavore Myth" and many other, similar articles.

The arguments are stale, shallow and often incorrect. But if you enjoy the flavor of organic heirloom tomatoes, fresh picked from the farm, here's how to read these articles without filling with guilt that your love of local food is doing the planet in and starving people in the Global South.


Myth #1: People who eat local eat the same diet as those who don't.
A favorite anti-locavore argument is that eating local does not reduce oil usage or carbon emissions. Now, if locavores were munching on locally produced Big Macs and other highly processed foods as the rest of the mainstream food system does, this argument might be correct. But that's not the case.


James McWilliams likes to use the example of a study on lamb which shows that eating New Zealand lamb in London actually has a smaller carbon footprint than lamb from the U.K. The New Zealand lamb is raised on pasture, and even when you factor in the carbon emissions from shipping, it is still friendlier to the environment than grain-fed factory farmed U.K. lamb. Well, sure. Only no self-respecting London locavore would dream of eating grain-fed, factory farmed lamb. He or she would find a local farmer raising lamb on pasture instead. Now compare the carbon footprint of that to the New Zealand lamb. With similar production methods and a correspondingly similar carbon footprint, the major difference between the two would be the oil required to ship the New Zealand lamb halfway across the world.


Myth #2: The only reason for eating local is reducing 'food miles.'
Often anti-locavore arguments, such as the one above from McWilliams, are predicated on the notion that locavores only eat local to reduce food miles -- the number of miles the food traveled from farm to fork -- and the reason they do that is to reduce carbon emissions. Since modern shipping methods are relatively efficient, it is then easy to prove that it's very efficient to transport a truckload or train car full of fresh peaches from California around the rest of the U.S., compared to the efficiency of driving a relatively small quantity of peaches to and from a farmers' market. No doubt one can come up with numbers showing that, per pound of peach, transporting large quantities of peaches across the country uses less oil than transporting smaller quantities shorter distances.

But that assumes this is the only benefit to eating local, and it isn't. For one thing, who picked the California peaches? Probably migrant labor. How were they treated? How were they paid? Probably poorly. What was sprayed on those peaches? In 2004, more than 100 different pesticides were used in California peaches, including highly toxic ones like methyl bromide, paraquat, chlorpyrifos, and carbaryl. This amounted to a total of 468,804 pounds of pesticides used on peaches in California alone that year. And how about water usage? What is the rationale of growing the majority of the nation's fruit in a state that does not have enough water without heavy irrigation (and also lacks the necessary abundance of water to accomplish all that irrigation)?

Then consider your own enjoyment and nutrition. Wouldn't you rather eat a fresh fruit or vegetable that was just picked? And wouldn't it be nicer to eat a variety that was selected for flavor and not for its ability to withstand shipping and storage? These are not merely hedonistic considerations, as nutrients can degrade over time once produce is harvested. What's more, nutrients that were never in the soil to begin with cannot possibly be present in the food. A farm with healthy soil will also produce healthier -- and more flavorful -- food. Your body is wired up to desire flavorful fruits and vegetables because they are better for you. And when you eat out at a restaurant that serves local food, often the chef can work together with local farms so that the farmers plant the specific varieties of fruits and vegetables that the chef wants to serve.

One last reason for eating local are the relationships that one forms within one's community, and the economic multiplier effect that occurs within the community when one buys local. This extends beyond just food to other goods as well. When you spend your money locally, it enriches your community. When you buy from a large grocery chain, some of your money goes to pay the clerk who checked you out and the manager who oversees that clerk, but the rest goes to the grocery store's corporate headquarters, to the truckers, the warehouses, and to the farm that grew your food, far away from where you live. What's more, when you buy from the same local farmers each week, you build relationships with those who grew your food. Thus, your weekly food shopping nourishes your soul as well as your body.


Myth #3: Growing food locally is inefficient.
This is the subject of the latest tirade against eating local. The piece sings the praises of "comparative advantage," noting that it makes the most sense to grow Alabama's potatoes in Idaho, where potato yields far exceed yields in Alabama. Alabama should grow whatever it grows best and then it should ship that to Idaho, right?

This depends on your idea of efficiency. Idaho is no doubt growing Russet Burbank potatoes, the kind used in French fries. These are large, high-yielding potatoes, especially when -- as described by Michael Pollan -- they "have been doused with so much pesticide that their leaves wear a dull white chemical bloom and the soil they're rooted in is a lifeless gray powder." In The Botany of Desire, Pollan describes how an Idaho farmer with 3,000 acres grows potatoes (and nothing but potatoes!). He begins with a soil fumigant, then herbicide. Then he plants his potatoes, using an insecticide as he does. Next, another herbicide -- and so on. For "efficiency" he applies these pesticides by adding them to the water in his irrigation system, water that is then returned to its source, a local river. He also has a crop-dusting plane spray the plants every two weeks, and he applies 10 doses of chemical fertilizer. (With all of these chemicals, the farmer told Pollan he won't eat his own potatoes. He grows a special, chemical-free plot of spuds near his house for his own consumption.) Altogether, in a good year, an Idaho potato farmer will spend $1,950 per acre in order to net $2,000 per acre. Efficient?

Perhaps the Alabama potato farmers who are achieving much lower yields than they might in Idaho are using the same business model. If so, they are bad businessmen, as it would require a lot of costly inputs to produce less of a commodity that is sold by the pound, and they would make rather little money for their trouble. But if any of them subscribe to the locavore model of farming and eating, then this will not be the case.

Hopefully, the Alabama farmers forgo the costly inputs, so that the money earned from the potatoes after their harvest is their own. Potatoes, after all, require little soil fertility. In the Andes, where potatoes were first domesticated, a farmer named Juan Cayo grows potatoes at the end of a four year rotation on his fields near Lake Titicaca. First, he grow fava beans, which infuse the soil with nitrogen. For the next two years, he grows grains (barley and oats), which use up most of the nitrogen in the soil. Last, he grows potatoes, which are happy enough to grow using whatever fertility is leftover.

Crop rotation also serves to deal with the nematodes and insects that the Idaho farmer sprayed for. If any pests find the potato crop and begin to breed, they will have a bad surprise when, the next year, a different crop is sown in that field and they suddenly have no food. Some of them might find where the farmer is now growing this year's potatoes, but it will take them time to get there. For the bugs that do arrive, there are a number of organic strategies. Least efficient, but always an option, is picking the bugs off by hand. Better yet, a farmer can provide habitat for predatory insects or birds that prey on the pest or spray Bacillus thuringiensis, a bacteria that naturally kills insects.

Weeds can be suppressed by a heavy layer of mulch, removed via tilling, or pulled by hand if necessary. A common strategy is to allow the weed seeds in the field to germinate before planting your crop, then kill the weeds via tilling, and subsequently plant the crop. You might even be able to convince your chickens to kill the weeds for you. Additionally, some weeds in your field are actually a good thing. Not so many that they choke out your crop, of course, but there are edible weeds, weeds that can be used as medicinal plants, and even weeds that, when allowed to grow sparingly in your field, can boost production of your crop. For example, garden guru John Jeavons recommends dandelion, purslane, stinging nettle and lamb's quarters as edible (and nutritious!) weeds that will help your crops, too.

Last, an organic farmer growing for a local market will plant a number of potato varieties, not just one. The reasons for growing several varieties are many. Some varieties may be best for baking, others for mashed potatoes, and still others for frying. One variety might taste the best or yield the most but lacks natural resistance to a fungal disease, whereas another variety with mediocre yields does naturally resist the disease. Some varieties mature faster than others, allowing the farmer to harvest and sell potatoes all season long. And if all of the potatoes fail, the farmer is also growing a number of other crops in addition to potatoes. In short, agrobiodiversity -- growing diverse varieties and diverse species -- provides insurance.


Myth #4: We can't feed a growing population on local (organic) food.
This is the biggest whopper of all. The recent Freakonomics article says it as follows:
"From roughly 1940 to 1990, the world's farmers doubled their output to accommodate a doubling of the world population. And they did it on a shrinking base of cropland. Agricultural productivity can continue to grow, but not by turning back the clock. Local foods may have a place in the market. But they should stand on their own, and local food consumers should understand that they aren't necessarily buying something that helps the planet, and it may hurt the poor."
For an American who has never traveled outside the country -- or perhaps has traveled but not to agricultural areas, observing both peasant subsistence agriculture and industrial agriculture -- Sexton's argument might seem logical. But his argument ignores the vast expanses of land planted with entirely unnecessary crops for feeding the world: cotton, sugarcane, palm oil, soybeans, corn, rubber, tobacco, and fast growing trees like eucalyptus for paper production, to name a few. No doubt we need some cotton, sugar, and corn, etc. But the amount of land under these crops, which are then used to produce biofuels, processed foods, factory-farmed meat, paper, clothing, and industrial inputs, is immense, wasteful, and largely (although not entirely) unnecessary.

Prior to the European conquest of the Americas, sugar was reserved for the very wealthy. By the height of the Industrial Revolution, sugar made up a significant percent of calories in the British diet. And it is hardly controversial to note that Americans eat an unhealthy amount of sugar in their diets today. Palm oil, which is now found in 50 percent of processed foods and other items like cleaning products in the United States, was once a local, traditional West African food. Today, palm oil production is ravaging the rainforests of Indonesia and Malaysia and expanding to other areas like Papua New Guinea and Latin America.

Both of these crops, as well as corn, soy and jatropha are also grown for biofuels, which do not feed people. Neither does paper, which we in the United States use as a cheap renewable resource, unaware of enormous areas covered in fast growing trees that are often quite disruptive to the ecosystem around them in order to meet our needs. And grain-fed meat, as pointed out so many years ago by Frances Moore Lappe in Diet for a Small Planet, is a wasteful use of calories compared to feeding grain directly to people. If we care about feeding the world while using fewer resources, switching to pasture-raised meat -- and less of it per person in the developed world -- is a must. Doing so would likely improve our health as a nation at the same time.

There is much more to say in response to Sexton's claims. As productivity doubled during the 20th century, it did so based on the nonrenewable resources of oil and natural gas. These agricultural methods are, thus, not sustainable. That means they cannot be continued indefinitely into the future even without considering an expanding global population. We in the United States and other wealthy countries must find a new way to feed ourselves no matter what. The "gains" of the 20th century are temporary, and defending them by attacking local food will not create the needed oil or freeze the clock on the climate crisis in order to continue growing food like we do now.

To understand how to feed a growing world population, we must travel outside our bubble, into the Global South, to see how the areas of the world where the population is growing the fastest live. Here, where the people that writers like Sexton worry about feeding, live, the economics and efficiency of local, organic food are completely different from how they are in the United States. These people who live in adobe huts, grow their food with manual or animal labor, saved seeds, few inputs, and often no irrigation, are not endangering our ability to support our global population with the Earth's resources. Those who bathe and do laundry in the local stream, who store dried grains and beans because they lack refrigerators, and who never go shopping as a leisure activity as we do in the United States will not make or break the planet's ability to provide enough food, fuel, and fiber for human needs. It's we in the United States and other developed countries who will do so.


Myth #5: Eating local (organic) food is elitist.
In the United States, where processed food is artificially cheap and where many people eat what they can afford to buy at the expense of their health, local food is a luxury. For those who do not grow their own food, and especially for those who want to eat in restaurants or buy from a grocery store, local and organic food is expensive. But let's reframe the issue. Instead of asking for cheaper (but unhealthy and environmentally destructive) food, let's ask for living wages so that anyone who works full time can support their family and feed them well. Let's ask for an expanded middle class instead of a growing gap between rich and poor.

We must also note that outside of the United States and Europe, this equation is different. For peasants, local, organic food is cheap and low-risk. Going back to the example of potatoes, in the Andes, a farmer might grow 50 varieties of potatoes. Some varieties cannot be eaten directly because they are bitter or spicy, and they are instead freeze-dried using traditional methods and stored for years as a hedge against years with a bad crop. Likewise, the Andes are home to more than 3,000 varieties of quinoa. A few animals are kept to eat items that humans cannot eat and to serve as a sort of insurance -- a literal "piggy bank." When income is needed, the farmer can sell a pig or a cow. Farmers grow different varieties and different crops at different altitudes: llamas (for meat) and alpacas (for fiber) in the highest areas, then potatoes, quinoa, and other tubers and grains a little lower, then corn, and citrus and vegetables at lower altitudes. They have done this since pre-Incan times. Farmers from the lowlands trade crops with farmers from the highlands.

This sort of agriculture is not unique to the Andes (although the crops and the use of different altitudes is). Using these methods, farmers can avoid going into debt and can protect their families against bad years. Hopefully, if one variety of a crop fails, another does not. Rare is the year when every single crop fails -- and should that happen, farmers have stores of preserved food from years past and can even subsist on weeds and wild plants and animals. This does not conform to the capitalist model of maximizing yield and profits, but it serves as a low risk strategy to prevent hunger without exhausting the soil or other local resources. For the many billion peasants in the world, purchased, processed foods are elitist, not local, organic foods.

Therefore, next time you read a column that tells you your love of fresh, flavorful, healthy local foods is elitist, inefficient, or contributing to world hunger, feel free to shred that article and put it in your compost pile and then continue enjoying your delicious Green Zebra and Brandywine tomatoes with a little bit of extra virgin olive oil, homegrown basil, and sea salt without the slightest bit of guilt.
~~~~~~~~~~~~
Jill Richardson is the founder of the blog La Vida Locavore and a member of the Organic Consumers Association policy advisory board. She is the author of Recipe for America: Why Our Food System Is Broken and What We Can Do to Fix It..