Someone Was Wrong On The Internet

To content | To menu | To search


Entries feed - Comments feed

Tuesday 21 February 2017

How to Prevent Tar Sand Mining by San Francisco Bay Area Particulate Emissions Regulations

Today’s rant concerns an article in the magazine, The Nation, whose web version can be found at

The article has the title “This Bay Area Proposal Would Strike a Huge Blow to the Dirtiest Forms of Oil Production, “ by one Will Parrish, dated 31 January, 2017.

The cynic in me is having a field day in pointing out that it is just too easy to cherry pick the misstatements and lack of fact checking made by journalists writing about environmental matters. The problem here is that I find that my inner cynic has a good point.

Now it could be argued that it’s not really fair or even productive to cherry pick and attack the misstatements and lousy fact checking in such articles because that’s not the point of the articles in reporting on environmental matters that concern all of us. Yes, I concede there is value in good environmental reporting that gives us information on things that can be a danger to public health and the commonweal. My counter to such a statement is that it’s a real shame there is hardly any good environmental reporting out there.

Let me put it to you in another way (opinion alert!). Regardless of perceived value by people who are interested in articles on any given subject, facts still matter and they should always matter. Even cherry picked mistakes in factual content matter because if enough facts and misstatements exist in a piece of journalism, then it calls into question the quality and credibility of the article and the publication that allowed it into print through its failure to properly vet its content. Having made my position and opinion clear, let us proceed with picking facts and statements like ripe choke cherries before making pie.

The article is about a proposed new emission standard in California. The author shows his bias right out of the gate by making clear that the desirable purpose of this new emission standard is to prevent California refineries from processing heavier strains of crude like dilbit from tar sands - all for the purpose of fighting increasing global emissions of greenhouse gasses:

Just hours after President Trump announced his intention to resume construction of the Keystone XL and Dakota Access pipelines, Brown declared, “The science is clear,” and said there is much California can and will do on its own to combat the climate crisis. A coalition of climate-justice advocates and labor groups in the Bay Area have a proposal that they say is a prime example of how California can do this. 

The people and organizations in this coalition

“are pushing to make the San Francisco Bay Area the first place in the world to place limits on oil refineries’ overall greenhouse-gas (GHG) and particulate-matter emissions. The proposal would prevent oil corporations from making the Bay Area a center of tar-sands refining by enforcing a cap based on historic emissions levels. “

This statement is somewhat amazing to me since the single best way to stop the production of fuels from tar sands is to get the Canadians to stop mining the stuff. The second best way to keep tar sands dilbit out of California is to build the Keystone XL pipeline - that would send all the tar sand dilbit to refineries in the Gulf of Mexico. Building the Dakota Access pipeline isn’t an issue as far as dilbit is concerned since its purpose is the transportation of Williston Basin light crude. The third best way to keep dilbit out of California is outlaw the rail transport of crudes through urban areas or to tax it out of existence. Using an emissions regulation to keep dilbit out of California is kinda like using a wrench to light a campfire. Opinion Alert: I really feel that the object of such actions is not to serve the so-called stated action, whether it’s keeping dilbit and other heavy crudes out of California or to preserve Lakota sacred spaces or protect the water supply of the Sand Pipe Indian Reservation; I believe the real aim of such actions are a way to attack evil Big Oil out of frustration by individuals' lack of control over environmental issues in the face of a perceived uncaring plutocracy...but that’s a blog post for some other day. Let’s get back to misstatement hunting.

“The idea for a cap on oil refinery emissions was born from an incident some 15 miles and a world away from San Francisco’s financial district. In 2012, an explosion and fire at Chevron’s massive refinery complex in Richmond—an industrial East Bay city predominantly composed of low-income and working-class people of color—endangered 19 workers and sent 15,000 neighbors to local hospitals with respiratory ailments. Within months, a coalition of environmental-justice, environmental, and labor groups had organized to oppose the oil companies’ push to refine cheaper, dirtier crudes. “

Wow! There’s so much here to pick on!. The aim of opposing Big Oil’s “push” to refine cheaper, dirtier crudes came into being because of the 2012 Richmond Refinery fire? Let’s start with Big Oil’s “push to refine cheaper, dirtier crudes.”

This is chemical engineering 101, folks: heavier crudes are always more expensive to refine than light crudes. There is less profit in refining heavy crudes, including dilbit, always and everywhere. No one in their right mind, even an evil brain-sucking Big Oil executive who rapes and pillages the earth and steals from widows and orphans would prefer heavy crudes over light ones. The lighter and sweeter the crude, the more profit there is to be had, the cleaner the process, and more importantly, the smaller and simpler the refinery can be, meaning safer and less-regulated operation. I would recommend the author of this rather astounding statement might do to take a look at and for a crash course from a neutral source on crudes and how they are refined. Seriously, thinking heavier crude is cheaper to refine is right up there with an article I read recently where the journalist actually thought that the crude from tar sands that was pumped through pipelines was the same as raw bitumen - I’m saving that article up for a future blog post.

Now let’s consider the 2012 Richmond Refinery fire. I’ve done work there, by the way, both some geotechnical analysis of ground stability for Chevron for the proposed building of a new structure and tank testing of the Navy’s underground fuel bunkers on the back side of the hill on the refinery’s west side where their million gallon aboveground storage tanks are located. One thing I can say about the facilities there and on the Navy’s side of the hill is that both organizations really care about doing their geotechnical engineering correctly. I’ve hunted buried pipeline in the area too using geophysical tools. It’s one of the more fun places I’ve been to do environmental geoscience.

Anyway, about that fire. First, the fire had nothing to do with dilbit or tar sands products. Nothing. Zero. Zip. Most of the crude that arrives in Richmond comes through the marine terminal. There’s no Canadian tar sands dilbit currently shipped on the Pacific coast. There’s no Venezuelan heavy crude, which is almost as bad as Canadian tar sands crude, being shipped to California. The Bay Area doesn’t refine a lot of the really heavy junky crude because such low profit crap crude isn’t shipped to the Bay Area. Most of the heavy junk crudes go to the refineries along the Gulf of Mexico where more lax environmental regulations make it possible to build new refinery capability. The regulatory environment in California is already so severe for refineries that it is currently impossible to build new refinery facilities in the state (ref: fourth paragraph of, accessed 2/21/2017).

The superficial cause of the Richmond Refinery fire was the explosion of a vapor cloud caused by actions during a piping repair on a leaking sidecut pipe off one of the crude oil distillation towers. The underlying causes of the fire were 1) lack of common-sense safety engineering controls on the pipeline (no cut-off valve to shut down product flow into the pipe from the distillation tower); 2) deteriorated internal pipeline lining (lack of adequate inspection and replacement of worn piping components); 3) inadequate safety procedures (attempt to repair the pipe leak with product actively flowing in pipe instead of shutting down the distillation tower prior to making the repair; and 4) human error (the responders attempting the pipe repair damaged the pipe while trying to work on it but continued the work even after causing the pipe to shift instead of evacuating and shutting down the distillation tower). In many respects, it was a classic industrial accident made up of cascading causes, the cure of any one of which would have prevented the disaster. Chevron goofed big time. Regardless, the fire was not connected to junky heavy crudes, but to normal operations on the usual medium and light crudes and petroleum gasses refined in the Bay Area. If you want to learn more about the fire, I highly recommend the refinery fire website at and watch the narrated animation. Also check out the report at

Now let’s look at the statement that the fire and the resulting plume of rather nasty refinery-fire smoke “sent 15,000 neighbors to local hospitals with respiratory ailments.” Sounds like local hospitals were suddenly swamped by 15,000 crowding through the doors gasping for breath. This is a suspect statement right of the bat when you consider hospital capacity in the bay area. You can get an idea of Bay Area hospital capacity from the number of hospital beds available. There are approximately 14,000 hospital beds in the Bay Area (ref: – that’s for the whole region, not just the northeast end of the Bay up by Richmond, which is actually rather deprived of hospital facilities compared to the rest of the Bay. Frankly, 15000 people trying to get to get to a hospital would flood the roads and cripple the health care response ability of East Bay hospitals. So what is going on here with this statement?

Here’s where I think the journalist who wrote this article got his information: right out of the Chevron Final Investigation Report cited above, which states:

“In the weeks following the incident, approximately 15,000 people from the surrounding communities sought medical treatment at nearby medical facilities for ailments including breathing problems, chest pain, shortness of breath, sore throat, and headaches. Approximately 20 of these people were admitted to local hospitals as inpatients for treatment.”

I will leave it up to the readers of this blog post to make your own decisions as to the author’s use of the information on respiratory complaints during and after the fire, and will keep my personal assessment of misleading hyperbole to myself.

Here’s one more gem from this article. Let us go back to the statement:

“In response, a coalition of groups….are pushing to make the San Francisco Bay Area the first place in the world to place limits on oil refineries’ overall greenhouse-gas (GHG) and particulate-matter emissions.”

I do believe, based on my knowledge of California and US EPA regulations, that there isn’t a standard on CO2 emissions - but in reading this statement, If I didn’t know better then I might think that Bay Area refineries were not subject to particulate-matter emissions. Given that I at one time was working on the environmental remediation of spills at the Navy’s no-longer extant fire fighting school in the middle of the Bay on the Buena Vista, and got my fill of air quality control board and water board regs for the Bay Area, I know better. Particulate matter regulations are very old news. With the proviso that California’s air standards for the Bay Area are much more strict, you can look up the more lenient (but still stringent) US EPA standard for particulate matter at:

Now given the number of misstatements listed so far in just the first five paragraphs of the twenty-seven paragraph article, would you now give credence to the rest of it as it argues that a new emissions regulation is the way to prevent tar sands dilbit from being shipped by rail to California? Frankly, I find it rather insane that no one is even suggesting better ways to go about this, like bans on rail shipment of crude in urban areas; or regulating the stock-piling, storage and sale of the incredibly filthy pet-coke that’s produced by the refinery cokers that are required to refine dilbit; or taxing the importation of high-sulfur crudes. No, it seems clear that the author of this article and the people he supports believe that more emission regs are the way to hurt Big Oil. Whether you agree that such tactics are salutary and worthwhile instead of finding ways to build more alternative energy infrastructure to replace our dependence on fossil fuels is your own business.

In case it wasn’t obvious, grumpy science nerd is grumpy today.

Friday 26 September 2014

California Cotton

Here's a question from a Facebook discussion on the California drought among some of my friends:

"I for one am still wondering why anyone is farming below Fresno, and most the farming there is cotton, isn't cotton a 3rd world crop?"

The last bit is why this qualifies as a subject for this blog. Okay, it was just a question but I already wrote most of this post as a reply to that question on Facebook - and I'm not one to waste decent prose. All things considered, it took me about a half an hour to write the text and then two hours to attach decent references to it.

Cotton is not a Third World crop, unless we're serious about labeling the USA as a Third World country. Cotton is an all-world crop. It is grown everywhere in sub-tropical climates (1). There are cotton species that are native to all continents excluding Antarctica (1, 2). The world's leading grower of cotton is China (3), followed by India and the US trading off for the number two spot (2, 4). The US is the world's largest exporter of cotton (2,5). Texas and California trade off on being the top cotton producer in the country (6, 7, 8).

California became the top producer of cotton after the devastation of the deep South's cotton farms by the boll weevil in the first half of the 20th century (9, 10, 11) . The USDA spend decades eradicating the boll weevil so now cotton is grown again in the South but for many years, the South produced only a fraction of its former production (9, 11).

Cotton has moderate drought and saline soil tolerance but it requires irrigation throughout the American southwest, including CA (12). The southern counties of the Central Valley used to be a major producer of grapes but with the degradation of the soil from irrigation and the introduction of cotton, the vast vineyards south of Fresno are a thing of the past (13, 14, 15). Cotton is a now major crop in Kern, Kings, Tulare, Fresno and Merced counties (8). The Grapevine at the southernmost location of the Central Valley is not named for either of the winding roads that once climbed - or still climb - from the foot of Wheeler Ridge to the Tejon Pass; it got its name in the 19th century for the wild grapes and the subsequent vineyards that once dominated the area (16). Wild grapes still grow there in spots as I discovered the one time I rode my motorcycle up the now abandoned path of the original car route up the pass. I suspect you could find some to munch on if you drove that road right now.

Cotton is a heavily subsidized crop in the US (17) and in terms of labor it's a cheap crop to harvest here due to the prevalence of mechanical pickers. Cotton is still picked by hand in the so-called developing countries (1). If the subsidies were rescinded, American cotton farming would likely implode as we would no longer be more competitive on the world market with South American and Eurasian cotton, despite our lower labor costs to harvest the crop. We don't use all the cotton we grow - we export a huge amount every year. It's really just a cash cow except for when the price of cotton on the world market is low, which is often given that it's grown almost everywhere (e.g., 5).

Cotton requires upwards of 25 inches of precipitation to produce a crop (12). Annual rainfall in the southern San Joaquin Valley is between 5 to 10 inches a year (8). It doesn't take a genius to see that cotton is not a crop we should growing in CA if you want to use the State's water responsibly. Open-range cattle and sheep are probably the best fit for the water and climate of the southern San Joaquin Valley, but compared with subsidized cotton grown with subsidized irrigation water, open-range livestock are less profitable. Only a fraction of the livestock in the southern Central Valley are ranged, however: most are raised in feedlots in Fresno County, fed on high-water-demand grain crops - as anyone who has ever rolled down their windows on I-5 near Coalingua already knows. But that's a whole other topic...

  1. Encyclopædia Britannica Online, s. v. "cotton", accessed 26 Sept. 2014,
  2. Wikipedia, s. v. "Cotton", accessed 26 Sept. 2014,
  3. International Trade Centre, s. v. "Cotton Exporter's Guide", Chap. 6.2, accessed 26 Sept. 2014,
  4. International Trade Centre, s. v. "Cotton Exporter's Guide", Chap. 1.1, accessed 26 Sept. 2014,
  5. USDA (Sept. 2014), s.v. "Cotton - World Markets and Trade", accessed 26 Sept. 2014,
  6. National Cotton Council of America, s. v. "National & State Cotton Area, Yield and Production", accessed 26 Sept. 2014,
  7. National Cotton Council of America, s. v. "FAQ", accessed 26 Sept. 2014,
  8. California Dept. of Food and Agriculture, s. v. "California Agricultural Statistics Review", accessed 25 Sept. 2014,
  9. Lange, F., Olmstead, A., and Rhode, P. (2008), The Impact of the Boll Weevil, 1892 - 1932, accessed 26 Sept. 2014,
  10. Hunter, W. D., and Coad, B. R. The boll-weevil problem, USDA Farmer's Bulletin 1359, at: UNT Digital Library. Accessed September 26, 2014.
  11. Weber, Devra (1996). Dark Sweat, White Gold: California Farm Workers, Cotton, and the New Deal. University of California Press, ISBN 978-0-520-91847-4.
  12. National Cotton Council of America (1999), s. v. "Cotton Water Use", Cotton Physiology Today, v. 10 no. 2, accessed 26 Sept. 2014,
  13. Parsons, J. J., 1987Carl Sauer Memorial Lecture: A Geographer Looks at the San Joaquin Valley, accessed 26 Sept. 2014,
  14. Gentry, C. (1968), The Last Days of the Late, Great State of California, ISBN-13: 978-0891740216.
  15. Reisner, M. (1986), Cadillac Desert, ISBN 0-14-017824-4.#
  16. The Ridge Route Organization, s. v. "History", accessed 26 Sept. 2014,
  17. Environmental Working Group (2012), s. v. "Cotton Subsidies", accessed 26 Sept. 2014,

Friday 11 October 2013

Population and Meaningless Math

In the daily barrage of needless coverage of TV celebrities, I note with vague disapproval the headline announcing that Michelle Duggar, mother and star of the TV reality show 19 Kids and Counting, has made waves this week with her announcement that she and her hubby Jim Bob are working at conceiving kid #20. Why this is even considered newsworthy is beyond me. Maybe, just maybe, if she were actually pregnant then that might be worth a footnote somewhere from the Hollywood press machine. But really now, do we need to know the Michelle and Jim Bob are having a good time making the bed springs squeak? Seriously folks, we know they are married, we know they are in good health; and we can safely conjecture therefore that they are indeed engaged in activities that may lead to pregnancy. We do not need a press release to inform us of that...

Folks with 19 kids usually aren't fodder for my blog. Really, people in such circumstance are more likely to receive my heartfelt sympathy more than anything else, and I have to concede a little admiration for utitlizing the size of their family to make a little money, despite my distaste for the tacky genre of reality TV.

I could not fail to note one statement made by Michelle Duggar in the news coverage that she and Jim Bob are looking to produce little kid No. 20. The article in question can be found at (accessed 10 Oct 2013). Here's the quote:

Michelle tells Celebrity Baby Scoop that she doesn't believe in overpopulation. "We have studied it and I believe that there is a misconception about overpopulation. I think that the whole mindset of overpopulation is really overrated," the Duggar family matriarch explains. "A few years back, we stated that the whole population of the world could be stood shoulder-to-shoulder in Jacksonville. That may have changed a little bit since we've heard that statistic."

I'm going to ignore the CYA pussy-footing around the world's population today vs. "a few years back" and just use current population figures. After all, I probably won't be off more than 500 million, which is less than an order of magnitude error, so let's wave our decent approximation arms in the air and proceed. To calculate how many people we could stuff into an area, I began by gathering population, population density and area data. The US government's websites that provide geographic and population data are down right now due to the government shut down so I had to rely on Wikipedia. For references, a list of all the wikipedia webpages I consulted are appended at the end of this blog post.

The current estimate for the world's population is 7,116,000,000 people. Our first calculation, a very crude one, assumes we can stuff one person in a square foot so long as that person stands with shoulders aligned along the diagonal of the square.

The amount of space needed for the world population is then:

Start with 711600000 people
Assume 1 person = 1 square foot 
So the world's estimated population will take up 7116000000 sq ft of space

There are 27,878,400 square feet in a square mile. 

We divide through by this amount to figure out how many square miles are needed to fit the world population.

7116000000 sq ft of space
divided by 27878400 sq ft per sq mile 
equals 255.3 sq miles

The square mileage of the City of Jacksonville, Florida, is 885 square miles. 

So far so good, right? Michelle Duggar wouldn't mislead her viewing public, now, would she?

Well, allocating a square foot per person is a bit tight. It would be much more reasonable to expand that a bit. The width of an airline seat in coach class is an average 18 inches. That's not a lot since shoulders, waists and arms are usually wider, especially on guys as the following picture illustrates.

Sean with ruler

The hapless victim, played by my talented spouse in the photo, is holding an accordian ruler expanded out to 30 inches. The ruler width just the other side of his left hand is 24 inches. So a more realistic width for making a gridded area filled with people would be 24 inches. Given that squares are not the most effective shape for maximizing effective specific area, we're going to switch to using the most efficient shape for packing uniform isometric equidimensional items in the least amount of space, namely the hexagonal cell in the arrangement known as hexagonal closest packing. This should be old hate (sic?) for all of you who've suffered through crystallography, advanced physical chemistry or solid state physics. For the rest of you, just take my word for it that hexagonal closest packing - or "hcp" - is the way to go, as looking at any honeycomb in a beehive can illustrate..

A hexagon with a width of 24 inches from apex to apex is equivalent in area to 6 equilateral triangles with sides of 12 inches. Using the area of a triangle formula of 1/2 the base times the height gives us an area of 62.35 sq inches per triangle, for a total area of 374.1 sq inches per hexagon. Dividing through by 144 square inches per square foot gives us an area of 2.6 square feet per hexagon. So using hexagonal closest packing with 24 inch hexagons gets us the following:

Assume 1 person = 2.6 square feet (24" hexagonal closest packing)

711600000 people x 2.6 sq ft/person = 18501600000 sq ft

18501600000 sq ft / (27878400 sq ft per sq mile) = 663.7 sq miles

Okay then, this is still less than the area of the City of Jacksonville, Florida. But of course, there's a catch - and here it is: Jacksonville, Florida, is the city in the entire United States of America with the largest area. Not even that queen of western US urban sprawl, Los Angeles, at 503 square miles, comes close. Let's look at LA real quick. LA is huge. Not only do you have downtown, which is actually quite small, but there's all of Griffith Park, Dodgers Stadium, most of the San Fernando Valley, parts of what's considered "Hollywood" including the famous sign, half of the Santa Monica Mountain Range, and the huge industrial area and rail yards east of downtown. In a state of big spawling cities, LA is the biggest. So in comparison, LA at 503 square miles, which includes that huge piece of suburbia called The Valley (like totally for sure...), is smaller than Jacksonville at a supersized 885.

Michelle Duggar picked the one city in the country with the biggest area. If the comparison used the second largest city in the country, which is LA, then the population of the world would take up more space on the basis of using hexagonal closest packing. Now we don't know what basis Mrs. Duggar used to estimate how much room a person takes up in terms of ground area but I'm assuming it isn't very different from what I've done here.

Of course, it does look a little suspicious that Mrs. Duggar used the one municipality with a really ridicuously huge area. So what's a more normal city look like? Here's a list of cities and towns with their ground area and their population density for your perusal:

NOTE: (My blogging software doesn't make it easy to do formatted tables so until I get off my butt to write the xhtml code to do the formatting, please forgive the crappy presentation below)

City..........................Area...........population density

Name....................Sq Miles.........population/sq mile

Los Angeles.............503..............225
New York..................302.6...........27550
Hong Kong...............426..............17024
Groton, Conn............45.3............890
Salt Lake City............110.4...........1666

Groton is where I grew up so I threw it onto the list. It's a typical town in coastal southern New England. I was surprised that Dehli was smaller than places like New York and Beijing. I was blown away that Macau and Manila were so tiny in terms of area, especially given their very large population densities.

Let's talk about population density. What Michelle Duggar's simplistic analysis lacks is the awareness that one can not judge whether the world is over-populated by examining how many people you can squeeze into the smallest possible area. All that does is tell you, well, how many poeple you can squeeze into the smallest possible area. It's a useless measure. It's smoke and Mirrors, folks. Why? Because it doesn't tell you anything real; for example, it can't tell you about how much area it takes to grow food for 7 billion people or how much fresh water has to fall out of the sky to grow that food and quench everyone's thirst. Stuffing everyone into the smallest possible space doesn't tell you how many people you can fit into an urban environment - an environment where you have to be able to bed all those people; to have transit systems to get them to work and to the market to buy food; to build and maintain roads and rail and canals and quays to move food and goods in and waste out; to construct water mains, storm water drains, sewer drains, water treatment plants and electric power lines; to plan and create parks and theaters and sports arenas. 

Figuring out how much room you need to fit the world's population on a tightly-packed grid doesn't tell you anything about how much room people need to actually LIVE. The thing that tells you about how many people you can stuff into a city where they can actually live and work and prosper will be population density.

We really don't know what the maximum population density might be before the critical infrastructure necessary to maintain urban life fails. We do know what the highest population densities are in the world's most populated cities. Given that an excess of thousands of people aren't dying everyday in cities all the time tells us that we have not yet exceeded a population density so great that modern infrastructure fails to provide our needs.

The number one most crowded urban space in the world right now is Manila in the Phillipines, with an astounding 111,002 people per square mile. The high population density of Dehli in India wasn't much of a surprise but the almost-as-high population density of Paris was, at least to me. That Paris has more than twice the population density of New York City was an eye opener and a bit of humble pie for this Yankee. I put both Vienna and Salt Lake City on the list as examples of moderate-sized urban places with good infrastructure and pleasant habitable environs. Vienna is considered one of the most desirable cities to live in, independent of my personal bias as a former resident. Salt Lake City is probably my favorite city in the left half of the US, with its stunning surroundings, comfortable size, exceptionally clean urban environs, and its active arts scene. I wasn't surprised at the population density in Vienna given that the city limits include the massive greenbelt known as the Wiener Walt or Vienna Woods, made famous by the Strauss waltz of the same name. I was surprised at the really low population density of Salt Lake City but that was before I looked at the city limits on a map. The city includes City Creek Canyon and Grandview Peak in the Wasatch Mountains, all of the airport and the much of the salt marsh north of the airport, the industrial area starting at the railyards and extending as far west as the Kennecott tailings, and most of the salt marsh west of the airport all the way out to the shore of the Great Salt Lake. Basically, the developed parts of the city are exceeded by undeveloped lands also within city limits. It's all that empty land that makes Salt Lake City's population density so low on paper. In a way, looking at population density for cities is a bit misleading since how you draw city limits can distort that number. Salt Lake City and Jacksonville have low population densities compared to places like New York and Hong Kong because both incorporate large amounts of empty land within city limits.

To make a measure that might better reflect the minimum amount of area needed to sustain the world's population, one approach would calculate how much land would be used if the world's population were confined to an area with a population density already sustained by one of the world's most populated cities. I have done just that. I took the population densities from the previous list and then using those numbers, figured out how much room 7,166,000,000 people would take up if we fit them into a space with, for example, the population density of Singapore or LA. Here's the list of those areas:

City..........................Area......  ...population.................area for world population this population density  

.............................Sq Miles........people/sq Sq Miles 

New York..................302.6...........27550...................258294
Hong Kong...............426..............17024...................417998
Los Angeles.............503..............225........................865167
Vienna.......................160.1.......... 10366...................686475
Salt Lake City............110.4...........1666.....................4271308
Jacksonville..............885..............1100.................... .6469091
Groton, Conn............45.3............890........................7995506

This list doesn't tell us much because there isn't anything to compare these areas against. So what I did next was to rewrite this list with the areas of various US states and some other places for comparison.

Place, Calculated or Actual Area in Sq Miles

  • Ohio                       44825
  • Manila                    64107
  • Missouri                 69704
  • Utah                       84899
  • Colorado                104094
  • Delhi                      107598
  • Paris                       129620
  • Macau                    147966
  • California                163696
  • New York               258294
  • Texas                     268581
  • Singapore              375653
  • Hong Kong             417998
  • Alaska                    663268
  • Vienna                   686475
  • Mexico                  759516
  • Greenland             836297
  • Los Angeles            865167
  • Argentina               1073518
  • Beijing                   2156364
  • Australia                2969907
  • Brazil                    3287597
  • USA                      3537110
  • Canada                 3854085
  • Europe                  3930000
  • Salt Lake City       4271308
  • Antarctica             5300000
  • Jacksonville          6469091
  • South America      6890000
  • Groton                   7995506
  • North America        9540000

What does this tell us? Well, if all the world's 7.116 billion people lived in one city as dense as Manila, that city would take up an area bigger than Ohio but smaller than Missouri.

It's useful at ths point to look at Jacksonville, the city that Michelle Duggar invoked in her "research" on the world's population. If all the world's 7.116 billion people lived in one city with the actual population density of Jacksonville, Florida, that city would take up an area bigger than the continent of Antarctica but smaller than South America. Of course, almost all of Antarctica is uninhabitable, but that's beside the point.

We still haven't looked at really important things like how much land you need to feed the world's population and what density you need of roads and rail and water shipping to move food and goods in and garbage out. These are big complex subjects with answers that aren't easy to calculate, as any geographer, earth scientist, or climatologist can tell you. The analysis we did here is really very simplistic. All we really did was show the inadequacy of using Michelle Duggar's approach for saying anything meaningful about over-population. We didn't come close to a real analysis of population sustainability. That would take a book or two, I suspect, and it might take more than that if one were to get really technical in a truly critical and scientific manner.

So, was Michelle Duggar wrong on the internet? I would answer that by first pointing out that she used a city so large in area that her analysis approached the threshold of lying with statistics; and depending on whether her faux pas was intensional, she possibly crossed that threshold of dishonesty. Was her analysis a meaningful measure of population vs. over-population? Well, do the, wait. We just did that, didn't we?


If subjects concerning population science intestest you, a good blog to follow is: I must tip my hat to Tim De Chant, its author, and note that I stole his really useful idea of expressing world population in terms of area vs. population density for this post. Crediting one's sources is good manners, after all.


  • (All accessed 10 October 2013)
City Sq Miles pop./sq mile
New York 302.6 27550

Monday 23 September 2013

Pipeline Pathways and Foot-In-Mouth

Sometimes a journalist or pundit says something so stupid and so amazingly clueless that it takes my breath away. Back in July this happened. The blog software my website provider uses has its moments though and it ate the wonderful blog post I wrote on today's subject. It has taken me two months to return to this, one of the lamest things I have ever seen in print.

How lame it is? Let me preface the target of today's blog post with a little personal history. I have worked in rail yards. When you hang out in rail yards, you learn all sorts of cool things about what gets moved around by rail. One of things you learn is that rail roads are really cool. They move freight cheaper than trucks on the interstate for all long hauls greater than a few hundred miles. They are three times cleaner per ton of freight than 18-wheelers and they have really small carbon footprints compared to cars and trucks.

There are some other things I learned about railroads while working at them. I worked as a contractor in the two rail yards around Sacramento owned by Southern Pacific ("SP") back in the late 80s before Union Pacific bought them out. The company I worked for did the environmental engineering and remediation for SP at the time. The Roseville Railyard was a Superfund site back then. I did a lot of environmental stuff there and managed all the environmental activities in the yard for a time, before I flipped the jerks who ran the environmental engineering firm the bird and quit. Having done environmental stuff in a railyard left me with an understanding of the transport of all kinds of hazardous stuff that travels by rail.

On the flip side, I've also worked in and around pipelines, the kind that carry oil and gas, and yes, crude. My first pipeline carried jet fuel from a US Navy dock facility on the central California coast, over the California Coast Ranges, and into the Naval Air Station at Lamar, in the middle of the San Joaquin Valley. There was an indication it had a small leak somewhere. My work partner and I went leak hunting.

In the environmental geology world, the biggest concern over pipelines, though, is in the context of running drill rigs during environmental investigations. It's not a good thing to accidentally drill through a pipeline. Back when I was running all the field activities at the Roseville Railyard, I had one sampling location along the street right by the railyard's diesel repair shop. There was a PG&E pipeline scant feet from the sample location. In fact, we moved that location away from the pipeline. Regardless, being 4 feet removed was still too close for PG&E, who sent a crew out to dig out the pipeline by hand and then shore their excavation while we drilled our sampling well. That's just a sample of two of the pipelines I've run into in my day.

All of this is germane, which you'll appreciate in just a moment more when I unveil the reason for this post.

So, what is this marvel that compelled me to rant and rail (pun intended) for your benefit herein? Well, it's a rather astounding utterance in print from our friends at the Wall Street Journal in the aftermath of the ongoing tragedy in Lac Megantic.

Y'all remember Lac-Mégantic, Quebec, n'est pas? Where there was that horrific train derailment and explosion, a pile of people dead, a beautiful little town on a lovely lake destroyed? Did I mention anywhere yet that I've been there? I've driven through Lac-Mégantic twice, going to and from Trois Rivieres last October. It was a pretty place surrounded by forests and the last gasps of the northern end of the Appalachian Mountains. The view from downtown out over the lake was breath-taking. I don't look forward to revisiting. By all reports, the entire center of the town is gone. Photo: @sureteduquebec, Twitter

So what was this utterance in the Wall Street Journal? It was in an editorial with the provocative title of: "Can Environmentalists Think?" by one Bret Stephens, published on July 8, 2014 (, accessed July 8 and September 23, 2013).

The editorial is a rant about all the environmental types wailing over the transport of crude by train vs. pipeline in the aftermath of the Lac-Mégantic disaster. Of course, a lot of hand-wringing was done by the environmentalist types about how Lac-Mégantic shouldn't be used as a reason to support pipelines like the Keystone XL project. The tone of the editorial is rather caustic and the author laments, rightly so, about the lack of common sense and practical knowledge of most environmental activists about the real-world compromises that modern society has to make to support our industrial infrastructure. His points were apt though his sarcasm and tone were distasteful to me. Granted as someone who has a legitimate claim to having made a living as a professional in environmental science, I can't say that I have a lot of respect for your average environmental eco-idiot. Most of them have little real understanding of the science of the field they espouse to champion. I'm not sure who I dislike more: the businesses I've worked for who would let their environmental damage sit and be ignored if not forced by law to do something about it or the environmental crusaders who have no clue as to the real issues and science behind the causes they espouse. The former are near-criminals and the latter are living examples of my favorite adage that thinking is work and people are lazy...

You can see that I didn't much like the editorial. It wasn't respectful in a bad-manners kind of way. In fact, it was insulting and annoying and I'm not even one of the folks being insulted. It was needlessly nasty in my opinion (yours, of course, may differ). But that's not why I'm dissing this piece of work...

What set me off badly enough to make it the target of my blog? It's the following really clueless statement:

"Pipelines also tend not to go straight through exposed population centers like Lac-Mégantic."

The author was arguing the virtues of pipelines environmentally and this was one of his points about their safety in comparison to railroads. Now, I think I might know a thing about railroads and pipelines both.

So here's a short list of just a few places that pipelines make a beeline straight through exposed populations centers larger in size than Lac-Mégantic and its approximate 5000 souls. Population figure are from the 2010 US Census. To be absolutely rigorous, I have limited this list to places that I can personally go and stand on the pipelines in question. This is not a theoretical list made from using someone else's reference material. These are crude oil pipelines I know myself professionally, can stand on their path and point to the downtowns or residential neighborhoods they transit.

  • Conroe, TX pop. 56207
  • Taft, CA pop. 9327
  • South St. Louis, MO pop. 318172
  • Salt Lake City, UT pop. 189314
  • Farmington, UT pop. 18275
  • Bountiful, UT pop. 42522
  • Layton, UT pop. 67311
  • Eureka, CA pop. 27191
  • Rawlins, WY pop. 9259

Did the author of this editorial even bother to do any research on pipelines before suffering from verbal diarrhea?

Wednesday 18 September 2013

Miscaptioning Atrazine

What a difference a few words makes. Today's offering is a figure caption from Wikipedia. Maybe it's unfair to pick on Wikipedia - but since it has become the launching point for many an inquiry, I don't think they should be exempted from scrutiny. All things considered, I think Wikipedia is a good thing. I'm a big fan of not having barriers to knowledge for people outside of academe. Given the open and egalitarian nature of Wikipedia, there's far more that's right with it than wrong. The downside of Wikipedia is that it takes time to craft quality articles from a neutral perspective when anyone at all can contribute to writing and editing. It will never be the Encyclopedia Britannica but it has become a great place to start a research project on the net.

I debated whether to even bother with a post about one small figure caption on Wikipedia. Then I realized that if the same figure caption had shown up in a scientific article that I had been asked to peer review for a journal, I would have no mercy on the article authors. Why? Because figure captions matter. A lot of science professionals read articles outside their discipline by skimming in the following manner: first one reads the abstract followed by the figures and figure captions. Depending on the ego and nastiness of any given scientist, some would include a third step which would be to check the references to see if one had been cited. After all, it really is a publish and perish world out there and citations matter.

Basically, figure captions matter. When you consider that journalists and bloggers often lift figures out of journal articles and reprint them in internet or newspaper content, then figure captions matter a whole lot more than one would think. So in this context, I decided that, yes, I would indeed pick on just one short figure caption in Wikipedia.

Earlier today, I was reading a string of comments on Facebook about a murderer and his victims. Someone made a comment speculating that the murderer could have poisoned one of his victims with atrazine. This immediately hit my HUH? filter big time and left me wondering how much atrazine comprised a lethal dose for an adult human.

These days, I tend to look at Wikipedia first for regulatory, physical chemistry and toxicology information since many chemical pages on Wikipedia often include that info. If the Wikipedia page is any good, there will be a link back to a public health, industrial hygene, health physics or envirnomental science authority or journal where cited numbers can be verified. For the record, unless I already know a number off the top of my head (for example, I know most of the EPA MCLs for heavy metals by heart), I almost always verify numbers, especially if I'm going to be commenting or blogging about it later. Just as a quick FYI, the CDC is even better than the EPA if you want to look up understandable environmental and toxicological info about pollutants.

Getting back to our main topic here, which is a figure caption on the English-language Wikipedia site for atrazine, I found the comment from Facebook rather odd since herbicides are not popular or widely used poisons for homicides. As I suspected after looking at the toxicology numbers for atrazine, the amount needed to poison someone would be several tablespoons. Nope, atrazine would make a lousy homicide poison on the basis of quantity required. I suspect it would also taste bad too. Arsenic and strychnine are in no danger of being displaced as effective human poisons by atrazine. I'm sure that's a great relief to know! You can sleep better tonight knowing that evil atrazine from the blue earth corn fields of Minnesota will not waylay you and bring you to death's door before you wake.

Of course, atrazine has its own little anti-fan club because of its use in American farming, for cereal crops and especially maize, the iconic crop of the Midwest. Like all other things that farmers put on their crops in liquid form, atrazine has infiltrated into drinking water aquifers wherever farming is big. If you believe that atrazine is a danger to public health or the environment, then this is a matter of concern.

Regardless of the real or imagined danger posed by atrazine, having good facts at hand on its spread, prevalence and impact is necessary for meaningful debate. For the people out there who go to Wikipedia - and no farther - for their information, getting the facts right on the page for Atrazine strikes me as highly desirable. Now there are a few things that could use some fixing on this wiki page, but the one and only figure caught my eye immediately. Here's what it looks like, straight off my monitor screen: atrazine2.png

Did you spot the caption below the figure? "Atrazine use in pounds per square mile by county."

I made the mistake of really getting eye tracks all over this figure BEFORE blowing it up for finer inspection. Right off the bat, I thought all that green-level use of atrazine in New England was off-base. Seriously, New England - the home of rocky ground and non-existent top soil - was using that much atrazine? You don't use atrazine if you're farming apples, potatoes, maple syrup, trees or cows - which are all the main aggie products in the New England states. Now look at California and southern Idaho - especially southern Idaho where one of the biggest crops is barley. I would have thought the atrazine use in these area would be much higher than on the figure.

So I enlarged the figure: atrazine.png

I just love how the highest usage area overlaps the Midwest corn belt. Check out the non-linear scale too. There's all sorts of fun on this figure.

The enlarged figure did two things for me. First, I could actually read the text inside the figure box. I couldn't before because I've reached that point of middle age where I should really be wearing reading glasses and I'm too vain to enlarge the type size like an old person. After enlarging the figure, I could read the rest of the text on the figure and saw that the original caption was "Average annual use of active ingredient (pounds per square mile of agricultural land in county)."

Wow! That's a big difference. Use per square mile of farm land in a county is a lot different than use per mile of all land in a county! This figure would never convey how much bulk atrazine was being spread around on a per area basis. It only tells you how likely it is that farms will use atrazine on a county by county basis, regardless of how much farm land is in any given county.

The bottom line is that the Wikipedia caption that's big enough for an old person like me to read is misleading. As soon as I can figure out how to send in edits to Wikipedia, I'll try to fix this caption.

The second thing that enlarging the figure did for me was confuse me terribly. If the figure is showing me usage BY COUNTY, then I should be able to discern county shapes in the data but I should not be able to pick up details smaller than counties. The problem here is that there are features in the data that are obviously smaller than whole counties.

For starters, you can pick out pieces of interstates, like I-80 west of Chicago and the I-39 corridor in northern Illinois. You can see the Platte River in eastern and central Nebraska. You can see Columbus, Indianapolis, Peoria, and Cleveland but not Toledo or Des Moines. Cities and rivers are at scales finer than counties. A figure that's captioned as presenting data on a "by county" basis is mislabeled if you're seeing details smaller than counties.

The explanation turns out that the figure really isn't on a per county basis in a weird sort of way but you have to go to the source of the data to find that out. The source of the figure turns out to be respectable and reputable. The data and the figure both are from a very recent USGS report on pesticide usage in the USA. The complete citation is: Thelin, G.P., and Stone, W.W., 2013, Estimation of annual agricultural pesticide use for counties of the conterminous United States, 1992–2009: U.S. Geological Survey Scientific Investigations Report 2013-5009. You can also find it online at (accessed 18 Sept 2013). The authors of this USGS report did something kinda strange with their data and I'm left wondering why they bothered since it strikes me as somewhat counter-intuitive. Here's their explanation from the USGS webpage that explains how they made the pesticide usage maps in their report:

Individual crop types....were reclassified to simply differentiate agricultural land (including pasture and hay) from non-agricultural land....then generalized to one square kilometer cell size and the percentage of agricultural land for each cell was calculated. The proportion of county agricultural land included in each one square kilometer cell was multiplied by the total county use for each pesticide to calculate the proportional amount of use allocated to each cell. To display pesticide use on the annual maps for each compound, all of the cell values nationwide for the entire period were divided into quintiles and a color-coded map was generated for each year; the quintile classes were converted to pounds per square mile.


You follow all of that? They proportioned out the farm land in each county by one kilometer cells, allocated to each cell the amount of pesticide known for the county multiplied by the proportion of farmland in the cell, and then rebinned it all to present it on one national map in units of pounds of pesticide used on a per square mile basis. At the scale of the entire country, this conversion from kilometers to miles is a monstrous amount of work which would not change the level of detail one could see on the maps in their report. For their purpose, the conversion step was essentially superfluous!

One last thing. If you sit down and actually read this USGS report, you'll discover that the usage numbers for almost all the pesticide and herbicide data broken out by county is estimated based on statewide data.

My brain hurts.

Sunday 7 July 2013

Which is Safer to Transport Crude: Pipeline or Rail?

After the tragedy that's right over the border in Lac Megantic, it was only a matter of time before someone squawked about the lack of safety and the horrible environmental harm involved in transporting crude oil by rail. In fact, the Bangor Daily News already covered an aspect of this question last Summer in reaction to the increase in crude traveling by rail across Maine to a refinery in New Brunswick. But just a few hours ago, I spotted the first news article questioning the relative safety of pipelines vs. trains and more are sure to follow as the horror of the Lac Megantic disaster sinks in (see

Of course, the various professional associations that represent the rail and pipeline businesses have had their say as to which mode of transport is safer. This isn't really a big deal at all for transporting North Dakota crude across Maine to New Brunswick because there aren't any petroleum pipelines in central or northern Maine to compete with the two local rail carriers. Now if you're someone deeply invested for or against the proposed Keystone XL pipeline project, then looking at the relative safety of shipping oil sands dilbit by rail vs. pipeline is a big deal, especially since the controversial US State Department Environmental Impact Statement ("EIS") uses some of those arguments in favor of pipelines.

It will be interesting to see if the State Department reissues the Keystone EIS to include the Lac Megantic train accident as further support for the pipeline.

It is worth taking a look at the safety of shipping crude in pipelines vs. railroad tank cars. This is made difficult since the federal database on train accidents at the Federal Railroad Administration is even worse and more crypic than the old USGS surface and ground water database - and that's saying something given the notoriety of the infamous USGS water data interface. In fact, after fighting with the database on the Federal Railroad Administration website, I gave up and decided to use some data from the Association of American Railroads ("AAR").

The AAR data is from a two-page info piece that presents the AAR's assertion that shipping crude by rail is safer than shipping via pipeline. To do so, the AAR calculated an average spill rate on a per barrel and per mile basis. They dug up numbers for miles of track and miles of pipeline for this analysis. They also confined their analysis to exclusively crude oil. This approach has some major problems. Due the to differences between railroads and pipelines, their numbers are based on the no-no of false comparison. You can look at their two-page analysis for yourself at

The first problem with the AAR approach is the focus on crude. It's not really the right thing to focus on. The safety of tanker cars is the important factor in spills, not the contents. The gig with rail transport is the variety of rail cars used. There are many different tank cars types, though the major categories are pressurized or unpressurized, and insulated or uninsulated. Petroleum oils are commonly shipped in insulated non-pressurized tank cars. To evaluate tanker car safety for shipping crude, one should look at all hazardous liquid spills involving insulated non-pressurized tank cars. The spill rate for insulated non-pressurized tanker cars will be underestimated if one looks at just crude oil spills. So long as the liquid inside is appropriate for the tank car design, then the actual contents by themselves are unlikely to cause a spill. It's the external factors which are important in spills, things like tanker car design, the car's condition, train speed, weather and rail damage.

Any tanker car spill rate which is based solely on crude oil will be be lower than the actual spill rate

Using a spill rate based on the amount of track traveled is also problematic because one can ship multiple liquids over differing route. Selecting an appropriate amount of track mileage for a crude spill rate is therefore slippery and subjective. Also, consider that there is more track mileage than pipeline mileage. While oil pipeline mileage is essentially a constant with fixed-location end points, rail mileage can vary depending on the route taken. The AAR tried to compare a shipping mode with fixed mileage and and one known product with one that can ship any liquid commodity over a wide choice of routes. These differences add up making any comparison inadvisable. Comparing apples and gorillas is a false comparison.

Comparing spill rates on a per mile basis is questionable since total rail mileage is variable.

Even if spill-per-mile rates are problematic, there is still a way to compare pipelines vs. trains. The trick is to drop per mile measurements entirely and to convert spill rate to a dimensionless quantity. In this case, an appropriate dimensionless spill rate is the volume spilled vs. the total volume shipped. The subjective amount of mileage traveled is now no longer a problem and concerns of disparate infrastructure can no longer bias the results. Gross volume shipped is unaffected by other variables.

Let's look at the numbers. Data are from the AAR, the National Transportation Safety Board, the Association of Oil Pipe Lines and the Pipeline & Hazardous Materials Safety Administration ("PHMSA").

Right off the bat, it's obvious that pipelines do a better job in terms of injuries and death. Using 2011 data, fatalities for five modern transportation modes are:

  • 32367 highway deaths
  • 800 marine deaths
  • 759 railroad deaths
  • 494 aviation deaths
  • 14 pipeline deaths

Pipeline fatalities are lower by an order of magnitude compared to other shipping modes. This is likely due to the fixed nature of pipelines, where the pipe and pumping equipment are stationary. All other shipping modes involve the motion of both heavy machinery and product through space, a complex system of many moving parts. It's a no-brainer that a lot that can go wrong in any complex system. In comparison, pipelines are much simpler since only the product moves, not the pipeline.

US pipelines transport approximately 11.3 billion barrels of petroleum per year. Approximately 52% of this is crude so a total of ~5.9 billion barrels of crude oil travel in pipelines.

The rail transport data from AAR was decadal so I converted it to an annual basis instead. Here's the converted AAR data:

  • Average annual barrels shipping by rail: 20,632,788.
  • Average annual barrels spilled by rail: 206.

As already mentioned, that spill rate probably underestimates the risk of spills from tanker cars.

The AAR also reported a spill volume for pipelines:

  • Average annual barrels spilled by pipelines: 43,131.

We can test the validity of the AAR spill amount for pipelines by comparing it to other reported estimates for pipeline spills. The database from the PHMSA website is set-up to report on all petroleum products in pipelines, including liquified gasses. Here's the average spilled by pipelines based on that data:

An average of 105,370 barrels of all pipeline products are spilled every year.

Assuming that 52% of those pipeline products are crude oil, then an average of 54,792 barrels of crude in pipelines is spilled every year.

Now we can calculate annual average spill rates on a dimensionless volume-only basis:


20 632 788 barrels/year shipped on trains     
206 barrels/year spilled
9.98 e -06 = fraction spilled
= 0.000 998 %


5 900 000 000 barrels/year in pipelines (52% of PHMSA volume)
43 131 barrels/year spilled from pipelines (AAR data)
7.31 e -06 = fraction spilled
= 0.000 731 %


5 900 000 000 barrels/year in pipelines (52% of PHMSA volume)
54 792 barrels/year spilled (52% of PHMSA volume)
9.29 e -06 = fraction spilled
= 0.000 929 %

There's very little difference between these numbers.

If the data were actually available for the miles that every barrel of crude traveled from oil field to refinery, the most rigorous way to calculate spill rates would be on a basis of per mile of actual travel, which is not the per mile basis from before which was calculated using the total fixed track mileage or pipeline mileage (both constants). But that data might be impossible to generate since you would need access to the shipping records of every railroad and pipeline. One can make a strong argument, however, that the number of miles traveled by crude on rail vs pipeline will not be significantly different. Why? Because the distances between oil fields and refineries will be about the same regardless of shipping mode. Those miles are not going to vary much.

The bottom line here is that the dimensionless volume-basis spill rate is about the same for both rail and pipeline. If this is really true, then we must look at other variables before one can say that pipelines are preferable to rail for moving crude. We already know that shipping by rail is around ten times more lethal than shipping by pipeline. What we don't know, and would like to know, is impact on the environment. We've looked at quantity of spills, but we haven't considered the "quality" of those spills. This is not a stupid question. A leaking pipeline is going to impact the environment differently than a spill during a train accident. Location for oil spills matters! A spill off a rail car into a river is a different beast than a leak from a buried pipeline that may dribble down into a potable aquifer. So given that the volume-basis rate of spills is approximately the same for pipelines and rail, the real question should be which transport mode impacts the environment the least.

I suspect the folks in Lac Megantic already have an opinion on that.

This blog post was editted on 30 July 2013 to improve readability.


Association of American Railroads, "Just the Facts – Railroads Safely Move Hazardous Materials, Including Crude Oil,", accessed 7 July 2013.

Association of Oil Pipe Lines (2013), "About Pipelines,", accessed 7 July 2013.

Christou, P. A. (2010), "Silhouettes of Rail Cars, Tank Trucks and Chemical Tanks," ttp://, accessed 7 July 2013.

Federal Railroad Administration Office of Safety Analysis (2013, July 5),, accessed 7 July 2013.

National Transportation Safety Board, "34,434 Transportation Fatalities In 2011, ", accessed 7 July 2013

Pipeline & Hazardous Materials Safety Administration, "All Reported Pipeline Incidents,", accessed 7 July 2013.