Ammonia - The World's Most Important Material?

"Ammonia … deserves the top position as our most important ma­terial. As explained in the previous chapter, without its use as the dominant nitrogen fertilizer (directly or as feedstock for the synthe­sis of other nitrogenous compounds), it would be impossible to feed at least 40 percent and up to 50 percent of today's nearly 8 billion people. Simply restated: in 2020, nearly 4 billion people would not have been alive without synthetic ammonia. No comparable existen­tial constraints apply to plastics or steel, nor to the cement that is required to make concrete (nor, as already noted, to silicon).

"Ammonia is a simple inorganic compound of one nitrogen and three hydrogens (NH3), which means that nitrogen makes up 82 per­cent of its mass. At atmospheric pressure it is an invisible gas with a characteristic pungent smell of unflushed toilets or decomposing animal manure. Inhaling it in low concentrations causes headaches, nausea, and vomiting; higher concentrations irritate the eyes, nose, mouth, throat, and lungs; and inhalation of very high concentrations can be instantly fatal. In contrast, ammonium (NH4, ammonium ion), formed by the dissolution of ammonia in water, is non-toxic and does not easily penetrate cell membranes.

First reactor at the Oppau plant in 1913

"Synthesizing this simple molecule was surprisingly challenging. The history of inventions includes famous cases of accidental dis­coveries; in this chapter on materials, the story of Teflon might be the most apposite example. In 1938, Roy Plunkett, a chemist at DuPont, and his assistant Jack Rebok formulated tetrafluoroethyl­ene as a new refrigerant compound. After storing it in refrigerated cylinders, they found that the compound underwent unexpected polymerization, turning into polytetrafluoroethylene, a white, waxy, slippery powder. After the Second World War, Teflon became one of the best-known synthetic materials, and perhaps the only one that made it into political jargon (we have Teflon presidents, but seemingly no Bakelite presidents -- though there was an Iron Lady).

"The synthesis of ammonia from its elements belongs to the opposite class of discoveries -- those with a clearly defined goal pursued by some of the best -- qualified scientists and eventually reached by a per­severing researcher. The need for this breakthrough was obvious. Between 1850 and 1900 the total population of the industrializing countries of Europe and North America grew from 300 million to 500 million, and rapid urbanization helped to drive a dietary transi­tion from a barely adequate grain-dominated supply to generally higher food energy intakes containing more animal products and sugar. Yields remained stagnant but the dietary shift was supported by an unprecedented expansion of cropland: between 1850 and 1900, about 200 million hectares of North and South American, Russian, and Australian grasslands were converted to grain fields.

"Maturing agronomic science made it clear that the only way to secure adequate food for the larger populations of the 20th century was to raise yields by increasing the supply of nitrogen and phos­phorus, two key plant macronutrients. The mining of phosphates (first in North Carolina and then in Florida) and their treatment by acids opened the way to a reliable supply of phosphatic fertilizers. But, there was no comparably assured source of nitrogen. The mining of guano (accumulated bird droppings, moderately rich in nitrogen) on dry tropical islands had quickly exhausted the richest deposits, and the rising imports of Chilean nitrates (the country has extensive sodium nitrate layers in its arid northern regions) were insufficient to meet future global demand.

"The challenge was to ensure that humanity could secure enough nitrogen to sustain its expanding numbers. The need was explained in 1898 in the clearest possible manner by William Crookes, chemist and physicist, to the British Association for the Advancement of Sci­ence, in his presidential address dedicated to the so-called wheat problem. He warned that 'all civilized nations stand in deadly peril of not having enough to eat,' but he saw the way out: science com­ing to the rescue, tapping the practically unlimited mass of nitrogen in the atmosphere (present as the unreactive molecule N2) and con­verting it into compounds assimilable by plants. He rightly concluded that this challenge 'differs materially from other chemical discover­ies which are in the air, so to speak, but are not yet matured. The fixation of nitrogen is vital to the progress of civilized humanity. Other discoveries minister to our increased intellectual comfort, lux­ury, or convenience; they serve to make life easier, to hasten the acquisition of wealth, or to save time, health, or worry. The fixation of nitrogen is a question of the not far-distant future.'

"Crookes's vision was realized just 10 years after his address. The syn­thesis of ammonia from its elements, nitrogen and hydrogen, was pursued by a number of highly qualified chemists (including Wilhelm Ostwald, a Nobel Prize winner in chemistry in 1909), but in 1908 Fritz Haber -- at that time professor of physical chemistry and electrochem­istry at the Technische Hochschule in Karlsruhe -- working with his English assistant Robert Le Rossignol and supported by BASF, Ger­many's (and the world's) leading chemical enterprise, was the first researcher to succeed. His solution relied on using an iron catalyst (a compound that increases the rate of a chemical reaction without alter­ing its own composition) and deploying unprecedented reaction pressure.

"It was a no smaller challenge to scale up Haber's experimental suc­cess to a commercial enterprise. Under the leadership of Carl Bosch, an expert in chemical as well as metallurgical engineering who joined BASF in 1899, success was achieved in just four years. The world's first ammonia synthesis plant began to operate at Oppau in September 1913, and the term 'Haber-Bosch process' has endured ever since.

"Within a year, the Oppau plant's ammonia was diverted to make the nitrate needed to produce explosives for the German army. A new, much larger, ammonia factory was completed in 1917 in Leuna, but it did little to prevent Germany's defeat. The postwar expansion of ammonia synthesis proceeded despite the economic crisis of the 1930s, and continued during the Second World War, but by 1950 syn­thetic ammonia was still far less common than animal manures.

"The next two decades saw an eightfold increase of ammonia pro­duction to just over 30 million tons a year as synthetic fertilizer enabled the Green Revolution (starting during the 1960s) -- the adoption of new superior wheat and rice varieties that, when supplied with ad­equate nitrogen, produced unprecedented yields. The key innovations behind this rise were the use of natural gas as the source of hydrogen, and the introduction of efficient centrifugal compressors and better catalysts."


How the World Really Works

Fwd: Now I Know: How To Save a Sinking Church

Winchester Cathedral,
You're falling all down...



You need a diver and a lot of concrete

 

How To Save a Sinking Church

Pictured above is Winchester Cathedral, a massive church in Winchester, England. (Here's a map.) The cathedral was constructed over the course of five centuries -- builders broke ground in the year 1079 and generations thereafter kept building expansions through 1532. It's an incredible architectural site and if you want to visit, you won't be alone; more than 350,000 tourists stop by each year.

But a bit more than a century ago, visiting Winchester Cathedral was a fool's errand. It was, literally, falling apart. There were cracks developing in the walls, some sort large that owls could roost in them. Chunks of masonry were falling onto the floor. Some pillars were beginning to tilt inward. Something was wrong.

And the only person who could save it was the man pictured below, William Walker.

No, he's not in some sort of Halloween costume. He's wearing an early 20th-century diving suit.

It may seem odd that you'd need a diver to help repair a church; as you can plainly see in the image at the top, the cathedral is built on land and water. But in this case, the cathedral may as well have been built on a river, because it was built close enough to one for the waterway to have an impact. Not too far from the building is the River Itchen, and the groundwater system associated with that river runs right under the church. The soil beneath the church, as a result, is rather soggy and requires a solid, water-resistant foundation. And unfortunately for those of us who appreciate the architectural value of Medieval churches, the builders of a millennium ago used beech logs for that foundation. And, being made of wood, those logs began to rot over time. The church slowly began to sink into the ground below. It took a while, but by 1905, the building was in such bad shape that its administration feared collapse could be imminent.

The good news for fans of the cathedral was that the solution, in theory, was simple: replace the foundation with something more resilient. As Amusing Planet summarizes, "an architect Thomas Jackson and a civil engineer named Francis Fox were swiftly brought in. Fox and Jackson's solution was to dig narrow trenches underneath the walls of the building, remove the decaying beech logs and fill them with concrete to firm up the foundations." It was straightforward enough -- until they put the plan into action. Amusing Planet continues: "when workers dug below the water table, the trenches filled up with water quickly and even a steam pump couldn’t hold it back long enough."

And that's where William Walker comes in. Fox and Jackson hired Walker, a well-regarded Navy diver, to remove the logs and put in the concrete, as demonstrated by the illustration below. 
The task was an arduous one. Per the BBC, Walker "worked in almost complete darkness in the peaty water 13 feet above his head which was filled with sediment." The diving suit weighed 200 pounds -- per another BBC report, "because it took him so long to put on and take off his heavy diving suit, when he stopped for a break he would just take off his helmet in order to eat his lunch and smoke his pipe." And while the image above suggests that the amount of space that needed to be filled wasn't enormous, it understates the task by a lot -- builders dug 235 such trenches, and Walker filled all of them. In total, per the BBC, Walker installed "more than 25,000 bags of concrete, 115,000 concrete blocks, and 900,000 bricks."

The process took six years.

Walker completed the job in 1911 but didn't live much longer; his life was claimed by the Spanish Flu epidemic in 1918. Today, he is honored at the cathedral with a statue, seen here -- which, like the cathedral itself -- you can visit today, due to his efforts.


Now I Know is supported by readers like you. Please consider becoming a patron by supporting the project on Patreon. 

Click here to pledge your support. (If you do, in gratitude, you'll have an ad-free Now I Know experience going forward.)

Bonus fact: The Spanish Flu didn't originate in Spain, and the misattribution of the disease can be fairly blamed on World War I. Because of the war, most European nations were censoring news that could suggest that their citizens were in peril; these nations didn't want other nations to sense potential weakness. Spain, though, was neutral in World War I and, as Wikipedia's editors explain, were "unconcerned with appearances of combat readiness [ . . . ] so its newspapers freely reported epidemic effects, including King Alfonso XIII's illness." As only Spain reported on cases, Spanish citizens and those in neighboring countries alike thought that, at first, only Spain was impacted, suggesting -- incorrectly -- that the virus got its start there. (The actual origin of the disease is still debated today, but there's no reason to think it started in Spain.)

From the Archives: Underwater Repair Men: When the pipes from a reservoir break, how do you fix them?
Like today's Now I Know? Share it with a friend -- just forward this email along.
And if someone forwarded this to you, consider signing up! Just click here.
Share Share
Tweet Tweet
Forward Forward
Archives · Privacy Policy

Copyright © 2023 Now I Know LLC, All rights reserved.
You opted in, at http://NowIKnow.com via a contest, giveaway, or the like -- or you wouldn't get this email.

Now I Know is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com. Some images above via Wikipedia.

Now I Know's mailing address is:
Now I Know LLC
P.O. Box 536
Mt. Kisco, NY 10549-9998

Add us to your address book


Want to change how you receive these emails?
You can update your email address or unsubscribe from this list

Email Marketing Powered by Mailchimp

We Forget Just How Hard Farming Is


Today's selection-- from Cats vs Dogs. Whatever happened to the locust plagues that once swarmed the center of America?
 
"In the summer of 1873, black clouds drifted east from the foothills of the Rocky Mountains towards the newly settled farms of Nebraska, Iowa, Minnesota and the Dakotas. The pioneer families had no warning: the sky went dark at midday, the air filled with a sound like a thousand scissors. Then the clouds fragmented and locusts fell like hail onto crops of corn and wheat. In a few hours, the insects had devoured months of work. Locusts had been invading farms on the American frontier on and off for decades, but the irruption of the mid-1870s entered into legend. Many families gave up farming and fled to the cities. On 26 April 1877, John Pillsbury, the governor of Minnesota, called for a day of prayer to plead for deliverance from the locusts. A few days later, the insects rose up and left as inexplicably as they had come.

"When the Rocky Mountain locusts swarmed, they darkened the skies over vast swathes of the western and central USA, from Idaho to Arkansas. The number of insects was mind-boggling: one reliable eyewitness estimated that a swarm of locusts that passed over Plattsmouth, Nebraska in 1875 was almost 3,000 kilometres long and 180 kilometres wide. And they were devastating. 'You couldn't see that there had ever been a cornfield there,' one farmer said after a swarm passed through his land. Yet between these episodes of frantic fecundity, the locusts seemed to disappear.

"The Rocky Mountain locust (Melanoplus spretus), a big, beefy species of grasshopper, was considered the greatest threat to agriculture in the West. So entomologists tried to learn everything they could about the insects -- what trig­gered them to swarm, what they ate and how they reproduced. But after the spring of 1877, the locusts vanished and never plagued western farmers again. Within thirty years of Minnesota's official day of anti-locust prayer, the insect was extinct. The last live specimen was found by a river on the Canadian prairie in 1902.

"No one mourned the loss, and scientific interest in the locust waned. In the 1940s and 1950s, when farmers began to wage war on their enemies with insecticidal chemicals, a few researchers began to speculate about what could possibly have seen off the locust so spectacularly in those pre-pesticide days.

"During the disastrous outbreaks of the 1870s, farmers fought back with every tool they could find or invent. They deliberately set their fields on fire. They dragged tar-coated hunks of metal through the ground, hoping to trap locust hatchlings in the sticky goo. Nothing helped much. When desperate pioneer women tried to protect their vegetable gardens by draping blankets over them, the locusts ate the blankets before moving on to the vegetables.

"Whatever had done for the locust, it seemed, was some event far beyond the capabilities of nineteenth-century farmers. As the extinction coincided with a time of dramatic environmental change across the West, there were plenty of plausible explanations. Perhaps the locusts had depended on the fires that Native Americans had routinely set to keep the prairies open. Or maybe their most crucial habitat had been shaped by the huge herds of bison that were now all but extinct.

"Most standard entomology texts claimed that extreme fluctuations in population, like those that took place when the locusts swarmed, were a sign of a species in trouble, fighting to recover a balance with its environment. The sweeping changes that came with settlement, some scientists suggested, pushed the locusts through cycles of population explosion and collapse, and in the end wiped the species out.

"When Jeffrey Lockwood, an insect ecologist at the University of Wyoming, was hired to explore the biology of grasshoppers in 1986, the post-mortem on the Rocky Mountain locust had not gone beyond this sort of general speculation. Lockwood wanted to know more, and he hoped that somewhere there were still a few specimens of the long-vanished locust to study. Among the high peaks of the Rockies in Montana and Wyoming were glaciers where swarming insects had fallen, become immobilised by the cold, and died. Some of these glaciers might still hold frozen remains of the Rocky Mountain locust.

"Lockwood and his students spent summers searching in the ice at remote spots high in the Rockies. They began their hunt at Grasshopper Glacier in Montana, hoping it might live up to its name. Sure enough, they found some scattered body parts that might have once belonged to Rocky Mountain locusts, but without whole bodies there was no way to prove these bits had not belonged to some other, still living, species of grasshopper.

"'Finally, after four years of fruitless searching, we found the mother lode,' says Lockwood. On Knife Point Glacier in the Wind River Mountains of Wyoming, they recovered 130 intact bodies of Rocky Mountain locusts, the legacy of a swarm that had risen out of the river valleys of western Wyoming in the early 1600s. The antiquity of the frozen insects -- confirmed by radiocarbon dating -- proved that locusts had irrupted long before European settlers changed the face of the West. The reproductive frenzies, which at times produced enough insects to blanket the entire state of Colorado, were normal events in the history of the locust. Further study of Knife Point Glacier revealed deposits of locust parts throughout the layers of ice, indicating that swarms passed over the mountains at regular intervals during the centuries before the locust's extinction.

"To find more clues to what killed off the locust, Lockwood began to scour the scientific literature of the late 1800s. There he found the writings of Charles Riley, an entomolo­gist who had spent much of the 1870s and 1880s searching for ways to kill the locusts.

"Riley had mapped what he called 'the permanent breeding zone' of the locust, the territory where mating adults and eggs could be found every summer, regardless of whether the locusts were swarming. For a species that could spread across much of the continent during outbreaks, this home base was surprisingly small. Between swarms, the locusts lived only in the river valleys of Montana and Wyoming, where they buried their eggs in the damp ground along the banks of streams. These fertile spots were the same places the incoming settlers chose to farm.

"Riley had experimented with ways to control the number of locusts. Ploughing, he discovered, could push locust eggs so far down into the soil that they would fail to hatch. Flooding the ground where eggs had been deposited also killed many of the young. Riley concluded that agriculture itself -- the processes of ploughing and irrigation -- were the strongest weapons against the locust. But because less than 10 per cent of the land in the western USA was arable, he doubted that farming would ever have had a significant impact on the locust.

"A century after the locust disappeared, Lockwood took Riley's map of locust egg-laying areas in the 'permanent zone' and superimposed it on a map of land under cultiva­tion for corn, wheat or hay in 1880. He found that he had charted the geography of an extinction. In the 1880s, when the locust population had shrunk during an intermission between outbreaks, every corner of its breeding grounds was being farmed. The settlers, Lockwood suggests, had destroyed their nemesis without ever knowing it, simply by ploughing the land and watering their crops. 'The most spectacular "success" in the history of economic entomology -- the only complete elimination of an agricultural pest species -- was a complete accident,' he says." 

Cats vs Dogs: 99 Scientific Answers to Weird and Wonderful Questions about Animals
 
author:  
title: Cats vs Dogs: 99 Scientific Answers to Weird and Wonderful Questions about Animals  
publisher: Nicholas Brealey Publishing  
date: Copyright New Scientist 2020  
page(s): 72-77

Citizen Coke

Today's encore selection -- from Citizen Coke: The Making of Coca-Cola Capitalism by Bartow J. Elmore. As it grew to be a national and international company, Coke's future solvency was contingent upon the perpetuation of cheap sugar production:
 "Fortunately for Coke, there was lots of sugar available for purchase when Coke was first invented, as sugarcane fields cov­ered the tropical world, but this was truly an ecological conquest replete with historical contingencies. While today many people think of sugar as a ubiquitous commodity closely linked to the economies of the Caribbean and South America, especially Cuba and Brazil, not a single sugarcane stalk would have been found in these trop­ical regions before the fifteenth century. First domesticated in New Guinea 12,000 years ago, Saccharum officinarum (sugarcane) was native to Southeast Asia and first came to the West via Persian traders in the eighth century AD. Though sugar quickly became a desired spice and a medicinal dietary supplement for wealthy aristocrats, Euro­pean sugarcane cultivation nonetheless remained modest up to the 1400s, confined to rich soils abutting the Mediterranean Sea.

Saccharum officinarum -- sugar cane

"But sugar proved popular, and Westerners soon looked to expand sugarcane cultivation into new regions of the world in order to sati­ate their cravings. As they had with black pepper, cinnamon, and other coveted spices, Old World elites turned to state institutions to help them acquire greater quantities of the 'sweet salt' they desired, and by the fifteenth century, aristocrats in Western Europe secured government financing for colonial sugar cultivation projects in the imperial periphery. For the next three centuries, European powers cultivated sugar throughout the tropical world, relying on the labor of enslaved men and women. As a result, the sweet foodstuff became a cheap commodity by the end of the eighteenth century, available in abundance for the West's working class.

"Sugar offered incredible caloric density, making it the ideal dietary staple for both plantation field hands in the Caribbean colonies and factory laborers working long hours in the burgeoning industrial centers of nineteenth-century Europe. It made factories and planta­tions productive because it kept laborers on their feet. Along with coal, sugar would become a critical fuel feeding capitalist expansion in the nineteenth century.

"The United States government recognized the value of this dense energy source in the early 1800s and offered subsidies to help develop a domestic sugar empire. Beginning in 1803, when Thomas Jefferson executed the Louisiana Purchase, Congress imposed tariffs on imported raw sugar as a means of insulating Louisiana growers from interna­tional competition. Tariff-protected Louisiana growers expanded their operations between the War of 1812 and the 1890s, producing over 17,000 tons of sugar by 1823, with total US imports topping 30,000 tons that year. By the end of the twentieth century, the gov­ernment helped expand sugar cultivation to the American West and Midwest, offering subsidies and tariff protections to sugar beet grow­ers in the temperate climates of California, Colorado, and Nebraska.

"American farmers were not the only ones benefiting from the federal government's sweet sugar deals; industrial refiners got a big boost as well. Refiners developed the infrastructure that turned raw sugar from sugarcane and beet farms into refined white crystals fit for the consumer market. In the early 1800s, these factories were often quite rudimentary, often using open fires to vaporize juices mashed from sugarcane, but by the end of the century, refineries featured steam-powered engines and complex centrifuges capable of distilling lily-white crystals from beets and sugarcane. Investments were substantial and included railroad construction, sugar barrel manufacturing, and processing plant operation. After the Revolu­tionary War, the United States was far behind its European coun­terparts in refining, so in the tariff of 1789, the government placed heavy duties on imported refined sugar in order to help American processors gain a foothold against foreign competitors that could otherwise outsell them by a wide margin. This was part of a larger government tariff initiative to stimulate American manufacturing across all industries.

"Throughout the nineteenth century, the federal government would continue to shield domestic refiners, increasing duties on imported refined sugar to as high as 9 cents. With government protection, over fifty refineries emerged in the United States by 1870, up from just a handful seventy years earlier, but the success of many of these operations would be short-lived. Ultimately, the real beneficiaries were not small businesses but a handful of wealthy elites that gobbled up competitors to form huge monopolies.

"No one did this better than Henry O. Havemeyer, the co-owner, along with his brother, of Brooklyn-based sugar refinery company Havemeyer and Elder, incorporated by Henry's father in 1863. By the 1880s, Havemeyer had taken over leadership of the New York-based firm, which had become one of the biggest refinery concerns in the country. The family had amassed a fortune totaling more than $3 million by the 1880s, and Henry was determined to add to the cof­fers. By the time he became a partner in Havemeyer and Elder in the 1860s, he was dismayed at how competition cut into profits, and he set out to buy out struggling rival firms."

Citizen Coke: The Making of Coca-Cola Capitalism
 
author: Bartow J. Elmore  
title: Citizen Coke: The Making of Coca-Cola Capitalism  
publisher: W.W. Norton & Company  
date: Copyright 2015 by Bartow J. Elmore  
page(s): 78-80