I spent yesterday afternoon at our local National Wildlife Refuge.
I loaded up my wife, son and dog and drove around the place to get out of the house. We got out at the observation towers and looked at the birds and the alligators. It was a beautiful day. I noticed lots of people were making use of the walking trails. There were families out fishing on the lake.
It was this experience which got me thinking and inspired yesterday’s article about conservative liberalism and collective goods. The reason that places like this exist in America is because progressive liberalism dominated most of the 20th century. Conservative liberalism was discredited for generations by the Great Depression. The modern conservative movement and libertarianism emerged as a backlash against the New Deal and rose to power due to backlash politics after the Civil Rights Movement. The liberal Right has been in power for most of the last 40 years since the Reagan administration.
I started thinking about how radically different our local economy would be if history had played out otherwise. In the early 20th century, malaria was still endemic in Alabama:
“We analyzed the decline of malaria in Alabama, an archetypical Deep South cotton state that experienced high levels of malaria incidence well into the 1930s. We assessed the theory that movement of tenant farmers away from mosquito breeding grounds caused by the 1933 Agricultural Adjustment Act was the central factor in the decline of Southern malaria. Finding that this theory is not supported by the available evidence, we argue that targeted public health interventions, along with the development of state and local public health infrastructure, represent a more plausible explanation. …
From any perspective, the overall decline in malaria deaths in the South after 1933 is surprising. Malaria is a disease of poverty, and the rural South was hit particularly hard by the Great Depression. Economic dislocation, along with the increase in potential mosquito-breeding grounds where cotton or other cash crops had been grown, might have been expected to further increase the likelihood of malaria outbreaks in the region
We argue that the decline of malaria in the South was a consequence not of out-migration of tenants caused by the AAA but of federally sponsored public health interventions, most notably large-scale drainage efforts. After 1936, the expansion of federal intervention in the development of state and local public health infrastructure helped to ensure that drainage and other public health efforts were better organized and that measures were in place to prevent the disease from reemerging. ...
Between World War I and the beginning of the New Deal, drainage work was viewed as a prohibitively expensive way of attempting to rid the South of malaria. Beginning in the winter of 1933–1934, however, the federal Civil Works Administration put unemployed Southerners to work on drainage projects. Loosely supervised by the PHS, these early efforts appear to have at times been ineffective and even counterproductive.24
The Civil Works Administration was dissolved in early 1934, and the drainage program was continued by the Federal Emergency Relief Agency. In 1935, the federal Works Progress Administration (WPA) began providing labor for drainage projects and continued to do so throughout the 1930s. By the end of the decade, public health workers were increasingly focused on the development of methods of lining ditches to make them more permanent.25
In 1935, Congress passed the Social Security Act (SSA).26 Title VI of the SSA authorized federal grants-in-aid of $8 million annually for the development of state and local public health infrastructure and for the training of public health workers. Amendments to the SSA in 1939 raised this figure to $11 million.27 …”
“Alabamians continued to experience yellow fever throughout the 1860s and into the early 1870s. Outbreaks in Mobile and Montgomery were the most frequent during that time as these two cities were major transportation routes through which outsiders brought in the disease. In 1873, a severe epidemic affected Alabama from Huntsville, Madison County, in the north, south to Mobile Bay, with approximately 124 deaths across the state. The subsequent 1878 epidemic hit New Orleans (4,000) and Memphis (5,150) particularly hard. Alabama recorded approximately 250 deaths that year, many of whom were refugees fleeing other epidemic-plagued areas. Throughout the 1880s and 1890s, Alabama experienced sporadic occurrences of yellow fever. Cases and deaths were documented at Brewton, Escambia County, (1883), Decatur, Morgan County, (1888), Fort Morgan, Baldwin County (1893), and several other cities during 1897.
In 1875, the Alabama Department of Public Health was organized, mostly by members of the Medical Association of the State of Alabama, who recognized the need to combat epidemic diseases, such as yellow fever, more effectively. The state also created county health departments in each of the then 65 counties, making Alabama the first state to have health departments in every county. Therefore, when yellow fever epidemics struck in subsequent years, health departments under the guidance of the Alabama Department of Public Health were able to establish quarantine lines and issue suggestions on how best to combat the disease. Prior to 1875, the Medical Association of the State of Alabama also suggested strict quarantines as a means to prevent yellow fever. Physicians associated with both organizations through the nineteenth century also recommended preventive procedures such as distributing carbolic acid to thwart the spread of the disease. These health organizations also maintained records of statistics of epidemics collected by physicians.
With medical advancements and the introduction of germ theory in the last half of the nineteenth century, physicians and scientists began to study the biological origins of yellow fever. Mobile physician Josiah Clark Nott (1804-1873) lost four children to yellow fever in 1853 and wrote extensively on the disease. But, it was not until decades after his death that scientific study provided an explanation about the transmission of yellow fever. Epidemiologist Carlos Finlay first hypothesized that yellow fever was transmitted by mosquitoes before the Royal Academy of Medical Sciences in Havana, Cuba, in 1881. In 1900, after years of defending his theory, Finlay and an international team of scientists and physicians proved the mosquito theory. Medical pioneers Walter Reed, Aristides Agramonte, James Carroll, and Jesse Lazear worked with Finlay in Cuba to solve the mystery of yellow fever’s transmission.
The last cases of yellow fever verified in Alabama occurred in 1905, a few years after the validation of the mosquito transmission theory. Cases were noted in Castleberry, Conecuh County, Montgomery, and at the Mobile quarantine station. No deaths were recorded.”
The eradication of malaria and yellow fever in Alabama which were infectious diseases carried by mosquitoes were two of the greatest accomplishments of the 20th century. It is now something we are accustomed to taking for granted in the United States. If that had not occurred as a result of scientific progress, it wouldn’t be possible to enjoy such a day in the park looking at alligators in wetlands. In Africa, malaria killed about 285,000 children before their fifth birthdays in 2016.
“In early 20th century America, pellagra seemingly emerged out of nowhere when, in late summer of 1906 near Mobile, Alabama, Dr. George H. Searcy noted a peculiar malady affecting 88 patients at the Mount Vernon Insane Hospital, a state facility for the “colored insane.” Searcy soon realized that he was facing an outbreak of pellagra with its characteristic four-ds. He quickly reported his findings to his Medical Association of Alabama colleagues in 1907.
Soon, pellagra would become epidemic throughout the South, and thus the discovery of its etiology became crucial and much-debated. Lombroso’s theory of toxic cornmeal resurfaced, while eugenicists, promoting a popular but misguided theory of social “betterment,” suggested that its origin lay in racial or hereditary factors. In 1912 the privately endowed Thompson-McFadden Commission made an on-site study of pellagra in the South Carolina mill village of Spartanburg and concluded that the disease was infectious in nature. This erroneous report would haunt the medical profession for years as they searched for the real cause and cure of pellagra.
In 1914, the U.S. Public Health Service’s Dr. Joseph Goldberger (1874-1929) was already known for his success in fighting U.S. epidemics when he was asked to investigate pellagra. Through observations and experiments at Southern orphanages and prisons, Goldberger found that the disease was not infectious, but instead was caused by a deficiency in the diet. Many poor Southerners consumed a diet solely of meat, meal, and molasses. Low-wages driving high-deficiency diets made the disease economic in origin. Goldberger’s conclusions were correct but unpopular in the South because of their negative implications for the Southern way of life. Also, many in the medical community remained unconvinced since all of Goldberger’s studies were conducted in controlled environments, unlike those of the Thompson-McFadden Commission. In 1918 he rectified this by confirming his conclusions with his own mill village studies, aided by a brilliant statistician, Edgar Sydenstricker.”
Pellagra was a dietary deficiency that was solved by the U.S. Public Health Service at the height of the sharecropping economy. Don’t even get me started on how the free market brought about that catastrophe. The disappearance of pellagra is something else we take for granted.
“Up until the early 20th century, deer in the Southeast were excessively pursued. Settlers bought, sold and traded deer parts with Europe and other communities for profit. Subsistence hunting followed by a commercial trade in deer hides peaked around 1700.
Market hunting, over-harvest, subsistence hunting, and lack of effective law enforcement were the main causes that drove the whitetail deer populations in the Southeast to become nearly extirpated. Venison was in high-demand, and with little regulations of deer harvest, the species was facing havoc. During the early 1900s, nearly every southeastern state reached its lowest deer population level.
There were very few, if any, deer that were spotted in the southeastern United States during the early 20th century, and most remnant deer were seen on privately owned land. In addition to over-hunting, habitat change due to timber management, and agricultural purposes throughout the late 19th and 20th centuries dwindled the habitat that deer thrive in.
Timber and agriculture for the market was heightened shortly after the market hunting phenomenon. Timber – a deer’s habitat – was being cut and sold, and agriculture in the South for crops such as corn and cotton was essential to the economy. Deer were losing their habitat and had no where to thrive, causing an even bigger downward spiral of the population throughout the region.
Age of Restocking
Hunters, foresters and other outdoor enthusiasts were beginning to recognize the decline of deer, which prompted the implementation of game laws and establishment of state wildlife agencies. Shortly after 1900, wildlife agencies existed in almost every southeastern state, and the replenishing of whitetail deer began to unfold.
When wildlife agencies and game laws were established, southeastern states began their restocking efforts. Much of the restocking took place during the 1940s-1970s with most states implementing a “buck-only” law during the restocking period. It is estimated that many southeastern states had roughly 50,000 deer around the 1940s-1950s, but the exact population levels are unknown. Today, states in the Southeast have anywhere between roughly 720,000 to 1.5 million deer, according to most recent population estimates.
The first restocking occurred during the 1920s. Most deer came from remnant deer herds in Alabama, while a few were obtained from North Carolina.
“Some counties were not restocked, but rebounded with remnant deer,” Chris Cook said, deer program coordinator for Alabama Wildlife & Fisheries.
Today, the deer population in Alabama stands at about 1.5 million deer. Season regulations vary by region and county in the state.
“We are looking at what data we have and what data we need to get,” Cook said. “We are looking at what seasons and bag limits and impacts it takes to make better decisions on where we are at.”
“Without deer hunters, it would be possibly, impossible for us to run our agency,” Cook said.
“Before European settlers arrived in North America, there were millions of wild turkeys spread across what are now 39 U.S. states. But by the 1930s, wild turkeys had disappeared from at least 20 states and their total population had dropped to 30,000.
Over the next few decades, a series of reforms, conservation efforts and demographic changes helped bring wild turkeys back from the brink of extinction—making them one of the United States’ biggest wildlife success stories.
Wild turkeys, or Meleagris gallopavo, were not the only native U.S. species that were in danger. By 1889, there were only 541 American bison left. By the 1930s, when wild turkey populations hit their lowest, the passenger pigeon had already become extinct. The crisis in native species populations galvanized conservationists, who helped pass the Federal Aid in Wildlife Restoration Act of 1937, also known as the Pittman-Robertson Act. This act placed a tax on hunting guns and ammunition to pay for wildlife restoration efforts.
The 1930s also saw a major shift among the U.S. population that would end up benefiting wild turkeys, albeit unwittingly. The Great Depression forced many families to abandon their farms, leaving the land open for wild turkeys to expand into. “As these farms slowly reverted to native grasses, shrubs, and trees, wild turkey habitat began to emerge,” according to the National Wild Turkey Federation’s website. …”
In the early 20th century, the whitetail deer and wild turkey had been nearly hunted to extinction in the South and were only brought back by government intervention which restored the species.
“Is the Chattahoochee dirty?
In the 1960s and 1970s dissolved oxygen levels in the Chattahoochee River threatened fish. E. Coli levels discouraged government agencies from building access points in some areas, says Jason Ulseth of the Chattahoochee Riverkeeper, a nonprofit that patrols and protects the river. But today, he says, “the river is now cleaner than it has been in decades.” Higher levels of bacteria like E. coli and pollutants are more likely after heavy rains and in the summer when temperatures rise. And though sewage spills into the river have decreased in the last decade or so, they still occur. But there is more to do. Most parts of the Chattahoochee are monitored by the Georgia Environmental Protection Division and United States Geological Survey.
How did it get clean?
In 1973 the state passed the Metropolitan River Protection Act, setting strict rules on new development within 2,000 feet of the river. (Existing structures, such as the ones occupied by restaurants Canoe and Ray’s on the River, were grandfathered in.) After the Riverkeeper filed a lawsuit in 1995 against Atlanta over sewage spills, a federal judge ordered a $3 billion overhaul of the city’s once-decrepit sewer system—and a $20,000 fine per spill. Environmental officials have noticed a sizable decrease in city sewage spills: from 1,452 in 2001 to 627 in 2014. More awareness about pollution has also helped, says Jerry Hightower, a park ranger who’s spent more than 35 years at the recreation area.”
“Atlanta is fairly unique among big American cities in that, because it sits in the headwaters of a major river, its pollution affects the entire length of the river. Atlanta’s sewage discharge has been a primary culprit in the river’s degraded state.[
The river was clean enough for swimming in the 1940s. Yet by the 1960s, in large part because of the neglect of Atlanta’s sewer system during the city’s explosive growth, the river had become “grossly polluted,” state environmental officials told Congress.
Despite some significant improvements in the 1970s following the initial passage of the Clean Water Act, by the 1990s, Atlanta’s sewer system had fallen into disrepair. The city’s failure to regularly invest in maintenance and upgrades and to repair thousands of leaks had a disastrous effect on water quality: hundreds of millions of gallons of raw sewage spilled into the Chattahoochee every year, carrying more than 4 million tons of phosphorus. The river often had sewage floating on its surface. The West Point Lake, formed by the Chattahoochee downstream from Atlanta, was said by scientists to be “exhibiting the classic signs of death by pollution” and was completely devoid of oxygen much of the year. …
As a result of the upgrades, more than 400 million gallons of sewer spills per year were eliminated, and, by 2014, the volume of untreated sewage that flowed into the river and its tributaries had been reduced by 99 percent compared with the 1990s.”
By the 1960s, the Chattahoochee River had become grossly polluted by Atlanta’s sewage, which affected everyone who lives downstream of Atlanta to Apalachicola in the Florida Panhandle. As a result of a series of environmental laws and federal lawsuits, the Chattahoochee River is now cleaner than it has been in 70 years, which is why so many people come here to fish in the “Big Bass Capital of the World.” In recent years, more people in the Columbus area have started rafting and tubing in the Chattahoochee.
The upshot of all this is that the environment, public health and the economy are intertwined and often we don’t see the long term economic benefits of actions which are economically painful in the short term. In my area, we have reaped tremendous economic benefits from eradicating malaria, restoring the whitetail deer and wild turkey and cleaning up the Chattahoochee River. Both the Eufaula National Wildlife Refuge and Lake Eufaula which were created by the U.S. Army Corps of Engineers bring in tourist dollars. We also still have capitalism and private property, but we are richer than we would be otherwise.
Suppose there was no public investment in our state and local transportation infrastructure. The beach traffic that comes through here every spring and summer and which props up the local economy would similarly dry up which would negatively impact countless small businesses. It is just difficult for me to understand how conservative liberals and libertarians do not see how their systematic neglect of collective public goods (transportation infrastructure, education, public health, environmental conservation, worker safety, etc.) leads to economic underdevelopment in the long term. They have a tendency to conflate collective public goods with socialism and communism.
Are public parks, public lakes and wildlife refuges synonymous with communism? How about hospitals, state universities, police departments and fire departments? If it were not for agricultural research at our state universities, what would the economy of the modern South look like today? What about the Post Office? None of these institutions fits easily within the framework of conservative liberalism and libertarianism which are constantly trying to reduce them to market forces.
Fanaticism in the cause of individual liberty is immoral and socially and economically harmful in the long run. A more balanced society that values a basket of collective goods is a better society.
Note: In a future article, I will expand on how the Great Depression and the Civil Rights Movement respectively are turning points when conservative liberalism and progressive liberalism went off the rails in their monomaniacal pursuit of individual liberty and equality.