In 1903, German sociologist Georg Simmel (1858-1918) wrote an essay titled, “The Metropolis and Mental Life.” He articulated a question that many people had been pondering: Could city living drive you crazy? In North America and Europe, in particular, many people had recently abandoned lives in rural settings, doing agricultural work, for life in urban settings, where they worked in industry or perhaps in services supporting these industries. Simmel wondered what effect this momentous change was having on mental health.
In the United States, neurologists, such as George Miller Beard (1839-1883) and Silas Weir Mitchell (1829-1914) had been arguing since the 1860s that the stresses of modern life, typified by bustling, hectic urban environments taxed the nervous system in susceptible individuals, resulting in a disorder called neurasthenia. Towering skyscrapers, crowded streets, jangling telephones, staccato telegraphs, and inescapable traffic taxed the nerves of susceptible individuals.
Middle-class, white, Protestant city dwellers were thought to be particularly vulnerable to neurasthenia, which caused a wide range of mental and physical symptoms, including fatigue, depression, anxiety, and neuralgia. Others argued that Black Americans, migrating to northern cities from the South, might also struggle to cope in the new environment.
Neurasthenia ceased to be commonly diagnosed after World War One, though it continues to be diagnosed in Asia. But concern with cities remained. By the 1920s, social scientists at the Chicago School of Sociology had put the Windy City under the microscope. Chicago was a particularly appropriate subject for study. After the Great Fire of 1871, it was rebuilt.
Chicago boasted the world’s first skyscrapers and its new downtown was designed to emphasize commercial, rather than residential, development. A truly modern city.
Chicago School researchers investigated juvenile delinquency, suicide, homelessness, gangs, and mental illness. Although these problems were not restricted only to cities, they came to be associated mainly with urban environments. One of these studies, which explored the geography of mental illness in Chicago, found that schizophrenia in particular was associated with the deteriorated slum areas (nicknamed “Hobohemia”) surrounding the downtown core. They suspected that the chaotic, disorganized, and unstable characteristics of life there, along with the endemic poverty, were behind these rates. But were these problems limited to cities? Or could they lead to mental illness anywhere? Two post-World War Two studies would attempt to answer these questions by exploring mental health in two very different places.
The first location was an even more emblematic American city: New York City. Researchers in the Midtown Manhattan Study surveyed adults in the Upper East Side to assess how mentally healthy or unhealthy they were. Their findings were somewhat of a shock. The study found that fewer than one in five (18.5 percent) Manhattanites had good mental health. A quarter were incapacitated by their mental health problems, unable to work or function socially. These statistics captured the attention of the nation’s media, with headlines blaring, “New York City Living for Nuts Only,” and “City Gets Mental Test, Results are Real Crazy.”
But did this mean that cities drove people insane? The second study, conducted at roughly the same time and co-led by Alexander Leighton (1908-2007), the same researcher who would complete the Midtown Manhattan Study, suggested otherwise. This was the Stirling County Study (Stirling County was a pseudonym), which did not investigate an urban environment at all, but a very rural one in southwestern Nova Scotia, Canada. Stirling County, which consisted of fishing, forestry, and farming communities, might have been as far removed from the Big Apple as one could get, but its rates of mental illness were strikingly similar.
Rather than blaming cities per se, therefore, the researchers suggested that underlying factors—that could be present in both urban and rural environments—were really the issue. These factors included poverty, inequality, social isolation, and community disintegration. Other studies, which used rat models to investigate crowding, also suggested that the relationship between cities and mental health was more complex than people thought. But the baby was already being thrown out with the bathwater in many places.
Urban renewal, partly justified because crowded inner cities were bad for mental health, was already well underway in North America and Europe. Boston’s West End, for example, was razed to the ground, destroying what many observers argued was a vibrant, multi-ethnic community. Some contended that such measures resulted in more, not less, mental illness. Elsewhere, downtown cores emptied as people left for the suburbs. The neighborhoods that remained were often impoverished, disenfranchised, and underfunded, characterized by crumbling infrastructure, fires, and homelessness, none of which was good for mental health.
Perhaps the urban planners should have referred back to Georg Simmel. His conclusion in 1903 was that the relationship between cities and mental health was effectively ambivalent. For people who felt constrained and stifled living in close communities where everyone knew your business, cities could be liberating. Cities made others feel detached, lonely, and unimportant. There are city mice and country mice, after all.