Fires have plagued American cities for centuries. During the 18th century, the Great Fire of Boston (1760), the First Great Fire of New York City (1776), the First Great New Orleans Fire (1788), and the Great Fire of Savannah (1796) each destroyed hundreds of buildings and challenged municipal authorities to improve safety in an increasingly risky environment. Beginning in the 19th century, with increasing commerce, rapid urbanization, and the rise of industrial capitalism, fires became more frequent and destructive. Several initiatives sought to reduce the risk of fire: volunteer fire companies emerged in all major cities, fire insurance developed to help economic recovery, and municipal infrastructure like fire hydrants became ubiquitous to combat blazes. Despite significant efforts to curb this growing urban problem, fire dangers increased in the late 19th century as cities became epicenters of industry and the populations boomed. The “great” fires of the late 19th century, like those that took place in Chicago (1871), Boston (1872), Seattle (1889), Baltimore (1904), and San Francisco (1906), fundamentally altered cities. The fires not only destroyed buildings and took lives, but they also unearthed deep-rooted social tensions. Rebuilding in the aftermath of fire further exacerbated inequalities and divided cities. While fire loss tapered off after 1920, other issues surrounding urban fires heated up. The funneling of resources to suburbs in the post-war white-flight period left inner cities ill-equipped to handle serious conflagrations. In last few decades, suburban sprawl has created exurban fire regimes, where wildfires collide with cities. Extreme weather events, dependence on fossil fuels, deregulation of risky industries, and a lack of safe and affordable housing has put American metropolitan areas on a path to experience another period of “great” fires like those of the late 19th and 20th centuries.
Back to Top