When Did Christianity Come To America?
Americans have had a long and complex relationship with Christianity from the early days of settlement up to the present day. Christianity arrived in America in the 16th century as a result of colonisation by European settlers. Over the centuries, Christianity has come to dominate the religious landscape of America and has become part of the definition of ‘American.’
Christianity first reached America by way of the Spanish. In 1513, when explorer Juan Ponce de Leon landed in Florida, which was then part of New Spain, he and his men held a Roman Catholic Mass, making them the first Christians to arrive in America. Roman Catholics would continue to dominate the American Catholic movement until the mid-19th century.
A more significant wave of Christian missionaries and settlers came in the late 16th century from West and Central Europe. Protestantism, including Puritanism, was the main religion among these settlers, arriving in the colonies of Massachusetts, Connecticut and Virginia. Some of the earliest settlements, such as the Pilgrims who arrived in Plymouth in 1620, were religiously motivated, and sought to build a ‘city on a hill’ and establish a shining example of godliness.
In 1776, when the American colonies declared their independence from Great Britain, the new United States of America was founded on the ideal of religious freedom. This freedom resulted in a rapid growth in the diversity and number of denominations. By 1800, there were many different Protestant denominations in existence, each with its own distinct doctrines and teachings.
As the 19th century progressed, Christianity became more solidly entrenched in American culture. In 1845, the religious movement known as the Second Great Awakening began and millions of Americans adopted a more evangelical, passionate and personal form of Christianity. By 1900, evangelical, Christian values had become so deeply rooted in the American way of life that the term ‘Protestant Establishment’ was used to describe the dominant religious culture.
In the 20th century, Christianity in America suffered a period of decline, especially with the emergence of secularism and the counterculture of the 1960s. The increasingly liberal social climate of the 1960s and 70s saw many Americans abandoning their Christian faith, and the proportion of Americans who identified as Christian fell from 90% in 1980 to 70% in 2014. Despite this, Christianity remains the dominant religion in America, and Christian values still shape much of American culture and doctrine.
Religion in The Colonies
Religion played a major role in the lives of colonial Americans, and colonial society was deeply embedded in religious culture. For example, in the Puritan colonies of Massachusetts and Connecticut, religious observance and devotion to Christian values were legally enforced, and any deviation from the prescribed doctrine was severely punished. In addition, days of worship and religious festivals such as Christmas and Easter were celebrated with great enthusiasm. Religion provided an important moral and ethical framework for colonists, and with the exception of Pennsylvania, all of the original 13 colonies were officially devoted to the Protestant Christian faith.
Cross-Sectional Responses
The impact of Christianity on American culture varies greatly depending on the region, demographics and political leanings of the people. For example, religious values are a much more important element in the culture of the Deep South than they are in the more liberal cities of the West Coast. Surprisingly, religious values remain a strong force in some of the most ‘liberal’ cities in America. For example, New York City is often seen as the epitome of liberalism, yet it has a well-established network of churches that draw large congregations.
The Role of Christianity in the 21st Century
Christianity continues to be an important part of American culture in the 21st century, albeit in a more subtle and pervasive form. Religion still shapes many aspects of American society, from the political rhetoric to the language used in public discourse. Perhaps one of the most significant roles of Christianity in American society is the way it shapes the nation’s moral and ethical code.
Impact of Globalisation
The role of Christianity in American culture has been further enhanced by the globalisation of faith. Thanks to the explosion of the internet, individuals now have access to religious teachings and practices from all around the world. As a result, Christian values and beliefs have become more widespread and accepted. Even in traditionally non-Christian circles, individuals now have more exposure to the Christian faith and the teachings of its many denominations.
Impact of Society
Christianity has also been shaped and influenced by the changing nature of American society. For example, the rise of feminism has seen Christianity become more embracing of female religious leaders and the LGBT+ community. Christianity has also adapted to the changing nature of work and family life, with the rise of online services such as online Mass and virtual Bible studies.
Persecution and Discrimination
The main challenge for Christians in America today is to adapt to a society that is becoming increasingly secularised, whilst also facing discrimination and persecution from some quarters. As Christianity remains the dominant religion in many parts of the United States, some individuals and groups have sought to exclude others on the basis of their faith or beliefs. As a result, some Christian individuals and organisations have come under fire for their views on social issues such as abortion and same-sex marriage.