How Does Christianity Influence American Culture

999 Words4 Pages
English 102 Brittany Daigle Dr. Trahan 09/16/2011 Brittany Daigle Dr. Trahan English 102 09/16/2011 Christianity Religions, or belief systems, are perhaps the strongest force in society. Christianity is the main religion in America, and is what American morals are surrounded by; even though our founding fathers clearly stated” As the Government of the United Stated of America is not, in any sense, founded on the Christian religion”. (The Treaty of Tripoli) Christianity has influenced American culture for years. Most people will view Christianity in modern society as positive, but there are revolutionists that view religion as a whole as negative and can ruin our society. Christianity is a teaching based on the life…show more content…
It usually starts a very young age by bringing the child to church or to Sunday school. Most Christians start teaching their children the stories of the bible and how what you do affects you in the afterlife. Most Christians believe in getting baptized at an early age so you can be cleansed of all sins. In the Christian religion, even if you have lived a life of sin, so long as you repent in the end, you will be saved and given eternal life. This is not so in other religions. Such religions as Hinduism for instance do not believe this. For everything you do wrong you will be punished. Christianity is slowly evolving and trying to become even more appealing to society, there are still many downfalls. Todays society is steadly envolving and the views on how the Universe works today is a total different story on how people thought it worked back then. Science is starting to explain things that religion use to explain on how the Universe works. People are getting more educated and becoming more indifferent on the teachings of Genisus in the bible. Gensus tells us that the Earth was made in seven days and it only a several thousand years old, and that humans were
Open Document