Truthbook Religious News Blog

All Blog Posts |  See More Blogs

     |           |     

How Will Decline Of Christianity Affect The Future Of USA

Is Christianity in decline in America? When you examine the cold, hard numbers it is simply not possible to come to any other conclusion. Over the past few decades, the percentage of Christians in America has been steadily declining. This has especially been true among young people. As you will see later in this article, there has been a mass exodus of teens and young adults out of U.S. churches. In addition, what "Christianity" means to American Christians today is often far different from what "Christianity" meant to their parents and their grandparents. Millions upon millions of Christians in the United States simply do not believe many of the fundamental principles of the Christian faith any longer.

Churches are shrinking, skepticism is growing and apathy about spiritual matters seems to be at an all-time high.

See "Link to External Source Article" below to read further.


From The Urantia Book:

99:2.1 Institutional religion cannot afford inspiration and provide leadership in this impending world-wide social reconstruction and economic reorganization because it has unfortunately become more or less of an organic part of the social order and the economic system which is destined to undergo reconstruction. Only the real religion of personal spiritual experience can function helpfully and creatively in the present crisis of civilization.

Please see Weakness of Institutionalized Religion

Link to External Source Article

     |           |     
Atom   RSS