With baptism rates in Baptist churches down, mainline denominations in what is referred to as a “statistical downfall,” fewer conversions to Christianity, evangelism and mission budgets drastically cut in most churches, and people longing for a revival of old that doesn’t seem to be happening, has the American church kissed faith goodbye? Or has God withdrawn his favor from it?

What do you think needs to happen for many churches in this country to reverse their dismal downward spiral and inevitable demise?