apondsong - They love telling others "you are going to burn in hell", if they don't believe the same way they do. I think they love telling people that line because that makes them feel safe and special.
Let's take an honest look at that statement, because there are so many assumptions made and so much taken as truth, incorrectly so.
It is a generalization, which, out of the gate, makes it inaccurate at best.
It also hints at reverse labelling. That is, if someone says "you are going to burn in hell" then they must be a christian.
With regards to the second part of that statement, "if they (you) don't believe the same was they do." - This is true IMHO of every human being. In other words, we all, at some level, make a judgement of somebody who doesn't believe or act as we do. Sometimes, we even judge the differing person as better then ourselves (how weird is that, but it is probably true)
That leads me to the last sentence, which I think has alot of meat in it and takes a certain amount of integrity to admit its' truth, especially the word "safe".
For myself, I could identify with it because I know in my life, when I have not been at peace with a belief, I would bounce it off others, sometimes preach it off, to see if it sticks to them. What a curious behavior, don't you think?
But getting to the subject line of this topic, can anybody specify a time in american history that we can dub our nation as a "Christian nation"? certainly it wouldn't be at its founding. If we looked at some of the lives and behaviours of our founding fathers and brothers, mothers and sisters, we couldn't possibly label them "christian" acocrding to most modern criteria. And those who deemed themselves "Christian" then, would probably see today's christian, in many sects, as heathen.
So, when has america ever been a "Christian" nation and what is the proof?