Originally Posted By: 3800Series
The south. Many people think of the south and think racists and ignorant twats. Its not always the cause but sadly people think of it. I mainly like New Orleans and the old French Quarters.
Personally I think the south is the heartland of america. Super polite people, always willing to help each other.
A few months back I had a flat and before I could even get the jack out of my car I had 2 different people stop and offer help. I turned them down but thanked them. Iv been up north and you could be in the middle of no where 300 miles from the nearest town and people will not even stop for a moment even if it was -30 degrees.
I don't think this or that makes/ symbolize america. I think its the people who come together to help everyone is taken care of and people who help insure others have a good quality of life.
Hmmm...I would say that is maybe true for parts of the rural south, but definitely not representative of all of it. The urban south tends to be pretty bad. Lots of crime, poverty, corruption, etc. My city is on just about every "worst cities" list there is. People can be downright nasty here, and will kill each other over the stupidest stuff.
I think people in rural areas in general tend to be a little more polite, perhaps due to being around fewer other people in a typical day.