Is this factual or anecdotal? And if it were the latter, how does one come to this opinion? I went to college in both the South and North (as well a visited plenty in the South), from my experience it's rare to find visible widespread strong liberal beliefs in the Southern schools, as opposed to Northern School.
How long has it been since you graduated?
I got my opinion from a survey of college professors. It's getting lefter.