Racism in the South

I know that racism is everywhere but im curious if it's worse in some places such as the South in the United States. I ask because I have a sister in law who is from North Carolina and she made a comment recently stating that she hates coming to the North(Pennsylvania) because "the blacks do not know their place". I asked her what she meant and as an example she said that where she is from "the blacks step off the side walk or move aside to let us pass when they see us coming". She also talked about how people in the south embrace being called a "cracker" because it meant "white people cracked the whip to keep blacks in line". I was completely shocked by the things she said I honestly wanted to spit in her face. It really got me wondering if this is typical in the South... Is racism worse or equal to the rest of the country?