civil war was inevitable, the southerners would never have given up their slaves to anything short of armed conflict, even though only a small percentage owned slaves, slavery was just so ingrained in their life style, they couldnt imagine blacks being free, that idea was just too disgusting to them. but a lot of people in the north felt the same way, they didnt agree with slavery, but only because they didnt like the plantation system, they didnt want black equality either. lincoln only freed the slaves because he felt that it would be the best thing for the post-war country. after the war the government was pretty soft on the south, and let the southern states pass some pretty racist legislation, some of which didnt get repealed till the seventies. bottom line, war was the only way, look who started the war. southerners fired on federal troops because they felt their southern plantation-oriented, agrarian, aristocratic life style was threatend. to be honest, i could imagine some states not having full equality without the civil war, and if the north hadnt been so lenient, and had kicked some ass in the south after the war, problems could have been avoided. the south needed to get fucked up big time, plain and simple