Reconstruction (United States)

From Leftypedia
Jump to navigation Jump to search

The Reconstruction era was the period of United States history following the American Civil War in which northern Radical Republicans attempted to integrate freed Blacks into American society with protected political and legal rights, reforming the Southern states to this end through an occupation by the US military.

General references

  • Du Bois, W. E. B. Black Reconstruction in America.