Imperialism

Did European Imperialism make life better in Africa? Imperialism is A policy of extending a country's power and influence through diplomacy or military force. Only a small part of Africa is independent, the rest is owned by the British the french, the Spanish the Portuguese the belge the German and the Italian.