The history of Namibia and German colonization
After brief Portuguese incursions in the 15th century, and Dutch explorations in the 18th century, the Germans ended up colonizing the place. They occupied the west coast in 1878 and undertook evangelizing missions which concerned the indigenous peoples. In 1884, the region officially became a German protectorate, called “German South West Africa”.