Home » Germany

Germany

German ties in West Africa ‘not a charity campaign’

Germany is pulling its military out of the terror-stricken Sahel region. At the same time, it wants to expand ties in West Africa because Europe’s security depends on Africa, Germany’s top diplomat says. The streets of Dakar, the capital of Senegal, are changing. They’re...

Read More

Translate