6
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 11 Aug 2023
6 points (60.7% liked)
World News
37024 readers
484 users here now
News from around the world!
Rules:
-
Please only post links to actual news sources, no tabloid sites, etc
-
No NSFW content
-
No hate speech, bigotry, propaganda, etc
founded 5 years ago
MODERATORS
While nothing is ever black and white, it's undeniable that the relationship between the west and Africa has been deeply exploitative. It's also pretty well understood what people mean by the west. These are vassal states of the US that are politically subservient to US and rely on US for military protection. These are countries that are aligned around the failing liberal capitalist ideology that US promotes.
Kicking out western neocolonial regimes is a prerequisite for any positive changes in the countries of Africa. Only once these countries are under control of the people living there can they chart their own course. As long as they remain under the yoke of western hegemony, the interests of the empire will always be put above the interests of the people living in these countries.