The world turned on its head
Was WWI really the beginning of the end of empires?
The Great War brought the end of some empires, while others expanded on the ruins of those that lost. Viewing WWI as the end of imperialism is too Europe-centric a concept and needs to be amended.