I actually don’t see any real difference between the American Empire and any other.
Svetlana Voreskova
124

America certainly became an official empire after the Spanish American War, although the tendency was there well before.

But you forget, Svetlana Voreskova, that we are touchy about the term “empire.” Empire connotes “bad guys” and we know we are always the “good guys.” Being one of the good guys is part of our American birthright, like air-conditioning.

Still, if it looks like a duck, quacks like a duck, etc. :)

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.