The United States Has Never Not Been at War

The United States is a war corporation.

The United States Has Never Not Been at War
[The war corporation]