The American Wartime Normalization

Why is the United States, which has almost never been invaded on its own soil, seemingly always at war?

Allen Huang
Politically Speaking
13 min readJul 9, 2021

--

With the joint decision by the U.S. and Taliban’s leaders to implement the Doha Agreement, which was reached by a slim consensus in 2020, U.S. forces have been withdrawing combat troops from Afghanistan. The U.S. armed forces have been stationed in this landlocked and unstable nation for 20 years now, but now they will be transferring command of existing military bases to national forces led by the Afghan president. In addition to the United States, all other NATO forces involved in the war in Afghanistan will be withdrawn from the country. The U.S. government expects that these troops will be officially withdrawn from Afghanistan in September 2021, and this extremely heavy page in U.S. history will be officially turned.

It seems that the War on Terror, which was modeled on former President George W. Bush’s 2001 congressional address and eventually developed into a massive military campaign spreading throughout the 21st century, will finally come to an end. With the end of these wars, American society seems to be able to reflect more thoroughly on a question that they have been pondering since the beginning of the war: Why is the United States, which has almost never been invaded on its own soil, seemingly always at war?

In the 2013 book, “War Time: An Idea, Its History, Its Consequences,” the author, Emory University School of Law Professor Mary Dudziak argued that the beginning of the War on Terror is not only the beginning of what is commonly thought of as “perpetual war,” but also the continuation of “wartime” as a political concept in American society.

Photo by Army Master Sgt. Alejandro Licea of the DoD

It is necessary to explain what is meant by “perpetual war.” The idea is that no matter how divided the parties in the United States are on social issues, a part of the political elite in charge of foreign policy always seems to favor military solutions to international crises. Their ideas eventually turn into concrete policies and programs, and have produced little widespread change despite the fact that the presidency continues to shift between conservative Republicans and liberal Democrats. Although often referred to as “neoconservatives,” those who held such views were relied upon by four U.S. presidents of varying political ideologies. Leading International Relations scholar Stephen Walt has argued that the neoconservatives maintained a close alliance with liberal political forces on foreign affairs in order to ensure that American values remained dominant in society.

American Traditional Wartime

Traditionally, “wartime” has often been defined simply as the period of time during which a nation is in the process of fighting. However, due to changes in the definition of “war” and “time” in American history, we can no longer simply use this definition to understand the concept. Although the United States has almost never experienced a major invasion on its own soil, the United States has continued to intervene in or provoke wars, and as these wars have continued, the American public and society have come to believe that these wars do not really seem to exist. However, this illusion is dangerous: if the governmental powers granted in wartime are constantly rationalized, then the United States will be able to maintain the existence of war in its governing psyche and thereby limit the rights and freedoms of the people. Moreover, if wartime is in fact the norm rather than an exceptional period, then law during war must be seen as the form of law we normally practice, not as a suspension of an idealized understanding of law.

The normalization of war in our daily lives, instead of being something irrelevant or innocuous, would mean an endless assault on American civil liberties.

To understand wartime, Dudziak argues, it is necessary to first understand the role of time as a medium. If the concept of time is taken to be understood as a medium, then the neutrality of time itself receives questioning. In fact, time itself has never been neutral, and this has to do with the changing use of time by the American public over the course of history. Although clocks have been used in the United States since the country’s founding, for a long time there was no specific standard for time for the masses everywhere. Until the end of the 19th century, many towns often set their clocks according to the sun, which caused great confusion for railroads and other transportations. Efforts to make time practices more uniform eventually created the four time zones that the United States now has and were codified during World War I. With the brief legalization and federal repeal of Daylight Saving Time as an energy conservation act during World War I, the use of time in the United States was once again thrown into chaos, with no coordination between those who still use it and those who no longer do; a situation that did not largely end until Daylight Saving Time was reincorporated into law in World War II.

From the creation of American time zones and the use of Daylight Saving Time in the United States, we can see that the process of using time as a medium is not natural, but is regularly affected by human manipulation. Wartime, as a medium, is similar to time in the sense that it is a synthetic set of ideas derived from social life and changed by people’s tendencies and specific behaviors. When war becomes the subject of a nation, it is not only the alignment of time that changes, but also through the expectation of great social change. People will think that war is important and exceptional, but also that it will end. They would also think that time is divided cleanly between wartime and peacetime, but this is not the case: while many people enjoy a seemingly peaceful life, the war did not stop.

In times of war, a portion of the law is reinterpreted in the name of war. In the case of Schenck v. United States, which is widely regarded as a case defining the modern understanding of the First Amendment’s free speech clause, the Supreme Court held that the defendant, Charles Schenck, chairman of the Socialist Party of America, had no First Amendment right to advocate against the U.S. government’s military draft because of the exigencies of war. At the same time, Congress passed the Sedition Act of 1918, which explicitly prohibited by law the publication of speech that obstructed the war and criticized the U.S. government, military and institutions during the war. In these cases, we see for the first time in American history the wartime effects on American law and the suppression of freedom codified and enforced.

How World War II Challenged American Wartime

For the American public, the rise of fascism in Eurasia has increased concern that American values of democracy and freedom are being challenged. For the politically sensitive masses, Hitler’s rise was not just a great military threat, but a political counterpoint: they saw clearly how badly the words and actions of an ambitious demagogue could be influenced by populist agitation and lack of restraint, and came to appreciate more on the importance of an independent judicial system.

When Japan bombed Pearl Harbor and the United States officially entered World War II by declaring war, the erosion of individual rights by this independent judicial system became even more apparent and urgent. It is well known that as president, Franklin D. Roosevelt signed Executive Order 9066 in 1942, authorizing the Secretary of War to designate specific areas as “war zones,” thus paving the way for the internment of more than 100,000 Japanese Americans and Japanese in the United States during World War II. The incident eventually led to a legal firestorm: Japanese American Toyosaburo Korematsu was arrested after trying to flee California after the U.S. West Coast was defined as a war zone. Social activists opposed to this policy sued the U.S. government on Korematsu’s behalf, hoping to overturn the detainment situation. The Supreme Court ultimately ruled that this action was not unconstitutional in the “emergency and peril” context of war with Japan, and that it was reasonable to detain a group of U.S. citizens because of their “hostile” heritage.

When did World War II officially end in the United States as a time of war? Most would probably say September 2, 1945, when Japan officially surrendered on the U.S.S. Missouri, but that is not the view of the top echelon of the U.S. government, led by President Harry S. Truman. The atmosphere of hostility and conflict created by the brutal war was far from dissipated; in his speeches, Truman continued to believe that a state of war in the legal sense continued until 1951. This was also reflected in law: In 1946, the U.S. Justice Department deported a German citizen living in the United States on the grounds that he was a “citizen of a hostile country” in a state of war. After he appealed to the Supreme Court, the Court’s opinion ultimately held that the president’s special wartime powers meant that he could decide when the war was considered over, not when the enemy signed a capitulation.

World War II ended, but the World War II mentality lived on. The entire U.S. legal system, from the 1930s to the 1950s, conceptualized rights in terms of national security and extended or limited them in the name of national security. Everything from the internment of an entire ethnic group to whether or not to recite the Pledge of Allegiance in class was often decided by the Supreme Court through the name of national security.

The Cold War and the Perpetuity of Wartime

As the differences in political positions between the Western camp, represented by Britain, France and the United States, and the Eastern camp, represented by the Soviet Union, widened to irreconcilable levels, a concept that seemed contradictory in its literal sense, the Cold War, was born. The Cold War, as a far more ambiguous wartime than World War I and World War II, dominated U.S. foreign and social policy for more than 40 years until the collapse of the Soviet Union.

However, when the United States and the Soviet Union possessed and expanded their nuclear warheads, and the two sides were just forming a saber-rattling posture, many in the United States did not consider wartime to be ambiguous; rather, they believed that all-out war was likely to come at any moment. World War III could mean global destruction. This sense of tension spread from beneath society all the way to the political hierarchy in Washington; one regional conflict and gunfight after another drove U.S. domestic and foreign policy, which eventually developed into an ideological objective aimed solely at opposing the Soviet Union. With the transformation of the War Department into the Department of Defense and the renaming of the military budget to “defense spending,” Dudziak argues, the political ecology of the United States entered a state of codependent “war and non-war” during the Cold War.

From a historical perspective, the Cold War seems to be only a specific period of time. However, even amidst the numerous anxieties and threats, the social process in the United States did not come to a complete halt, and Americans gradually became accustomed to a life in which a behemoth with nuclear weapons and a hostile political ideology was watching over them every day. The Smith Act of 1940, which prohibited the discussion and dissemination of any attempt to overthrow the United States government, led directly to the demise of the far-left political parties in the 1950s, and the country as a whole went on a political witch hunt against communism, instigated by Senator Joseph McCarthy and others. The “Red Scare” narrative quickly fell out of favor in the late 1950s, and by the 1960s, although the United States had entered the public discourse on civil rights, many people remained silent about more radical left-wing political ideas and actions — apparently, the wartime mindset had accustomed the public to not discussing them. The U.S. government took the opportunity to interfere in the political and military situation in countries such as Lebanon, the Dominican Republic and Granada, in the name of fighting communism, while the public began to silence itself.

The unbridled expansion of political power and the weakening of political pluralism in the U.S. government had its roots in the Cold War’s wartime as a political mindset that almost completely annihilated the array of social welfare policies built during Roosevelt’s New Deal. While Democratic President Lyndon B. Johnson implemented some similar social reform measures, significant social and economic resources continued to be diverted in the name of “national security” to the point where the public and the media became increasingly accustomed to it. This mentality continued after the end of the Cold War, until the American political process officially entered a whole new phase as a result of a tragic event.

If War on Terror Fades, Would Wartime Disappear Too?

That tragedy was the 9/11 terrorist attacks that killed 2,997 people. When civilian airplanes carrying innocent people became the murder weapon used by religious extremists to commit mass murder, and when New York was overwhelmed by dust and wreckage, George W. Bush knew he had to make an important decision — one that would completely redefine wartime.

Prior to 9/11, “terrorism” and “war” were two unrelated terms. The response to an act of terrorism was different from the response to an act of war; the perpetrators of terrorist acts were punished by the legal system, not by a declaration of war. But all that went out the window when Bush solemnly announced before a somber audience of congressmen that he would declare war on the “enemies of freedom” through military action in order to catch the perpetrators of the attack — al-Qaeda leader Osama bin Laden. The United States launched two major wars, one against Iraq and one against Afghanistan, then sovereign nations. At the time, nearly all U.S. citizens, regardless of party or ethnicity, witnessed the collapse of the World Trade Center and believed the Bush administration’s portrayal of it as an “act of declared war.” With little need to mobilize public opinion, the United States entered another state of war.

Although the United States did not formally declare war on either country, Congress quickly used the Authorization for Use of Military Force to define the scope of the war on terror. The AUMF declared that the U.S. military could use “necessary and appropriate force” against anyone deemed to be involved in an attack. Immediately afterwards, Congress passed the USA PATRIOT Act, a massive expansion of U.S. police powers to search for personal and personal data, in the name strengthening the defense against another extremist religious terrorist attack within the United States, particularly an Islamic terrorist attack.

From the day of September 11, 2001, the American public and media have been almost unanimous in their firm belief that the attacks changed everything. The attacks changed the course of history, and anyone who questioned the series of actions the United States took after the war was considered to be “still living on September 10.” From the initial targeting of the Taliban government in Afghanistan for harboring al-Qaeda, to the adoption of a new resolution designating Iraq as harboring “weapons of mass destruction” and recommending the use of force against the country, U.S. wartime has been more formally recognized as something that can be triggered by ideological differences.

These policies, Dudziak argues, made the concept of wartime an argument that later supported certain actions that many believed crossed the line. In order to justify the entire counterterrorism narrative, the U.S. government went straight past conventional procedures and entered the war without thinking twice about it, against a completely different opponent than before: terrorism. The fact that the advocates of terrorism are often not states, and often do not have political entities, naturally facilitates the U.S. government’s use of “terrorism” to be as broad as possible. From prisoner abuse at Guantanamo and Abu Ghraib to the indiscriminate attacks on civilians, the U.S. government has sought to rationalize these accusations by using 9/11 as a shield amid growing criticism.

At the same time, laws that were created “temporarily” because of the wartime situation have become less and less temporary as the war continues. Guantanamo’s prolonged, non-courtroom detention of terror suspects on the grounds that they are “not part of U.S. territory” has been repeatedly ruled constitutional by the Supreme Court, using the same “national security” rationale as in the Yeson case. The Supreme Court has repeatedly ruled it constitutional to hold terror suspects without trial, using the same “national security” rationale as Korematsu. And with Edward Snowden’s revelations about the Prism project and a series of other mass surveillance operations by the U.S. government, the American public is beginning to find itself unable to make fundamental changes to this trend beyond outrage and demonstrations — a trend that too many people have grown accustomed to. Now, they cannot envision a country where these systems are overturned.

The suspension of normal limits on executive power, which continued long after Bush left office who made it happen, has also caused legal scholars to rethink a post-9/11 U.S. society still in wartime. Some scholars point out that as the U.S. military gradually sinks into the quagmire of Afghanistan and Iraq without achieving real victory, and the traumatic memories of 9/11 slowly fade with time, American society is in a whole new state: neither considered wartime nor at peace. And this confusing status, even after Osama bin Laden was killed by U.S. Navy SEALs, hasn’t really changed. Although the United States withdrew from Iraq in 2011, the rapid rise of the Islamic State soon brought the United States back into the Middle East. With Yemen and Syria, also located in the Middle East, embroiled in unstoppable civil wars, it has become difficult for the United States to fully disengage from the region.

Regardless of their different political positions and philosophies, U.S. presidents have never stopped worrying about terrorism. Even as U.S. troops are about to leave Afghanistan, where they have been for 20 years, the U.S. government and military are still showing their strong influence in the Middle East and will still use force for their own benefit. Today, the U.S. defense budget is still around $700 billion, doubling that the 2001 defense budget, and far exceeds the U.S. federal government’s spending on infrastructure, education, diplomacy, housing, and other areas, and is second only to Medicare and Social Security.

Although wartime as an otherwise special and exceptional state temporarily changes the pattern of people’s lives, as the definition of war itself and the international situation has changed, the U.S. government has increasingly abused this state in the process. Now, as the new international situation created by the United States after 9/11 become more and more unchangeable, the United States and the world are forced to accept a harsh reality. The United States, henceforth, will be accompanied by “hostile forces” that cannot be completely eliminated, and will remain with them forever in wartime.

--

--