From the First World War to the end of the Cold War the U.S. military has seen dramatic changes in the American “way of war” during each individual conflict. Like shadows on a parade field, military institutions and war reflect in part the society that creates them. Although many Americans view themselves as a peace‐loving people and war as an aberration, war has been a regular part of American history, integral to the way the nation developed. Despite divisions among Americans, the United States has justified its wars as in defense of American lives, property, or ideals. Policymakers have also taken the nation into war for various strategic, economic, and political reasons. But since the idea of Old World balance‐of‐power wars or wars of subjugation over other nations has been anathema to Americans' self‐image, the United States has usually mobilized for war in highly idealistic crusades—for liberty or democracy.
To understanding the changes in the United States Militaries “way of war” it is best to start during the late 1800’s to the early 1900’s. These were the years when the United States developed its first modern military. During this time the American Military was involved in many arduous and harsh wars against Native Americans. It was during this time that the American Congress had reduced the authorized strength of the Army from more than 57,000 to only 27,000 men. With this small force, the Army was responsible for manning not only frontier posts but also coastal fortifications. As the country began to grow farther and farther west it became more difficult for the small number of men serving in the Army to battle the Indians and protect the country. What brought an end to the Indian Wars were the massacre at Wounded Knee, along with similar massacres at Sand Creek, Bear River and other sites. (Allison 199)
These massacres have come to epitomize the Indian Wars in bad light. Both, the Army and the Indians had been placed in a no-win situation by unfortunate timing and circumstances. For the Indians, the choice was to accept assimilation on reservations or to fight for their independence. Neither choice offered hope for a happy outcome. For the Army, a poorly coordinated and often conflicting policy of pacification and military action placed the Army in an unpopular situation with the citizens and soldiers of the U.S. For many soldiers and officers, forcing people to give up their native ways and live on squalid reservations held no honor, nor did fighting an enemy who most of the time was outgunned and outmanned. When the Indian Wars were finally over there was a renewed push toward professionalism in the United States of America. (Allison 200)
The birth of the modern National Guard in America was a way for the government to help keep the U.S. citizens safe during times of need. Whether it is a riot or a natural disaster the National Guard is the first responders to handle the situation. Along with the National Guard another change was beginning to reshape the small sized American Army. As written in the book, American Military History by William T. Allison, “When the industrialization in the United States and Western Europe, the advent of new weapons technology, and lessons learned from the Civil War, the American military engaged in another push toward professionalization similar to what it had experienced in the decades following the War of 1812.” (Allison 201) By studying military developments in England and Germany, Army Progressives argued that in the modern industrialized world, relying on a skeletal standing army filled out by militia would no longer suffice to defend American security and national interests.
In 1903, the Army adopted a general staff model that promoted better cooperation between the various bureaus and field commands. Along with the change in commands came a change to the technology and tactics the Army used. With the development of new weapons, such as the modern machine gun, the Colt .45 and .30 caliber Krag-Jörgensen rifles, in the early 1900’s the Army was beginning a new chapter in how it conducted war. (Allison 203) In order for the United States to have a successful “way of war” they had to build a modern navy. Industrial growth and its corresponding increase in American trade abroad, however, demanded a larger, modern fleet of naval vessels to protect American shipping and trade interests beyond the waters of the United States. Moreover, as European navies, particularly the Royal Navy, eagerly adopted coal-fueled, steam-powered vessels complete with steel hulls and huge guns, the coast of the United states seemed at greater risk of foreign attack than ever before. In order to address these very real concerns, the Navy had to modernize. In 1907, a modern American battleship fleet left on its first world tour. These upgrades to the United States Army and Navy are the background to the development of the U.S. way of war during the First World War.
As war begins in Europe in 1914 the United States Government declares neutrality in the conflict. Instead the U.S. is involved in the Punitive Expedition. The 1916 Punitive Expedition was prompted by an incursion into New Mexico by the Mexican forces of Francisco Villa that included a raid on the town of Columbus in March, resulting in fifteen deaths among its citizenry and local soldiers. A force of some 12,000 armed soldiers, led by Brigadier General John J. Pershing, were sent to fight the small Villa force of a little more than 500 men. Pershing and his men penetrated over 400 miles into Mexico trying to rid Villa’s men while trying to better the relationship between the U.S. and Mexico. However, in January of 1917, the U.S. Government ordered the withdrawal of Pershing’s forces following the promulgation of a new constitution and the election of Carranza as president in Mexico. The tensions that had characterized the American-Mexico relations began to recede, while increasingly strained relations with Germany had by now grown tremendously. As Allison writes, “Although the Punitive Expedition had made early use of emergent military technology such as aircraft, and had demonstrated some benefits of the introduction of motorization, the U.S. Army in 1917 was not yet a modern army.” (Allison 220) This meant that the United States involvement in the First World War was going to be one of trial and error.
A pattern had emerged in America's wars. War usually began with setbacks, largely because the nation, although willing to go to war, was militarily unprepared. Early defeats were followed by preparation and retaliation, and ultimately decisive redeeming victories. The belief in the inherent righteousness of the cause, in the natural fighting ability of the American citizen-soldier, and in the nation's ability to mobilize its resources gave Americans an extraordinary optimism about what they could achieve militarily. Wars against Indians, Mexicans, and Spaniards in the nineteenth century reinforced these views, as with relatively small loss of life suffered by U.S. citizens the United States gained enough territory to claim overwhelming, if not always total, victory. In World War I, President Woodrow Wilson called for a crusade to “end all wars” and to make the world “safe for democracy.” The American war effort helped defeat the German empire, create a German republic, and make the United States the financial capital of the world. (Russel 4-17)
In April of 1917, Germany’s renewal of unrestricted submarine warfare led directly to the American entry in to the World War. Due to the location of the war and the already established direction of the Allied war effort the American Expeditionary Forces had little choice about conforming to the broad thrust of Allied strategy. This war was the first large scale, global conflict that the United States had been a part of. The recruitment, deployment, and maintenance of the American Expeditionary Force represented the largest military challenge in American history to that time, and the most complex since the Civil War. A goal of around three million men on the ground in Europe was the plan. (Allsion 223)
The way the war was taking place on the Western Front was a type of warfare that no country had seen before. The use of trench warfare gave problems to both sides fighting. This was a time where manpower was more important than technology. This meant that the American way of war had to consist of learning how to battle in a trench warfare environment. One problem that the Americans had during World War I was the lack of learning they were willing to do. General Pershing, who led the American Expeditionary Forces, held the Allied armies in low regard, believing that the American Expeditionary Forces had little of value to learn from them. He insisted on high standards of drill, discipline, and marksmanship among American units, which was fine as far as it went, but he missed the point about the value of massed and coordinated fire against area targets rather than the traditional aimed fire against point targets emphasized in American training doctrines.
World War I was the first time that the new American Navy would see combat. The war at sea tends not to receive the attention often lavished on the Western Front, but the control of sea lines of communications and the neutralizing of the German fleet by the Royal Navy were critical to the Allies’ capacity to fight and win the war. The ability of the American Expeditionary Force to take the field at all was dependent on sea transportation, and the greatest threat to this was posed by the successive German submarine offensives blockading Britain and attempting to cut the sea links across the Atlantic. The American industrial effort provided dramatically more numbers of merchant vessels and escort ships, and American ships played an important part in laying the mine barrage in the North Sea between Scotland and Norway to restrict German submarines’’ freedom of movement. The U.S. Navy increased to an unprecedented size of almost half a million men and over 2,000 ships, ranging from simple transports to battleships. (Allison 232)
American involvement in the First World War was far more important to and for Americans than it was for anyone else. But, this is true largely because the war ended when it did, in November 1918. Had the war gone on into 1919, many people believed the weight of American numbers and the sharp increase in American combat capability that was the inevitable fruit of broader experience would have proven irresistible, coupled of course with the rapid decline in German manpower stocks and military effectiveness that would have benefited allied arms generally. Pershing lamented that the war ended in 1918, because in his estimation, the American Expeditionary Force was just approaching maturity as an effective fighting force when the armistice suddenly ended the fighting. The American Army had to learn how to fight this new type of war. This was a war of trial and error for not only the United States but for many other countries. (Allison 235)
The American experience in the First World War convinced many in military and political circles that the American military establishment needed another major overhaul. The expansible army concept, dating back to the early 19th century, had not worked well in the massive mobilization effort required in 1917. Industrialized warfare required new and improved military organization. In 1920, Congress passed a new national defense act that radically altered the organizational structure of the Army and provided for a better-trained and better-prepared military force should war again threaten the United States.
As the United States industrialized, optimism about America's fighting ability focused on superior weaponry. At the turn of the century, Adm. Alfred T. Mahans’s doctrine of Sea Power emphasizing the use of a modern fleet promised swift and total victory. In the 1920s and 1930s, Gen. Billy Mitchell of the Army Air Service helped develop the doctrine of Strategic Airpower as a technological means to achieve quick and total victory. In World War II, in response to the Japanese attack on Pearl Harbor and in a crusade against fascism, Americans waged war on land, sea, and air, including conventional and ultimately nuclear bombing of urban areas to achieve decisive victory and unconditional surrender of the enemy. (Russell 43-57)
In 1939, as Nazi Germany invades Poland once again the United States proclaims its neutrality in the conflict growing the Europe. Relationships between the United States and Japan had deteriorated steadily after the mid1930’s, largely over Japanese actions in China and the retaliatory sanctions imposed by the United States on oil imported to Japan. The Japanese attacks on the American Pacific Fleet at Pearl Harbor and the American and Filipino forces in the Philippines were what brought the United States into the war on the Allied sides once again. The American Pacific Fleet was badly damaged by the Japanese, but the vital aircraft carriers were at sea and untouched, and thus still a serious threat to Japan’s plans.
While the War in the Pacific remained an overwhelming American war, the landscape in Europe and the Mediterranean was much different. In order for a sufficient number of American soldiers a draft was implemented once again. This was now the third time that the U.S. had had to rely on a draft – Civil War and World War I. The Second World War was much different from the first. The use of technologically advanced weaponry was seen all throughout the war. The introduction of planes capable of delivering powerful bombs played a large role in the Second World War. Along with planes improvements to armored vehicles and tanks, as well as better machine guns and artillery would change the way the war was fought. The American way of war determined more on technological advances rather than pure manpower. With the creation of the first atomic bomb and the use of two of them on the cities of Hiroshima and Nagasaki in Japan changed the way wars would forever be fought. (275) World War II was the greatest and most destructive conflict in history, and one that helped to cement the United States’ rise to global dominance. The allies could not have won the war without the involvement of the Unite States, and with the American involvement they ultimately could not lose it, either. (Allison 275)
After World War II there was another change in the American way of war. The Korean War had shown how difficult it could be to achieve limited objectives in the fight against communism without the use of nuclear weapons, without a congressional declaration of war, and without full mobilization. It had also once again exposed the delicate balance between military command in the field and civilian control of policy objectives, as well as the difficulties of matching military objective with war aims. The introduction of NSC-68 had been put to the test.
NSC-68 was a 58-page top-secret policy paper issued by the United States National Security Council on April 14, 1950, during the presidency of Harry S. Truman. It was one of the most significant statements of American policy in the Cold War. NSC-68 largely shaped U.S. foreign policy in the Cold War for the next 20 years, and involved a decision to make containment against Communist expansion a high priority. The strategy outlined in NSC-68 arguably achieved ultimate victory with the collapse of the Soviet Union and the subsequent emergence of a “new world order" centered on American liberal-capitalist values alone. Truman officially signed NSC-68 on September 30, 1950. It was declassified in 1975. (Walter 507-511)
The difficulty behind the Cold War was how it challenged the traditional American way of war. To fight a war without fighting was something the U.S. had not seen before. The slightest wrong move could have potentially started an all-out nuclear war amongst the United States and the U.S.S.R. Nuclear weapons rapidly reached such destructive levels that “winning” a nuclear war had become a questionable concept. The Cold War posed a major challenge to American views of war and the military. Containment of the Soviet Union led to large standing military forces, but even these did not produce a sense of military security, for the USSR also developed intercontinental ballistic missiles and thermonuclear weapons. Before it ended in 1991, with the total collapse of the Soviet empire, the forty‐year Cold War represented an unprecedented period of U.S. uncertainty over national security. During the Cold War, the U.S. government refrained from the use of total military force in Korea and Vietnam. But the policy of limited war clashed with the traditional goal of total victory. The Korean War ended in a frustrating stalemate, the Vietnam ultimately in defeat. After the United States had fought for more than seven years to prevent it, the Communist victory in Vietnam was a severe blow to Americans' optimism, sense of righteousness, and sense of military prowess, which did not return until the collapse of the USSR and the American victory in the Persian Gulf War of 1991. (Michael)
The evolution of the U.S. military and the changes to the American “way of war” have been an enduring process that will not end. From the brute strength of World War One to the technologically driven World War II wars like countries are constantly evolving. The uses of new weaponry, tactics, and machines have greatly changed the battlefield. Since the end of the Cold War there have been even more changes to the American way of war. This constant evolution will not end unless peace is somehow fulfilled worldwide. But, until that day the U.S. military and the American way of war will continue to change.
Allison, William Thomas., Jeffrey Grey, and Janet G. Valentine. American Military History: A Survey from Colonial times to the Present.Upper Saddle River, N.j.: Pearson, 2013. Print.
Michael Sherry, In the Shadow of War: The United States since the 1930s, 1995.
Russell F. Weigley, The American Way of War: A History of United States Military Strategy and Policy, 1973.
Walter L. Hixson, "What Was the Cold War and How Did We Win It?" Reviews in American History (1994) 22#3 pp. 507-511