Articles

Review: Volume 50 - Second World War

Review: Volume 50 - Second World War

In May 1943, a specially established RAF squadron made its permanent imprint on military aviation history by flying a high-risk, low level, nighttime attack against German hydro-electric dams vital to the Nazi armaments industry in the Ruhr Valley. A comparatively tiny part of Air Chief Marshal Sir Arthur Harris' four-month-long Battle of the Ruhr, this one raid had an impact totally out of proportion to the small number of aircraft involved. It highlights the synergy of science and technology, weapons development and production, mission planning and practice, and the unflinching courage in the execution of a highly dangerous bombing raid. Furthermore, it established a legend that still resonates today.

This Osprey Command title looks closely at the early life, military experiences and key battlefield exploits of Field Marshal Bernard Law Montgomery, first Viscount Montgomery of Alamein (1887-1976), perhaps the best-known, most highly respected and most controversial British general of World War II. Monty's reputation was made while in command in North Africa, in the Mediterranean and then North-West Europe. Arguably his best-known achievement was rebuilding a dispirited and defeated eighth army and inflicting a decisive defeat on Rommel at El Alamein. Montgomery's style and exercise of command and his personal reputation were largely shaped by his highly driven, but often difficult and enigmatic personality. He made an incalculable contribution to the Allied victory in Europe, and his leadership had played a crucial role in transforming the British Army into a war-winning weapon.

Alex de Quesada reveals the full history of the US Coast Guard throughout World War II in this Elite title. In particular, the book draws attention to the little-known story of how the US Coast Guard ran a number of the landing craft throughout D-Day in 1944 as well as providing crucial anti-U-boat patrols throughout the war years. A number of Coast Guard servicemen were lost in these two campaigns, and their undeniable contribution to the US war effort deserves greater recognition. The Coast Guard also provided aviators and gunners to the Merchant Marine and manned Port Security Services. These roles are all fully explained and illustrated with rare photographs and specially commissioned artwork.

Horatius Murray (1903-1989) was commissioned into the Cameronians (Scottish Rifles) in 1923. and played football for the Army and excelled at many sports. In 1935, he went to Staff College before training with the German Army in 1937. He commanded the 3rd Battalion The Cameron Highlanders in 1940 before being sent to North Africa where he commanded the 1st Gordon Highlanders. Despite being wounded at El Alamein he commanded no less than four divisions (as recounted in this memoir). After commanding Scottish Command he became C-in-C Allied Forces Northern Europe. After retiring in 1961 he dedicated himself to worthy causes notably as Chairman of the Royal Hospital for Incurables Putney.


Germany and the Second World War

Germany and the Second World War (German: Das Deutsche Reich und der Zweite Weltkrieg) is a 12,000-page, 13-volume work published by the Deutsche Verlags-Anstalt (DVA), that has taken academics from the military history centre of the German armed forces 30 years to finish. [1] [2]


Like the analogously named book about World War I, this small book is replete with excellent maps, great photos, fascinating fact-boxes, and reader-friendly infographics. But of course, limiting a vast subject like “World War II” to “fifty things” is going to leave some gaps.

Perhaps the most significant omission to my mind is the matter of the internment of Japanese Americans in the United States. This involved the forced relocation and incarceration in 1942 of between 110,000 and 120,000 people of Japanese ancestry, 62% of whom were U.S. citizens. Men, women, and children were sent to camps in barren, inhospitable locations. As many as 25 persons lived in spaces intended for four. Their belongings, businesses, and savings were confiscated. (Losses were estimated by the government as more than $200 million in 1942.) None were ever found guilty of disloyalty a 1980 U.S. Government Commission concluded the incarceration had been the product of racism.

Japanese American Children in an American internment camp

Similarly, Britain’s roundup of Italians and Germans (including Jewish citizens from those two countries who had fled to Britain to avoid the Nazis) gets no mention whatsoever.

The author includes a “blurb” on the 1944 Warsaw Uprising of the Polish Resistance, but nothing whatsoever about the 1943 uprising of the Warsaw Jewish Ghetto, one of the more amazing acts of resistance in modern history. [You can read a summary of what happened here.] Nor is there any mention of The Katyn massacre, a series of mass executions of Polish nationals carried out by the Soviet secret police in 1940, and only acknowledged by Russia in 1990. (Churchill and FDR both knew about what happened at Katyn, but chose not to criticize their Soviet ally.)

Jews from the Warsaw ghetto surrender to German soldiers after the uprising. (Photo by Keystone/Getty Images)

Another omission seemed unfortunate to me. Although the author devotes a relatively large section to the British code breakers of Bletchley Park and Alan Turing, it is a shame he did not take the opportunity to report on how the British government “rewarded” Turing for work acknowledged as “essential” in defeating enemy U-boats and helping the Allies at D-Day. (In fact, by some accounts, it has been estimated that the work at Bletchley Park shortened the war in Europe by as many as two to four years. And yes, the post-war world does receive some coverage, so it cannot be said to be outside of the purview of the work.) Turing was prosecuted in 1952 for homosexuality forced to undergo chemical castration in lieu of imprisonment and died of cyanide poisoning in 1954 (whether self-induced or not has never been conclusively established).

Nevertheless, the author found many ways to include engrossing aspects of a huge subject as well as some “fun facts” (like the derivation of code names for various military operations) and gives a good, if incomplete, overview of what happened during the war. Importantly, I don’t think anyone is going to be bored by the history lessons in this book.

Evaluation: This book does a very good job at introducing the subject of World War II to students. All the eye-popping pictures and facts will no doubt inspire further inquiries, at which time the omitted portions of the history will become clear. Great maps and infographics with plenty of photos will make the time fly as you learn the basics. A brief “who’s who” photo gallery and glossary are at the back of the book.

Rating: 3.5/5

Published in the US. by QEB Publishing, 2015

Note: This book and others about World War II throws around some pretty big numbers about casualties, but they are not necessarily easy to conceptualize. An excellent animated data visualization by Neil Halloran entitled “The Fallen of World War II” helps translate the abstract numbers into terrifying relatable terms. The video first analyzes soldier fatalities by nation, then civilian deaths, and finally offers a perspective of WWII in the context of previous conflicts and those that followed. It is exceptional and unforgettable, and well worth the eighteen minutes. You can watch it here.


ABOVE: Two combatants sit at rest. The K98k Mauser is above the Springfield M1903A4. Their army’s helmets lie next to them.

One definition of a sniper rifle is “a precision rifle used to insure more accurate placement of bullets at longer ranges than other small arms.” The word “sniper” is derived from the snipe bird which was very hard to shoot as its flight path was highly erratic. The military sniper came into being back at the turn of the 18th century.

The first true sniper rifle is generally thought to be the British Whitworth rifle invented in 1854 by Sir Joseph Whitworth, under commission from the British War Department. It was a single-shot muzzle-loaded 45 caliber percussion-fired rifle with an accurate range out to 2,000 yards. Around 13,000 of them were built between 1857 and 1865. The Confederate States of America used some of them during the American Civil War.

What made the Whitworth stand out and have such excellent accuracy was the design of its rifling – which it didn’t really have. The bore of the barrel was hexagonal with a 1 in 20 inch twist. The bullets were long for the caliber and they were hexagonal instead of round. They fit very tightly in the bore and had much less friction than round bullets fired through standard rifling.

Sniper rifles advanced in design through the Franco-Prussian War of 1870, when the first bolt-action breech loading rifles became available. The French had the Chassepot 11mm bolt action rifle the Prussians the Dreyse 15.4mm needle gun.

The first skirmish of the First World War took place August 3, 1914 at a small village named Joncherey in France. This was the initial battle on what was to become the Western Front. By the end of 1914, armies on both sides had stalled their advance. Trenches were dug with what came to be known as “No Man’s Land” between opposing forces. So began a war of attrition. During this period snipers were utilized by both sides to pick off any soldier who exposed any part of his body for a period of three seconds, or less. The War to End All Wars (named in error, as was proven a few years later), saw Germany and Great Britain make great strides in snipers and sniper rifles up to the end of the war in 1918.

As the 1930s closed, it became obvious that the old animosities were leading up to another global conflict – this time much greater that the last one. Germany was re-arming at a rapid rate. Japan was in the process of creating its “Greater East Asian Co-Prosperity Sphere” which was announced June 29, 1940. Russia stated that by 1938 six million troops had qualified for the “Voroshiloff rifle badge, and that the soviet munitions factories had built over 53,000 Mosin-Nagant sniper rifles. Great Britain still had stocks of its P1914 Mk1 (T) sniper rifle left over from the first conflict. Japan had a number of different sniper rifles, including the 6.5 mm Type 97 and the 7.7 mm Type 99. The United States had dropped most of its sniper programs between the two wars. There was only a small training school at Camp Perry, Ohio. However, the Marine Corps had always prided itself on its marksmen and encouraged target practice on an individual basis. This changed when a sniper program was initiated at Fort Bragg, North Carolina at the end of 1942.

All the world’s major powers were ready to go to war, and they were bringing their snipers in a big way.

Service Dates: 1898 – 1945
Used by : 11 countries including Germany
Wars Since World War I: 1918 German Revolution, Finnish Civil War, Russian Civil War, Turkish War of Independence, Spanish Civil War, World War II, Second Sino-Japanese War, Chinese Civil War
Designed by: Peter Paul Mauser
Patented: Sept 9, 1895
Manufactured by: Mauser, Deutsch Wafer und Munitionsfabriken, Heal, Sauer & Sonn, Waffenwerke Oberspree, V. CHR. Schilling Co., Simson, Imperial Arsenals of Amber, Danzig, Efurt, Leipzig, and Spandau
Number Built: 16,000,000+ all types
Variants: K98a, K98b, K98k,
Weight: 9.0 lb
Length: 49.2 inch
Barrel Length: 29 inch
Cartridge: 7.92吵 mm Mauser
Action: Bolt
Muzzle Velocity: 2881 fps
Effective Range: 550 yards, 800+ yards with scope
Capacity: Internal 5-round magazine
Sights: Iron
Sights Sniper: 1.5x, 4x, 6x scopes

Germany in World War I had relied on the Mauser Gewehr 98 (G98) service rifles with scopes. This is the best known of all Mauser rifles. It’s chambered in 7.92 mm (8 mm Mauser), bolt action with a five round magazine. In the early months of 1915, the decision was made to produce 18,000 G98 Rifles with scope sights as sniper rifles. The rifle wasn’t designed to be used with a scope, so the bolt had to be turned down and a recess carved into the stock so that it could be cycled with the scope on the rifle. The mount had to be high enough for the soldier to be able to load the magazine, run the bolt back and forth and flip the safety catch.

This rifle carried into World War II in its sniper role. Germany then modified the G98 by changing the rear sight to a flat tangent, removed the stacking hook, cutting a slot in the butt stock for a sling, and bending the bolt handle down. This became the K98k.

They used the basic K98k platform in different varieties. By the end of World War II five such types had been employed.

1. First type – short rail system
2. Second Type – uses the ZF 41 and ZF41/1 scopes
3. Type Three is the turret mount system both high and low
4. Type Four is a long rail mount
5. Type Five is a claw mount

The various systems were used at different times and sometimes employed concurrently. Various optics companies supplied the scopes Schneider & Co., Zeiss, Hensoldtwerk, Ajak, and others. Magnification ran from 1.5x to 6x. The 4x and 6x were the most used as the 1.5 didn’t have enough magnification for combat duties.

The short rail system was attached to the left side of the rifle’s receiver by three screws. This mount was used during the 1930s by German police. In 1941 the High Command ordered it into general combat. The most common scopes are 4x.

The Second type began using the Zielfernrohr 41 (ZF41/ZF41/1) low magnification scope in 1941. Originally intended for sharpshooters, this scope was unpopular with the snipers as its 1.5 power wasn’t adequate for the task at hand. Approximately 100,000 were manufactured by the end of the war in 1945.

The Low Turret/ High Turret Type Three used different scopes. The High Turret has a 6.35 mm greater recess depth in the front scope base cone than the Low Turret. Other than that, they’re the same rifle.

Type Four – Long Rail Mount – uses a longer mounting base for better rigidity. This required a larger receiver flat be milled to take the base. This base used three screws and three tapered pins to control flex or movement. This system came in use in 1944.

The Claw Mount only was in use from late 1943 to 1944. Less than 10,000 K98ks were fitted with them. The most common scope was the Hensoldt Metzler.

Other German rifles were converted to sniper duty with different amounts of success. Mauser built a semi-automatic 10-round rifle called the G 41. It wasn’t a good design and not many were built. Then Carl Walther modified the rifle and produced it in 1943, calling it the Model G 43. As all G 43 rifles were built with flat side receivers set up for mounting a scope. Making a sniper rifle consisted of adding a ZF 4 4x scope. No other brand of scope was used during World War II. German snipers still preferred the K98k over the G 43.

Service dates: 1943-1945
Used by Nazi Germany, German Democratic Republic
Wars: World War II
Designer: Walther
Manufactured: by Walther
Number built: 400,000+ (all types)
Variants: G 43, K 43
Weight: 9.7 lb (w/o scope)
Length: 44.5 inch
Barrel length: 21.5 inch
Cartridge: 7.92吵
Action: Gas-operated (can be bolt operated)
Muzzle velocity: 2,448 fps
Effective range: 875 yards (scope)
Capacity: 10-round detachable magazine
Sights: Zf 4 scope

Short Magazine Lee Enfield (SMLE)

Service dates: 1907-present
Used by: Britain, Australia, Canada
Wars since WWII: Minimum 16
Designer: James P. Lee, Enfield Arsenal
Manufacture by: Enfield Arsenal- Great Britain, Long Branch- Canada, Savage- USA
Numbers built (all types): 17,000,000+
Variants (sniper): SMLE Sniper (telescopic sights), No.1 Mark III H.T. (Australian), Rifle No. 4 Mark I (T) and Mark I* (T)
Weight: 8.8 lb. (w/o scope)
Length: 44 inch
Barrel length: 25.2 inch
Cartridge: .303 Mk VII ball
Action: Bolt
Muzzle velocity: 2,441 fps
Effective range (optics): 750 yards
Capacity: 10-round magazine
Sights (sniper): Various scopes made by Periscope, Aldis, Winchester, R.E.I., Enfield No. 32

The “Smelley” rears its head. The Short Magazine Lee Enfield (SMLE), was first developed during World War I from the original Lee-Metford series. It was designed as a replacement for both the long barreled rifle and the carbine with its shorter barrel.

The first sniper rifle constructed was built on the Mark III and Mark III* rifles and designated SMLE Sniper (optical). They were fitted with front and rear optics that when looked through gave a 2-3 power magnification. Slightly more than 13,000 SMLEs were converted in 1915. These were occasionally seen during World War II, but very rarely.

Similar to the optical SMLE was the SMLE Sniper (Telescopic Sights). This rifle had conventional telescopic sights made by Periscope, Aldis, Winchester, and others. Around 9,700 of these rifles were converted during World War I and were used into World War II.

As an aside, Britain still used the World War I P1914 Mk I (T) sniper rifle up until 1942 when the Enfield No 4 Mk 1 (T) and No 4 Mk 1* (T) entered the conflict. The main difference between the MkI and the Mk I* is that the Mk I* was built in the United States mostly by Savage-Stevens. The bolt head catch was altered for ease of production.

Canada also had the SMLE sniper rifle known as the Long Branch. Some of the Canadian rifles mounted the Lyman Alaskan scope, although less than 100 were fitted. These Canadian Long Branch sniper rifles mainly used the scope built in Canada by R.E.I. It was very similar to the Enfield No. 32. R.E.I. also designed and built the No. 67 Scope, however less than 100 were mounted on the SMLE.

The No I Mk III H.T. (Australian) came into being towards the end of the war. It used rebuilt actions that dated between 1915 and 1918 with a heavy barrel installed. The scopes were Australian Pattern 1918 (Aus). Both high and low mounts were used. The iron sights remained and the rifle could be operated using iron sights without removing the scope.

Most of the scopes used during World War Two were built by Enfield and identified as the No. 32 (Mk 1-3). The scope had originally been designed to fit on a BREN machine gun, so robustness wasn’t a problem. It was capable of hits out to about 800 yards, but 600 yards was a more realistic number.

Arisaka Type 97 (38)
Service dates: 1937-1945
Used by: Japan
Wars since WWII: Chinese Civil War, Indonesian National Revolution, Korean War, First Indochina War, Vietnam
Manufactured by: Koishikawa Arsenal, Choker Arsenal, Nagoya Arsenal
Number built: 22,500 (Type 97), 14,000 (Type 99)
Variants: Type 97, Type 99
Weight: 8.7 lb (Type 97), 8.1 lb (Type (99)
Length: 50.4 inch (Type 97)
Barrel length: 31.5 inch (Type 97), 25.5 inch (Type 99)
Caliber: 6.5x50mm (Type 97), 7.7x58mm (Type 99)
Action: Bolt
Muzzle Velocity: 2,510 (Type 97), 2,394 (Type 99)
Capacity: Five round magazine (both)
Sights sniper: 2.5 power scope with serial number matched to the rifle. (Weight approximately 2.3 lb with mount)

The Arisaka Type 97 “Sniper’s Rifle” was based on the Type 38 rifle that was first introduced in 1905. The Type 97 first saw service in 1937. Caliber was 6.5x50mm. Recoil was very light and muzzle blast was low. These qualities made for a good sniper rifle platform, and counter-sniping against the Arisaka was difficult. Having a barrel 31 inches long also allowed all the powder inside the mild cartridge to be completely burned so there was little flash or smoke.

It had a 2.5 power scope mounted on the left side of the receiver and offset to the left to allow loading with stripper clips. It was factory mounted and stamped with the rifle’s serial number. The scopes were manufactured by Tokyo Dai-lch Rikugun Zoheisho factory and others. They weren’t adjustable. Each scope was zeroed to its rifle by adjustment at the mount.

The only other changes from the Type 38 was the use of a slightly lighter stock with a wire monopod that swiveled at the front sling mount.

With the advent of the 7.7x58mm Type 99 rifle in 1939, it was only a matter of time before it was adapted to sniping duties. The heavier 7.7 caliber bullet punched through the air with better ballistics than the Type 97’s 6.5 mm projectile. However, this came at the cost of higher recoil and visible smoke from the shorter 26 inch barrel.

Two different scopes were issued with the rifle. The first was the Type 97’s 2.5 power, and the second was a non-adjustable Type 99 4-power. Towards the end of the war some 2,000, give or take, were built with range adjustment. The scopes could easily be detached and carried in a pouch when the sniper
changed positions.

There was one variant of the Type 99 that had a bent bolt and the scope fitted above the receiver which effectively turned it into a single shot.

Service dates: 1931-1945
Used by: Just about everybody
Designer: Sergi Mosin & Leon Nagant
Manufactured by: Izhevsk Arsenal, Tula Arsenal
Number built: 54,000+
Variants: PE or PEM, PU
Weight: 8.8 lb w/o scope
Length: 48.5 inch
Barrel length: 29 inch
Cartridge: 7.62x54R
Action: Bolt
Muzzle velocity: 2838 fps
Capacity: 5 rounds
Effective range: 730 yards (scope)
Sights: PE & PEM scope, PU scope

Mosin-Nagant 91/30 sniper rifle production began in 1942 and continued through 1944. Two Arsenals did the conversion on accurate infantry 91/30 rifles – Tula and Izhevsk. The Izhevsk Arsenal produced 53,195 sniper rifles in 1942. A total of 275,250 were completed when manufacturing ended in 1958. Numbers for the Tula Arsenal, which only built sniper rifles in 1943 and 1944, are not available, but certainly were much smaller than Izhevsk. The Tula rifles are identified by a five-point star with an arrow stamped onto the top of the chamber. The Izhevsk symbol was a hammer and sickle within a wreath in the same place.

There were two variants of the rifle – PE or PEM, and the PU, determined by which type of scope was mounted. (“PE” = unified model. “PEM” = unified model modern) Early PE scopes could be focused, later PEM could not. The first snipers were fitted with a 4 power PE or PEM scope that Russia reversed engineered from a Zeiss Zf-4. Later, a simpler 3.5 power scope, PU, was fitted. This scope has no means of focusing, so the sniper had to have perfect, or slightly better, eyesight. Its lower magnification made operation a bit easier, but what you saw is what you got.

To fit the scope, the bolt handle had to be turned down and lengthened. The scope mount attached to the left side of the receiver by a rail. The PE scope was used from 1931 to around 1939 (some sources say longer). The PEM was manufactured from 1937 to 1942. The lower power PU was built from 1942 to 1944.

Service dates: 1940-1950
Used by: Russia
Designer: Fyedo Tokarev
Number built: 51,710
Variants: SVT-38, AVT-40
Weight: 8.5 lb unloaded
Length: 48.3 inch
Barrel length: 24.6 inch
Cartridge: 7.62x54R
Action: Gas-operated semi-automatic
Muzzle velocity: 2720 fps
Effective range: 1,100 yards (scope)
Capacity: 10-round detachable magazine
Sights: Iron, 3.5 power PU scope

Feed Vasilievich Tokarev was a Russian weapons designer and deputy of the Supreme Soviet of the USSR. Among his many accomplishments was the M1940 SVT (Samozaryadnaya Vintovka Tokareva, Obrazets 1940 Goad – Tokarev Self-loading Rifle, Model of 1940.), built from 1940 through the end of the war.

Before the M1940, he designed the M1938 SVT. This rifle was the precursor to the M1940, but had a fair amount of problems. It was not able to stand up to combat as was learned in the Winter War of 1939 when Russia and Finland opened hostilities. However, more than 150,000 of the M1938 rifles were manufactured from 1938 to 1940.

The M1940 was the second most prolific semi-auto next to the M1 Garand of World War II. It was gas operated and the untrained Soviet conscript didn’t have the knowledge or ability to service he rifle. It had a complicated gas-operated short-stroke piston operating a tilting bolt that required special tools and depot-level training to service.

All M1940s had two grooves on top of the rear of the receiver, parallel to the bore where a scope mount could be clamped. The infamous PU scope was used. This, along with a precision bore, was all that set the sniper rifle apart from the service rifle. Due to the poor quality of Soviet wartime ammunition, and a very large muzzle blast from the 24.6 inch barrel, 4.4 inches shorter than the Mosin-Nagant’s barrel, the Mosin was preferred, and was the most prolific bolt action sniper rifle used in World War II.

Service dates: Army – 1943-end of war, US Marine Corps – entire war
Used by: US Army, US Marine Corps (M1941 Sniper Rifle)
Designer: Springfield Armory
Wars: World War I & II, Korean War, Vietnam War
Weight: 9.38 lb
Length: 43.21 inch
Barrel length: 24inch
Cartridge: 30-06 Springfield
Action: Bolt, 5-round magazine
Muzzle velocity: 2,800 fps
Effective range: 600 yards
Sights-Army: Weaver M73B1 2.2X power
Sights-USMC: Lyman 5A 5X power, Unertl 8X power

The M 1903 rifle was originally designed by the Springfield Armory in 1901. That version wasn’t accepted by the US Army. It was re-designed and the 1903 version was accepted. Selected rifles were fitted with telescopic sights from 1907 to after the First World War. Prior to the Second World War US Army interest in sniping was almost non-existent. When fighting in the Pacific Theater, the need for a long distance sniping rifle became evident. On January 18, 1943 Remington Arms received a contract to take 20,000 M1903A3 Springfield rifles from the production line and convert them to the M1903A4. The first was delivered February 1943.

The M1903A3 was modified by turning down the bolt handle for scope clearance, removing all iron sights, fitting permanent scope blocks, and installing a 2.2x power Weaver scope designated M73B1. As the war progressed, improved models of Weaver 2x scopes, the M81, M82, and M84, were implemented as they became available. The scope fit directly over the magazine negating the use of stripper clips when reloading. Rounds had to be inserted one at a time. With the scope mounted directly over the barrel, the “Model 03-A3” markings could not be read, so they were moved to the left side. These markings added some confusion when seen on an M1903A4, as no rifles were marked A4. Two types of stocks can be found on the M1903A4, A1 straight stocks and C stocks with a pistol grip.

The Marine Corps had their own sniper rifle based on the 1903 Springfield. They designated their 03A4 sniper rifle as “M1941 Sniper Rifle”. They used 03A1 National Match actions and star gauged or very accurate stock barrels. Stock was a Type C pistol grip and the upper handguard was modified to allow the front scope mount to attach to the barrel. The Marines had used similar rifles between the wars, but this new model didn’t see action until November 1943.

The telescopic sight was quite different from the Army’s Weaver scopes. Based on the military version of Unertl’s 8 power target scope, it was much more accurate. Hits could be made out to 1,000 yards when the rifle was in the proper hands of a well-trained Marine.

Type: Semiautomatic
Service dates: July 1944 -end of war
Used by: US service rifle
Wars: World War II
Designer: John C. Garand
Manufactured by: Springfield Armory
Effective range: 500 yards
Numbers built: 7,900 approx.
Weight: 11lb
Length: 43.5 inch
Barrel length: 24 inch
Cartridge: 30-06 Springfield
Sights: Lyman Alaskan – M73, M81, M82, 2.2 power

When the US Army entered into World War II in 1941, it did not have a dedicated sniper rifle. The Springfield M1903A4 was pressed into service while the Army Ordnance department evaluated different designs to convert a M1 Garand into an accurate semi-automatic sniper rifle. One of the major problems in the conversion was just how to mount the scope. As the magazine had to be fed from eight-round en-bloc clip from the top, mounting a scope on the rifle’s centerline was out of the question. Many different solutions were tried, and finally a mount that attached to the left side of the receiver and mounted an offset scope was ordered from Griffin and Howe. Five holes had to be drilled in the receiver to secure the mount. Two were used for tapered pins to align and steady the mount. Three were threaded for screws. To be able to see through the scope, a leather cheek pad had to be attached to the buttstock to position the solder’s eye properly.

A concern about muzzle flash led to a cone-shaped flash hider being adopted in January 1945. This proved to be of little use and could affect accuracy, so a lot of them were removed. The Lyman M73 2.2 power scope was originally fitted to the M1C, but as the war progressed the M81, and then the M82 became standard.

Problems with the scope and its mounts, and accuracy delayed M1C production woefully. It was not until the final months of the war in the Pacific in 1945 that the M1C entered combat. Less than 8,000 saw war service.

The M1D Garand differed from the M1C in scope and mount only. The scope base was permanently attached to the rear of the barrel and drilled and tapped to take a scope mount. A knurled screw allowed the mount with the scope to be easily removed. The scope was designated M84. The cone-shaped flash hider on the M1C was replaced by a slender barrel extension. Almost no M1Ds were manufactured or distributed to combat zones during World War II. In the early 1950s, they were converted from existing service rifles for use
in the Korean War.


Extract from Journeying Moon – first published 1958

Shortly after the war I married and tried to settle down to a life in London. Like most of my generation, though, I was infected by restlessness. Early in our lives we have been given a taste for the world of action, and too much adrenalin had gone through our systems for them to adjust easily to the routine of ‘nine to six’.

Our palates had been spoiled for the softer nuances of contentment. The after-lunch doze with the Sunday paper, the clatter of the lawn-mower, and the distant scrape and fiddle of BBC tea-time music seemed insipid after fevered nights in leave-time ports.

Of those who failed to make the adjustment, some emigrated, some took to drink, and some climbed mountains. Others – and I was among them – attempted the return to post-war living, found it unsatisfying, and then cut out new paths for ourselves. The welfare state was designed for the generation that followed us.

London was strange and uneasy in those immediate post-war years. It had something of the same smell about it that conquered Naples had at the time when Naples was the leave centre for our Anzio troops: a little dust much decay and the smell of corruption.

I remember the night-clubs thick with black-marketeers the well-fleshed smiler who knew where you could get whisky, and whose new Bentley echoed nightly with the giggles of loose-legged girls. People never fight for the world they get. They fight for the world they remember. Perhaps that is why so many returned soldiers make poor citizens.

I had an acquaintance, a Labour MP in the post-war government. He had never dined nor dressed so well in all his life before.

‘If you don’t like it,’ he said. ‘Why didn’t you get out of it? It’s a big world, my boy.’

You’re right, I thought, I will. Just give me time, just let me save a little money, and I’ll get going.

Four months later we left England for France. We had been waiting for two days inside the bar at Chichester harbour while a spring gale blew itself out down the Channel. The wind was dying now and the barometer was rising. Janet took the tiller while I heaved in the anchor. The voyage had begun.

Our boatMother Goose was a ten-ton cutter. A Dutch boeier, with a draught of only two feet, she was forty years old, clinker-built of galvanized iron on iron frames. She was as tubby, as solid, and as dependable as a Dutchman’s ideal hausfrau.

Her decks were teak, her curved aerofoil leeboards (which threw great fan-like shadows on the deck) were oak, and her saloon and interior were panelled in polished mahogany. With her dark blue hull, her red sails, her curved gaff, and elaborately carved tiller – which ended in a goose’s head – she was a romantic boat. Some had their doubts about her.

‘I wouldn’t mind her on the Broads’ said an ocean-racing friend. ‘But I wouldn’t like to be out in her in any real weather.’

‘You’ll never be taking that, midear, any far way from land,’ remarked old Jack, who was coxwain of the Fowey lifeboat. ‘Why, look at them there leeboards! No, midear, you want a good keel under you when you get out to sea. Them there boats is all very well for the Dutch.’

I heard many such arguments. When questioned closely as to where I really intended to take her, I hedged or remarked casually, ‘Well we might pick a quiet day and run over to France.’I never disclosed my real intentions. Certainly, even I never realised that within the 30-ft length of Mother Goose we should make our home for two-and-a-half years.

The first time that you make a departure for a foreign coastline in your own boat is as unforgettable as first love. There is a tension and a suppressed excitement about your actions. E ven routine details like taking a pair of crossed bearings to fix your point of departure assume a strange and satisfying importance.

Outside the bar we found that the wind had died but the sea was still running lumpily up the Channel. The grey sky was touched with faint light along the edges of the clouds. I sighted along the hand-bearing compass and called out the bearings to Janet who had the chart splayed out in front of her on the saloon table.

‘One nine oh degrees – the Nab Tower.’

‘Two three oh degrees – Culver Head. One degree easterly deviation on the compass.’

Fixed. The simple ‘mystery’ of the navigator’s art now held our small swaying world of food and books and iron and wood and us, located in one pinpoint on the Channel chart. The intersected lines that marked the boat’s position marked the start of our new life. In the act of taking two bearings we had crossed our Rubicon and established for all our lives the point of no return.

The kettle feathered a wisp of steam through the open hatch, and soon we were clasping mugs of hot coffee as we sat in the cockpit and listened to the suck and swallow of the sea against the ship’s side.

The wind died away, and the sails hung empty as a sailor’s pockets. I started our small twin-cylinder diesel engine, waited an anxious moment until its first asthmatic cough had settled down to a steady snore, and then lashed the tiller while the two of us lowered the sails.

Even under power Mother Goose left a clean sweet wake. Her rounded stern settled down or rose to the sea like a bird’s blunt tail. She lifted easily over the swell, and ran down its sides with a smooth, unhurried movement.

We were twenty-four hours out from Chichester Bar when we sighted Le Havre light vessel blinking and groaning in a cold white mist. As the lights of Le Havre faded against the dawn and went out we altered course for the nearest whistle buoy, whose sigh blended with the melancholy morning. The broken buildings and the stark lines of the reconstructed city came up past the headland. Shipping thronged the fairway, and a Chinese cook carrying a vast tea-pot along the decks of a merchantman, gave us ‘Good Day’ with a flash of teeth.

Janet and I looked at each other and smiled. The damp night air had crinkled our hands, and it sparkled in our hair.

The first leg of the voyage was over. Now we could confess to each other what we had never confessed to inquisitive longshoremen or even to friends: that this was not just a casual trip to France ‘only if the weather’s fine’. This was the end of one life and the beginning of another. We were bound up the Seine for Paris and beyond – through the canal for Lyons, then down the Rhone to Marseilles. Our course lay eastward to the dolphin-haunted waters to the islands of thyme and silver rock, and the high noon that leaves no shadows.

Note: We may earn a commission when you buy through links on our site, at no extra cost to you. This doesn’t affect our editorial independence.

Australian Economic History Review

From Economic History Society of Australia and New Zealand
Contact information at EDIRC.
Bibliographic data for series maintained by Wiley Content Delivery ().

Access Statistics for this journal.
Track citations for all items by RSS feed
Is something missing from the series or not right? See the RePEc data check for the archive and series.

Volume 57, issue 3, 2017 The Rise and Fall of Exceptional Australian Incomes Since 1800 pp. 264-290 David Greasley and Jakob Madsen Macroeconomic Consequences of Terms of Trade Episodes, Past and Present pp. 291-315 Tim Robinson, Tim Atkin, Mark Caputo and Hao Wang The First 100 Years of Tariffs in Australia: the Colonies pp. 316-344 Peter Lloyd The Evolution of an Intellectual Community Through the Words of Its Founders: Recollections of Australia's Economic History Field pp. 345-367 Claire Wright and Simon Ville Chinese National Income, ca. 1661–1933 pp. 368-393 Yi Xu, Zhihong Shi, Bas van Leeuwen, Yuping Ni, Zipeng Zhang and Ye Ma

Volume 54, issue 2, 2014 Essays in Latin American Business and Economic History: Introduction pp. 93-94 María Inés Barbero, Andrea Lluch, María Inés Barbero and Andrea Lluch The Contribution of Exports to the Mexican Economy During the First Globalisation (1870–1929) pp. 95-119 María Inés Barbero, Andrea Lluch and Sandra Kuntz Ficker American & Foreign Power in Argentina and Brazil (1926–65) pp. 120-144 María Inés Barbero, Andrea Lluch, Norma S. Lanciotti and Alexandre Macchione Saes Multinational Corporations, Property Rights, and Legitimization Strategies: US Investors in the Argentine and Peruvian Oil Industries in the Twentieth Century pp. 145-163 María Inés Barbero, Andrea Lluch, Marcelo Bucheli and Gonzalo Romero Sommer The Evolution of a Socially Committed Business Group in Colombia, 1911–85 pp. 164-182 María Inés Barbero, Andrea Lluch, Carlos Dávila and José Camilo Dávila Corporate Networks and Business Groups in Argentina in the Early 1970s pp. 183-208 María Inés Barbero, Andrea Lluch, Andrea Lluch, Erica Salvaj Carrera and María Inés Barbero


The Victoria Cross and the George Cross: The Complete History, review

Britain’s two premier medals for bravery, the Victoria Cross and the George Cross, have been awarded to fewer than 1,800 servicemen and civilians in their century and a half of existence. Given their rarity it is little wonder that individual medals can fetch up to £400,000 at auction.

Yet accurate details about the history of the awards, their recipients and where and how they were won, have been woefully inadequate until the publication of this three-volume authorised history. The fruit of more than a decade of research, and averaging just under 1,000 pages a volume, it is a monumental work of scholarship that will be the standard reference work on these gallantry awards for decades to come.

The Victoria Cross was founded in 1856 to emulate the French introduction of a national award that recognised outstanding bravery in battle. It was the first gallantry medal for servicemen of all ranks and remains “the highest and most prestigious recognition of exceptional valour in the face of the enemy”. Its civilian equivalent, the George Cross, was introduced during the Battle of the Blitz in 1940. “The need for a decoration to recognise heroism exhibited not immediately in the face of the enemy,” explain the editors, “but on a level with those actions awarded the Victoria Cross was highlighted by the extreme courage shown by civilians and Service personnel involved with bomb and mine disposal at the beginning of the Second World War.”

The editors dispel the rumour that current VCs are no longer made from the bronze cascabels (balancing weights) of Russian cannon captured at Sebastopol in 1855. “There is still enough metal from this source,” they explain, “to make 60 more crosses and the truncated cannon themselves can be seen at the Royal Artillery Museum at Woolwich.” So prone to shattering is this poor-quality bronze that the casting “has to be done in sand and takes much finishing, resulting in minor variations in the design”. No two medals, in consequence, are exactly alike.

The first batch of 62 VCs were awarded to Crimean War veterans by Queen Victoria at a special military review in Hyde Park on June 26, 1857. She was riding her horse Sunset and “wearing, for the first time, a field marshal’s uniform especially adapted in design”. What the editors omit to mention is that Victoria unwittingly pushed the pin of the medal into the chest of the first recipient, a legless veteran of the Battle of the Redan, who bore the eye-watering pain in silence. The queen was none the wiser.

The three volumes are chronological. The first covers 1854 to 1914, the second the First World War (when an astonishing 628 VCs were given) and the third the years since then. The last volume is a mine of fascinating information in the form of appendices that list, among other things, that eight VCs have been forfeited for later “crimes”, the last in 1908 that 323 VCs and 90 GCs were awarded posthumously that the unit with the most VCs is the Rifle Brigade (27), whereas the SAS has just one that five VCs and eight GCs have been awarded to children, the youngest an English boy called DC Western who was 10 when he rescued a friend from icy water in 1948 that 13 women have received the GC, including the SOE agent Mme Szabo in 1945 and that medics have won 55 VCs and eight GCs.

But it is the simple yet stirring prose of the official citations that best encapsulates the selfless heroism of people like John Bamford, James Beaton and James Ashworth (see extracts). Bamford was a young mineworker from Newthorpe in Nottinghamshire who suffered terrible burns rescuing his two younger brothers from a fire in 1952 Beaton the Royal Protection Officer who was shot as he shielded Princess Anne and Captain Mark Phillips from a would-be kidnapper in 1974 and Ashworth, a 23-year-old L/Cpl in the Grenadier Guards who died assaulting an enemy position in Afghanistan in 2012, the last recipient of the VC.

None is more moving than the GC citation for Cpl Mark Wright of the 3rd Paras who was mortally wounded in Afghanistan in 2006 as he tried to save a wounded comrade. Part of it reads: “Corporal Wright spent three-and-a-half hours in the minefield and… for a significant amount of that time he himself was very seriously wounded and in great pain… His complete disregard for his own safety while doing everything possible to retain control of the situation and to save lives constitutes an act of the greatest gallantry.”

It is hard – almost impossible – to read these citations and not shed a tear. The heroic actions of these extraordinary 1,800 or so men, women and children deserve the lasting recognition this fine publication will give them.


Find out more

Unauthorised Action: Mountbatten and the Dieppe Raid by Brian Loring Villa (Oxford University Press, 1989)

Canada at Dieppe by T Murray Hunter (Canadian War Museum, Ottawa, 1982)

Official History of the Canadian Army in the Second World War: Vol 1, Six Years of War by CP Stacey (Ottawa, 1966)

The Commandos 1940-1946 by Charles Messenger (William Kimber, 1985)

March Past by Lovat, the Lord (Weidenfeld & Nicolson, 1978)

The Green Beret: The Story of the Commandos 1940-1945 by Hilary St George Saunders (Michael Joseph, 1949)

Commando by Peter Young (Macdonald, 1970)


Women, Gender, and World War II

The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service.

The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others.

However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.

Keywords

Subjects

The wartime arenas where American women witnessed—and often helped to generate—crucial changes and challenges were wage-based employment, volunteer work, military service, and sexual expression. In each of these arenas, women exercised initiative, autonomy, circumspection, caution, or discretion according to their individual needs and the dictates of patriotic duty.

Wage Work and Opportunity

Economic opportunities abounded for women willing and able to seize them. Wage work in war industries offered hourly pay rates much higher than those to which most women had been accustomed, with the best wages paid in munitions plants and the aircraft industry. Women were encouraged to apply for “war work” after President Franklin Roosevelt created the U.S. War Manpower Commission (WMC) to mobilize Americans in various venues for a total war effort. In August 1942 , the WMC organized a Women’s Advisory Committee to consider how female employees could be used most effectively toward this end. Late in 1942 , the WMC announced a new campaign to recruit women workers after estimating that “the great majority” of some five million new employees in 1943 would have to be women. The WMC also identified one hundred U.S. cities as “Critical War Areas,” with intent to marshal the “widely dispersed” womanpower reserves in these cities. The main targets were local married women who already lived in the designated metropolitan areas, including middle-aged and older individuals who had never worked outside their homes or whose experience was limited to domestic work. A major challenge would be “to remove social stigma attached to the idea of women working,” the WMC literature noted. 1 Since the employment of married women had been a long-standing practice in working-class families and in the middle-class African American community, the WMC propaganda implicitly targeted white middle-class women who had not typically worked for wages.

Madison Avenue advertising agencies designed and produced a variety of propaganda campaigns for the U.S. government, including the WMC’s bold declaration and appeal late in 1942 : “Women Workers Will Win the War.” Local U.S. Employment Service offices coordinated efforts to place women in jobs best suited to their skills and family needs. Mothers with children under fourteen were encouraged not to seek employment outside their homes unless other family members or trusted neighbors could offer reliable childcare. 2 The propaganda campaigns generated posters, billboards, films, and radio announcements urging women to join the work force some touted their domestic skills as advantageous for carrying out defense work, since women were thought to excel at repetitive tasks requiring small operations with fine details. While the images overwhelmingly featured young, white, married women, an occasional entreaty announced, “Grandma’s got her gun,” referring to an elderly worker’s riveting tool. Several corporations with U.S. government contracts proudly sponsored chapters of the War Working Grandmothers of America. In Washington war agencies, the demographic defined as “older” meant “women over 35.” 3 Women of color rarely appeared in advertisements for industrial work, although their accomplishments and workplace awards were widely reviewed in African American newspapers and journals, including the NAACP’s principal publication, The Crisis, and the National Urban League’s Opportunity. Such coverage constituted a vital part of the “Double V” campaign, an effort launched by the black press to defeat racism at home while troops fought fascism abroad. 4

American women became artillery inspectors, aircraft welders, sheet metal assemblers, gear cutters, lathe operators, chemical analysts, and mechanics of all kinds. Length and depth of training varied according to industry, with many forced to learn quickly if not “on the job” itself. By 1944 , skilled female workers earned an average weekly wage of $31.21. In spite of federal regulations requiring equitable pay for similar work, their male counterparts in similar positions earned $54.65 weekly. 5 Years of experience in specific jobs accounted for some wage disparity between men and women but could not account for aggregate discrimination during the war years. However unequal their wages compared with men’s, women in defense industries out-earned most “pink collar” employees who held retail, service, or clerical jobs. Constance Bowman, a schoolteacher who spent the summer of 1943 working in a San Diego B-24 bomber factory, earned 68 cents an hour. A beginning sales clerk at the upscale Bullock’s Wilshire Department Store in Los Angeles earned about $20 week, two thirds of a factory worker’s salary. 6 If women were able to cross boundaries into the “masculinized” workplaces of heavy industry, they would be remunerated more handsomely than women who remained in safely “feminized” spheres of employment but they would not always see paychecks matching those of their male co-workers, even when they faced the same workplace challenges and hazards.

The Women’s Bureau (WB) at the U.S. Department of Labor sent field representatives to factories throughout the country to scrutinize working conditions. Among the WB administrators’ gravest concerns were endangered female bodies on factory floors, where safety seemed subordinate to management’s production quotas and workers’ personal style preferences. An alarming New York Times story announced in January 1944 that American “industry deaths” since the attack on Pearl Harbor had exceeded the “number killed in war” by 7,500. 7 The Labor Department tried to convince American women to prioritize safety when choosing work apparel: to wear safety shoes or boots rather than ordinary footwear and to don protective caps or helmets rather than bandanas and scarves. A WB analyst reported that “the most distressing accident” in war industry resulted from long hair catching in machinery. In Rhode Island a woman was “completely scalped” after her hair coiled on an assembly line belt. The Office of War Information (OWI), the U.S. government’s chief propaganda agency, produced documents illustrating proper and improper ways to style and wear hair in industrial jobs. The WB urged factories to adopt rules about head coverings as well as safety shoes and slacks. The Labor Department even designed “fashionable” caps and hats in a variety of shapes and colors, since their research concluded that women did not wish to look exactly like one another in the workplace. 8

More shocking than minimal head protection was the use of substandard footwear, which led U.S. Secretary of Labor Frances Perkins to sound a warning bell at a 1943 “Women in War Industries” conference. In her opening address, Perkins noted that most industrial accidents among women were in the “slip, fall, and stumble categories,” leading her to recommend that work uniforms include “shoes devised particularly to help women prevent” such accidents. 9 Perkins and others concerned about occupational safety had to contend with American shoe retailers—and their representatives in Washington—who insisted that women would want to wear their sandals, moccasins, and espadrilles to work. 10 Retail store managers were told they could assist in recruitment and retention of female defense workers by displaying attractive work clothes that promoted safety, neatness, and good health. 11 In spite of U.S. government war agencies’ directives to defense plants to enforce safety standards on all fronts, some Labor Department inspectors found that corporate managers would not comply until threatened with prosecution. 12

Munitions makers and retailers alike were encouraged to take women employees’ “health and beauty” needs seriously, providing them with cosmetics, soaps, and sanitary supplies to use in workplace restrooms and lounges. Such comfort packages would not merely attract employees but also keep them content and more likely to stay after they had been hired. 13 The Labor Department recommended a sufficient number of showers and lockers on site for particular industries, such as shipbuilding, where women preferred to travel to and from work in their “street clothes.” 14 Working women saw magazine advertisements instructing them to pay particularly close attention to skincare and personal hygiene, lest they lose their “femininity” in the much-altered economic and social landscape of wartime America. 15

Job opportunities and steady wages could not offset for many the hardships of fulltime employment: shift work, long commutes, limited childcare options, and inconvenient shopping hours for food and other necessities. Very few grocery and department store owners chose to accommodate women who needed to do their shopping in the late evening or night hours. That women workers got sick more often than men was attributed to the fact that they were doing, “in effect, two fulltime jobs.” 16 U.S. government promises to organize day care centers in war boom areas went largely unfulfilled, meeting the needs of a mere fraction of the large population of working mothers the public childcare project was not funded until 1943 , and “even then, the centers provided care for only 10 percent of the children who needed it.” 17

While limited training, sore muscles, and exhaustion from the home/work double shift discouraged many women, added burdens for women of color included workplace discrimination and harassment. They endured racial slurs and physical attacks in factories, and disproportionately filled the lowest-paid and least appealing jobs, including janitorial work. The Fair Employment Practices Committee (FEPC)—created by Executive Order 8802 in 1941 to address racial discrimination in industry—lacked the funds to handle the wave of complaints engendered by rapid wartime mobilization. When FEPC cases faced delays, black women searching for work or seeking promotions in their current jobs suffered the most. But women of color, like all American women, found their greatest challenge to be reconciling home life and work life during the war years. Opportunity magazine noted that black women in defense jobs grew “much more irritated than men by periods of standing around and doing nothing,” since they knew they could use the down time running errands for their second shift duties at home. One commentator suggested release of workers in factory down periods in order to promote “better morale” and to stem the tide of absenteeism, a significant problem among female employees eighteen months into the war. 18

American women were encouraged to consider every job a war job, however irrelevant a particular position might seem with regard to the military effort. Beyond riveting and welding, other tasks required even more hands and minds nationwide. The United States needed farm laborers, telephone operators, laundry workers, food servers, and bus drivers. Three million women cultivated crops in the federal agriculture program known as the Women’s Land Army. And while women had filled clerical positions for nearly half a century in the United States, the war accelerated the trend. Women took certain places as men vacated them, with the U.S. government offering hundreds of thousands of desk jobs to anyone who could file, type, and take dictation. The expanding bureaucratic structure of war was matched by private sector growth, where American businesses were forced to open their doors and offices to female employees. With the military draft taking its share of male, middle-class clerks and salesmen, openings for women abounded in the consumer economy. Radio stations, insurance firms, and advertising agencies hired more women than ever before. Banking, in particular, saw “feminization” in its employment ranks at the beginning of the war, some sixty-five thousand women worked in banking but by the end of 1944 , approximately one hundred thirty thousand women were bank employees, constituting nearly one half of the industry’s total personnel. 19

Volunteer Work

Beyond those who earned wages, millions of women donated their time, money, or both, especially in the realm of morale work. Those who cultivated a genuine spirit of volunteerism saw their work bear fruit, even though some groups were criticized for their “charity bazaar” approach. Images circulated of the rich snob who sat at a booth for a few hours a week but remained oblivious to real sacrifice. 20 A government handbook for the American Women’s Voluntary Service (AWVS) clarified the organization’s purpose as well its diverse membership in many states, where women carried out “real hard work.” They took classes on home repair and first aid, helped children, and learned practical wartime skills such as map reading, convoy driving, clinical photography, and Morse code. The AWVS affected every aspect of wartime culture, sending its members to assist military personnel, distribute ration books, sell war bonds, and collect salvage, as well as to recruit blood donors, nurses, farm workers, and child care workers, and to knit, sew, and recondition clothes for military families and relief agencies. 21

AWVS chapters took pride in their “non-sectarian, non-political, non-profit-making” status to encourage women from many backgrounds to join their ranks. Across the country the AWVS made strides in several socially sensitive areas including interracial cooperation. Indeed, African American women urged others to support the organization, because it “transcend[ed] any consideration of race, or color, or class, or caste.” The AWVS became a place where, through their work together, women could understand “each other’s problems and shortcomings and consciously or unconsciously, [develop] an appreciation of each other’s virtues,” one member reported. Interracial volunteer activities among women spurred optimism for a more inclusive postwar America while stimulating the growth of similar organizations where women could meet and serve a larger cause. 22

In the realm of “morale,” the presumed purview of women, one group enjoyed the spotlight above all others—the United Service Organizations (USO). In assisting and entertaining U.S. military troops, USO volunteers were asked to consider their work the female equivalent of military service. Through gender-defined actions and activities, USO volunteers were expected to assume particular mental and emotional postures when dealing with soldiers and sailors. The ideal USO junior hostess’s femininity quotient was determined in part by her ability to yield to a serviceman’s wishes within the boundaries of middle-class American womanhood. How she presented herself would determine the reactions of soldiers and sailors, she was instructed. Patience, general optimism, and good listening skills were a good hostess’s requisite qualities. Since many USO sites provided games, women played table tennis, checkers, and cards, and often allowed their male opponents to win. Such “gendered emotional work” meant women were not to appear too smart or too competitive to challenge a serviceman’s masculinity undermined the organization’s purpose of supporting male service members’ morale. As historian Meghan Winchell argues, “If a hostess made a serviceman happy, then she had done her job, and this, not meeting her own interests, theoretically provided her with satisfaction.” Her selflessness would presumably reinforce cultural gender norms and uphold social order in the midst of wartime crisis. 23

This requisite “cheerful selflessness” was matched by the initiative of women who chose to relocate near their spouses’ military installations. In packed trains and buses, often with young children in tow, they made their way cross-country to visit or live near their husbands. One observer called them “the saddest and most predictable feature of the crowded train stations and bus terminals.” 24 War brides on the move could easily identify each other and found comfort in their shared condition. 25 African American army wives who accompanied their husbands to Fort Huachuca, Arizona, lived in a squalid “unconverted barrack” outside the camp’s gates during the day they served the base as secretaries, janitors, cooks, food servers, launderers, and maids in white officers’ homes. But their main priority, according to a reporter for The Crisis, was “the morale of their menfolk.” 26

Military Service

Women who volunteered for military service posed a great challenge to the collective consciousness about gender and sexual norms and clear gender divisions, especially regarding who could be considered a soldier, sailor, or marine. The women in uniform closest to the front lines were nurses, government-sanctioned “angels of mercy” whose work Americans more readily accepted because it reflected expectations that women were natural caregivers. Precedent also helped to secure the public’s approval of women serving in this capacity both the army nurse corps and navy nurse corps had existed since the early 20th century, with more than twenty thousand military nurses serving during the First World War, half of them in overseas duty. But female volunteers in military organizations founded during World War II faced tougher scrutiny than nurses their womanhood and femininity were questioned by many detractors, even though the idea of national service for women was not new. As early as 1940 , First Lady Eleanor Roosevelt had recommended a required service responsibility (although not specifically a military duty) for all young American women. 27 Roosevelt did not get her peacetime wish, but after the U.S. declared war in December 1941 , the mobilization of women as assistants in the army seemed not merely plausible but imperative. U.S. Congresswoman Edith Nourse Rogers’ bill to that effect had languished since May 1941 , but in May 1942 , Congress approved it and President Roosevelt signed it, creating the all-volunteer Women’s Army Auxiliary Corps.

Three additional military units followed the creation of a women’s army. The women’s naval organization, Women Accepted for Volunteer Emergency Service (WAVES), was founded in July of 1942 the women’s coast guard, Semper Paratus Always Ready (SPAR), followed in November and finally, the U.S. Marine Corps Women’s Reserve (USMCWR) was established in February 1943 . All four of the women’s military groups were designed to release men who held military desk jobs and other stateside responsibilities for combat duty, something many men resented. In addition, because of the expansive mobilization of the military for the war, thousands of new clerical positions emerged in all branches of the armed services and this too inspired calls for female military personnel. As one colorful recruitment poster directed at women commanded, “Be A Marine. Free A Marine to Fight.” Recruiters had to proceed cautiously with a message whose logic told women that joining a military service organization would send more men to their deaths. Even so, the message reinforced gender differences—women might wear uniforms, march in formation, and be promoted, but only men could face enemy forces at battle sites. Thus, men continued to dominate the most masculine of human activities—warfare—which was further masculinized by U.S. government propaganda in the 1940s. 28

The Women Airforce Service Pilots (WASP) did not receive military status during World War II, but members participated in the American war effort by ferrying planes from factory sites to defense camps and embarkation points. These female aviators also tested new aircraft, hauled cargo, and assisted male pilots in training exercises. In 1944 , U.S. Army Air Corps General Henry “Hap” Arnold publicly declared WASP pilots as capable as their male counterparts. Thirty-eight women died serving in the WASP during its two-year existence ( 1942–44 ), yet none of the pilots’ families received government support for their funerals because the organization was not officially militarized. 29

Propaganda aimed at enticing women to join one of the military forces touted substantial base pay in addition to food, lodging, clothing, and medical and dental care. But the Office of War Information (OWI) insisted that recruitment messages refrain from appealing “entirely to the self-interest approach.” Women were not supposed to entertain individual needs or wishes, but instead to join for higher, nobler reasons: “patriotism and the desire to help our fighting men,” the OWI instructed. 30 Even so, years later, many female soldiers, sailors, marines, and pilots admitted to volunteering because they wanted an adventure or independence or both. 31

Figure 1. Recruitment poster created by the Office for Emergency Management, Office of War Information-Domestic Operations Branch, Bureau of Special Services, 1944 . U.S. National Archives (44-PA-260A).

In 1943 , the women’s army group discarded its “auxiliary” status to become an integral part of the U.S. Army and was renamed the Women’s Army Corps (WAC), a move that generated an outpouring of criticism, concern, and derision. Male GIs carried out a smear campaign against the organization. They spread rumors that WAC volunteers served as prostitutes for male soldiers, reinforcing a notion that army life encouraged promiscuity. Some wondered whether incorporating the WAC into the regular army meant that its members would—like their male counterparts—be issued condoms. Would army life encourage sexual activity among female volunteers? 32 Viewed not simply in ethical terms, women’s sexual autonomy was considered transgressive behavior that aligned them too closely with men in uniform, whose masculinity was often measured by their sexual prowess and emphasized during the war years. 33 The blurring or crossing of gender and sexual lines in this realm implied a social disorder that many Americans could not abide.

Worries about women’s sexual independence also inspired rumors of a “lesbian threat” in the WAC. In the 1940s, both American medical opinion and public opinion associated female sexual “deviance” as much with a woman’s appearance as her actions. Androgyny or, in wartime language, a “mannish” way, could mark a woman as suspect since she challenged the rules of femininity that grounded heterosexuality and secured a traditional social order. As women stepped into previously all-male venues during the war years, gender “disguise” could be interpreted as dangerous. Acutely aware of this, WAC director Colonel Oveta Culp Hobby ordered army women “to avoid rough or masculine appearance which would cause unfavorable public comment.” 34 In the spring of 1944 , female mechanics at Ellington Air Base, Texas, attended lectures about “proper dress for work” with a warning not to “roll up” the legs or sleeves of their coveralls. One Ellington mechanic wrote to her parents, “We are now buttoned and covered from tip to toe.” The OWI instructed advertisers and illustrators to show female soldiers in “complete G. I. uniform” and never “smoking or drinking alcoholic beverages,” concerns not voiced about men in uniform. These rules of propriety indicated the preeminent role that clothing played in assigning gender and sexual identities during the war. Even the appearance of impropriety could be grounds for dismissal and a dishonorable discharge. 35

Beyond the role of patriotic duty, the U.S. government’s preeminent recruitment message emphasized gender, declaring: “Women in uniform are no less feminine than before they enlisted.” In fact, officials hoped to appeal to women’s sartorial interests by using fashion plate graphic designs in recruitment literature. Illustrations of female soldiers posing as atelier models and department store mannequins displayed the numerous stylish items in a military wardrobe—from foundations to outerwear—together worth about $250. The idea was not only to recruit women but also to counter critics who railed against the idea of women’s military organizations in the United States. The tactics worked many volunteers admitted joining one organization or another because they liked the uniforms. 36

Enlistment criteria, training, and job assignments varied widely by organization. The WAC accepted volunteers with a minimum of two years of high school, while the WAVES required a high school diploma, with college “strongly recommended.” Female marines in the women’s reserve (WRs) needed at least two years of college credit. Their respective training models also bespoke their differences. While WAC recruits trained, lived, and worked at army camps, WAVES and WRs took instruction on college campuses. As a result of the varying minimum standards for enlistment in the women’s services, the WAC became home to a more ethnically and racially diverse population, and it enlisted women from a wider range of socio-economic backgrounds, including those who could not afford to attend college. More age-diverse as well, the WAC welcomed women between the ages of 20 and 50 who had no children under 14 years, whereas the WAVES, SPAR, and USMCWR limited their volunteer base to women between the ages of 20 and 36 who had no children under 18. Of the four women’s military services, only the WAC allowed its members to serve overseas. 37

To alert women to the army’s variety of needs and encourage them to volunteer, the WAC advertised “239 kinds of jobs.” Many recruits received specialized army training in radio, chemistry, mechanics, and other fields, while others brought previously honed skills, such as foreign language training, into the army. Bilingual Latinas, for example, were recruited specifically for cryptology and interpretation a special unit comprised of two hundred Puerto Rican WAC volunteers served at the New York Port of Embarkation and other locations dedicated to the shipment of U.S. troops. Nevertheless, some female soldiers were given tasks considered “women’s work” rather than jobs they had been promised or trained to do. WAC officer Betty Bandel discovered low morale among troops whose expectations about their roles were not met. The army had given them domestic tasks, similar to those they had held in civilian life, or it had failed to utilize the professional expertise they brought with them into service. Disappointed at what she and her colleagues interpreted as gender discrimination, Bandel confided to her mother that some Army Air Force units had even requested that Wacs do the pilots’ laundry and provide “troop entertainment.” 38

Women of color who wished to join military units faced steep discrimination. Excluded from the WAVES and SPAR until November 1944 , and excluded from the wartime marines or WASP, sixty-five hundred African Americans joined a segregated women’s army. As one of the first female African American army officers, Charity Adams experienced vicious discrimination at Ft. Des Moines on several occasions. Early in her training, a higher-ranking white male officer—a fellow South Carolinian—excoriated Adams for appearing at the officers’ club one evening. In his lengthy peroration, Adams stood silently at attention while the colonel reminded her about segregation laws, the southern past, racialized slavery, and her “place” in this scheme. 39 Adams persevered at the Iowa base, rising in the ranks to major and commanding an all-black battalion of eight hundred fifty women assigned to a postal unit in Great Britain and France in 1945 . But she spent many hours at Ft. Des Moines tending to “extra” duties that fellow soldiers expected of her because she was black one of those tasks was cultivating the small Victory Garden at their barracks. Other women of color in uniform were assaulted at southern railway stations, denied access to facilities and dining cars on trains, and treated with disdain in towns near their bases and well beyond. 40

Japanese American women, initially barred from joining the Women’s Army Corps, were admitted beginning in November 1943 , but organization officials preferred that news outlets not publicize the inductions of Nisei women. 41 The WAVES, the second largest women’s military organization, did not accept Japanese American volunteers during the war. The pervasiveness of anti-Japanese sentiment adversely affected U.S. citizens of Japanese ancestry, many of whom strove to prove their loyalty in the face of embedded racism and a nationwide hatred that took even deeper root among white supremacists as the 1940s wore on. 42

Sex, Marriage, and Motherhood

Loosening sexual mores, skyrocketing marriage rates, and a burgeoning baby boom characterized the war years. Casual sexual relations among the unmarried startled many Americans, who blamed young women—especially those who worked outside their homes—for shifting standards. Government propaganda associated the spread of sexually transmitted diseases, such as syphilis and gonorrhea, with women rather than men by casting disease carriers as female. 43 Among the most vulnerable to infected women, official media suggested, were America’s men in uniform. Posters warned: “She May Look Clean—But” and, in 1941 , before the United States entered the war, the May Act declared prostitution near U.S. defense camps a federal crime. Yet the vast wartime mobilization effort combined with the cultural politics of the early 1940s provided American women a wide berth to express and enjoy sexual intimacy in the name of patriotism. Many who migrated to war boom cities and military installments left behind constraints on sexual behavior that had guided them in their home communities. As circumstances “opened up new sexual possibilities,” women more freely explored their erotic desires. 44 For example, lesbians socialized, fell in love, and “began to name and talk about who they were,” contributing to one of the war’s significant legacies, the establishment and reinforcement of lesbian and gay communities. 45 At the same time, shifting social standards made more women open targets for sexual innuendo and unwelcome invitations from strangers San Diego factory worker Constance Bowman wrote about cat calls and whistles and, on one occasion, a marine stalking her down a street with the persistent entreaty, “How about a little war work, Sister?” 46 The intersections of rapid defense mobilization, loosened social constraints, and greater female sexual autonomy created a home front where women became a “suspect category, subject to surveillance for the duration of the war,” Marilyn Hegarty argues. 47

Paradoxically, in the midst of wartime fear and surveillance of women’s sexuality, female allure and glamour were used to sell everything from laundry detergent to soda pop to troop morale. The World War II years marked the heyday of the “pin up girl,” and an unprecedented display of American women’s bodies movie stars such as Betty Grable, Rita Hayworth, and Lana Turner posed seductively for photographers and other artists, whose prints, posters, and calendars were reproduced in the millions and circulated widely. Ordinary American women copied these poses in photographs that they sent stateside to military camps and overseas to battlefronts. 48 And many women took the next logical step by literally offering their bodies—out of patriotic duty, to cap a brief encounter, or to seal a romantic relationship. 49

High U.S. marriage rates during World War II created a “Wartime Marriage Boom.” Between 1940 and 1943 , some 6,579,000 marriages took place, yielding over 1.1 million more marriages than rates in the 1920s and 1930s would have predicted. 50 A “bridal terror” had emerged soon after the Selective Service Act of 1940 initiated the United States’ first peacetime draft, and a rumored “man shortage” took hold of the American imagination midway through the war. Early on it was unclear how marriage and parenthood might affect military deferments, leading couples to tie the knot with expectations of securing extra time. In addition, with the wartime draft extending to males between the ages of 18 and 45, the pool of eligible men for marriage had presumably shrunk. By 1944 , rising U.S. casualty figures also contributed to the alarm. In large cities and defense camp areas, where soldiers and sailors congregated before deployment, “the urge to send men away happy meant numerous intimate liaisons, quick marriages, or both.” Many couples barely knew each other before taking their vows. A 1944 U.S. Census Bureau survey revealed that more than 2.7 million young, married women had husbands away in the armed services. The following year, the U.S. Census Bureau reported that more marriages had occurred “in each of the past four years than in any prior year in the history of the United States.” 51 War mobilization encouraged many couples to marry sooner than they had planned and others to marry soon after meeting each other. Many of these long distance relationships unraveled over the war years, with the high wartime marriage rates resulting in the highest divorce rates in U.S. history. 52

A baby boom accompanied the marriage boom, and many young mothers were left alone to care for their children and make ends meet. The more resourceful of them pooled their funds by “tripling up” in apartments, splitting the rent and food costs, and sharing childcare and housekeeping responsibilities. 53 Others found childcare where they could in order to take advantage of defense industry jobs. These working mothers received limited assistance from federally sponsored childcare facilities that had been authorized under the 1940 Lanham Act, an extension of the Depression-era public works projects. Underfunded and concentrated primarily in war boom areas, federal childcare centers served some six hundred thousand children during the war years yet at their greatest use, they served only 13 percent of children who needed them. Americans’ steadfast belief in a mother’s responsibility to remain at home with her children persisted during World War II even the war emergency failed to temper this deeply entrenched, middle class standard. 54 The notable exception to otherwise meager organized childcare assistance came on the west coast, where the Kaiser Shipbuilding Company provided its female employees in Washington, Oregon, and California with reliable, well-staffed facilities. The Richmond shipyards in the San Francisco Bay area oversaw approximately fourteen hundred children daily. 55

Figure 2. Josie Lucille Owens, Kaiser Shipyards, Richmond, California.

Working mothers were forced to make difficult choices during the war years. Some chose second shifts or night shifts, so they could be with their children during the day and work while they were sleeping. Others who worked day shifts were criticized for leaving their children. In several defense boom areas, social workers and school staff speculated that women entering the work force were spurred by “additional income and a too great readiness to evade full responsibility for their children” rather than “patriotic motives.” 56 Pressure on mothers to assume full responsibility for their children intensified during the war years, as reports of increasing juvenile delinquency appeared in magazines and newspapers. In A Generation of Vipers ( 1942 ), Philip Wylie criticized “Mom” for many “social discomforts and ills,” particularly the problems of American youth. FBI Director J. Edgar Hoover instructed mothers to stop “the drift of normal youth toward immorality and crime,” telling them not to take war jobs if their employment meant “the hiring of another woman to come in and take care of [their] children.” American society, in spite of the wartime emergency, barely budged on its expectations of working mothers. 57

Figure 3. “And then in my spare time . . .” Bob Barnes for the Office of War Information, ca. 1943. Prints and Photographs Division, Library of Congress (LC-USZ62-97636), digital ID: cph 3b43729.

Mobility, Sacrifice, and Patriotic Duty

Women’s growing independence during World War II was visibly characterized by their mobility. The cities, towns, and camps attracting them were located on both coasts and everywhere in between—Washington, DC, Seattle, Portland, Mobile, Detroit, St. Louis, and numerous other places where the prospects of war work, steady wages, or other opportunities beckoned. Some traveled occasionally to see their sweethearts, sons, and husbands, while others took to the road daily or weekly to punch time clocks in defense factories. Extending and expanding the Great Migration from the rural south to urban, industrial America, black women entered shipyards, ordnance plants, and bomber factories in unprecedented numbers.

Industrial growth and military mobilization allowed women to crisscross the nation in trains and buses, but their new mobility caused many Americans a sense of uneasiness and discontent. Women who traveled or lived alone were viewed with suspicion, while those who crowded into teeming defense areas, with or without their families, were often treated with scorn by local residents. In Portland, Oregon, community women criticized female shipyard workers who came into town “dirty and tired” at the end of their shifts. In Mobile, Alabama, a woman berated newcomers as “the lowest type of poor whites, these workers flocking in from the backwoods. They prefer to live in shacks and go barefoot . . . Give them a good home and they wouldn’t know what to do with it.” Many were met with the Depression-era epithet, “Okies.” In addition to the contempt they endured, migrants had to tolerate conditions that posed health risks: overcrowded boarding houses, makeshift accommodations, brimming sewers, limited water supplies and hard-pressed local schools. 58

In the nation’s capital, thousands of women who answered the persistent calls for office workers—a “Girls for Washington Jobs” campaign—created a “spectacle” that “staggered the imagination.” The women arrived in the city to find substandard lodging, if they found it at all. Construction on U.S. government residence halls that had been promised to unmarried female workers lagged months behind schedule, forcing women to find rooms in boardinghouses run by mercenary landlords or strict matrons. 59

Testing a woman’s conscience about her full participation in the war effort was commonplace in home front propaganda. She was supposed to want to undertake defense work, volunteer positions, or join a women’s military organization in order to support combat troops and out of a sense of patriotic duty. To use such positions to launch personal independence of any kind—especially financial—could be viewed as selfish or even reckless. African American sociologist Walter Chivers observed, in 1943 , that black women who thought they had left domestic work behind by seizing defense jobs would once again “have to seek employment in the white woman’s home.” An appeal for more military nurses late in the war asked: “Is Your Comfort as Important as the Lives of 15 Wounded Soldiers?” 60

Women were advised to spend their extra coins and dollars on war bonds or other U.S. government initiatives. The 1942 handbook Calling All Women advised that a ten-cent war stamp would purchase “a set of insignia for the Army” or “five .45 cartridges for the Marine Corps.” The 6th War Bond Drive in 1944 included a “Pin Money War Bond” promotion for women who previously had been unable to afford to buy bonds whether unemployed or underemployed, they could spend pennies and nickels to fill a “stamp” album that would eventually convert to a war bond. Eleanor Sewall, a Lockheed Aircraft employee whose husband was captured on Bataan, was heralded by the company for her decision to contribute 50 percent of her salary in payroll deductions toward war bonds. Beyond such an investment’s practical value in assisting the government, less disposable income for women would limit paths to financial independence that could be viewed as self-serving. Sacrifice in the cause of patriotic duty would temper desires for—and achievement of—personal autonomy. 61

Among many American women who sacrificed during the war were those who served near the front lines or had family members in military service. The sixty-six nurses who were captured by the Japanese on Corregidor spent three years in Santo Tomas prison camp in Manila. Besides sharing scarce food and limited supplies with three thousand other American and British prisoners, they shared three showers and five toilets with the five hundred other women there. 62 American mothers, wives, sisters, and sweethearts together lost more than four hundred thousand loved ones—the U.S. death casualty count—during the war. The writer Zelda Popkin noted that some women became “widows before they were really wives.” 63

Lasting Changes

Amidst sacrifice and loss, many American women clung to the opportunities extended to them during World War II. Prewar gender expectations had been tested and found wanting. Susan B. Anthony II, great-niece and namesake of the women’s suffrage fighter, argued in 1944 that women had proven their abilities in every field and therefore deserved “equal pay for equal work, a right grudgingly acceded” them during the war. Having worked all three shifts as a grinder in the Washington Navy Yard machine shop, while her fifty-six-year-old mother worked at a Pennsylvania radar factory, Anthony was confident that war’s end would “mark a turning point in women’s road to full equality.” 64

If the Allies’ fight for “freedom” meant personal independence, then American women had embraced it in the early 1940s. Of the “Four Freedoms” articulated by President Roosevelt in 1940 , “freedom from want” and “freedom from fear” went a long way in explaining why some American women enjoyed the financial, social, and emotional rewards of the war years. The large number of those who developed skills and carried out new work, who put on military uniforms, married quickly, engaged in sexual activity freely, or moved several hundred miles away from home—or all of these—did so inside the grander framework of national and global crisis. Out of crisis, the most meaningful transformations emanated from the confidence they developed and the independence they felt and exercised. Many feared these would fade or be retracted after the war, and their fears were justified. From popular culture to social commentary to political leadership, powerful voices urged women to “go back home to provide jobs for service men,” despite the fact that the jobs many held were not available to servicemen before the war and that many returning servicemen had not worked for wages regularly in the 1930s. 65 Numerous surveys and polls of female workers found that most wanted to remain in the work force rather than return to their prewar employment conditions. 66 Efforts to “contain” women during the late 1940s and convince them to embrace a middle-class dream where they would play starring roles as domestic goddesses in their own homes eventually backfired. 67 Their wartime experiences combined with collective memory not only affected their daughters, sisters, and friends directly, but also reinforced the deep foundations of the equality crusades—from civil rights to women’s rights to workers’ rights to gay and lesbian rights—that would take center stage in the postwar generations.

Discussion of the Literature

Women featured in a few early histories of the Second World War, but they did not receive much scholarly notice as a group until the late 1970s, after the women’s movement and the field of women’s history had gained traction. The simultaneous influence of social sciences on history contributed to the heightened interest in women as subjects—they could be counted, plotted on graphs, and studied in the aggregate, especially as war workers. Thus the earliest scholarship highlighted women’s contributions to U.S. success in World War II, particularly through their work as builders and inspectors of military equipment. Leila J. Rupp’s book Mobilizing Women for War: German and American Propaganda, 1939–1945 ( 1978 ) focused on the U.S. government propaganda campaigns to get women into the factories and other places of employment and to keep them there for the duration. 68

In the 1980s, four landmark works appeared, establishing the vital role of American women in the Second World War and positing an essential question: How did women’s work for wages affect their abilities as wives, mothers, and homemakers? In Wartime Women: Sex Roles, Family Relations, and the Status of Women during World War II ( 1981 ), Karen Anderson focused on three of the fastest-growing industrial areas for war production: Detroit, Baltimore, and Seattle. Anderson unveiled the underside of these burgeoning urban workplaces, with their racial tensions and violence, age discrimination, and unfulfilled government promises to working homemakers who needed assistance with shopping, meal preparation, and child care. Susan Hartmann’s The Home Front and Beyond: American Women in the 1940s ( 1982 ) launched Twayne’s American Women in the Twentieth Century series, a chronological history organized by decade. That Hartmann analyzed the 1940s, whole and entire, allowed readers to see the social and political forces operating to encourage the maintenance of traditional, clearly defined gender duties in postwar America ( 1945–1949 ), namely homemaking and motherhood for women. 69

In 1984 , D’Ann Campbell published the cleverly titled Women At War With America: Private Lives in a Patriotic Era, a work that approached various groups of American women in terms of their roles and resources. Using the rich material produced by social scientists and their organizations during the war, Campbell combined the techniques of both a social scientist and humanist to show that military women, homemakers, stateside service wives, and female industrial laborers, among others, fared much worse on all fronts than one group singled out and heralded because their work fit within acceptable gender parameters: nurses. All of these groups had gone to war, many answering the numerous calls to assist however they could, but Campbell demonstrated that American women remained at war with a nation that extended opportunities to them while simultaneously reining them in. 70

The fourth significant book published in the 1980s, Maureen Honey’s Creating Rosie the Riveter: Class, Gender, and Propaganda during World War II ( 1984 ), revealed how high-circulation magazines aimed at particular audiences sought to appeal to women on the basis of class status and values. In addition to these four important works, Alice Kessler-Harris and Ruth Milkman also conducted studies in the 1980s on the challenges women faced during World War II as laborers. By the end of the decade these historians and other scholars generally agreed that the war had offered myriad and measurable opportunities to women of all races and at all socioeconomic levels, but the options proved temporary, resulting in little significant redefinition of cultural gender norms that had cast women primarily as wives and mothers. 71

This early scholarship was enriched by oral history projects begun in earnest in the 1980s, notably Sherna Berger Gluck’s interviews of southern California war workers in Rosie the Riveter Revisited: Women, the War and Social Change ( 1987 ), a collection that encouraged scholars to follow Gluck’s lead in focusing on personal narratives of women who now seemed comfortable talking candidly about their wartime experiences. Oral history projects would flourish in the 1990s, as fiftieth anniversary commemorations of U.S. involvement in World War II not only marked specific events but also prompted an urgency to record aging participants’ stories. Scholars’ concentration on particular locales or geographic regions, as well as specific groups of women or the jobs they carried out became organizing principles for a succession of oral history collections, some available online and others in print, such as Cindy Weigand’s Texas Women in World War II ( 2003 ) and Jeffrey S. Suchanek’s Star Spangled Hearts: American Women Veterans of World War II ( 2011 ). 72

While oral history projects flourished in the 1990s and beyond, Judy Barrett Litoff and David Smith began soliciting, collecting, and publishing as many wartime letters as possible. Their quest, begun in 1990 , continues a generation later, with an amassed total of over 30,000 letters written by women. Litoff and Smith’s edited collections remain a starting point for any scholar pursuing the voices of ordinary American women who corresponded during the war. 73

The emerging field of cultural studies influenced scholarship from the 1990s forward, bringing gender and sexuality to the fore. The questions raised by cultural studies required scholars to consider the intersections of race, ethnicity, class, and sexuality as central elements in how women were viewed and what they experienced as a result. In Abiding Courage, Gretchen Lemke-Santangelo surveyed African American women who had migrated to northern California’s East Bay area, where employment in the shipyards and auxiliary industries offered economic opportunities unavailable in the Jim Crow south. Leisa D. Meyer’s Creating GI Jane revealed the myriad challenges, both real and imaginary, posed by a women’s army—notably Americans’ views on who could and should be a soldier and what that meant for a social order dependent on clear-cut gender norms Meyer was one of the first to analyze lesbian Wacs during WWII. Maureen Honey’s edited collection of primary sources, Bitter Fruit: African American Women in World War II ( 1999 ), investigated how women of color were depicted in popular culture, including the African American press, and how they negotiated these characterizations in addition to the challenges of wartime mobility, displacement, and opportunity. 74

In recent years, scholars examining American women during World War II have synthesized and built on the foundations laid by the previous generation, taking further the equations linking gender, sexuality, personal autonomy, and the media’s role in guiding individual and collective self-awareness, behavior, and cultural values. The historians’ titles reveal not only the characterizations of wartime women but also the pressures brought to bear on them during the crisis: Marilyn Hegarty’s Victory Girls, Khaki-Wackies, and Patriotutes: The Regulation of Female Sexuality during World War II ( 2008 ), Meghan K. Winchell’s Good Girls, Good Food, Good Fun: The Story of USO Hostesses during World War II ( 2008 ), and Melissa A. McEuen’s Making War, Making Women: Femininity and Duty on the American Home Front, 1941–1945 ( 2011 ), all pose research questions that uncover uneasy truths about the measured oversight and careful management of American women during a U.S. war inspired by and fought to defend “freedom.” Similar questions remain today as historians still seek to understand how U.S. propaganda agencies, and American media in general, depicted women during the war, and what this meant to them, to those conducting the war effort, and to the nation at large. 75

Primary Sources

Primary sources depicting or targeting American women during World War II—including photographs, posters, cartoons, advertisements, letters, government documents, and oral history interviews—are available in several major collections, most notably at the Library of Congress, the National Archives at College Park, Maryland, and Duke University’s Rubenstein Library.

A good place to initiate any study of women on the home front is with “Rosie Pictures,” a selection of images of wartime workers from the Library of Congress, Prints and Photographs Division. The representative sampling in “Rosie Pictures” hints at what may be found among the library’s vast holdings of visual images, including the invaluable Farm Security Administration-Office of War Information Collection, comprised of 175,000 photographs taken by U.S. government photographers who traveled throughout the nation between 1935 and 1944 . The collection has been carefully curated, with each item fully described and contextualized, and nearly all of them digitized.

The National Archives Library Information Center (ALIC) has organized information on women topically, so that the subject of war may be pursued from several angles and according to themes such as “women in the military” or “African American women.” Links to a variety of websites containing women’s history materials—though not necessarily items housed in the National Archives—may be found at the ALIC’s reference hub on Women. Millions of the U.S. government’s paper records not yet digitized are available at the College Park research facility, including documents produced by federal agencies created during the Second World War for specific objectives, such as the Office of War Information, the War Manpower Commission, and the War Production Board. At the U.S. Department of Labor, the Women’s Bureau generated countless pages of reports during the war, and all are available to researchers who visit the National Archives.

Duke University’s Rubenstein Library houses a variety of primary source materials in several major collections, including the War Effort Mobilization Campaigns Poster Collection, 1942‐1945, and the extensive Guide to the J. Walter Thompson Company. World War II Advertising Collection, 1940‐1948. Additional collections located in the John W. Hartman Center for Sales, Advertising, and Marketing History at the Rubenstein Library offer such resources as roadside billboard advertisements and department store window displays, designed to appeal to female consumers in the 1940s. Finally, among Duke University Libraries’ Digital Collections is Ad Access, a database of magazine and newspaper advertisements that features over 1,700 items from the war years, including official propaganda and many promotions directed specifically at women.

Three other significant primary sources collections deserve attention and offer scholars insight into women’s lives and experiences during World War II. Interview transcripts and video excerpts of interviews conducted for the “Rosie the Riveter WWII American Home Front Project” by the Regional Oral History Office at the University of California, Berkeley, are available at the Bancroft Library site. Northwestern University Library’s World War II Poster Collection contains 338 items, thoroughly identified and contextualized and at a high resolution to facilitate close analysis, many of them featuring women. Images are available as high-resolution files for close analysis. For wartime correspondence, there is no better starting point than the U.S. Women and World War II Letter Writing Project, developed by Professor Judy Barrett Litoff at Bryant University, and housed there in 175 boxes. Several hundred letters are available as PDFs on the project site, along with a helpful Finding Aid to the entire collection, prepared by Litoff.

A number of museums and special exhibits devoted to American women’s roles and contributions in World War II contain valuable primary sources and historical analysis. These include: The Farm Labor Project: Brooklyn College Oral Histories on World War II and the McCarthy Era, Brooklyn College “Focus on: Women at War,” See & Hear Collections, The National World War II Museum, New Orleans National WASP World War II Museum, Sweetwater, Texas “Partners in Winning the War: American Women in World War II, National Women’s History Museum, Alexandria, Virginia “Women Come to the Front,” Library of Congress “WAVES, World War II, Establishment of Women’s Reserve,” Naval History and Heritage Command and “World War II: Women and the War,” Women in Military Service for America Memorial Foundation, Arlington, Virginia.


Masculinity, Shell Shock, and Emotional Survival in the First World War

The First World War has shaped British imaginings of war for nearly 100 years now. The content of these imaginings has undoubtedly changed over the decades, as a recent flurry of scholarship on the myth and memory of the war has argued, but from the Armistice right up to the present moment, the events of 1914–18 have been a crucial reference point for those seeking to understand not only war, but the world around them. Few would now agree with the sweeping claim in Paul Fussell’s influential The Great War and Modern Memory (1) that the First World War represented an absolute, unbridgeable break with the past yet few would deny the truth in his assertion that it was the crucible in which the modern world was forged. If nothing else, the fact that the war is still debated in terms of its status as the originating moment of modernity tells us much about its place in the contemporary imagination, and by extension the contemporary cultural and historical landscape paradoxically, this is proof of the extent to which the world we know now is shaped by that war.

The key verb here is ‘imagine’. In 1990, the literary scholar Samuel Hynes argued that the First World War was an imaginative as well as a military and political event. It fundamentally altered the way men and women thought about the world, and as an imaginative event, it changed reality. In A War Imagined: the First World War and English Culture Hynes explored this transformation through the prism of literature and literary culture. In the later stages of the book, almost in passing, he suggested that one such fundamental shift was in the stock ‘cast of characters’ in imagined wars. The martial hero no longer took centre stage unchallenged. Instead, he was superseded by, or at least jostled for space with, the coward, the frightened boy, and the shell shock victim.(2)This is not to suggest that the First World War killed off the soldier hero: as Michael Paris as shown, the ‘pleasure culture’ of war continued to exert a powerful hold long after 1918.(3) It is rather that after the First World War, trauma was an ever-present possible outcome of war, even if it was not present in every imagining. After 1918, the warrior hero acquired a shadow self the broken mental patient in a military hospital, or the silent and haunted veteran, would always be waiting in the wings.

This shadow self has emerged into the light. Indeed, some would argue that it has been over-exposed. Today, the shell shocked soldier holds a central place in British imaginings of the First World War. In the factual reporting of newspapers and history books no less than the fictional recreations of novels and films, traumatised victims of the war claim their place alongside its heroes. The psychological effects of the war were of course widely discussed even while it raged, and forcefully represented in fiction, drama, and autobiography during and after the conflict. Yet the scale of the recent incorporation of shell shocked veterans into narratives of the First World War is new. This is evident from even a brief glance at a chronological record of historical publications on shell shock. We are now so accustomed to viewing shell shock as an integral part of the history of the First World War that it is surprising to realise that it was only in 2002, with Peter Leese’s Shell Shock: Traumatic Neurosis and the British Soldiers of the First World War (4), that the first full-length English language historical monograph on trauma in this conflict was published.

Since the mid-1990s, the volume of discussion on the topic in journals, edited collections, and sole-authored monographs on overlapping areas had mushroomed. Leese’s book confirmed the status of shell shock as a hot new historical topic, but did not mark the culmination of a trend. Its publication coincided with that of Ben Shephard’s War of Nerves, which fitted the story of shell shock into a coherent, over-arching narrative, and became an indispensable reference work for every student of the subject. Successive monographs followed over the next few years: Paul Lerner’s Hysterical Men (2003), Peter Barham’s Forgotten Lunatics of the Great War (2004), Edgar Jones and Simon Wessely’s Shell Shock to PTSD (2005), and most recently Fiona Reid’s Broken Men: shell shock, treatment and recovery in Britain, 1914-1930 (2010).(5) As some of these titles suggest, the recent surge of interest in shell shock is undoubtedly related to a wider interest in the ‘genealogy’ or historical construction of trauma, which has spawned an enormous body of boundary-crossing research.(6) In turn, many factors have no doubt contributed to the popularity of histories of trauma, including anxieties about the ongoing and long-lasting effects of war past and present, the perceived growth of ‘therapy culture’, and the maturation of the history of psychiatry as a discipline. The result, however, is a renewed attention to shell shock, quite often in the context of a search for the origins of the distinctively modern concept and experience of trauma. In many of these writings, as elsewhere, the First World War is the defining event which spawned modern ways of being.

A quest for the origins of traumatic modernity has provided one impetus to scholarly research on shell shock in the First World War. Another spur was the emergence of gender as a category of analysis from the late 1980s, and the subsequent realization that perceptions and experiences of masculinity, like femininity, were not monolithic, self-evident, or historically constant. An early analysis of shell shock as a gendered diagnosis was made by Elaine Showalter in her The Female Malady: Women, Madness and English Culture, 1830-1980. The chapter on shell shock in this book, which Showalter expanded on in several other articles and essays, has been so influential that the extent to which she was a lone pioneer is easily forgotten.(7) Showalter was frequently cited in the years following publication of The Female Malady indeed, along with Eric Leed and Martin Stone, she was part of a holy triumvirate which more or less defined historical writing on shell shock for at least a decade. Her work demonstrated that shell shock was fertile ground for historical explorations of gender, but until 1996, when Joanna Bourke’s Dismembering the Male: Men’s Bodies, Britain and the Great War (8) was published, it remained the most in-depth consideration of the topic.

In the meantime, historical studies of masculinity moved on. In 1985, Showalter could write about the Victorian masculine ideal with justified confidence that readers would know what she meant and share her definition: an ideal based on the stiff upper lip, self-control, self-restraint, and will-power.(9) This ideal has not been written out of existence, but it is no longer self-evident. It sometimes seems as though the history of masculinity is written as a series of questions: not only ‘What should historians do with masculinity?’ but ‘what should historians do with heroes?’, and, more demandingly, ‘what have historians done with masculinity?’(10) These echoes of John Tosh’s agenda-setting 1994 essay on masculinity testify to its influence, and to a new sense of maturity within a relatively young discipline. The cumulative tendency of recent scholarship has been to emphasise that in the years leading up to 1900 and beyond Victorian models of masculinity were subject to internal and external redefinitions, generated powerful anxieties regarding male identity, and did not permeate the value structure of large swathes of the working classes.(11) Moreover, studies of gender, war, and identity have uncovered the extent to which women were able to adopt, adapt, and exploit apparently ‘masculine’ ideals of service and patriotic duty in pursuit of personal fulfilment or collective goals.(12) As the certainties surrounding the historical construction of gender have crumbled, and as trauma studies have blossomed, the appeal of shell shock as a window into the effects of war on masculine ideals and male subjectivities has widened.

The four books under review are all manifestations of these historical trends. Mark Micale’s Hysterical Men: the Hidden History of Male Nervous Illness (2008) is a panoramic survey of the history of male hysteria, stretching from c.1900 BC to c.1900 AD. The very existence of this book demonstrates how far histories of psychiatry and masculinity have come in only a few short years. In 1995, when Micale’s magisterial historiographical survey of writings on hysteria, Approaching Hysteria: Disease and its Interpretations was published, only a handful of pages could be devoted to male hysteria because so little had been written on the topic. Much of the short discussion centred on Micale’s own research and Showalter’s essays on shell shock, although Micale could also point to recent research on literary male nervousness and was sanguine that future scholarship would be fruitful.(13) In the intervening years, awareness of the extent to which the construction of mental illness is a gendered process has grown, yet hysteria has continued to be identified as a female malady. Hysterical Men therefore fills a noticeable gap in the literature, and although it stops short of 1914, it is a history with important consequences for understandings of shell shock.

Between writing Approaching Hysteria and Hysterical Men, Micale co-edited one of the most important collections of historical essays on trauma in recent years, Traumatic Pasts: History, Psychiatry and Trauma in the Modern Age, 1870–1930.(14) The other editor of this essential collection, Paul Lerner, is also an expert on male hysteria, although to date the chronological and geographical scope of his studies has been somewhat narrower. Lerner’s Hysterical Men: War, Psychiatry, and the Politics of Trauma in Germany, 1890-1930, first published in 2003, has recently been reprinted in paperback form. It therefore formed part of the wave of works on shell shock in the early 2000s, and the reprint attests to the continued appeal of the topic. In the more affordable format this excellent book, still the only English language monograph on shell shock in Germany, will hopefully reach the wider audience it deserves.(15)

The studies of Micale and Lerner both begin with, and largely focus on, hysteria as a formal medical diagnosis. Each author demonstrates that medical concepts of hysteria were not objective scientific descriptions of natural phenomena, but were rather shaped by prevailing social and cultural mores, and in some times and places, driven by powerful political and economic imperatives. Both also show that medicine has never been able to contain hysteria it has always also existed as metaphor and cultural trope. This is medical history in its most generous dimensions, firmly embedding notions of physical and psychological health and illness in the broader historical context, and therefore encompassing not only the relations of doctors to the state or to their patients, but also such diverse topics as the formation of class and gender identities and the interplay of medicine, literature, and art.

The next two books under review consider male psychological subjectivity rather than medicine and male mental illness. Michael Roper and Jessica Meyer take very different approaches to this topic, but both books reflect the emphasis on fluidity and adaptability in recent scholarship on the making of modern masculinities. Meyer’s Men of War: Masculinity and the First World War in Britain (2009) examines a range of ‘personal narratives’ – letters home from the front, wartime diaries, letters of condolence, letters from disabled servicemen to the Ministry of Pensions, and post-war memoirs – to explore ‘how British serviceman who fought in the First World War used their experience to define themselves as men, both in relation to other men and to women’ (p. 2). She argues that two identities emerge most clearly as masculine ideals in these texts, the domestic and the heroic, and that these identities were central to social definitions of appropriate masculinity during and after the war although they were also fraught with tension for individual men. Her focus is therefore on the construction of male identities in different narrative forms, including the different ways in which the expression of these identities was affected by the audience being addressed.

Although Meyer only makes a few explicit comments on his work, Michael Roper is one of the historians who have done most to establish and invigorate the modern history of masculinities in Britain. The volume of essays he co-edited in 1991 with John Tosh, Manful Assertions: masculinities in Britain since 1800 was an important text in confirming within the mainstream of historical thought the notion of masculinity as historically and culturally constructed. Now Roper is once again at the forefront of his field in moving beyond cultural construction. Since the beginning of the decade, he has published a series of articles on the First World War and memory, masculinity, and subjectivity which have explored the immediate and lasting emotional and psychological repercussions of the war for individuals and their families.(16) The Secret Battle: emotional survival in the Great War (2009) extends and deepens this research.

Using many of the same types of source as Meyer, with a particular reliance on letters, diaries, and memoirs, Roper shows that although some soldiers may have felt alienated from the civilian world during wartime, home and front were ‘structurally connected and inter-dependent’ (p. 6). Soldiers relied on families for bodily comforts and for emotional sustenance, and these needs ensured a constant physical and psychological traffic between home and front. Like Meyer then, Roper stresses both martial and domestic aspects of soldiers’ subjectivities, although he avoid any hint of the bifurcation she suggests: indeed, the whole force of his argument is directed towards showing that the domestic and the soldierly were mutually dependent aspects of identity. In contrast to Meyer’s emphasis on the textual construction of male identity, Roper is concerned with emotional subjectivity and psychological survival, and particularly how family relationships influenced men’s experiences of the Western Front.

Neither of these books is ‘about’ trauma, but it would not be possible to write a full history of the emotional experiences of soldiers, or the effects of war upon masculine identities, without touching upon the subject. Roper and Meyer therefore both tackle shell shock in the course of their books, but are concerned with the experiences of servicemen more generally rather than those of soldier-patients in particular. This is an important distinction: in academic studies as well as in the popular imagination, the coward, frightened boy, and shell shock victim invoked by Hynes as new characters in the cast of war are often conflated or compressed into a single historical actor. Yet there were clearly differences in the experiences of those men so incapacitated that they were removed from active military service, and those who managed to cope sufficiently well to remain with their units. As Roper notes, studies of shell shock often have little to say about ‘the majority who continued to carry out their military duties with at least a minimum of competence, but who suffered from periodic or even chronic emotional disturbances’ (p. 247). These are not the same men who feature in the archives of shell shock, which record the experiences of those formally diagnosed and processed by the medico-military bureaucratic complex. Yet the shell shocked and the ‘merely’ suffering existed on the same experiential, and therefore emotional, spectrum. To understand how the war was experienced by serving soldiers, we must move beyond trauma to understand trauma, we need to understand how the war was experienced by serving soldiers. The four books under review suggest the many different ways in which such understanding might be achieved.

Hysterical Men

Hysteria is at the heart of historiographical interpretations of gender and shell shock. This is in no small part due to the influence of Elaine Showalter’s work, which is imprinted on virtually every discussion of the war neuroses which makes reference to gender.(17) Showalter’s reading of shell shock was driven by contemporary feminist scholarship on hysteria as ‘the daughter’s disease’.(18) Within this tradition, hysteria was viewed as both the product of female oppression, and a physical and mental rebellion against this repression. Hysteria encapsulated the history of female suffering, protest, and stigmatisation. Showalter’s description of shell shock as ‘an epidemic of male hysteria’ therefore carried heavy ideological freight. In her view, shell shock was both perceived and experienced as emasculating and effeminising its subjects. Shell shocked soldiers felt themselves to be less than men they were also viewed by others as displaying feminine characteristics. In fact, ‘shell shock’ was a popular term because it provided ‘a masculine-sounding substitute for the effeminate associations of ‘hysteria’’. Shell shock was, moreover, a ‘disguised male protest’ against both the war and the Victorian masculine ideal. The influence of this model of manliness was also evident in the differential application of diagnostic labels and treatments applied to ranking men and officers. The hysterical soldier was, like the hysterical woman, perceived as ‘simple, emotional, unthinking, passive, suggestible, dependent, and weak’, and was treated with harsh disciplinary therapies. The ‘complex and overworked neurasthenic officer was much closer to an acceptable, even heroic male ideal’, and so was treated with analytic therapies which stressed self-knowledge.(19)

Historians of shell shock have tended to accept Showalter’s account of the relationship between hysteria and shell shock even where they have rejected the finer details of her argument or the validity of her general approach. The standard view, then, is that doctors saw hysteria as a female disorder, and therefore perceived shell shock as a type of feminine behaviour in male subjects this led to the belief that traumatised soldiers were effeminate or unmanly. Showalter’s interpretation of the war neuroses, which has become the dominant historiographical reading of shell shock as a gendered diagnosis, rests on the identification of hysteria as the quintessential female malady. If hysteria was not historically a female disorder, this version of the relations between medicine, shell shock and gender is much less convincing.

Although Mark Micale’s Hysterical Men constitutes a major revisionist reading of the history of history, it only partially challenges the view that hysteria has been constructed predominantly as a female malady. Micale’s central argument is that at several moments in hysteria’s long history, doctors have borne witness to the existence of the disorder in men. These glimpses of recognition, however, have been repeatedly suppressed from the official discourses of science and medicine. The history of male hysteria is a history of medical ‘anxiety, ambivalence, and selective amnesia’ (p. 7). The book, which is organised chronologically, traces the history of the male hysteria concept from the late Renaissance (when the gynaecological model of the disease favoured by the ancients was first disputed) up to fin-de-siècle Vienna and the work of Freud. In the first century of its existence, between the 17th and 18th centuries, male nervousness flourished in a culture which recognised no hard-and-fast divisions between science, psychology, and literature. Between the late 18th and the mid 19th centuries, however, all this changed. As gender dichotomies were constructed and strengthened within science and culture, female hysteria flourished in medical discourse while male hysteria was submerged. As scientific and artistic commentaries on mind and body gradually diverged, male nervousness disappeared from medical texts and became the preserve of artists and writers.

Micale argues that male physicians had a range of reasons and a number of strategies for distancing themselves from hysteria. The weakness and emotionality hysteria appeared to reveal could not be countenanced by male physicians because it undermined the image of strength and authority they needed to project, and threatened their self-image as rational men of science. Male hysteria did not disappear, but it was feminized, sexualized, pathologized, and portrayed as a moral deviation. It was flatly denied, re-diagnosed, or qualified to the point of non-existence when acknowledged, the male hysteric was portrayed as effeminate and barely a man at all. Even such an important champion of male hysteria as the French neurologist Jean-Martin Charcot (1825–1893), who consciously sought to subvert the stereotypes attached to the disorder, produced a model of the illness that was subtly and complexly gendered. Perhaps most importantly, one group never appeared in Charcot’s work on male hysteria, even though he treated it in private practice, and belonged to it himself: wealthy bourgeois men.

The heightened cultural anxiety of fin de siècle Europe, which provided the backdrop to Charcot’s work on male hysteria, encouraged radical explorations of masculinity in a range of new biomedical discourses, including evolutionary biology, eugenics, criminal anthropology, endocrinology, and sexology. Yet although in France discourses of male hysteria flourished, there was still a strong current of resistance to the subject within medicine. Even Freud, who diagnosed himself as hysterical and encountered male hysteria at several crucial points in his early career, chose not to publish case studies of male hysterics. For Micale, this proves ‘how difficult it would be for any male scientist of this era to transcend the inherited categories of masculinity and femininity and to break out of the historical “prison of gender”’ (p. 275).

This is a valuable book which, as the subtitle promises, exposes ‘the hidden history of male nervous illness’. Micale has amassed an impressive quantity of material on male hysteria, and it will be difficult for future scholars to ignore the historical existence of the disorder. There is an impressive tradition of feminist scholarship of hysteria, and the role of feminine ‘nerves’ in cultural constructions of womanhood has long been appreciated by historians of women. Hysterical Men provides an important parallel history of male nervousness and its role in structuring visions of ideal masculinity. In arguing that male physicians ignored hysteria in their own sex, Micale does not challenge the historical association of hysteria with femininity, but he does add a new layer to our understanding of the construction and operation of this association, and shows how it affected men as well as women. This is often achieved through a re-reading of apparently familiar episodes in the history of history, gender, or sexuality: as, for example, when he highlights the importance of male hysteria in Charcot’s work, or when he points out the radical de-stabilisation of ‘male’ and ‘female’ in the work of sexologist Otto Weininger, whose anti-Semitic and misogynist writings are more usually portrayed as upholding gender conservatism. The intersections of the medical, cultural, and political are thoughtfully explored, and he has a gift for placing particular texts or thinkers within the broader sweep of history without over-simplification.

As with any synoptic history, some chapters are stronger than others. The middle part of the book, which deals with the Victorian ‘eclipse’ of male hysteria, the work of Charcot, and fin de siècle explorations of the disorder, show Micale at his most assured. By contrast the opening chapter, which ranges from the first mention a wondering womb in an Egyptian papyrus dating from c.1900 BC up to Samuel Johnson’s Rasselas in 1759, inevitably seems sketchy in places. There are numerous scholars of early modern Europe who would dispute his throwaway assertion, in the course of arguing that demonological theories conceptualised hysteria as archetypally female, that ‘There were no male witches’ (p. 10).

The final chapter, on Freud and the origins of psychoanalysis, poses certain challenges. Micale convincingly argues that male hysteria played an important part in formulating Freud’s thought at critical junctures in the early history of psychoanalysis, but at times seems eager to excuse Freud for imagined ‘flaws’ in his attitudes towards male hysteria, gender, and sexuality. He suggests, for example, that if the more conservative Breuer had not co-written Studies in Hysteria, it may have included case studies of male hysterics. This is presented honestly as pure speculation, but it is unnecessary and unconvincing speculation: if Breuer was one of the forces preventing Freud from making his heartfelt commitment to a gender-neutral theory of hysteria public, why did he not publish any studies of male hysterics after his short-lived association with Breuer ended? This type of special pleading undermines an otherwise thought-provoking addition to the volume of scholarship on Freud, gender, and hysteria.

The decision to end the book with Freud is interesting. As the master historiographer of hysteria, Micale was surely aware that in this respect his book echoes the structure of Ilza Veith’s classic Hysteria: the History of a Disease, first published in 1965 (20), and it is difficult to believe that this was not a deliberate decision. Veith has been criticised for a teleological reading the history of hysteria, combing medical texts for traces of pre-Freudian thought and presenting the birth of psychoanalysis as the end point of the history of hysteria. Micale may have left himself open to a similar set of charges. There is, however, a good case for arguing that twentieth-century hysteria was a different type of beast, not least because from a few decades in the Freudian viewpoint became so pervasive, and that this is a logical end point for the book.

A more important question is whether Micale’s central argument stands up: producing a great mass of medical writing on male hysteria makes the argument that the concept has been rigorously suppressed for most of its history problematic. He convincingly argues that when forced to acknowledge male hysteria, physicians have employed a range of strategies which served to minimise its importance but if these doctors had been completely unable to countenance the psychological frailty male hysteria threatened, then they would have left no record of it at all, and this history could not have been written. It may just be that the suppression was partial because of the supreme self-confidence of white, heterosexual, middle-class male physicians, and that they were able to successfully ‘other’ male hysterics. This is a possibility denied by Micale in his conclusion because male hysteria was, apparently, ‘a discourse of the self’ rather than ‘a construction of […] collective others’. Yet surely it is only with psychodynamic psychology that hysteria did become a discourse of the self, rather than a collection of symptoms or a sign or hereditary taint? Ultimately, however, although not every reader will agree with Micale’s conclusions, all should appreciate the range of questions he has asked – and, more than this, should find it surprising that these questions have not been asked before. This is not only a fine book, but an essential one.

Although it stops short of 1914, Micale’s work should be of interest to historians of shell shock and gender because it shows that, contrary to assumption, the existence of hysteria in men did not necessarily surprise wartime physicians. Yet although hysteria was not an exclusively female malady, it was a highly feminised and stigmatised disorder, and, if Micale’s central argument is accepted, one which doctors preferred to ignore or evade. To this extent, his research mounts only a limited challenge to conventional views of shell shock as a gendered diagnosis, and even confirms a modified version of the Showalter thesis. Paul Lerner’s Hysterical Men: war, psychiatry, and the politics of trauma in Germany, 1890–1930 is potentially far more disruptive for this argument, although its full implications do not appear to have been realised since its original publication in 2003.

Lerner argues that in late 19th-century Germany, male hysteria was not denied or suppressed. On the contrary, for some ambitious medical men, it became the diagnosis of choice. This is explained by the peculiar history of the male hysteria concept in Germany, where it emerged hand-in-hand with debates on the traumatic neuroses. In the 1880s, as a result of rapid industrialization and coinciding with both Bismarck’s social insurance legislation and Wilhelmine concern with collective health, industrial and railways accidents multiplied. As more and more men involved in these accidents presented hysteria-like symptoms, medical debates raged over the origins of these disorders. The neurologist Hermann Oppenheim argued that the traumatic neuroses originated in material damage to the nervous system, but his critics countered that they were nothing more than hysteria. The question had important political implications: if the symptoms were caused by the pathological mental processes of the hysterical individual, then the state or employers were not liable to pay compensation to the victims of accidents. The association of hysteria with work therefore displaced its traditional gender identity, and made it a preferable diagnosis for employers and the state. When the war broke out and soldiers began to break down, these debates were replicated and reached the same conclusion. German military doctors decided that soldiers were hysterical rather than suffering from traumatic neurosis this was seen as a patriotic diagnosis which not only minimised the cost to the state, but prevented the individual from developing a crippling ‘pension neurosis’ in pursuit of compensation.

In this thoughtful and comprehensive book, Lerner traces the origins and consequences of the trauma concept from the 1890s into the post-war years. The recurrent theme is how ‘the surrounding political, economic, and social context influence[s] diagnostic change in the history of psychiatry and how scientific ideas can resonate with broader cultural patterns’ (p. 62). He never denies individual suffering, but always shows how it is ‘constructed by larger social and medical forces’ (p. 10). In 1914, German neurologists and psychiatrists greeted the outbreak of hostilities with patriotic fervour, portraying war as a health-giving agent which would cleanse the nation. This encouraged them to see mentally ill soldiers as a threat to national unity, and themselves as loyal servants of the nation performing an essential duty by combating male hysteria. The stigmatization of war hysteria had one benefit for soldiers. Men who had broken down once were perceived as liable to crack under pressure if returned to active service, where they risked spreading hysteria among their units (it was widely perceived as contagious) and putting the lives of healthy German soldiers in danger. This meant that psychiatrists defined cure as the ability to work rather than the return to fighting fitness, and unlike in other combatant nations German war neurotics were not returned to the front. ‘In such a way, health, morality, and productivity were blended into a normative concept of appropriate masculine behaviour’ (p. 127).

Their loyalty to the state also meant that German military doctors were willing to experiment with a range of therapies in the interests of turning broken soldiers into productive workers. These ranged from the relatively benign, such as the suggestive therapies most strikingly illustrated by the neurologist Max Nonne’s theatrical displays of hypnosis, or the programme of work therapy and rehabilitation developed at special neurosis stations, to the wholly unexpected, such as the alliance between the leaders of the psychoanalytic movement and the political and the military authorities of the Central Powers (lured by the promise of deeper cures which could return men to the field) which, had the war lasted longer, would have resulted in the creation of special psychoanalytic clinics for the treatment of war neurosis. The dark side to this experimentation was the acceptance of methods which relied on pain and humiliation to succeed, such as the notorious Kaufmann cure (Überrumpelingsbehandling), which involved the application of strong electrical currents accompanied by relentless verbal suggestion. This proved popular because it was seen to produce rapid cures and to be an easy method to learn even for inexperienced physicians. There were a number of objections to the method – it was inhumane, painful, and there were documented cases in which it produced injury and even death – but use of the Kaufmann technique was still not restricted until the final weeks of the war, when it met with patient protest and popular resistance. The collectivist ethos of medical men meant that the ends were seen to justify the means. In the post-war years, when their wartime activities were heavily criticised, psychiatrists even argued that such unjustified attacks made them the real victims of the war and in their attempts to deny veterans compensation and thereby to avoid crippling the state with an enormous pensions bill, they portrayed Weimar welfare provision, rather than war, as the true threat to mental health and national efficiency.

It is possible that specialist historians of imperial or Weimar Germany might criticise some of Lerner’s assertions certainly he is not shy of challenging historiographical constructs such as the notion of the special path (Sonderwag) of German history, or the Weimar ‘culture of trauma’. This historian of shell shock, however, finds it difficult to discover flaws in the book. One of the major achievements of Lerner’s Hysterical Men is its deep ‘historicization of World War I – era psychiatry into its multiple early-twentieth-century contexts’ (p. 4). This means not only a lively and continual awareness of the social and economic dimensions of the male hysteria diagnosis, but a keen appreciation of the motivations of psychiatrists and neurologists, including the jostle for scientific and professional supremacy between the two groups, the social context of psychiatric practice, and the ways in which practice shaped theory and vice versa. He argues, for example, that Hermann Oppenheim’s championing of the traumatic neurosis concept was rooted in nineteenth century liberal traditions, in which patient complaints were taken seriously and patients themselves were minutely examined. The new, younger generation of doctors did not work in private practice with individual patients, and were likely to take a collectivist approach to social problems in which the individual patients was submerged. The prevalence of anti-Semitism meant that it was easy to depict the Jewish Oppenheim as alien and unpatriotic when his ideas seemed to threaten the full-blooded pursuit of the war. The victory of the male hysteria diagnosis is therefore presented as the concatenation of a series of forces: political and economic imperatives, racial/religious prejudice, generational conflict, and trends within medical culture.

The other major achievement of Lerner’s research is to provide the fullest and most nuanced national history of masculinity and trauma in the First World War. Here, gender is not relegated to a separate chapter or its operation as a historical construct assumed rather, the importance of gender in shaping medical attitudes and expectations, and consequently patient experiences, permeates every page. Lerner acknowledges his debt to the work of Elaine Showalter, and explains that there has been little written on gender and trauma in the German context. He argues that, in contrast to Showalter’s analysis of shell shock as a gendered diagnosis in Britain, there is an absence of explicit feminization of traumatised patients in German medical writings. The operative opposition here is not between masculinity and femininity, but between healthy masculinity and the pathological lack of male behaviour such as working, fighting, and patriotism. Lerner sees this as evidence of the particular national context of the debates on hysteria, work, and war which he discusses, but it could also be argued that it demonstrates the necessity for more in-depth attention to how male hysteria was constructed, and the concept deployed, in all combatant nations.

Micale’s work shows that male hysteria has a surprisingly long history, and that it has shifted shape in response to political and social developments as well as changes in the doctor-patient relationship Lerner confirms that the meanings of male hysteria were determined by the perceived needs of the nation, and the complex of relations between state, medicine, and soldier-worker-patients, and that this combination of circumstances was enough to overthrow the traditionally feminine associations of hysteria. Together, these books suggest that more detailed research on the specific contexts (historical, national, and chronological) in which the diagnosis of male hysteria has been deployed may yet alter our understandings of its operation in wartime. If doctors were familiar with male hysteria, or if in one country they believed it was the most appropriate diagnosis for soldiers who had broken down, surely it is time to reconsider how the label is used in histories of masculinity, shell shock and trauma?

Masculinity and Emotional Survival

A number of labels, old and new, were applied to men who were processed by the medico-military bureaucratic complex between 1914 and 1918: not only hysteria, neurasthenia, traumatic neurosis, and shell shock, but also manifold hybrid diagnoses and permutations on the same theme. The precise choice of appellation reflected the symptoms displayed by the patient and the aetiological explanation favoured by the doctor. From the point of view of the medical historian, the differences in these labels, the nuances of opinion they reveal and conceal, are fascinating and integral to understanding the construction of trauma as a diagnosis. For others, the differences in diagnostic terminology are meaningless, and what is important is that these labels have the same subject: men in pain as a result of their exposure to war. Yet of course, these labels do not cover the full spectrum of suffering, or the range of ways in which men could be affected by war. In all combatant nations, the number of men diagnosed and treated for shell shock during the war formed only a small proportion of all those who fought. The trauma suffered by many soldiers was doubtless ignored or misdiagnosed, but the fact remains: although the First World War pushed men to the limits of their endurance, most did endure. Yet stating the fact does not explain it. The questions of how such endurance was achieved, and what the war ultimately cost even those who seemed to emerge relatively unscathed, continue to exert a grim fascination.

‘Endurance’ seems a particularly apt word to describe the qualities required of soldiers in the trenches. Jessica Meyer argues that the notion of endurance as a masculine ideal was actually a product of the war. Victorian and Edwardian models of masculinity had emphasised self-control, but endurance was something a little different. ‘Men who endured were those who controlled their emotions not only in the moment of fear and stress but also when confronted with the on-going horrors of warfare’ (p. 142). As this example suggests, the very terms in which we think about soldiering are a result of the redefinition of masculinity caused by the First World War. Meyer’s work uses a range of personal narratives to examine the complex processes by which masculine identities were constructed and reconstructed during and after the war, and emphasises the fluidity and complexity of male identities in wartime. Civilian and domestic identities were crucial in shaping perceptions of martial masculinity among volunteers and conscripts, and the version of soldiering they presented to their families and friends. The civilian persisted within the soldier. Similarly, the ideal of the soldier hero persisted long after the experience of the trenches had shown most men the difficulty of ever attaining this ideal, although it was also modified (as through the new emphasis on endurance) and had to sit alongside new ‘culturally powerful identities’ such as that of the male victim (p. 5).

The decision to focus on personal narratives is one of the strengths of this book. Meyer shows that although there are potential problems with using such narratives, such as the extent to which any narrative can be held as representative of a mythical unified ‘war experience’, common threads of understanding nevertheless emerge from these sources which enable a greater understanding of the effects of war on representations of male identity. The chapters in Men of War focus on different textual sources and proceed chronologically, allowing Meyer to reflect on the role of memory in shaping narratives of war. The first two chapters deal with wartime material (letters from the front and wartime diaries) a chapter on letters of condolence bridges the wartime and post-war worlds and the final two chapters on letters from disabled ex-servicemen to the Ministry of Pensions and war memoirs deal with soldiers’ attempts to negotiate the world after the war. There are occasional problems with this structure: soldiers appear to have recorded many of the same concerns in diaries and in memoirs, and although there is clearly a point in comparing the subtle differences in these narratives, the material on horror (for example) seems repetitious in places. For the most part however, the structure highlights both the particularity of different types of sources and the diverse arenas in which martial masculinities were acted out. It has the benefit of showing not only how soldiers constructed their own masculine identities, but how these varied with intended audience, and how others (mothers, military superiors, pension officials) contested or supported these constructions.

The book’s other great asset is the wealth of original source material, generously quoted, which Meyer has unearthed. Letters of condolence, and letters to the Ministry of Pensions, are excellent and under-explored resources for the social history of the First World War. Although soldiers’ letters, diaries, and memoirs are commonly used by historians of the First World War, it is rarer to find attention given to these as texts governed by particular rules and narrative forms, as well as shaped by intended audience – and, of course, the stories told by these men are thoroughly absorbing. The case study of a particular soldier which ends each chapter illustrates general themes and provides an in-depth analysis of an individual’s construction of his own identity as a soldier or veteran in a particular narrative form, but is particularly welcome as an opportunity for greater engagement with these life stories.

Men of War is therefore undoubtedly an interesting book, but it is also flawed in certain respects. In the introduction, Meyer emphasises that ‘not all men experienced the same war in the same way’ differences in social class, regional background, and the particularity of individual experiences of the war militate against the assumption of one shared war experience (p. 10). Yet throughout the book, all too often she does not provide relevant information about individuals. Their pre-war lives remain a blank. In part, this may be a result of the types of source material – surely few individuals began their letters home by listing all the information a census-taker might require – but any type of personal narrative usually produces some clues as to the writer’s social status or pre-war occupation. The failure to comment on these aspects of identity undermines the analysis in places. For example, the opening sentence of an in-depth analysis of one individual states, ‘One man who was undoubtedly changed by the war was Lt C. S. Rawlins’ (p. 40). The next sentence summarises his war experience from enlistment until the Battle of Loos. As far as the reader is concerned, Rawlins had no life before 1914 it is therefore difficult for Meyer to show, or the reader to judge, how Rawlins was changed by the war.

Meyer’s use of Rawlins also illustrates an occasional tendency to strain too hard to make the evidence fit the thesis. Consider the following quotation from one of Rawlins’ letters:

Our best and fittest men are daily being killed & wounded: all our best blood is going to waste, & our race is bound to suffer terrible depreciation in consequence & we ought to do all in our power to lessen this for the sake of our country’s future … every single man will have to marry ‘after the war’.

According to Meyer, this ‘casting of the problem of manpower in terms of marriage and fatherhood exposes the extent to which the domestic and the military were merged in his view of the world’ (p. 45). Yet surely the extract quoted does not display a concern with military manpower, but rather with racial decline as a result of war? It does not discuss marriage and fatherhood as part of a traditionally domestic identity, but rather as public duties which must be performed for racial regeneration. If these comments prove anything, it is not Rawlins’ negotiation of domestic and martial identities, but the prevalence of discourses of degeneration and eugenics in wartime Britain (it might also be added, that they suggest some clue as to the class identity of C. S. Rawlins, never mentioned in Meyer’s discussion of his letters).

As the focus on different types of text suggests, Men of War is concerned with the construction of male identities rather than psychological subjectivity. The emphasis on construction often results in the lack of a sense of identities as inhabited, as lived realities. Men did not simply construct martial and domestic identities: they lived as soldiers, with all the bodily deprivations and psychological turmoil that entailed, and they were fathers, brothers, and sons. To speak of constructing an identity implies a degree of agency and awareness in regard to subjectivity which may be proper to self-representation, but does not cover the inevitable blind spots in self-knowledge. Of course, to a certain extent, all historians deal with retrospective constructions rather than unmediated experience. The text stands between scholar and subject. Some would argue that, properly speaking, the text is the subject and there is no unmediated experience – even the oral historian has to wrestle with the manifold problems of memory. Yet there is nevertheless a gap between experience and representation which is never fully explored or acknowledged in Meyer’s book, and in this she is representative. It is a gap which few historians attempt to negotiate, either because they do not subscribe to social and cultural constructivism, or because they believe it is impossible to find a way round (or through) the text without a retreat to essentialism. Michael Roper may not escape charges of essentialism, but in The Secret Battle: Emotional Survival in the Great War he has produced a work which is resolutely about experience rather than representation, which is methodologically innovative and empirically flawless, and which is theoretically sophisticated yet should appeal to a wide reading public, not only professional historians.

Roper seeks to explain how young British civilian soldiers survived the First World War. He argues that family relationships were ‘a source of practical survival skills and support, and played a crucial role in sustaining the morale of this largely young, amateur army’. During the war these relationships were largely conducted through the long-distance means of letters and parcels, ‘but drew their strength from a much longer history, whose legacy could be seen not only in the soldier’s domestic skills and memories of home, but in his deepest states of mind’ (p. 1). The book is divided into three sections: ‘Mothers and sons’, which examines the psychological and physical interchange between home and trenches in the form of letters and parcels ‘Mothering men’, which considers the relations between maternal care and domestic survival and ‘Falling apart’, which explores the emotional experience of trench warfare and its aftermath: grief, terror, horror, dread, anger, alienation, revulsion and love. Throughout, Roper shows that home and the Western front were not separate spheres, and men’s identities were not split into aspects of the soldier and the civilian. The two fronts were organically blurred, not only through physical and emotional traffic between the man and his family, but in the person of the soldier who might sometimes feel alienated from good old Blighty and all in it, but who nevertheless carried home with him wherever he went.

The men Roper writes about are not martial heroes, or even simply officers or sergeants, but fathers, brothers, nephews, and above all sons. The emphasis throughout the book is on maternal relationships. One explanation for this is the inspiration Roper derives from psychoanalytic ideas, which he explains helped him to understand the impact of trench warfare on the mind as well as how to interpret letters, diaries and memoirs. Another, however, is that the mother-son relationship did govern the emotional lives of most young men in the early 20th century, albeit in ways dependent on the social class and particular circumstances of different individuals. One page into the book, Rope presents us with examples of men crying out for their mothers as they died he spends the rest of it attempting to explain why this was, and what it tells us about the emotional lives of soldiers. An interest in psychoanalysis might explain the decision to highlight these stories, but it does not explain why the dying word of so many men, of so many different nationalities, was ‘Mother’ (or, for the more cynically-minded, it does not explain where such stories originated or why they were so often repeated). Psychoanalytic theory informs and enriches this history rather than dictates its content, although it is nevertheless integral to the history that Roper has produced. To put it another way it would be possible to remove most of the explicit references to psychoanalysis in this book without altering the interpretations and conclusions that are arrived at through its use, and still satisfy the most anti-psychoanalytic and empirically-minded historian.

The book gathers force through the accumulation of flashes of revelation which appear to the reader as moments of recognition. Most of these examples involve linking up men’s pre-war and domestic experience with their experiences as soldiers, and they usually demonstrate a keen awareness of the importance of social class in structuring both emotional and bodily experience. In many of these cases, Roper is not unveiling new evidence but placing information in a different context which exposes new dimensions or significance to it. Often, they reveal the extent to which assumptions about gender have governed the questions previous historians and asked and the histories they have written as a result. For example, he not only reminds us of the well-known fact of overcrowding among poor families, but illustrates this with reference to a particular case and suggests how social background differentiated experiences of war:

When Charles Taylor, a tunnel construction worker, signed up in April 1915, fourteen people were living in his mother’s six-roomed house in South London, and he slept on the couch in the front room. Officers sometimes wondered how the men managed to get to sleep in primitive billets or funk holes, but many had never enjoyed the privacy of a separate bedroom. While the clerk might have found it bewildering at first to have to sleep twelve to a bell tent in the Army, the semi-skilled working man Charles Taylor probably did not and he was used to putting away his bedding by day (p. 181).

Elsewhere, he shows that although historians have started to investigate the influence of civilian resources on wartime survival, they have usually focused on the public sphere. For working-class men, the neighbourhood, workplace, and mutual associations have been discussed at great length, but the home has been underplayed. For the officer class, historians have assumed that the experience of looking after boarders at public schools determined the nature of relations with rankers, rather than exploring the influence of relationships with domestic servants. Yet given the youth of British Army recruits, the home rather than the public sphere was the most immediate source of knowledge and point of reference for most. Not only this, but the army itself was also a domestic institution, which fed, clothed and sheltered its recruits. Therefore

[when] the soldier took out his sewing kit or ‘housewife’ to mend his tunic, heated up his rations of a Tommy cooker, or tried to rid himself of the ubiquitous louse, he performed a domestic task, sometimes with help or advice from his mother. How things had been done at home influenced how, in this largely non-professional Army, things would be done at war (pp. 161–2).

It might be speculated that historians have downplayed the influence of the private sphere, or failed to look at the home, because they are so used to conceptualising masculine identities as formed in the public world. Certainly, only rather rigid preconceptions about sex and gender roles can explain why historians have often described the caring roles performed by subalterns as ‘paternal’. Yet as Roper points out, the practical tasks performed by officers (ensuring that men were fed, watered, sheltered and generally kept in the greatest physical comfort possible) were those most usually performed by mothers, not fathers, while the emotional qualities they were required to demonstrate (keeping order and regulating minor punishments, while all the time remaining sympathetic and alive to their charges’ needs) were again maternal rather than paternal. Perhaps most tellingly, subalterns often compared themselves to mothers, not to fathers yet historians have transmuted these direct comparisons into evidence of ‘paternal’ feeling simply because the tasks were performed by men (pp. 165–6). This is a perfect illustration of Roper’s ability to make the reader see familiar material with fresh eyes, and in doing so to turn old assumptions about gender, masculinity and war upside down while providing an enriched and deepened understanding.

These examples also demonstrate how the book fulfils its overarching aim of moving experience to centre stage. This involves a more thoughtful approach to memory than is often taken in such histories. Roper attacks (elegantly, but nonetheless fatally) the orthodoxy that retrospective accounts are ‘tainted by memory’ and that contemporary sources are somehow closer to the ‘truth’ of the event. He points out that proximity to events could sometimes prevent understanding. ‘What these men experienced was sometimes too disturbing to take in the very ability to think was under attack’, meaning that experiences ‘were not wholly constituted in language’ (pp. 20–1). In practical terms, this means reading sources such as letters and diaries in a different manner. Roper quotes from a letter written by Captain Herbert Leland: ‘Oh! Such a crump has just fallen. Mud, dust, splinters of wood all over me, but I am hanging on to this piece of paper’. It is easy to imagine another historian using this as evidence of the ability of soldiers to mediate the horror of war through humour, a stance which often falls into arguing that horror was not as extreme as it might first seem to us, because it was not felt as such by those who experienced it. Roper reads the letter quite differently:

‘the jokey tone scarcely conceals his terror of being rent apart, which was less evoked than performed in the very writing […] Only a scrap of paper, his one connection to home, kept Leland sane amidst the shellfire, and if he lost his grip on it, he might not hold together’ (p. 21).

Elsewhere he points out the significance of certain slips of the pencil. Words written by accident and then crossed out show memory running ahead of the ability to put experiences into words (p. 66). The counterpart of this sensitivity to the potential readings of contemporary records is an awareness of the value of reflection in retrospective accounts. These may not tell us exactly how someone believed they felt at the time, but they tell us something of equal value how someone felt afterwards, when they had a chance to mediate the experience and to reflect on where it fitted into the broader outlines of a war and a life.

The Secret Battle marries the best of social history, psychohistory, and histories of emotion. It is a strident counter to the recent trend within First World War studies to argue that the horror of the war is largely a product of myth, memory, and anti-war sentiment. The fact that not every man who served died does not mitigate the bleakness of the casualty lists. The fact that not all men saw active service, or that even for those who did, ‘going over the top’ was an exceptional rather than daily occurrence, does not make the killing, dying, and dismemberment which did occur somehow less terrible. The ability of human beings to cope in the face of pain and terror does not justify the events which caused these emotions. Of course it is important to realise that the meanings attributed to the war by subsequent generations have often been shaped by the contemporary political and cultural landscape, or that many prominent representations of the war can be read differently, or read alongside other accounts which emphasised different aspects of war experience. But in pursuing these research agendas, the lived experience of this war – its daily discomforts as well as its outermost limits of horror – has sometimes been lost. The attempt to understand the emotional experience of the war beyond coping or breaking down, meanwhile, has been attempted only very rarely, and the continuities between pre-war and wartime emotional experience never shown in such a rounded and nuanced fashion. This is a book in a class of its own, which should be read not only by scholars of the First World War, or historians of gender, but anyone interested in the human mind and human society, past or present.

Our understandings of modern wars are formed within and against the shadow of the First World War. In turn, our perceptions of this originating conflict are shaped by what we have learnt or imagine we now know about the costs of war. This does not mean the ‘real’ meaning of the war has been lost, but that new dimensions of the experience, significance, and lasting influence of the events of 1914–18 are continually being uncovered. The rash of interest in shell shock over the past twenty years certainly tells us something about attitudes to war, trauma and psychiatry in contemporary culture, but as the four books under review amply demonstrate, it also means that we are continually gaining new perspectives on the past. Although all of these books are concerned with the emotional and psychological suffering of men, they are very different to each other. The range of approaches employed, areas addressed, and conclusions reached suggest that there is plenty of mileage yet in the overlapping subjects of male hysteria, shell shock, and the emotional experiences of men at war.

The first two books under review, those by Micale and Lerner, prove that medical history still has much to offer. In recent years historians of shell shock have self-consciously moved away from a reliance on the published medical literature, pleading that to understand the war neuroses, a variety of sources and perspectives must be consulted.(21) Shell shock studies have been dominated by cultural historians over the last decade. The tendency has been to move away from an analysis of the formal contents of diagnostic categories used by doctors and the intellectual framework within which shell shock was constructed. While this has resulted in a secondary literature which is immensely rich, as well as dynamic, the rejection of medical history raises certain problems. Most obviously, it is impossible to write a history of a category such as shell shock, which originated as a medical label applied by doctors to patients (a category therefore enmeshed in the network of medical relations at the most basic level), without including some reference to medical ideas and practices. The books by Micale and Lerner are both concerned with the diagnosis of pathological masculinity and the motivations of doctors in formulating such diagnoses, but also show how medicine reflects and refracts wider socio-economic, political, and cultural aims and assumptions. Their work is clearly of relevance to scholars in a number of fields beyond medical history.

The books by Roper and Meyer, on the other hand, show that histories of trauma are perhaps most productive, and tackle their subject most properly, when they extend beyond the medical. The term shell shock long ago escaped from medical journals into the vernacular we might speculate that it has survived there because it seems so peculiarly apt as a way of describing not only on those individuals who have undergone medical diagnosis and treatment, but others who experienced traumatic events without being subject to such interventions, or even the effects of war on whole societies. Indeed, Susan Kingsley Kent’s recent Aftershocks: politics and trauma in Britain, 1918-1931 (22)even argues that political events in the immediately post-war years are best explained as a result of collective shell shock among Britons. This might be too extreme for some tastes, but both Roper and Meyer show that the war had lasting psychological and emotional repercussions on those who fought it. Not all men were traumatized, but many had borne witness to traumatic events, and the wartime and post-war histories of these men also need to be written.

The history of trauma beyond the purely medical, or of wartime emotional experience beyond trauma, can fruitfully interact with medical histories: what was the difference between a traumatized soldier and a shell shock patient, or between those who were diagnosed and those who were not? What determined the translation of an emotional state into a formal medical diagnosis? Or, just as importantly what made some men break down while others coped, however well or badly? The last question foxed doctors during the First World War, and we still have no convincing answer to it today. Although this is depressing to acknowledge, it does suggest one reason why shell shock continues to hold us in its thrall these are live mysteries about the possibilities and limitations of the human mind. In many respects, these are not historical questions, and if they can be answered, it will not be through history. Yet as these books show, albeit in their very different ways, it is important as historians that we keep asking these questions, not only to learn about humans in the past but because the act of asking continually throws up new questions and new ways of seeing.