A story could start almost anywhere. This one begins at a moment startled
by a rocket.
In the autumn of 1957, America was not at war ... or at peace. The threat of nuclear annihilation shadowed every day, flickering with visions of the apocalyptic. In classrooms, “duck and cover” drills were part of the curricula. Underneath any Norman Rockwell painting, the grim reaper had attained the power of an ultimate monster.
Dwight Eisenhower was most of the way through his fifth year in the White House. He liked to speak reassuring words of patriotic faith, with presidential statements like: “America is the greatest force that God has ever allowed to exist on His footstool.” Such pronouncements drew a sharp distinction between the United States and the Godless Communist foe.
But on October 4, 1957, the Kremlin announced the launch of Sputnik, the world’s first satellite. God was supposed to be on America’s side, yet the Soviet atheists had gotten to the heavens before us. Suddenly the eagle of liberty could not fly nearly so high.
Sputnik was instantly fascinating and alarming. The American press swooned at the scientific vistas and shuddered at the military implications. Under the headline “Red Moon Over the U.S.,” Time quickly explained that “a new era in history had begun, opening a bright new chapter in mankind’s conquest of the natural environment and a grim new chapter in the cold war.” The newsmagazine was glum about the space rivalry: “The U.S. had lost its lead because, in spreading its resources too thin, the nation had skimped too much on military research and development.”
The White House tried to project calm; Eisenhower said the satellite “does not raise my apprehension, not one iota.” But many on the political spectrum heard Sputnik’s radio pulse as an ominous taunt.
A heroine of the Republican right, Clare Boothe Luce, said the satellite’s beeping was an “outer-space raspberry to a decade of American pretensions that the American way of life was a gilt-edged guarantee of our material superiority.” Newspaper readers learned that Stuart Symington, a Democratic senator who’d been the first secretary of the air force, “said the Russians will be able to launch mass attacks against the United States with intercontinental ballistic missiles within two or three years.”
A New York Times article matter-of-factly referred to “the mild panic that has seized most of the nation since Russia’s sputnik was launched two weeks ago.” In another story, looking forward, Times science reporter William L. Laurence called for bigger pots of gold at the end of scientific rainbows: “In a free society such as ours it is not possible ‘to channel human efforts’ without the individual’s consent and wholehearted willingness. To attract able and promising young men and women into the fields of science and engineering it is necessary first to offer them better inducements than are presently offered.”
At last, in early February 1958, an American satellite -- the thirty-pound Explorer -- went into orbit. What had succeeded in powering it into space was a military rocket, developed by a U.S. Army research team. The head of that team, the rocket scientist Wernher von Braun, was boosting the red-white-and-blue after the fall of his ex-employer, the Third Reich. In March 1958 he publicly warned that the U.S. space program was a few years behind the Russians.
------------------------------
Soon after dusk, while turning a skate key or playing with a hula hoop, children might look up to see if they could spot the bright light of a satellite arching across the sky. But they could not see the fallout from nuclear bomb tests, underway for a dozen years by 1958. The conventional wisdom, reinforced by the press, downplayed fears while trusting the authorities; basic judgments about the latest weapons programs were to be left to the political leaders and their designated experts.
On the weekly prime-time Walt Disney television show, an animated fairy with a magic wand urged youngsters to drink three glasses of milk each day. But airborne strontium-90 from nuclear tests was falling on pastures all over, migrating to cows and then to the milk supply and, finally, to people’s bones. Radioactive isotopes from fallout were becoming inseparable from the human diet.
Young people -- dubbed “baby boomers,” a phrase that both dramatized and trivialized them -- were especially vulnerable to strontium-90 as their fast-growing bones absorbed the radioactive isotope along with calcium. The children who did as they were told by drinking plenty of milk ended up heightening the risks -- not unlike their parents, who were essentially told to accept the bomb fallout without complaint.
Under the snappy rubric of “the nuclear age,” the white-coated and loyal American scientist stood as an icon, revered as surely as the scientists of the enemy were assumed to be pernicious. And yet the mutual fallout, infiltrating dairy farms and mothers’ breast milk and the bones of children, was a type of subversion that never preoccupied J. Edgar Hoover.
The more that work by expert scientists endangered us, the more we were informed that we needed those scientists to save us. Who better to protect Americans from the hazards of the nuclear industry and the terrifying potential of nuclear weapons than the best scientific minds serving the industry and developing the weapons?
In June 1957 -- the same month Nobel Prize–winning chemist Linus Pauling published an article estimating that ten thousand cases of leukemia had already occurred due to U.S. and Soviet nuclear testing -- President Eisenhower proclaimed that the American detonations would result in nuclear warheads with much less radioactivity. Ike said that “we have reduced fallout from bombs by nine-tenths,” and he pledged that the Nevada explosions would continue in order to “see how clean we can make them.” The president spoke just after meeting with Edward Teller and other high-powered physicists. Eisenhower assured the country that the scientists and the U.S. nuclear test operations were working on the public’s behalf. “They say: ‘Give us four or five more years to test each step of our development and we will produce an absolutely clean bomb.’”
But sheer atomic fantasy, however convenient, was wearing thin. Many scientists actually opposed the aboveground nuclear blasts. Relying on dissenters with a range of technical expertise, Democratic nominee Adlai Stevenson had made an issue of fallout in the 1956 presidential campaign. During 1957 -- a year when the U.S. government set off thirty-two nuclear bombs over southern Nevada and the Pacific -- Pauling spearheaded a global petition drive against nuclear testing; by January 1958 more than eleven thousand scientists in fifty countries had signed.
Clearly, the views and activities of scientists ran the gamut. But Washington was pumping billions of tax dollars into massive vehicles for scientific research. These huge federal outlays were imposing military priorities on American scientists without any need for a blatant government decree.
------------------------------
What was being suppressed might suddenly pop up like some kind of jack-in-the-box. Righteous pressure against disruptive or “un-American” threats was internal and also global, with a foreign policy based on containment. Control of space, inner and outer, was pivotal. What could not be controlled was liable to be condemned.
The ’50s and early ’60s are now commonly derided as unbearably rigid, but much in the era was new and stylish at the time. Suburbs boomed along with babies. Modern household gadgets and snazzier cars appeared with great commercial fanfare while millions of families, with a leg up from the GI Bill, climbed into some part of the vaguely defined middle class. The fresh and exciting technology called television did much to turn suburbia into the stuff of white-bread legends -- with scant use for the less-sightly difficulties of the near-poor and destitute living in ghettos or rural areas where the TV lights didn’t shine.
On the surface, most kids lived in a placid time, while small screens showed entertaining images of sanitized life. One among many archetypes came from Betty Crocker cake-mix commercials, which were all over the tube; the close-ups of the icing could seem remarkable, even in black and white. Little girls who had toy ovens with little cake-mix boxes could make miniature layer cakes.
Every weekday from 1955 to 1965 the humdrum pathos of women known as housewives could be seen on Queen for a Day. The climax of each episode came as one of the competitors, often sobbing, stood with a magnificent bouquet of roses suddenly in her arms, overcome with joy. Splendid gifts of brand-new refrigerators and other consumer products, maybe even mink stoles, would elevate bleak lives into a stratosphere that America truly had to offer. The show pitted women’s sufferings against each other; victory would be the just reward for the best, which was to say the worst, predicament. The final verdict came in the form of applause from the studio audience, measured by an on-screen meter that jumped with the decibels of apparent empathy and commiseration, one winner per program. Solutions were individual. Queen for a Day was a nationally televised ritual of charity, providing selective testimony to the goodness of society. Virtuous grief, if heartrending enough, could summon prizes, and the ecstatic weeping of a crowned recipient was vicarious pleasure for viewers across the country, who could see clearly America’s bounty and generosity.
That televised spectacle was not entirely fathomable to the baby-boom generation, which found more instructive role-modeling from such media fare as The Adventures of Spin and Marty and Annette Funicello and other aspects of the Mickey Mouse Club show -- far more profoundly prescriptive than descriptive. By example and inference, we learned how kids were supposed to be, and our being more that way made the media images seem more natural and realistic. It was a spiral of self-mystification, with the authoritative versions of childhood green-lighted by network executives, producers, and sponsors. Likewise with the sitcoms, which drew kids into a Potemkin refuge from whatever home life they experienced on the near side of the TV screen.
Dad was apt to be emotionally aloof in real life, but on television the daddies were endearingly quirky, occasionally stern, essentially lovable, and even mildly loving. Despite the canned laugh tracks, for kids this could be very serious -- a substitute world with obvious advantages over the starker one around them. The chances of their parents measuring up to the moms and dads on Ozzie and Harriet or Father Knows Best were remote. As were, often, the real parents. Or at least they seemed real. Sometimes.
Father Knows Best aired on network television for almost ten years. The first episodes gained little momentum in 1954, but within a couple of years the show was one of the nation’s leading prime-time psychodramas. It gave off warmth that simulated intimacy; for children at a huge demographic bulge, maybe no TV program was more influential as a family prototype.
But seventeen years after the shooting stopped, the actor who had played Bud, the only son on Father Knows Best, expressed remorse. “I’m ashamed I had any part of it,” Billy Gray said. “People felt warmly about the show and that show did everybody a disservice.” Gray had come to see the program as deceptive. “I felt that the show purported to be real life, and it wasn’t. I regret that it was ever presented as a model to live by.” And he added: “I think we were all well motivated but what we did was run a hoax. We weren’t trying to, but that is what it was. Just a hoax.”
-----------------------------
I went to the John Glenn parade in downtown Washington on February 26, 1962, a week after he’d become the first American to circle the globe in a space capsule. Glenn was a certified hero, and my school deemed the parade a valid excuse for an absence. To me, a fifth grader, that seemed like a good deal even when the weather turned out to be cold and rainy.
For the new and dazzling space age, America’s astronauts served as valiant explorers who added to the elan of the Camelot mythos around the presidential family. The Kennedys were sexy, exciting, modern aristocrats who relied on deft wordsmiths to produce throbbing eloquent speeches about freedom and democracy. The bearing was American regal, melding the appeal of refined nobility and touch football. The media image was damn-near storybook. Few Americans, and very few young people of the era, were aware of the actual roles of JFK’s vaunted new “special forces” dispatched to the Third World, where -- below the media radar -- they targeted labor-union organizers and other assorted foes of U.S.-backed oligarchies.
But a confrontation with the Soviet Union materialized that could not be ignored. Eight months after the Glenn parade, in tandem with Nikita Khrushchev, the president dragged the world to a nuclear precipice. In late October 1962, Kennedy went on national television and denounced “the Soviet military buildup on the island of Cuba,” asserting that “a series of offensive missile sites is now in preparation on that imprisoned island.” Speaking from the White House, the president said: “We will not prematurely or unnecessarily risk the costs of worldwide nuclear war in which even the fruits of victory would be ashes in our mouth -- but neither will we shrink from that risk at any time it must be faced.”
Early in the next autumn, President Kennedy signed the Limited Test Ban Treaty, which sent nuclear detonations underground. The treaty was an important public health measure against radioactive fallout. Meanwhile, the banishment of mushroom clouds made superpower preparations for blowing up the world less visible. The new limits did nothing to interfere with further development of nuclear arsenals.
Kennedy liked to talk about vigor, and he epitomized it. Younger than Eisenhower by a full generation, witty, with a suave wife and two adorable kids, he was leading the way to open vistas. Store windows near Pennsylvania Avenue displayed souvenir plates and other Washington knickknacks that depicted the First Family -- standard tourist paraphernalia, yet with a lot more pizzazz than what Dwight and Mamie had generated.
A few years after the Glenn parade, when I passed the same storefront windows along blocks just east of the White House, the JFK glamour had gone dusty, as if suspended in time, facing backward. I thought of a scene from Great Expectations. The Kennedy era already seemed like the room where Miss Havisham’s wedding cake had turned to ghastly cobwebs; in Dickens’ words, “as if a feast had been in preparation when the house and the clocks all stopped together.”
The clocks all seemed to stop together on the afternoon of November 22, 1963. But after the assassination, the gist of the reputed best-and-brightest remained in top Cabinet positions. The distance from Dallas to the Gulf of Tonkin was scarcely eight months as the calendar flew. And soon America’s awesome scientific capabilities were trained on a country where guerrilla fighters walked on the soles of sandals cut from old rubber tires.
Growing up in a mass-marketed culture of hoax, the baby-boom generation came of age in a warfare state. From Vietnam to Iraq, that state was to wield its technological power with crazed dedication to massive violence.
_____________________________________________________
This is an excerpt from Norman Solomon’s new book “Made Love, Got War: Close Encounters with America’s Warfare State.” Norman Solomon’s book “Made Love, Got War: Close Encounters with America’s Warfare State” was published this week. For more information, go to: www.MadeLoveGotWar.com
In the autumn of 1957, America was not at war ... or at peace. The threat of nuclear annihilation shadowed every day, flickering with visions of the apocalyptic. In classrooms, “duck and cover” drills were part of the curricula. Underneath any Norman Rockwell painting, the grim reaper had attained the power of an ultimate monster.
Dwight Eisenhower was most of the way through his fifth year in the White House. He liked to speak reassuring words of patriotic faith, with presidential statements like: “America is the greatest force that God has ever allowed to exist on His footstool.” Such pronouncements drew a sharp distinction between the United States and the Godless Communist foe.
But on October 4, 1957, the Kremlin announced the launch of Sputnik, the world’s first satellite. God was supposed to be on America’s side, yet the Soviet atheists had gotten to the heavens before us. Suddenly the eagle of liberty could not fly nearly so high.
Sputnik was instantly fascinating and alarming. The American press swooned at the scientific vistas and shuddered at the military implications. Under the headline “Red Moon Over the U.S.,” Time quickly explained that “a new era in history had begun, opening a bright new chapter in mankind’s conquest of the natural environment and a grim new chapter in the cold war.” The newsmagazine was glum about the space rivalry: “The U.S. had lost its lead because, in spreading its resources too thin, the nation had skimped too much on military research and development.”
The White House tried to project calm; Eisenhower said the satellite “does not raise my apprehension, not one iota.” But many on the political spectrum heard Sputnik’s radio pulse as an ominous taunt.
A heroine of the Republican right, Clare Boothe Luce, said the satellite’s beeping was an “outer-space raspberry to a decade of American pretensions that the American way of life was a gilt-edged guarantee of our material superiority.” Newspaper readers learned that Stuart Symington, a Democratic senator who’d been the first secretary of the air force, “said the Russians will be able to launch mass attacks against the United States with intercontinental ballistic missiles within two or three years.”
A New York Times article matter-of-factly referred to “the mild panic that has seized most of the nation since Russia’s sputnik was launched two weeks ago.” In another story, looking forward, Times science reporter William L. Laurence called for bigger pots of gold at the end of scientific rainbows: “In a free society such as ours it is not possible ‘to channel human efforts’ without the individual’s consent and wholehearted willingness. To attract able and promising young men and women into the fields of science and engineering it is necessary first to offer them better inducements than are presently offered.”
At last, in early February 1958, an American satellite -- the thirty-pound Explorer -- went into orbit. What had succeeded in powering it into space was a military rocket, developed by a U.S. Army research team. The head of that team, the rocket scientist Wernher von Braun, was boosting the red-white-and-blue after the fall of his ex-employer, the Third Reich. In March 1958 he publicly warned that the U.S. space program was a few years behind the Russians.
------------------------------
Soon after dusk, while turning a skate key or playing with a hula hoop, children might look up to see if they could spot the bright light of a satellite arching across the sky. But they could not see the fallout from nuclear bomb tests, underway for a dozen years by 1958. The conventional wisdom, reinforced by the press, downplayed fears while trusting the authorities; basic judgments about the latest weapons programs were to be left to the political leaders and their designated experts.
On the weekly prime-time Walt Disney television show, an animated fairy with a magic wand urged youngsters to drink three glasses of milk each day. But airborne strontium-90 from nuclear tests was falling on pastures all over, migrating to cows and then to the milk supply and, finally, to people’s bones. Radioactive isotopes from fallout were becoming inseparable from the human diet.
Young people -- dubbed “baby boomers,” a phrase that both dramatized and trivialized them -- were especially vulnerable to strontium-90 as their fast-growing bones absorbed the radioactive isotope along with calcium. The children who did as they were told by drinking plenty of milk ended up heightening the risks -- not unlike their parents, who were essentially told to accept the bomb fallout without complaint.
Under the snappy rubric of “the nuclear age,” the white-coated and loyal American scientist stood as an icon, revered as surely as the scientists of the enemy were assumed to be pernicious. And yet the mutual fallout, infiltrating dairy farms and mothers’ breast milk and the bones of children, was a type of subversion that never preoccupied J. Edgar Hoover.
The more that work by expert scientists endangered us, the more we were informed that we needed those scientists to save us. Who better to protect Americans from the hazards of the nuclear industry and the terrifying potential of nuclear weapons than the best scientific minds serving the industry and developing the weapons?
In June 1957 -- the same month Nobel Prize–winning chemist Linus Pauling published an article estimating that ten thousand cases of leukemia had already occurred due to U.S. and Soviet nuclear testing -- President Eisenhower proclaimed that the American detonations would result in nuclear warheads with much less radioactivity. Ike said that “we have reduced fallout from bombs by nine-tenths,” and he pledged that the Nevada explosions would continue in order to “see how clean we can make them.” The president spoke just after meeting with Edward Teller and other high-powered physicists. Eisenhower assured the country that the scientists and the U.S. nuclear test operations were working on the public’s behalf. “They say: ‘Give us four or five more years to test each step of our development and we will produce an absolutely clean bomb.’”
But sheer atomic fantasy, however convenient, was wearing thin. Many scientists actually opposed the aboveground nuclear blasts. Relying on dissenters with a range of technical expertise, Democratic nominee Adlai Stevenson had made an issue of fallout in the 1956 presidential campaign. During 1957 -- a year when the U.S. government set off thirty-two nuclear bombs over southern Nevada and the Pacific -- Pauling spearheaded a global petition drive against nuclear testing; by January 1958 more than eleven thousand scientists in fifty countries had signed.
Clearly, the views and activities of scientists ran the gamut. But Washington was pumping billions of tax dollars into massive vehicles for scientific research. These huge federal outlays were imposing military priorities on American scientists without any need for a blatant government decree.
------------------------------
What was being suppressed might suddenly pop up like some kind of jack-in-the-box. Righteous pressure against disruptive or “un-American” threats was internal and also global, with a foreign policy based on containment. Control of space, inner and outer, was pivotal. What could not be controlled was liable to be condemned.
The ’50s and early ’60s are now commonly derided as unbearably rigid, but much in the era was new and stylish at the time. Suburbs boomed along with babies. Modern household gadgets and snazzier cars appeared with great commercial fanfare while millions of families, with a leg up from the GI Bill, climbed into some part of the vaguely defined middle class. The fresh and exciting technology called television did much to turn suburbia into the stuff of white-bread legends -- with scant use for the less-sightly difficulties of the near-poor and destitute living in ghettos or rural areas where the TV lights didn’t shine.
On the surface, most kids lived in a placid time, while small screens showed entertaining images of sanitized life. One among many archetypes came from Betty Crocker cake-mix commercials, which were all over the tube; the close-ups of the icing could seem remarkable, even in black and white. Little girls who had toy ovens with little cake-mix boxes could make miniature layer cakes.
Every weekday from 1955 to 1965 the humdrum pathos of women known as housewives could be seen on Queen for a Day. The climax of each episode came as one of the competitors, often sobbing, stood with a magnificent bouquet of roses suddenly in her arms, overcome with joy. Splendid gifts of brand-new refrigerators and other consumer products, maybe even mink stoles, would elevate bleak lives into a stratosphere that America truly had to offer. The show pitted women’s sufferings against each other; victory would be the just reward for the best, which was to say the worst, predicament. The final verdict came in the form of applause from the studio audience, measured by an on-screen meter that jumped with the decibels of apparent empathy and commiseration, one winner per program. Solutions were individual. Queen for a Day was a nationally televised ritual of charity, providing selective testimony to the goodness of society. Virtuous grief, if heartrending enough, could summon prizes, and the ecstatic weeping of a crowned recipient was vicarious pleasure for viewers across the country, who could see clearly America’s bounty and generosity.
That televised spectacle was not entirely fathomable to the baby-boom generation, which found more instructive role-modeling from such media fare as The Adventures of Spin and Marty and Annette Funicello and other aspects of the Mickey Mouse Club show -- far more profoundly prescriptive than descriptive. By example and inference, we learned how kids were supposed to be, and our being more that way made the media images seem more natural and realistic. It was a spiral of self-mystification, with the authoritative versions of childhood green-lighted by network executives, producers, and sponsors. Likewise with the sitcoms, which drew kids into a Potemkin refuge from whatever home life they experienced on the near side of the TV screen.
Dad was apt to be emotionally aloof in real life, but on television the daddies were endearingly quirky, occasionally stern, essentially lovable, and even mildly loving. Despite the canned laugh tracks, for kids this could be very serious -- a substitute world with obvious advantages over the starker one around them. The chances of their parents measuring up to the moms and dads on Ozzie and Harriet or Father Knows Best were remote. As were, often, the real parents. Or at least they seemed real. Sometimes.
Father Knows Best aired on network television for almost ten years. The first episodes gained little momentum in 1954, but within a couple of years the show was one of the nation’s leading prime-time psychodramas. It gave off warmth that simulated intimacy; for children at a huge demographic bulge, maybe no TV program was more influential as a family prototype.
But seventeen years after the shooting stopped, the actor who had played Bud, the only son on Father Knows Best, expressed remorse. “I’m ashamed I had any part of it,” Billy Gray said. “People felt warmly about the show and that show did everybody a disservice.” Gray had come to see the program as deceptive. “I felt that the show purported to be real life, and it wasn’t. I regret that it was ever presented as a model to live by.” And he added: “I think we were all well motivated but what we did was run a hoax. We weren’t trying to, but that is what it was. Just a hoax.”
-----------------------------
I went to the John Glenn parade in downtown Washington on February 26, 1962, a week after he’d become the first American to circle the globe in a space capsule. Glenn was a certified hero, and my school deemed the parade a valid excuse for an absence. To me, a fifth grader, that seemed like a good deal even when the weather turned out to be cold and rainy.
For the new and dazzling space age, America’s astronauts served as valiant explorers who added to the elan of the Camelot mythos around the presidential family. The Kennedys were sexy, exciting, modern aristocrats who relied on deft wordsmiths to produce throbbing eloquent speeches about freedom and democracy. The bearing was American regal, melding the appeal of refined nobility and touch football. The media image was damn-near storybook. Few Americans, and very few young people of the era, were aware of the actual roles of JFK’s vaunted new “special forces” dispatched to the Third World, where -- below the media radar -- they targeted labor-union organizers and other assorted foes of U.S.-backed oligarchies.
But a confrontation with the Soviet Union materialized that could not be ignored. Eight months after the Glenn parade, in tandem with Nikita Khrushchev, the president dragged the world to a nuclear precipice. In late October 1962, Kennedy went on national television and denounced “the Soviet military buildup on the island of Cuba,” asserting that “a series of offensive missile sites is now in preparation on that imprisoned island.” Speaking from the White House, the president said: “We will not prematurely or unnecessarily risk the costs of worldwide nuclear war in which even the fruits of victory would be ashes in our mouth -- but neither will we shrink from that risk at any time it must be faced.”
Early in the next autumn, President Kennedy signed the Limited Test Ban Treaty, which sent nuclear detonations underground. The treaty was an important public health measure against radioactive fallout. Meanwhile, the banishment of mushroom clouds made superpower preparations for blowing up the world less visible. The new limits did nothing to interfere with further development of nuclear arsenals.
Kennedy liked to talk about vigor, and he epitomized it. Younger than Eisenhower by a full generation, witty, with a suave wife and two adorable kids, he was leading the way to open vistas. Store windows near Pennsylvania Avenue displayed souvenir plates and other Washington knickknacks that depicted the First Family -- standard tourist paraphernalia, yet with a lot more pizzazz than what Dwight and Mamie had generated.
A few years after the Glenn parade, when I passed the same storefront windows along blocks just east of the White House, the JFK glamour had gone dusty, as if suspended in time, facing backward. I thought of a scene from Great Expectations. The Kennedy era already seemed like the room where Miss Havisham’s wedding cake had turned to ghastly cobwebs; in Dickens’ words, “as if a feast had been in preparation when the house and the clocks all stopped together.”
The clocks all seemed to stop together on the afternoon of November 22, 1963. But after the assassination, the gist of the reputed best-and-brightest remained in top Cabinet positions. The distance from Dallas to the Gulf of Tonkin was scarcely eight months as the calendar flew. And soon America’s awesome scientific capabilities were trained on a country where guerrilla fighters walked on the soles of sandals cut from old rubber tires.
Growing up in a mass-marketed culture of hoax, the baby-boom generation came of age in a warfare state. From Vietnam to Iraq, that state was to wield its technological power with crazed dedication to massive violence.
_____________________________________________________
This is an excerpt from Norman Solomon’s new book “Made Love, Got War: Close Encounters with America’s Warfare State.” Norman Solomon’s book “Made Love, Got War: Close Encounters with America’s Warfare State” was published this week. For more information, go to: www.MadeLoveGotWar.com