Back to Index   I   Home   I   DVD Store   I   Historical Documents   I   Video Guides   I   Customer Service   I   About Us

Need to print this document?  Go to "Print Preview" in your web browser and select "Shrink to Fit."

The American Testimony, a concise history of the United States Book 9:
The Post-Constitutional Republic Era
(Current Events from 2007 and Beyond)

© Copyright 2009 Bryan Hardesty. All rights reserved.


(NOTE: Because the American Testimony DVD series covers US history through 2006, this text provides a continuation of subsequent historical events and current social issues not included in the video collection.)

THE UNRESOLVED MORAL DEBATE

    Despite continual advancements in science and technology, questions have arisen about the progress of moral character in twenty-first century America.  Whenever the human quest for comfort, convenience, and pleasure has been executed in the absence of empathy, compassion, and respect for others (“the better angels of our nature,” as Lincoln observed), common outcomes have been the wounding of interpersonal relationships, loss of civility, violation of the social compact, and even the destruction of life.  The desire for “a better world” for one group of people has often led to encroachments against others. To deflect feelings of guilt for such acts, such arguments as “meting out social justice” or “acting for the greater good” have been offered.  To further alleviate one's conscience, offenders may have gone as far as diminishing the very human status of those harmed.

When does constitutionally-protected life begin?    For nearly four decades, the nation's people have wrestled with the issue of on-demand pregnancy termination, and whether or not unborn children should even be considered as human beings, meriting the unalienable right to life.  On a moral basis, many Americans refused to accept the Supreme Court's 1973 Roe v. Wade decision which overrode Tenth Amendment state government powers to regulate and restrict abortion procedures within their own states.  During the early twenty-first century, however, the federal courts went further, permitting pregnancy termination procedures to extend far beyond early extraction of newly-conceived fetuses. Thereafter, the debate expanded to the issue of “partial-birth” and “full-term” abortion, involving the oftentimes slow and tortuous dismemberment of fully-formed infants.

    Those Americans who have favored unrestricted abortion "rights" have focused their concerns primarily on the plight of host mothers, many of whom have been young women who, in pursuing love and happiness, unintentionally brought potentially life-changing predicaments upon themselves.  Beyond the fear of shame, doubts about financially supporting a child, along with the prospect of setting aside one’s own aspirations to raise offspring, have served as strong arguments against giving birth. Under extreme duress, even highly moral, kind-hearted people have made choices that ran against their core beliefs. 

Image of developing child in the womb.     Although technological method has been developed to either verify or refute theological assertions that one's “soul” is infused into the human egg at the moment of fertilization, advancements in medical imaging have enabled scientists to conclude that infants experience physical pain a mere seven weeks after conception. According to the Centers for Disease Control (CDC), more than forty percent of abortions in the United States have been performed after the eighth week of pregnancy. Thus, of the approximately four thousand abortions currently performed on a daily basis (combined CDC and Planned Parenthood statistics), more that 1,600 infants per day have suffered as their lives ended.

 

    The extent and duration of agony experienced by aborted infants varied by gestational stages.  Nevertheless, all forms of pregnancy termination, regardless of developmental phase, involved either the dismemberment or chemical burning of the unborn child.  (This includes the Suction Method, performed during the earliest weeks of pregnancy, in which powerful vacuums were employed to tear fetuses to pieces.  More developed fetuses had to be cut apart to facilitate passage through the womb.  Only the Saline Method, applied during the final trimester of pregnancy, did not necessarily involve live dismemberment.  However, the highly concentrated salt solution used in this technique caused the infants to be slowly burned to death.)  After the fifth month of pregnancy, with babies fully capable of surviving outside the womb, even secular, atheistic science could no longer offer an objective distinction between abortion and infanticide.  The most efficient method of late-term abortion was the Intact Dilation and Extraction technique (D & X; also known as Late Term D & E), involving the partial, feet-first delivery of living babies partway through the birth canal, followed by the piercing and crushing of their skulls. The heads oftentimes separated from the bodies.  On average, approximately twenty-thousand late-term/partial-birth abortions have been performed in America each year.
 

Late-term abortion    According to combined CDC and Planned Parenthood figures, ninety-five percent of all abortions have been performed for no other reason than birth control. The remaining five percent have been collectively attributed to rape, incest, fetal abnormality, or poor health of the host mother. No convicted murderer in the United States judicial system has ever faced the “cruel and unusual punishment” of execution by dismemberment, burning, or skull crushing. The same could not be said for the millions of pre-born infants whose only apparent offense was not being wanted.

    The moral debate over abortion divided the American people to a degree reminiscent of nineteenth century pre-Civil War hostilities over slavery and states rights.  While abortion supporters have routinely compared unborn infants to cancerous tumors, indistinct cell clusters, or blood clots (despite their uniquely human genetic structure), abortion opponents have argued that

children developing in the womb were no less human than those born outside of it.  Indeed, at no point during gestation is there a transformation from one species to another.

Right-to-life defenders

   

    Because evangelical Christians traditionally comprised the largest coalition of anti-abortion activists in the United States; abortion supporters have labeled this group “intolerant religious zealots.” However, beyond the scripturally-backed faith assertions that God’s hand knitted human life in the womb from conception, abortion opponents also pointed to the first and foremost right proclaimed in the Declaration of Independence: the right to life. Although the welfare of pre-born infants has stood as their chief concern, “pro-life” advocates have been branded by the angriest abortion supporters as backward, mean-spirited “rednecks” that sought to deny women control over their own bodies.

    Lost or overlooked in most discourses on voluntary pregnancy termination was the pivotal question of a biological father’s responsibility. The choice of ending the baby’s life or raising it in less-than-ideal circumstances has been a dilemma largely faced by the unwed mother alone.  More recent sociological findings, concerning the consequences of childrearing in the biological father’s absence, have further compounded the issue.


RACE, POVERTY, AND THE FATHERLESS PROBLEM

    Although the nation’s people experienced an ease in racial tensions at the onset of the twenty-first century, the overall question of racial equality remained unresolved.  Cursory examinations of prison populations by race have often prompted misguided outcries of continuing racism in the US justice system.  White males have consistently outnumbered all other races incarcerated, but because Americans of African descent comprised only 12 percent of the nation’s total population, questions arose as to why blacks accounted for 40 percent of those behind bars.
Jail: not a happy place to be.Racial diversity in law enforcement and criminal justice did not change prison statistics; thus social scientists began looking beyond skin color to gain a deeper understanding of the phenomenon. Instead of uncovering widespread racial prejudice, they found a direct and irrefutable link between criminal activity and fatherlessness.

 

    Researchers from the University of Chicago, with the aid of the Federal Bureau of Prisons, discovered that 78 percent of all inmates in the nation’s prisons and jails grew up in fatherless households. Furthermore, they concluded that children of fatherless homes were 20 times more likely to be imprisoned (8 percent for murder) than those of two-parent homes, regardless of race. (In these studies, households deemed “fatherless” included those parented by never-married single mothers, divorced-but-not-remarried single mothers, step-father households, and foster homes.)
 

Kids need their dads just as much as their moms.    In the latter twentieth century, the biological father’s role in childrearing tended to be viewed as unnecessary by pop psychologists and feminist groups. By 2007, nearly half of America’s children were identified as growing up fatherless. Social pundits who devalued fatherhood failed to recognize the uniquely male ability to mentor children in controlling aggressive behavior and managing anger. In a surprising revelation, researchers unexpectedly found that children growing up in two-parent homes were still twice as likely to end up in prison as those of father-only households (11% to 0.5%, respectively). This finding was attributed to the fact that most women instinctively perceived discipline as harsh and unloving, and such views undermined the father’s ability to enact appropriate corrective measures. Researchers, too, have admittedly found it more difficult to delineate between the loving, self-restrained forms of discipline conducted by most biological fathers, and the rage-induced types of physical and emotional abuse meted out by many step-fathers or boyfriends of biological mothers. Unable to discern between the two, outside observers have held most forms of corporal punishment in contempt, crying “abuse” where there has often been none.
 

    Direct correlations have also been established between fatherlessness and poverty. Census statistics confirmed that two-parent families earned three times as much as single-mother households. Researchers also discovered a link between a child’s cognitive development and the presence the biological father. The American Economic Review determined that children in single-mother families were three times more likely to fail in school as those from intact two-parent homes. Girls in fatherless households were two to three times more likely to become pregnant as teenagers than those living with both parents.
 

A blessing whenever the father is present.    By objectively examining illegitimate birth statistics in the United States, social scientists were able to pinpoint the root cause for the excessively high rates of violent crime, poverty, and low educational scores among Americans of African descent. The National Center for Health Studies reported that the out-of-wedlock birth rate in 1965 was 3.1 percent in the white population and 24 percent in the black population. By 2005, illegitimate births among whites rose to 25 percent, while blacks experienced an astounding out-of-wedlock birth rate of 70 percent. Despite the incremental decline in systemic racial discrimination on the public landscape, many Americans of African descent continued to face socioeconomic hardships due to widespread fatherlessness within the culture.
 

    For decades, superficial social pundits have pointed to abortion and sex education as solutions to the rise in illegitimate births. However, in 2007 alone, 40 percent of all births were to out-of-wedlock mothers (source: National Center for Health Statistics).  Additionally rampant outbreaks of sexually-transmitted diseases have skyrocketed.  These trends have occurred long after abortion services and sex education programs became widely available. Indeed, the promotion of condoms to America’s youth has been conducted in such a way as to ignore their high failure rate, giving a false sense of security that sexual intercourse could be “safe.”  Additionally, easy access to abortuaries has tended to allay fears from consequences of promiscuous sexual activity. Only premarital sexual abstinence has proved 100 percent effective in avoiding unplanned pregnancies and sexually-transmitted diseases, and its feasibility has been established by earlier generations. (For example, during the 1930s and ‘40s, 84 percent of all single adults remained virgins until marriage.)  With levels of crime, poverty, and educational decline statistically linked to each community’s illegitimate birth rates, it has not been unreasonable to view the cultural promotion of abstinence as a viable solution to many of the negative social issues plaguing American cities.


 
PROGRESS IN IRAQ
 

The troop surge in Iraq: a positive turning point for the US.    Another issue dividing the nation's people was the war in Iraq.  Despite the nation's overall military superiority, historical lessons gleaned from the prior US wars in Korea and Vietnam exposed America's chief weakness: the inability of its people to tolerate long, protracted warfare. For the Iraqi campaign, George W. Bush’s principle blunder was the application of his “compassionate conservative” philosophy toward the conduct of war. His “good intentions” strategy ran counter to the well-proven principle of decisive victory, as observed by the leading military theorist, Prussian General Claus von Clausewitz. In his classic 1831 work, “On War,” Clausewitz wisely observed that “kind hearted people might, of course, think there was some ingenious way to disarm or defeat an enemy without too much bloodshed, and might imagine this is the true goal of the art of war. Pleasant as it sounds, it is a fallacy that must be exposed. War is such a dangerous business that the mistakes which come from kindness are the very worst.” 

 

General David Petraeus    To rectify past miscalculations, President Bush, in January 2007, appointed Lieutenant General David Petraeus to assume top command over US forces in Iraq, replacing General George Casey. Well-schooled in the theories of Clausewitz and other leading military strategists, Petraeus embarked on Bush’s newly-proposed “surge” offensive in Iraq. The president’s request for an additional 20-thousand troops to stem sectarian conflict in Iraq was greeted with derision by the Democrat majority in Congress. On February 16th, the House of Representatives adopted a resolution critical of the Bush surge strategy, only to see the measure blocked by Republicans and conservative Democrats in the Senate.
 

    Much to the chagrin of critics, the surge campaign quickly turned the tide of war, bringing increased stability through the neutralization or outright elimination of insurgency strongholds in Iraq. Thereafter, daily life throughout Iraq was vastly improved, with only occasional outbreaks of violence.

 

FEAR-BASED PUBLIC POLICY

In hindsight, The Population Bomb turned out to be an intellectual dud.    More than any profit-generating private business enterprise, bureaucratic government agencies have held the potential to wield enormous power over private citizens. Since public consent has been crucial to the creation of federal departments, some sort of cause or crisis has been needed to justify each new agency’s existence.  With the availability of monetary incentives in the form of research grants and other taxpayer-funded development projects, politicians have found it easy to recruit various scholars, scientists, and so-called experts to champion a particular cause.  One such example was the global overpopulation hysteria of the 1970s, triggered by the publication of The Population Bomb by Stanford University professor Paul R. Ehrlich.  The book convinced countless educated people throughout the world that hundreds of millions would starve to death in the 1970s and ‘80s, due to overpopulation.  Although small, localized famines occasionally occurred in the world due to political strife, global food production actually increased on a broad scale.  Other outrageously false Ehrlich predictions included the extinction of all important sea life by 1980; the smog-related deaths of 200 thousand New York and Los Angeles residents by 1973; and the drop in human life expectancy to age 42 by 1980.  Though laughable in hindsight, such assertions were seriously accepted as "scientific fact" at the time they were proclaimed. Ehrlich’s alarmist theories, backed by a number of credible scholars and scientists, served to influence government policy in a number of countries, as well as the United Nations, during the 1970s. It was a case where reactionary public policies, government regulations, and media myths were driven by scientific theory instead of empirical scientific evidence.

 

    In empirical science, a hypothesis has been provable either through deep examination of physical evidence or reproducible through laboratory experimentation. Conversely, scientific theory has never been more that a conjecture, opinion, belief, assumption, or speculation about an observable but not testable finding. By failing to discern between empirical science and scientific theory, many gullible politicians, journalists, and citizens have promoted erroneous government and social agendas affecting the nation as a whole.

 

An Inconvenient Truth by Al Gore    The lessons of Population Bomb pseudo-science went largely unlearned.  In the early twenty-first century, another body of work--this time a politically-driven motion picture film--provided yet another example where highly-selective scientific findings and error-riddled statistics were manipulated and presented in credible fashion to trigger mass hysteria.  In this instance, the theory was that of man-made global warming.  The film, An Inconvenient Truth, was a cinematic record of a lecture series by former Vice President Albert Gore, Jr.  Its message communicated a growing concern among many scientists that the Earth’s near-surface air and ocean temperatures had been steadily climbing since the 1850s, and that associated rises in sea levels and changes in weather patterns would result in devastating agricultural losses, mass migrations, various species extinctions, new viral and bacterial strains, and violent super-storms.  Although such conjecture contradicted the mid-1970s scientific consensus of an impending ice age, An Inconvenient Truth was so enthusiastically embraced that it won an Academy Award, and Gore went on to receive the Nobel Prize for his efforts.  And yet, the scientific community itself was fiercely divided over the veracity of many global warming claims.

 

Is planet Earth as fragile as many think?    The primary issue of contention was the question of human contribution to the so-called “greenhouse effect” (referring to the atmospheric trapping of sun-generated heat that bounces off the Earth’s surface). Without natural greenhouse gases, temperatures would plummet to permanent sub-freezing levels, with the most devastating thermal losses occurring at night. But while the greenhouse effect has usually been beneficial to the sustenance of life on Earth, global warming theorists asserted that industrialized human activity was producing an overabundance of hydrocarbon emissions, trapping ever-increasing amounts of heat in the atmosphere.  Indeed, while temperature records vaguely supported this notion, blind acceptance of the "man-made" aspect of the theory resulted in the implementation of numerous new laws, regulations, litigations, and tax penalties.  Those questioning the theory were derided as "global warming deniers" by university instructors, journalists, and popular entertainers.

 

    Over time, mounting empirical evidence exposed a number of scientific flaws to the man-made global warming theory. One such finding revealed that the most significant rise in global temperature actually occurred prior to 1940, proving that increases in warming happened well before the most prominent phase of global industrialization. The temperature record also provided evidence of a cooling trend between 1940 and 1965, the very time when unfiltered carbon emissions poured into the atmosphere at the highest rate. Further debunking of the man-made global warming theory was made through exhaustive studies of oxygen isotopes, beryllium ions, tree rings, tiny sea and pollen fossils, stalagmites, glacier ice cores, and lake sediments. With centuries of written history available, researchers also reviewed nautical records of sea levels, census accounts of public health trends (plagues, epidemics, etc.), and archival journals and documents reporting weather occurrences in various locations.  The new findings directly contradicted the earlier man-made global warming assumptions.
 

Could fluctuations in the sun's temperature be the primary cause of climate change on Earth?    In September 2007, the Hudson Institute, a scientific research organization, compiled peer-reviewed literature from more than 500 scientists who had concluded through empirical research that more than a dozen global warming periods identical to the present one had occurred in cyclical fashion over the last 1,500 years. Among these experts were climatologists and astronomers who also concluded that “our Modern Warming is linked strongly to variations of the sun’s irradiance.” From data gathered by NASA, the Pulkovo Astronomical Observatory in Russia, and Oxford University in England, scientists have confirmed that the sun was the primary cause of global warming, adding that temperatures on Mars rose in proportion to those on Earth. The sun, however, could not be taxed or regulated by government, and it was more profitable for certain special-interest groups to perpetuate the man-made global warming myth.

 

 Former President Bill Clinton at the U.N. Climate Change Conference.   International treaties from the United Nations were drafted to force developed nations to pay vast monetary fees for carbon emissions. Taxpayer-funded regulatory agencies were likewise created and empowered by the federal government to impose and collect fines on industries attributed to global warming.  Self-interested scientific bureaucracies have been filled with paid advisors who supported climate-change hysteria. Government research grants for scientific studies have been controlled by politicians with self-promoting agendas. Guilt-based business opportunists have profited from the selling of “carbon offsets” to excuse the use of private jets, high fuel-consumption vehicles, and electricity-draining mansions.  Thus, it has not been in the best interests of certain groups to accept empirical evidence that has overwhelmingly refuted the man-made global warming hypothesis.  In 2008, climate scientists determined that the Earth had been in a decade-long cooling trend.  Thereafter, the alarmists began phasing out the term "global warming" in exchange for the newer, more vague term, "climate change."

    Prior to his death in March of 2008, Dr. Frederick Seitz, President Emeritus of Rockefeller University and former President of the National Academy of Sciences, presented a petition of scientific declaration, rejecting the notion that humans produced greenhouse gases at levels high enough to impact global temperatures. The petition contained the signatures of more than 31-thousand scientists from across America. Nevertheless, both the Democrat-controlled Congress and the Republican Bush administration ignored the document, choosing instead to vigorously pursue questionable climate-change policies that restricted and penalized energy and technology industries involved in the domestic production of coal, oil, and natural gas.

 

A GOVERNMENT-GENERATED ENERGY CRISIS

    As a land abundantly rich in natural resources, the United States of America once served as a major energy exporter to the world. However, irrational government regulations destroyed incentives for industrial progress, forcing the nation to import oil from foreign countries; thereafter adding to the gargantuan, insurmountable trade deficit.  As a consequence, fuel pricing and availability was controlled by the petroleum exporting nations; many of which were hostile toward American interests, especially in the wake of the Iraq war.

 

Could the oil supply be more plentiful than we think?    Beyond climate-change and environmental impact concerns, resistance to domestic oil and natural gas drilling had also been based on the long-embraced “fossil fuel” theory; a belief that such resources were created exclusively from fossilized animals and plants that had been subjected to subsurface heat and pressure for hundreds of millions of years. For generations, petroleum and gas reserves were considered “nonrenewable resources” in limited supply. Recent empirical discoveries, however, have overturned this notion. In separate studies by Cornell University, the US Geological Survey, the National Academy of Sciences in the Ukraine, Indiana University at South Bend, and Princeton University, researchers found traces of methane gas inside meteors, petroleum molecules inside rocks at depths unsustainable by organic life, and uniquely inorganic chemical signatures in carbon and hydrogen isotopes. The findings led scientists to rethink the entire “fossil fuel” assumption, and a number of subsequent geological surveys have supported the hypothesis that, like magma, many gases and petroleum elements emerge naturally from the Earth’s core, rising through the crust into soil levels where intermixing with biological agents gives the deceptive appearance of having organic origins. The most damning evidence against the fossil fuel theory came from NASA’s Cassini-Huygens interplanetary mission, in which the space probe’s onboard gas chromatograph mass spectrometer provided incontrovertible evidence of abundant, continually-replenishing supplies of methane venting from surface of lifeless Titan, a giant moon orbiting Saturn. Following this discovery, biogeochemists at Lund University in Sweden noted that approximately four million tons of methane has erupted from the surface of the Arctic tundra each year during winter, a time when organic materials were too frozen to decompose.

 

An offshore oil platform.    Despite the evidence, many in the academic and political communities refused to let go of long-accepted myths about energy resources. Since the early 1960s, doomsayers have constantly prophesied the imminent total depletion of world oil supplies (usually “within the decade”).  Instead, more and more reserves have been discovered. The high cost of oil exploration had forced petroleum companies to limit geological studies to small areas where oil would most likely be found. Thus, no one has ever known the actual amount of accessible oil in the Earth, as statistics only reflected “known reserves.”  In addition to the already accessible petroleum reserves in the United States, recent discoveries, including the vast Jack Field reserve in the Gulf of Mexico and the widespread shale oil formations in the western states, confirm that there is enough energy to supply the entire nation--at its current rate of consumption--for the next four centuries, without the need for foreign imports.  (These figures are for current known reserves alone, and the new empirical findings indicate that the earth creates and secretes these materials on a continual basis.)

 

    Political resistance has stood as the single greatest obstacle to total, long-term energy independence in America.  Powerful special interest groups, such as The Sierra Club, Environmental Defense Fund, and Nature Conservancy, have spent millions of dollars promoting the argument that energy extraction damages ecosystems.   Furthermore, they have perpetuated the notion that petroleum-fueled engines significantly contribute to climate change. In reality, technological advances in petroleum drilling and shale mining have made it possible to extract these energy resources in ways that leave minimal and short-lived footprints on the environmental landscape. Furthermore, innovations in fuel conservation and anti-pollution devices have continued to diminish any potential for atmospheric impact. Nevertheless, so much money was made through restricting and regulating energy production, government officials adamantly have protected their interests at the expense of the American people.

    Beyond the federal government’s Environmental Protection Agency, thousands of state and local enterprises were established to impose taxes, fees and fines for a wide range of interests involved in regulating energy production, distribution, and consumption. Congress itself has been largely populated with trial lawyers who have stood to profit from energy-related lawsuits. Due to restrictive government policies and the threat of vast financial penalties, no oil refinery has been built in the United States since 1976. Additionally, the nation has been unnecessarily forced to import 62 percent of its oil from foreign countries at a cost of 600 to 700 billion dollars per year. These petroleum purchases account for the single largest transfer of American wealth overseas, compounding the already insurmountable trade deficit.
 

 

THE CORN CONUNDRUM

A fuel pump offering E85 ethanol.    The idea of supplementing or replacing imported petroleum with alternative fuels has been promoted ever since the Arab oil embargo that created gasoline shortages in the US during the 1970s. However, it was not until the September 11, 2001 terrorist attacks that the federal government began implementing heavy-handed measures to compel oil refiners to add substantial volumes of ethanol to their gasoline blends. Touted as a renewable “biofuel,” ethanol is a type of alcohol distilled primarily from corn. It was initially believed that by adding it to gasoline, ethanol would reduce the nation’s dependence on foreign oil imports, while simultaneously providing a financial boost to farmers who grew corn. Politicians, in their usual reactionary fashion, imposed a number of mandates for the use of ethanol without truly examining its actual efficiency. They also failed to consider the unintended agricultural and economic consequences of their actions.

 

    Through taxpayer-funded subsidies for corn farmers, as well as tax incentives for oil companies that blended ethanol with gasoline, greater financial gains were made using corn for fuel instead of food.  Thus, food processors were forced to pay the same top-dollar price offered for the crop by energy companies. With more than 3,500 uses for corn, the costs for a vast array of products skyrocketed.
 

 Corn production places a heavy demand on the water supply.   In the Energy Policy Act of 2005, Congress had required oil refiners to blend no less than 7.5 billion gallons of biofuels into gasoline by 2012, increasing to 36 billion gallons by 2022. However, only 328 gallons of ethanol could be produced by each acre of corn, and it took 140 gallons of petroleum-based fuel to plant, grow, and harvest each acre. In order to meet the 2022 ethanol quota, it was determined that more than one-third of the nation’s land surface would need to be used for growing corn. Furthermore, Cornell University scientists found that the amount of energy required to convert corn into ethanol vastly exceeded the amount of energy that could ever be produced by the final product itself. In total, it took much more than a gallon of gasoline to produce a single gallon of ethanol. Other precious resources risked rapid depletion as well. Corn crops erode the soil of nutrients at an excessively high rate (requiring crop rotation), and it takes 25 rainy seasons to replenish the water used to grow corn. From the planting of corn to its conversion into fuel, 1,700 gallons of water are needed to produce a single gallon of ethanol.

 

    The government offered farmers bonuses to grow corn, while energy companies, desperate to meet federal energy mandates, paid the highest prices for the crop. To keep up with the demand and reap the maximum reward, farmers expanded their corn output, using acreage originally designated for other crops, such as wheat. Soil was not allowed to rest or replenish due to the need for additional corn harvests. Millions of gallons of water per day have been extracted from streams and underground aquifers to irrigate corn fields. Although ethanol was promoted to the public as an environmentally-friendly fuel alternative, its production has created the potential for a "rape of the earth" environmental catastrophe unlike any yet seen.

 

Consumer Reports cover - "The Ethanol Myth"    Scientists measuring the efficiency of gasoline-ethanol blends discovered that ethanol caused gasoline to be expended by automobile engines at a faster rate than pure gasoline itself. Multiple road tests proved that any car using nine gallons of pure gasoline would have the same fuel mileage as a car using ten gallons of gasoline mixed with an additional gallon of ethanol. This was backed by a Consumer Reports investigation which concluded that one gallon of the newer E85 gasoline/ethanol mix delivered only 70 percent of the mileage delivered by a gallon of straight gasoline. Thus, fewer total emissions would have been poured into the atmosphere had cars simply burned pure gasoline. In short, there has been no benefit whatsoever for the existence of ethanol, especially in light of the fact that the United States sits upon a known four-century supply of petroleum-based fuels. Professor David Pimentel of Cornell University’s College of Agriculture and Life Sciences offered this stinging indictment: “Abusing our precious croplands to grow corn for an energy-inefficient process that yields low-grade automobile fuel amounts to unsustainable, subsidized food burning.

 

    Despite recent scientific findings, President George W. Bush signed a congressional bill entitled The Energy Independence and Security Act on December 19, 2007, enacting one of the most monumentally ill-advised and suicidal energy bills in the nation’s history. Adding to prior legislation, the new bill ordered automotive fuel to be 15 percent ethanol by 2015, rising to 25 percent by 2022. The authors of the bill appeared to be ignorant of the fact that, based on current crop production and fuel consumption rates, 100 percent of corn grown in the United States is capable of only producing enough ethanol for a 12 percent blend with gasoline. The bill also ordered auto makers to engineer cars to run on 85 percent ethanol by 2012.

 

The much in-demand corn crop.    Beyond the unintended agricultural and environmental consequences of the colossally imbecilic biofuel legislation, the negative economic impact of the 2007 energy mandate was almost instantaneous. There has never been an infinite supply of any agricultural product, nor have the resources needed to produce a crop ever been limitless. This is especially true of corn, which is used in thousands of products. Corn is the primary source of livestock feed, and by hijacking thirty percent of the harvest for ethanol, prices for beef, chicken, pork, dairy and poultry products skyrocketed. Cereals and other foods using corn byproducts also increased in cost. Scarcities in wheat and soybean occurred when the producers of those crops switched to the more financially lucrative corn. Between April 2007 and April 2008, dairy prices rose nearly 80 percent and grain 42 percent.

 

    The United States has been the world’s largest grower and exporter of grain, the main staple of Third World countries. Thus, crop shortages and higher prices have had a profound global impact on food availability. World grain reserves plummeted from a 180-day to a 57-day supply, according to the United Nations Food and Agricultural Organization’s World Food Index and the World Food Program. Impending mass famines could therefore be linked to political opposition to petroleum drilling in the United States.


TO THE BRINK OF ECONOMIC MELTDOWN

The Bush White House. (Photo Copyright 2003 Bryan Hardesty.  All rights reserved.)    The tax cuts and taxpayer rebates in the early years of the George W. Bush presidency had done much to temporarily restore economic prosperity after the September 11, 2001 terrorist attacks; but these advances were eventually undone by out-of-control spending and poorly-conceived monetary mandates of Congress. When Bush took office in January of 2000, the national debt was under 6-trillion dollars. By August 2007, that debt exceeded 10-trillion dollars, a 70 percent increase.  For much of that period, the Republican president had been backed by a narrow Republican majority in Congress; however, most of these party-faithful abandoned their former “fiscally conservative” principles, engaging instead in reckless deficit spending. By the time an equally spend-crazy Democrat majority was restored to Congress in 2007, the national debt surged and the value of the US dollar tumbled on the world currency market.

 

    On November 15, 2007, American financial institutions were forced to adopt a new government-mandated accounting standard that created chaos in the nation's banking sector.  Because it had been difficult to determine the actual cash value of certain investments until they were actually sold, financial traders had to estimate the value of the unsold assets they held in possession.  The new government rule ordered them to adopt a "mark to market" accounting method, basing the value of unsold assets on the recent sale price of similar assets.  What federal regulators failed to consider is that oftentimes a troubled financial entity would dump its holdings at prices far below their true value.  Although the dumped assets may have actually been of greater worth, all institutions holding similar assets had to incorporate the "dumped" prices into their own accounting records.  Thus, financial institutions that were declaring profits months earlier were forced to declare substantial decreases in the value of their yet-to-be-sold holdings.  With apparent "losses" on the books, these entities restricted further asset investments.

 

    One by one, the pillars of economic stability began to fall. The next crises occurred in the same real estate market that had experienced an unprecedented boom at the mid-point of the Bush presidency. Home foreclosure in the dwindling real estate market.Congress, in a well-intentioned effort to increase minority home ownership, pressured lending institutions to cease credit history checks, down-payment requirements, and reviews of applicant employment income.  Millions of high-risk, low-income borrowers with poor credit ratings were thereby granted home loans through the practice of subprime lending.  Such loans involved Adjustable Rate Mortgages (ARMs), in which interest rates on house notes were adjusted in accordance with ever-changing Treasury security values, banking expenses, and common lending charges.  As long as interest rates were low, house payments under ARM structures were affordable to median-income borrowers.  The first sign of trouble was the rise in property values that naturally resulted from the increased demand for new homes. Simultaneously, cost-of-living expenses began to soar due to the unintended consequences of government energy and environmental mandates.  Then, following the implementation of "mark-to-market" financial accounting rules, banking institutions became more rigid in their lending practices.  As a result, interest rates on ARM loans increased by more than 30 percent, forcing millions of borrowers to default on their house payments. Collectively, these losses created a lending crisis and a drastic downturn in the housing market.

    In December of 2007, the Bush administration reached an agreement with lenders to freeze interest rates for up to five years for people who were up-to-date on their house payments. However, the crisis could not be alleviated, because the government’s own mortgage entities, Fannie Mae and Freddie Mac, were consumed by widespread corruption.
 

The Federal National Mortgage Association (FNMA), better known as Fannie Mae.    The name Fannie Mae came from FNMA, an acronym for the Federal National Mortgage Association. Chartered by Congress in 1968, its ownership was shared between the federal government and private stockholders. This hybrid corporation offered taxpayer-funded backing for mortgage loans, thereafter packaging clusters of loans as bonds and securities for outside investors. As part of a 1999 agreement with the US Department of Housing and Urban Development, Fannie Mae committed half of its guarantees to low-income and high-risk borrowers. The associated entity, Freddie Mac, a named derived from FHLMC, referring to the Federal Home Loan Mortgage Corporation, had been created by the government in 1970 to expand the use of taxpayer money to guarantee the secondary mortgage market. The blended private-and-government structures of Fannie Mae and Freddie Mac were designed to reward investors with all the profits while saddling taxpayers with all the risks. Thus, financial institutions were emboldened to take unnecessary and reckless gambles in their lending practices.

The United States Capitol building.  (Photo Copyright 2003 Bryan Hardesty.  All rights reserved.)    As early as 2003, the Bush administration had requested reforms to better regulate the mortgage industry, but was opposed by a congressional coalition headed by Representative Barney Frank, the ranking Democrat on the Financial Services committee of the House of Representatives. After a long struggle, House Republicans eventually passed the Federal Housing Enterprise Regulatory Reform Act of 2005, only to see the bill abandoned in the Senate. A later Federal Election Commission report of special interest campaign funding from 1989 through 2008 revealed that those senators most opposed to the reforms were the top three recipients of cash contributions from Fannie Mae and Freddie Mac: Illinois Senator Barack Obama, Massachusetts Senator John Kerry, and Connecticut Senator Christopher Dodd, chairman of the Senate Banking Committee.

    By the end of 2007, President George W. Bush, having basked in the glory of record economic growth and low unemployment rates during the previous four years, saw America’s economic fortunes plummet with the simultaneous rapid rise in the energy market, the collapse of the real estate market, and new reports of tens of thousands of job losses. In January of 2008, he proposed an emergency economic stimulus bill to refund additional dollars to individual taxpayers earning less than 75-thousand dollars per year. The bill was enacted by Congress on February 7, 2008. Meanwhile, the Federal Reserve incrementally slashed interest rates while lending hundreds of millions of dollars in Treasury securities to ease the credit crisis. However, the collateral for these securities were unstable mortgages. Congress, in turn, passed a housing relief bill on April 10th in an effort to rescue homeowners on the brink of foreclosure. Each of these maneuvers by the federal government merely postponed the crisis.

    Under ordinary circumstances, free-market adjustments would have gradually allowed corrections to occur in the financial markets without government intervention. However, 2008 was a presidential election year, and politicians, under the scrutiny of sensationalism-hungry journalists, embarked on their ill-conceived, reactionary measures for public approval.


THE POLITICS OF SUPERFICIALITY

George W. Bush during the final months of his presidency.    Despite his devout Christian faith and kind-hearted demeanor, President George W. Bush lacked the adequate communications and political skills to convey his vision and inspire the nation; and unlike most other Republic presidents, he did not adhere to the fiscal conservatism that had long characterized party principles. The American people, already fatigued by slow progress in Iraq, grew evermore dismayed by soaring energy costs and the troubled real estate market.  As time approached to elect a new president, the last thing most voters wanted was “another Bush.”  This sentiment shaped the entire tone of the election season.
 

Senators Hillary Clinton and Barack Obama: Democrat Party contenders for the 2008 presidential election.    With the Republican image tarnished by both the real and perceived ineptitudes of the Bush administration, the 2008 political climate vastly favored Democrats.  Since it was generally assumed that whoever won that party’s endorsement could easily win the presidency, New York Senator and former First Lady Hillary Clinton vied for the Democratic nomination against the increasingly popular Illinois Senator, Barack Obama.  Senator Clinton’s peripheral involvement in her husband’s administration, combined with eight years of service in the Senate, made her the more experienced of the two candidates.  Nevertheless, the Clinton name harkened back to government of the past, and many voters yearned to move in a new direction.  Though having served an utterly unremarkable three-year term in the Senate, Barack Obama was nonetheless an energetic, articulate and charismatic politician whose electrifying keynote address at the 2004 Democratic Convention secured his place as a rising star in the party.  In external characteristics, he possessed the upbeat, inspirational qualities so lacking in George W. Bush; and with a Kenyan father, he was realistically poised to become the nation’s first non-Caucasian president.  After a long, close contest with Senator Clinton, Obama emerged as the Democrat party’s presidential candidate. His running mate for the Vice Presidency was Senator Joseph Biden, Jr. from Delaware.

 

    The Republicans, meanwhile, were saddled with the unpopular distinction of being the “party of Bush;” and as such, their most viable candidate was a man well known for breaking away and opposing many Bush policies.  John McCain had served in Congress since 1983; first as a two-term Representative and then as a Senator from Arizona. In his younger days, he was a naval aviator in the Vietnam War, and was shot down and captured by the North Vietnamese in 1967.  As a prisoner of war until 1973, McCain was routinely tortured; wounds of which left him with limited physical mobility. The experience, nevertheless, infused him with a unique perspective on the greatness of America.  In his 2008 Republican Convention nomination acceptance speech, McCain declared: “I fell in love with my country when I was a prisoner in someone else’s. I loved it not just for the many comforts of life here. I loved it for its decency; for its faith in the wisdom, justice and goodness of its people. I loved it because it was not just a place, but an idea, a cause worth fighting for.

The 2008 Republican presidential ticket: Sen. John McCain (Arizona) and Gov. Sarah Palin (Alaska).    John McCain’s pursuit of the nation’s highest political office was considered unattainable by many, due to the damage done to the Republican reputation by the president and big-spending, big-government party members in Congress. However, his chances for victory were profoundly bolstered by his running mate, Alaska Governor Sarah Palin.  As the first female and youngest person to hold the top office in her state, Governor Palin cut billions of dollars of wasteful expenditures from Alaska's state budget, and despite the absence of a sales or state income tax, state revenues doubled in 2008.  The mainstream news media, however, was unabashedly supportive of an Obama victory, and various journalists set out to damage Palin’s reputation, conducting background research more extensive than any done on Obama, while confronting her with difficult-to-answer questions they never dared ask the other candidates.
 

    In an ironic turn of events, the Republican presidential ticket was further hampered by legislation coauthored by John McCain himself.  His campaign finance reform bill, the 2002 McCain-Feingold Act, placed strict fundraising limitations on candidates accepting public money.  Barack Obama cunningly bypassed these restrictions by refusing to accept public funding for the general election.  This strategy enabled the Democrat candidate to raise four times the amount of money as his Republican counterpart.

 

    The election of 2008 was one in which symbolism trumped actual experience or prior achievement.  Whenever pollsters asked Obama supporters to identify a major accomplishment of their candidate while in the Senate, virtually none could provide an answer.  Time and again, the most commonly offered explanation for supporting Obama was: “he inspires me.”  The trend toward emotion-based public opinion was further confirmed by the results of a 2008 multiple-choice test on basic American history and civics, as conducted by Intercollegiate Studies Institute and the National Civic Literacy Board.  Out of thousands of randomly-selected American adults, an astounding 71 percent failed the exam, averaging an overall score of 49 percent.  The questions receiving the fewest correct scores were those regarding elementary economic principles. Thus, at a time when the nation faced its grimmest financial prospects in decades, few voters were intellectually equipped to identify political candidates who could offer the most sensible solutions.  America had evolved into a nation of fundamental ignoramuses.
 

 

STUPIDITY-INDUCED FINANCIAL TURMOIL

    On September 7, 2008, with the national debt already surpassing the 10-trillion dollar mark, the federal government placed Fannie Mae and Freddie Mac under a conservatorship similar to bankruptcy.  The US Treasury Department guaranteed their mortgage-backed securities with taxpayer funds.

 

Treasury Secretary Henry Paulson with President Bush.    Other banking and financial services firms, including Merrill Lynch, Bear Stearns, Lehman Brothers, and Washington Mutual likewise fell into disarray--the direct result of "mark-to-market" accounting regulations.  The scores of incumbent senators and representatives seeking reelection in 2008 knew that an economic calamity would likely ruin their political prospects.  Treasury Secretary Henry M. Paulson, Jr. insisted that unless Congress allocated 700 billion taxpayer dollars to use at his discretion, the economies of the US and other nations would suffer significantly. Concerned that Paulson’s plan forced American taxpayers to pay the consequences for government-pressured high-risk lending practices, Republicans and conservative Democrats in the House formed a majority to defeat the bailout bill.  Within two days of the House rejection of the Paulson plan, the panic-stricken Senate took up the measure, making it more attractive by raising the amount of money insured by the FDIC for individual bank accounts.  This time, the Troubled Asset Relief Program (TARP) passed through Congress and was signed into law on October 3, 2008.

 

Barack Obama was the first black American to win the presidency.    The hastily-conceived bailout measure did little to stop the rapid plunge in the stock market.  Nevertheless, it offered the illusion of “something being done,” so that public attention could refocus on the November election.  Whatever prior gains John McCain had made in his campaign were undone by lackluster performances in televised presidential debates.  Barack Obama won the election by a comfortable margin, becoming the first non-white person to win the nation’s highest office.  As such, his victory symbolized the death of systemic racial discrimination in the United States, bringing about a positive sense of cultural healing among the people.

    In the final two months of the George W. Bush presidency, White House economic policies grew more reminiscent of National Socialism than free-market conservatism. In essence, the federal government had nationalized the mortgage banking system, and its reach extended to other business entities. As a primary insurer of mortgage-backed securities and high-risk debts, the American International Group (AIG), the world’s largest insurance corporation, was imperiled by the mortgage crisis.  Because AIG was the primary backer of congressional retirement pensions, panicked members of Congress intervened, doling out 150-billion taxpayer dollars to AIG in November of 2008. 

 

    Beyond bailout giveaways, Federal Reserve loan pledges to the financial system surpassed the 7.4-trillion dollar mark by mid-November 2008, an amount equal to half the value of everything produced in America during the previous year (at that point representing 24-thousand dollars for each man, woman, and child in the nation). According to Congressional Budget Office figures, the Federal Reserve loan guarantee was nine times the amount spent to that point on the Iraq and Afghanistan wars combined.  And yet, this grotesquerie would be a mere pittance in contrast to the nation-destroying expenditures that were yet to come.
 

A General Motors assembly plant, where the union workers really give their all.    Through the unprecedented loans and bailouts, the United States government unwittingly implied that the taxpaying public could be milked mercilessly to help large businesses escape the consequences of bad corporate decisions. Next to fall in line for government handouts were the “Big Three” US automakers, General Motors, Ford and Chrysler.  Directors of these companies initially requested a combined total of 50-billion dollars--half of which would be used to fulfilled employee benefit pledges made to the United Auto Workers Union (UAW).  Most Americans were reluctant to buy vehicles made by these Detroit, Michigan based companies, simply because they were overpriced for their quality. The high costs were due to the excessive employee salary and benefit demands made by the UAW on behalf of autoworkers. By contrast, more financially-sound automakers, such as Honda and Toyota, employed non-union workers in their American divisions, saving roughly a thousand dollars per vehicle, compared to cars made by the Big Three.  On December 19, 2008, the outgoing Bush administration authorized a 17.4 billion dollar rescue loan package for the ailing, union-driven auto companies, retaining the option for the government to partially nationalize the automakers through the seizure of stocks.  For its part of the bailout deal, the United Auto Workers Union suspended its “jobs bank” demand, which had forced the automakers to pay laid-off employees 95 percent of their salary and benefits for years after those employees had stopped working.  Despite these measures, cash-strapped Americans could not afford to buy cars in sufficient numbers to save the beleaguered companies.

 

 

A PLACEBO PRESIDENCY

In 2008, Barack Hussein Obama II (born August 4, 1961) was elected to serve as America's 44th president.    The nation’s 44th president, Barack Obama, was no more adept at solving the economic crisis than his predecessor.  Every Bush stimulus bill in 2008, totaling approximately 1.2 trillion dollars, had failed to turn the economy around.  Instead of learning from his predecessor's mistakes, the new president compounded them many times over.  Shortly after taking the oath of office, Obama began overseeing an 800-billion dollar "stimulus" package.  By the time the bill worked its way through both houses of Congress, total spending commitments exceeded 12.8-trillion dollars, eating up almost the entire 14.2-trillion dollar value of everything produced in America during 2008.  In one of the most brazenly irresponsible acts ever committed by the United States Congress, a vote was forced on the 1,100-page stimulus bill before anyone had time to read it.  Without a single Republican vote, the American Recovery and Reinvestment Act, the largest single transfer of wealth in the nation's history, went into effect in mid-February 2009.

 

    Initially, the new president predicted a 1.75-trillion dollar budget deficit for 2009, which would be added to the 11.2-trillion dollar national debt.  His figures, however, did not include the cost of ongoing military operations in Iraq and Afghanistan, nor was any accounting made of the net value of unfunded Social Security and Medicare liabilities.  After total federal debt obligations were factored-in using the US Treasury's "Generally Accepted Accounting Practices" (GAAP), it was determined that the United States had actually incurred a 65.5-trillion dollar debt, an amount that exceeded the Gross Domestic Product of the entire world.

 

    The economic stimulus package set up an artificial economy that focused exclusively on borrowing and spending instead of producing and saving.  President Obama's primary intention was to extend credit to consumers, disregarding the producers who could have benefited society in more tangible ways.  Regardless of the key recipients, credit could never be created out of nothing.  Vast savings deposits have always been needed in order for financial institutions to have something to lend. 

 

    Virtually all humans can understand irrefutable physical laws, such as gravity; and know that if a stone is thrown up in the air, it will fall to the ground.  However, few people in the early twenty-first century seemed to realize that there are also irrefutable economic laws.  Spending more money than one takes in always results in poverty.  Spending gargantuan amounts of money that cannot be repaid for generations will quickly destroy the spender.  With foreign nations refusing to loan any more money to the United States, the Federal Reserve's only remaining option was to print more paper currency.  Such an action always devalued the dollar, and was opposed by China and Japan, the largest holders of US debt.  The new president seemed unaware that his fiscal policies had one highly probable outcome: the total collapse of the American dollar.  The only exception would be for the currencies of other nations to sink at the same rate as the dollar, and indeed this occurred in early 2009, postponing the inevitable calamity.

 

    As seen in cases where phony placebo pills have convinced ailing persons that their conditions were improving, perception--even in the absence of reality--can profoundly affect people.  Despite the destructive, anti-growth policies of President Obama, many Americans felt he could do no wrong.  The mainstream news media was especially supportive of the handsome, confident, "Kennedy-esque" leader, and tended to deride those who questioned his actions. 

 

The United States of America: A Constitutional Republic no more.    Clearly, Barack Obama did not instigate the nation's financial crisis, but his mindset was aligned with legions of prior politicians who had contributed to the gradual eradication of America’s industrial base.  After four decades of ever-increasing taxation, regulation, union demands, and litigation, American manufacturers had been forced to either close down or move operations abroad.  As a result, the nation’s people were rendered unable to generate products and services at a sufficient level to purchase foreign versions of the goods they once made domestically. Federal, state, and local governments, aided by a number of oppressive regulatory agencies, taxed businesses to the point of removing all monetary rewards for innovation; imposed fierce regulations that hindered the freedom to produce; and nurtured a lawsuit-friendly environment that destroyed all incentives to hire additional workers and offer new products to the public.

 

    The United States of America was no longer capable of earning more money than it spent.  At one time the world's leading lender, it had become the largest debtor nation.  Furthermore, the enormity of debt extended the burden to future generations.  The constitutional republic structure of government, with all its free-enterprise protections, had been gradually replaced by a form of quasi-socialist statism.  The transformation was neither the result of foreign invasion or violent internal coup, but rather the will of the majority of the nation’s electorate, carried out by the people they voted into office.
 

    "The surest way to overthrow an established social order is to debauch its currency."  -- Vladimir Lenin

 

 

Back to Index   I   Home   I   DVD Store   I   Historical Documents   I   Video Guides   I   Customer Service   I   About Us