View Poll Results: Niall Ferguson

Voters
3. You may not vote on this poll
  • Alpha

    0 0%
  • Beta

    0 0%
  • Gamma

    1 33.33%
  • Delta

    0 0%
  • ILE

    0 0%
  • SEI

    1 33.33%
  • ESE

    0 0%
  • LII

    0 0%
  • EIE

    0 0%
  • LSI

    0 0%
  • SLE

    0 0%
  • IEI

    0 0%
  • SEE

    0 0%
  • ILI

    0 0%
  • LIE

    2 66.67%
  • ESI

    0 0%
  • LSE

    1 33.33%
  • EII

    0 0%
  • IEE

    0 0%
  • SLI

    0 0%
Multiple Choice Poll.
Results 1 to 9 of 9

Thread: Niall Ferguson

  1. #1
    WE'RE ALL GOING HOME HERO's Avatar
    Join Date
    Jul 2010
    Posts
    1,001
    Mentioned
    25 Post(s)
    Tagged
    0 Thread(s)

    Default Niall Ferguson

    http://www.newsweek.com/niall-fergus...needs-go-64419



    - Niall Ferguson (2012): ‘Remarkably the president polls relatively strongly on national security. Yet the public mistakes his administration’s astonishingly uninhibited use of political assassination for a coherent strategy. According to the Bureau of Investigative Journalism in London, the civilian proportion of drone casualties was 16 percent last year. Ask yourself how the liberal media would have behaved if George W. Bush had used drones this way. Yet somehow it is only ever Republican secretaries of state who are accused of committing “war crimes.”


    The real crime is that the assassination program destroys potentially crucial intelligence (as well as antagonizing locals) every time a drone strikes. It symbolizes the administration’s decision to abandon counterinsurgency in favor of a narrow counterterrorism. What that means in practice is the abandonment not only of Iraq but soon of Afghanistan too. Understandably, the men and women who have served there wonder what exactly their sacrifice was for, if any notion that we are nation building has been quietly dumped. Only when both countries sink back into civil war will we realize the real price of Obama’s foreign policy.

    America under this president is a superpower in retreat, if not retirement. Small wonder 46 percent of Americans—and 63 percent of Chinese—believe that China already has replaced the U.S. as the world’s leading superpower or eventually will.

    It is a sign of just how completely Barack Obama has “lost his narrative” since getting elected that the best case he has yet made for reelection is that Mitt Romney should not be president. In his notorious “you didn’t build that” speech, Obama listed what he considers the greatest achievements of big government: the Internet, the GI Bill, the Golden Gate Bridge, the Hoover Dam, the Apollo moon landing, and even (bizarrely) the creation of the middle class. Sadly, he couldn’t mention anything comparable that his administration has achieved.

    Now Obama is going head-to-head with his nemesis: a politician who believes more in content than in form, more in reform than in rhetoric.’



    ‘His Cairo speech of June 4, 2009, was an especially clumsy bid to ingratiate himself on what proved to be the eve of a regional revolution. “I’m also proud to carry with me,” he told Egyptians, “a greeting of peace from Muslim communities in my country: Assalamu alaikum ... I’ve come here ... to seek a new beginning between the United States and Muslims around the world, one based ... upon the truth that America and Islam are not exclusive and need not be in competition.”

    Believing it was his role to repudiate neoconservatism, Obama completely missed the revolutionary wave of Middle Eastern democracy—precisely the wave the neocons had hoped to trigger with the overthrow of Saddam Hussein in Iraq. When revolution broke out—first in Iran, then in Tunisia, Egypt, Libya, and Syria—the president faced stark alternatives. He could try to catch the wave by lending his support to the youthful revolutionaries and trying to ride it in a direction advantageous to American interests. Or he could do nothing and let the forces of reaction prevail.


    In the case of Iran he did nothing, and the thugs of the Islamic Republic ruthlessly crushed the demonstrations. Ditto Syria. In Libya he was cajoled into intervening. In Egypt he tried to have it both ways, exhorting Egyptian President Hosni Mubarak to leave, then drawing back and recommending an “orderly transition.” The result was a foreign-policy debacle. Not only were Egypt’s elites appalled by what seemed to them a betrayal, but the victors—the Muslim Brotherhood—had nothing to be grateful for. America’s closest Middle Eastern allies—Israel and the Saudis—looked on in amazement.

    “This is what happens when you get caught by surprise,” an anonymous American official told The New York Times in February 2011. “We’ve had endless strategy sessions for the past two years on Mideast peace, on containing Iran. And how many of them factored in the possibility that Egypt moves from stability to turmoil? None.”’



    ‘I was a good loser four years ago. “In the grand scheme of history,” I wrote the day after Barack Obama’s election as president, “four decades is not an especially long time. Yet in that brief period America has gone from the assassination of Martin Luther King Jr. to the apotheosis of Barack Obama. You would not be human if you failed to acknowledge this as a cause for great rejoicing.”

    Despite having been—full disclosure—an adviser to John McCain, I acknowledged his opponent’s remarkable qualities: his soaring oratory, his cool, hard-to-ruffle temperament, and his near faultless campaign organization.

    Yet the question confronting the country nearly four years later is not who was the better candidate four years ago. It is whether the winner has delivered on his promises. And the sad truth is that he has not.

    In his inaugural address, Obama promised “not only to create new jobs, but to lay a new foundation for growth.” He promised to “build the roads and bridges, the electric grids, and digital lines that feed our commerce and bind us together.” He promised to “restore science to its rightful place and wield technology’s wonders to raise health care’s quality and lower its cost.” And he promised to “transform our schools and colleges and universities to meet the demands of a new age.” Unfortunately the president’s scorecard on every single one of those bold pledges is pitiful.

    In an unguarded moment earlier this year, the president commented that the private sector of the economy was “doing fine.” Certainly, the stock market is well up (by 74 percent) relative to the close on Inauguration Day 2009. But the total number of private-sector jobs is still 4.3 million below the January 2008 peak. Meanwhile, since 2008, a staggering 3.6 million Americans have been added to Social Security’s disability insurance program. This is one of many ways unemployment is being concealed.

    In his fiscal year 2010 budget—the first he presented—the president envisaged growth of 3.2 percent in 2010, 4.0 percent in 2011, 4.6 percent in 2012. The actual numbers were 2.4 percent in 2010 and 1.8 percent in 2011; few forecasters now expect it to be much above 2.3 percent this year.


    ‘According to the 2010 budget, the debt in public hands was supposed to fall in relation to GDP from 67 percent in 2010 to less than 66 percent this year. If only. By the end of this year, according to the Congressional Budget Office (CBO), it will reach 70 percent of GDP. These figures significantly understate the debt problem, however. The ratio that matters is debt to revenue. That number has leapt upward from 165 percent in 2008 to 262 percent this year, according to figures from the International Monetary Fund. Among developed economies, only Ireland and Spain have seen a bigger deterioration.

    Not only did the initial fiscal stimulus fade after the sugar rush of 2009, but the president has done absolutely nothing to close the long-term gap between spending and revenue.

    His much-vaunted health-care reform will not prevent spending on health programs growing from more than 5 percent of GDP today to almost 10 percent in 2037. Add the projected increase in the costs of Social Security and you are looking at a total bill of 16 percent of GDP 25 years from now. That is only slightly less than the average cost of all federal programs and activities, apart from net interest payments, over the past 40 years. Under this president’s policies, the debt is on course to approach 200 percent of GDP in 2037—a mountain of debt that is bound to reduce growth even further.

    And even that figure understates the real debt burden. The most recent estimate for the difference between the net present value of federal government liabilities and the net present value of future federal revenues—what economist Larry Kotlikoff calls the true “fiscal gap”—is $222 trillion.

    The president’s supporters will, of course, say that the poor performance of the economy can’t be blamed on him. They would rather finger his predecessor, or the economists he picked to advise him, or Wall Street, or Europe—anyone but the man in the White House.
    There’s some truth in this. It was pretty hard to foresee what was going to happen to the economy in the years after 2008. Yet surely we can legitimately blame the president for the political mistakes of the past four years. After all, it’s the president’s job to run the executive branch effectively—to lead the nation. And here is where his failure has been greatest.



    On paper it looked like an economics dream team: Larry Summers, Christina Romer, and Austan Goolsbee, not to mention Peter Orszag, Tim Geithner, and Paul Volcker. The inside story, however, is that the president was wholly unable to manage the mighty brains—and egos—he had assembled to advise him.

    According to Ron Suskind’s book Confidence Men, Summers told Orszag over dinner in May 2009: “You know, Peter, we’re really home alone ... I mean it. We’re home alone. There’s no adult in charge. Clinton would never have made these mistakes [of indecisiveness on key economic issues].” On issue after issue, according to Suskind, Summers overruled the president. “You can’t just march in and make that argument and then have him make a decision,” Summers told Orszag, “because he doesn’t know what he’s deciding.” (I have heard similar things said off the record by key participants in the president’s interminable “seminar” on Afghanistan policy.)

    This problem extended beyond the White House. After the imperial presidency of the Bush era, there was something more like parliamentary government in the first two years of Obama’s administration. The president proposed; Congress disposed. It was Nancy Pelosi and her cohorts who wrote the stimulus bill and made sure it was stuffed full of political pork. And it was the Democrats in Congress—led by Christopher Dodd and Barney Frank—who devised the 2,319-page Wall Street Reform and Consumer Protection Act (Dodd-Frank, for short), a near-perfect example of excessive complexity in regulation. The act requires that regulators create 243 rules, conduct 67 studies, and issue 22 periodic reports. It eliminates one regulator and creates two new ones.’


    ‘And then there was health care. No one seriously doubts that the U.S. system needed to be reformed. But the Patient Protection and Affordable Care Act (ACA) of 2010 did nothing to address the core defects of the system: the long-run explosion of Medicare costs as the baby boomers retire, the “fee for service” model that drives health-care inflation, the link from employment to insurance that explains why so many Americans lack coverage, and the excessive costs of the liability insurance that our doctors need to protect them from our lawyers.

    Ironically, the core Obamacare concept of the “individual mandate” (requiring all Americans to buy insurance or face a fine) was something the president himself had opposed when vying with Hillary Clinton for the Democratic nomination. A much more accurate term would be “Pelosicare,” since it was she who really forced the bill through Congress.

    Pelosicare was not only a political disaster. Polls consistently showed that only a minority of the public liked the ACA, and it was the main reason why Republicans regained control of the House in 2010. It was also another fiscal snafu. The president pledged that health-care reform would not add a cent to the deficit. But the CBO and the Joint Committee on Taxation now estimate that the insurance-coverage provisions of the ACA will have a net cost of close to $1.2 trillion over the 2012–22 period.’


    ‘The failures of leadership on economic and fiscal policy over the past four years have had geopolitical consequences. The World Bank expects the U.S. to grow by just 2 percent in 2012. China will grow four times faster than that; India three times faster. By 2017, the International Monetary Fund predicts, the GDP of China will overtake that of the United States.





    ‘ . . . one thing is clear. Ryan psychs Obama out. This has been apparent ever since the White House went on the offensive against Ryan in the spring of last year. And the reason he psychs him out is that, unlike Obama, Ryan has a plan—as opposed to a narrative—for this country.

    Mitt Romney is not the best candidate for the presidency I can imagine. But he was clearly the best of the Republican contenders for the nomination. He brings to the presidency precisely the kind of experience—both in the business world and in executive office—that Barack Obama manifestly lacked four years ago. (If only Obama had worked at Bain Capital for a few years, instead of as a community organizer in Chicago, he might understand exactly why the private sector is not “doing fine” right now.) And by picking Ryan as his running mate, Romney has given the first real sign that—unlike Obama—he is a courageous leader who will not duck the challenges America faces.

    The voters now face a stark choice. They can let Barack Obama’s rambling, solipsistic narrative continue until they find themselves living in some American version of Europe, with low growth, high unemployment, even higher debt—and real geopolitical decline.

    Or they can opt for real change: the kind of change that will end four years of economic underperformance, stop the terrifying accumulation of debt, and reestablish a secure fiscal foundation for American national security.

    I’ve said it before: it’s a choice between les États Unis and the Republic of the Battle Hymn.’





    - from Civilization: The West and the Rest by Niall Ferguson; pp. xv-xxvii: The principal question addressed by this book increasingly seems to me the most interesting question a historian of the modern era can ask. Just why, beginning around 1500, did a few small polities on the western end of the Eurasian landmass come to dominate the rest of the world, including the more populous and in many ways more sophisticated societies of Eastern Eurasia? My subsidiary question is this: if we can come up with a good explanation for the West’s past ascendancy, can we then offer a prognosis for its future? Is this really the end of the West’s world and the advent of a new Eastern epoch? Put differently, are we witnessing the waning of an age when the greater part of humanity was more or less subordinated to the civilization that arose in Western Europe in the wake of the Renaissance and Reformation — the civilization that, propelled by the Scientific Revolution and the Enlightenment, spread across the Atlantic and as far as the Antipodes, finally reaching its apogee during the Ages of Revolution, Industry and Empire?

    The very fact that I want to pose such questions says something about the first decade of the twenty-first century. Born and raised in Scotland, educated at Glasgow Academy and Oxford University, I assumed throughout my twenties and thirties that I would spend my academic career at either Oxford or Cambridge. I first began to think of moving to the United States because an eminent benefactor of New York University’s Stern School of Business, the Wall Street veteran Henry Kaufman, had asked me why someone interested in the history of money and power did not come to where the money and power actually were. And where else could that be but downtown Manhattan? As the new millennium dawned, the New York Stock Exchange was self-evidently the hub of an immense global economic network that was American in design and largely American in ownership. The dotcom bubble was deflating, admittedly, and a nasty little recession ensured that the Democrats lost the White House just as their pledge to pay off the national debt began to sound almost plausible. But within just eight months of becoming president, George W. Bush was confronted by an event that emphatically underlined the centrality of Manhattan to the Western-dominated world. The destruction of the World Trade Center by al-Qaeda terrorists paid New York a hideous compliment. This was target number one for anyone serious about challenging Western predominance.


    The subsequent events were heady with hubris. The Taliban overthrown in Afghanistan. An ‘axis of evil’ branded ripe for ‘regime change’. Saddam Hussein ousted in Iraq. The Toxic Texan riding high in the polls, on track for re-election. The US economy bouncing back thanks to tax cuts. ‘Old Europe’—not to mention liberal America—fuming impotently. Fascinated, I found myself reading and writing more and more about empires, in particular the lessons of Britain’s for America’s; the result was Empire: How Britain Made the Modern World (2003). As I reflected on the rise, reign and probable fall of America’s empire, it became clear to me that there were three fatal deficits at the heart of American power: a manpower deficit (not enough boots on the ground in Afghanistan and Iraq), an attention deficit (not enough public enthusiasm for long-term occupation of conquered countries) and above all a financial deficit (not enough savings relative to investment and not enough taxation relative to public expenditure).


    In Colossus: The Rise and Fall of America’s Empire (2004), I warned that the United States had imperceptibly come to rely on East Asian capital to fund its unbalanced current and fiscal accounts. The decline and fall of America’s undeclared empire might therefore be due not to terrorists at the gates, nor to the rogue regimes that sponsored them, but to a financial crisis at the very heart of the empire itself. When, in late 2006, Moritz Schularick and I coined the word ‘Chimerica’ to describe what we saw as the dangerously unsustainable relationship — the word was a pun on ‘chimera’ — between parsimonious China and profligate America, we had identified one of the keys to the coming global financial crisis. For without the availability to the American consumer of both cheap Chinese labour and cheap Chinese capital, the bubble of the years 2002-7 would not have been so egregious.

    The illusion of American ‘hyper-power’ was shattered not once but twice during the presidency of George W. Bush. Nemesis came first in the backstreets of Sadr City and the fields of Helmand, which exposed not only the limits of American military might but also, more importantly, the naivety of neo-conservative visions of a democratic wave in the Greater Middle East. It struck a second time with the escalation of the subprime mortgage crisis of 2007 into the credit crunch of 2008 and finally the ‘great recession’ of 2009. After the bankruptcy of Lehman Brothers, the sham verities of the ‘Washington Consensus’ and the ‘Great Moderation’ — the central bankers’ equivalent of the ‘End of History’ — were consigned to oblivion. A second Great Depression for a time seemed terrifyingly possible. What had gone wrong? In a series of articles and lectures beginning in mid-2006 and culminating in the publication of The Ascent of Money in November 2008 — when the financial crisis was at its worst — I argued that all the major components of the international financial system had been disastrously weakened by excessive short-term indebtedness on the balance sheets of banks, grossly mispriced and literally overrated mortgage-backed securities and other structured financial products, excessively lax monetary policy on the part of the Federal Reserve, a politically engineered housing bubble and, finally, the unrestrained selling of bogus insurance policies (known as derivatives), offering fake protection against unknowable uncertainties, as opposed to quantifiable risks. The globalization of financial institutions that were of Western origin had been supposed to usher in a new era of reduced economic volatility. It took historical knowledge to foresee how an old-fashioned liquidity crisis might bring the whole shaky edifice of leveraged financial engineering crashing to the ground.

    The danger of a second Depression receded after the summer of 2009, though it did not altogether disappear. But the world had nevertheless changed. The breathtaking collapse in global trade caused by the financial crisis, as credit to finance imports and exports suddenly dried up, might have been expected to devastate the big Asian economies, reliant as they were said to be on exports to the West. Thanks to a highly effective government stimulus programme based on massive credit expansion, however, China suffered only a slow-down in growth. This was a remarkable feat that few experts had anticipated. Despite the manifest difficulties of running a continental economy of 1.3 billion people as if it were a giant Singapore, the probability remains better than even at the time of writing (December 2010) that China will continue to forge ahead with its industrial revolution and that, within the decade, it will overtake the United States in terms of gross domestic product, just as (in 1963) Japan overtook the United Kingdom.

    The West had patently enjoyed a real and sustained edge over the Rest for most of the previous 500 years. The gap between Western and Chinese incomes had begun to open up as long ago as the 1600s and had continued to widen until as recently as the late 1970s, if not later. But since then it had narrowed with astonishing speed. The financial crisis crystallized the next historical question I wanted to ask. Had that Western edge now gone? Only by working out what exactly it had consisted of could I hope to come up with an answer.




    What follows is concerned with historical methodology . . . I wrote this book because I had formed the strong impression that the people currently living were paying insufficient attention to the dead. Watching my three children grow up, I had the uneasy feeling that they were learning less history than I had learned at their age, not because they had bad teachers but because they had bad history books and even worse examinations. Watching the financial crisis unfold, I realized that they were far from alone, for it seemed as if only a handful of people in the banks and treasuries of the Western world had more than the sketchiest information about the last Depression. For roughly thirty years, young people at Western schools and universities have been given the idea of a liberal education, without the substance of historical knowledge. They have been taught isolated ‘modules’, not narratives, much less chronologies. They have been trained in the formulaic analysis of document excerpts, not in the key skill of reading widely and fast. They have been encouraged to feel empathy with imagined Roman centurions or Holocaust victims, not to write essays about why and how their predicaments arose. In The History Boys, the playwright Alan Bennett posed a ‘trilemma’: should history be taught as a mode of contrarian argumentation, a communion with past Truth and Beauty, or just ‘one fucking thing after another’? He was evidently unaware that today’s sixth-formers are offered none of the above — at best, they get a handful of ‘fucking things’ in no particular order.


    The former president of the university where I teach once confessed that, when he had been an undergraduate at the Massachusetts Institute of Technology, his mother had implored him to take at least one history course. The brilliant young economist replied cockily that he was more interested in the future than in the past. It is a preference he now knows to be illusory. There is in fact no such thing as the future, singular; only futures, plural. There are multiple interpretations of history, to be sure, none definitive—but there is only one past. And although the past is over, for two reasons it is indispensable to our understanding of what we experience today and what lies ahead of us tomorrow and thereafter. First, the current world population makes up approximately 7 per cent of all the human beings who have ever lived. The dead outnumber the living, in other words, fourteen to one, and we ignore the accumulated experience of such a huge majority of mankind at our peril. Second, the past is really our only reliable source of knowledge about the fleeting present and to the multiple futures that lie before us, only one of which will actually happen. History is not just how we study the past; it is how we study time itself.

    Let us first acknowledge the subject’s limitations. Historians are not scientists. They cannot (and should not even try to) establish universal laws of social or political ‘physics’ with reliable predictive powers. Why? Because there is no possibility of repeating the single, multi-millennium experiment that constitutes the past. The sample size of human history is one. Moreover, the ‘particles’ in this one vast experiment have consciousness, which is skewed by all kinds of cognitive biases. This means that their behaviour is even harder to predict than if they were insensate, mindless, gyrating particles. Among the many quirks of the human condition is that people have evolved to learn almost instinctively from their own past experience. So their behaviour is adaptive; it changes over time. We do not wander randomly but walk in paths, and what we have encountered behind us determines the direction we choose when the paths fork — as they constantly do.


    So what can historians do? First, by mimicking social scientists and relying on quantitative data, historians can devise ‘covering laws’, in Carl Hempel’s sense of general statements about the past that appear to cover most cases (for instance, when a dictator takes power instead of a democratic leader, the chance increases that the country in question will go to war). Or — though the two approaches are not mutually exclusive — the historian can commune with the dead by imaginatively reconstructing their experiences in the way described by the great Oxford philosopher R. G. Collingwood in his 1939 Autobiography. These two modes of historical inquiry allow us to turn the surviving relics of the past into history, a body of knowledge and interpretation that retrospectively orders and illuminates the human predicament. Any serious predictive statement about the possible futures we may experience is based, implicitly or explicitly, on one or both of these historical procedures. If not, then it belongs in the same category as the horoscope in this morning’s newspaper.


    Collingwood’s ambition, forged in the disillusionment with natural science and psychology that followed the carnage of the First World War, was to take history into the modern age, leaving behind what he dismissed as ‘scissors-and-paste history’, in which writers ‘only repeat, with different arrangements and different styles of decoration, what others [have] said before them’. His thought process is itself worth reconstructing:



    a) ‘The past which an historian studies is not a dead past, but a past which in some sense is still living in the present’ in the form of traces . . . that have survived.

    b) ‘All history is the history of thought’, in the sense that a piece of historical evidence is meaningless if its intended purpose cannot be inferred.


    c) That process of inference requires an imaginative leap through time: ‘Historical knowledge is the re-enactment in the historian’s mind of the thought whose history he is studying.’


    d) But the real meaning of history comes from the juxtaposition of past and present: ‘Historical knowledge is the re-enactment of a past thought incapsulated in a context of present thoughts which, by contradicting it, confine it to a plane different from theirs.’


    e) The historian thus ‘may very well be related to the nonhistorian as the trained woodsman is to the ignorant traveller. “Nothing here but trees and grass,” thinks the traveller, and marches on. “Look,” says the woodsman, “there is a tiger in that grass.”’ In other words, Collingwood argues, history offers something ‘altogether different from [scientific] rules, namely insight’.


    f) The true function of historical insight is ‘to inform [people] about the present, in so far as the past, its ostensible subject matter, [is] incapsulated in the present and [constitutes] a part of it not at once obvious to the untrained eye’.


    g) As for our choice of subject matter for historical investigation, Collingwood makes it clear that there is nothing wrong with what his Cambridge contemporary Herbert Butterfield condemned as ‘present-mindedness’: ‘True historical problems arise out of practical problems. We study history in order to see more clearly into the situation in which we are called upon to act. Hence the plane on which, ultimately, all problems arise is the plane of “real” life: that to which they are referred for their solution is history.’



    A polymath as skilled in archaeology as he was in philosophy, a staunch opponent of appeasement and an early hater of the Daily Mail,* Collingwood has been my guide for many years, but never has he been more indispensable than in the writing of this book. For the problem of why civilizations fall is too important to be left to the purveyors of scissors-and-paste history. It is truly a practical problem of our time, and this book is intended to be a woodsman’s guide to it. For there is more than one tiger hidden in this grass.


    * Which he called ‘the first English newspaper for which the word “news” lost its old meaning of facts which a reader ought to know . . . and acquired the new meaning of facts, or fictions, which it might amuse him to read’.



    In dutifully reconstructing past thought, I have tried always to remember a simple truth about the past that the historically inexperienced are prone to forget. Most people in the past either died young or expected to die young, and those who did not were repeatedly bereft of those they loved, who did die young. Consider the case of my favourite poet, the Jacobean master John Donne, who lived to the age of fifty-nine, thirteen years older than I am as I write. A lawyer, a Member of Parliament and, after renouncing the Roman Catholic faith, an Anglican priest, Donne married for love, as a result losing his job as secretary to his bride’s uncle, Sir Thomas Egerton, the Lord Keeper of the Privy Seal. [After he was briefly arrested for defying her father, she quipped: ‘John Donne — Anne Donne — Un-done.’ No wonder he loved her.] In the space of sixteen impecunious years, Anne Donne bore her husband twelve children. Three of them, Francis, Nicholas and Mary, died before they were ten. Anne herself died after giving birth to the twelfth child, which was stillborn. After his favourite daughter Lucy had died and he himself had very nearly followed her to the grave, Donne wrote his Devotions upon Emergent Occasions (1624), which contains the greatest of all exhortations to commiserate with the dead: ‘Any man’s death diminishes me, because I am involved in Mankinde; And therefore never send to know for whom the bell tolls; It tolls for thee.’ Three years later, the death of a close friend inspired him to write ‘A Nocturnal upon St Lucy’s Day, Being the Shortest Day’:


    Study me then, you who shall lovers be

    At the next world, that is, at the next spring;

    For I am every dead thing,

    In whom Love wrought new alchemy.

    For his art did express

    A quintessence even from nothingness,

    From dull privations, and lean emptiness;

    He ruin’d me, and I am re-begot

    Of absence, darkness, death — things which are not.



    Everyone should read these lines who wants to understand better the human condition in the days when life expectancy was less than half what it is today.


    The much greater power of death to cut people off in their prime not only made life seem precarious and filled it with grief. It also meant that most of the people who built the civilizations of the past were young when they made their contributions. The great Dutch-Jewish philosopher Baruch or Benedict Spinoza, who hypothesized that there is only a material universe of substance and deterministic causation, and that ‘God’ is that universe’s natural order as we dimly apprehend it and nothing more, died in 1677 at the age of forty-four, probably from the particles of glass he had inhaled doing his day-job as a lens grinder. Blaise Pascal, the pioneer of probability theory and hydrodynamics and the author of the Pensees, the greatest of all apologias for the Christian faith, lived to be just thirty-nine; he would have died even younger had the road accident that reawakened his spiritual side been fatal. Who knows what other great works these geniuses might have brought forth had they been granted the lifespans enjoyed by, for example, the great humanists Erasmus (sixty-nine) and Montaigne (fifty-nine)? Mozart, composer of the most perfect of all operas, Don Giovanni, died when he was just thirty-five. Franz Schubert, composer of the sublime String Quintet in C (D956), succumbed, probably to syphilis, at the age of just thirty-one. Prolific though they were, what else might they have composed if they had been granted the sixty-three years enjoyed by the stolid Johannes Brahms or the even more exceptional seventy-two years allowed the ponderous Anton Bruckner? The Scots poet Robert Burns, who wrote the supreme expression of egalitarianism, ‘A Man’s a Man for A’ That’, was thirty-seven when he died in 1796. What injustice, that the poet who most despised inherited status (‘The rank is but the guinea’s stamp, / The Man’s the gowd [gold] for a’ that’) should have been so much outlived by the poet who most revered it: Alfred, Lord Tennyson, who died bedecked with honours at the age of eighty-three. Palgrave’s Golden Treasury would be the better for more Burns and less Tennyson. And how different would the art galleries of the world be today if the painstaking Jan Vermeer had lived to be ninety-one and the over-prolific Pablo Picasso had died at thirty-nine, instead of the other way around?


    Politics, too, is an art — as much a part of our civilization as philosophy, opera, poetry or painting. But the greatest political artist in American history, Abraham Lincoln, served only one full term in the White House, falling victim to an assassin with a petty grudge just six weeks after his second inaugural address. He was fifty-six. How different would the era of Reconstruction have been had this self-made titan, born in a log cabin, the author of the majestic Gettysburg Address—which redefined the United States as ‘a nation, conceived in liberty, and dedicated to the proposition that all men are created equal’, with a ‘government of the people, by the people, for the people’ — lived as long as the polo-playing then polio-stricken grandee Franklin Delano Roosevelt, whom medical science kept alive long enough to serve nearly four full terms as president before his death at sixty-three?

    Because our lives are so very different from the lives of most people in the past, not least in their probable duration, but also in our greater degree of physical comfort, we must exercise our imaginations quite vigorously to understand the men and women of the past. In his Theory of Moral Sentiments, written a century and half before Collingwood’s memoir, the great economist and social theorist Adam Smith defined why a civilized society is not a war of all against all — because it is based on sympathy:


    As we have no immediate experience of what other men feel, we can form no idea of the manner in which they are affected, but by conceiving what we ourselves should feel in the like situation. Though our brother is on the rack, as long as we ourselves are at our ease, our senses will never inform us of what he suffers. They never did, and never can, carry us beyond our own person, and it is by the imagination only that we can form any conception of what are his sensations. Neither can that faculty help us to this any other way, than by representing to us what would be our own, if we were in his case. It is the impressions of our own senses only, not those of his, which our imaginations copy. By the imagination, we place ourselves in his situation.


    This, of course, is precisely what Collingwood says the historian should do, and it is what I want the reader to do as she encounters in these pages the resurrected thoughts of the dead. The key point of the book is to understand what made their civilization expand so spectacularly in its wealth, influence and power. But there can be no understanding without that sympathy which puts us, through an act of imagination, in their situation. That act will be all the more difficult when we come to resurrect the thoughts of the denizens of other civilizations — the ones the West subjugated or, at least, subordinated to itself. For they are equally important members of the drama’s cast. This is not a history of the West but a history of the world, in which Western dominance is the phenomenon to be explained.


    In an encyclopaedia entry he wrote in 1959, the French historian Fernand Braudel defined a civilization as:


    first of all a space, a ‘cultural area’ . . . a locus. With the locus . . . you must picture a great variety of ‘goods’, of cultural characteristics, ranging from the form of its houses, the material of which they are built, their roofing, to skills like feathering arrows, to a dialect or group of dialects, to tastes in cooking, to a particular technology, a structure of beliefs, a way of making love, and even to the compass, paper, the printing press. It is the regular grouping, the frequency with which particular characteristics recur, their ubiquity within a precise area [combined with] . . . some sort of temporal permanence . . .


    Braudel was better at delineating structures than explaining change, however. These days, it is often said that historians should tell stories; accordingly, this book offers a big story—a meta-narrative of why one civilization transcended the constraints that had bound all previous ones — and a great many smaller tales or micro-histories within it. Nevertheless the revival of the art of narrative is only part of what is needed. In addition to stories, it is also important that there be questions. ‘Why did the West come to dominate the Rest?’ is a question that demands something more than a just-so story in response. The answer needs to be analytical, it needs to be supported by evidence and it needs to be testable by means of the counterfactual question: if the crucial innovations I identify here had not existed, would the West have ruled the Rest anyway for some other reason that I have missed or under-emphasized? Or would the world have turned out quite differently, with China on top, or some other civilization? We should not delude ourselves into thinking that our historical narratives, as commonly constructed, are anything more than retro-fits. To contemporaries, as we shall see, the outcome of Western dominance did not seem the most probable of the futures they could imagine; the scenario of disastrous defeat often loomed larger in the mind of the historical actor than the happy ending vouchsafed to the modern reader. The reality of history as a lived experience is that it is much more like a chess match than a novel, much more like a football game than a play.


    It wasn’t all good. No serious writer would claim that the reign of Western civilization was unblemished. Yet there are those who would insist that there was nothing whatever good about it. This position is absurd. As is true of all great civilizations, that of the West was Janus-faced: capable of nobility yet also capable of turpitude. Perhaps a better analogy is that the West resembled the two feuding brothers in James Hogg’s Private Memoirs and Confessions of a Justified Sinner (1824) or in Robert Louis Stevenson’s Master of Ballantrae (1889). Competition and monopoly; science and superstition; freedom and slavery; curing and killing; hard work and laziness — in each case, the West was father to both the good and the bad. It was just that, as in Hogg’s or Stevenson’s novel, the better of the two brothers ultimately came out on top. We must also resist the temptation to romanticize history’s losers. The other civilizations overrun by the West’s, or more peacefully transformed by it through borrowings as much as through impositions, were not without their defects either, of which the most obvious is that they were incapable of providing their inhabitants with any sustained improvement in the material quality of their lives. One difficulty is that we cannot always reconstruct the past thoughts of these non-Western peoples, for not all of them existed in civilizations with the means of recording and preserving thought. In the end, history is primarily the study of civilizations, because without written records the historian is thrown back on spearheads and pot fragments, from which much less can be inferred.

    The French historian and statesman Francois Guizot said that the history of civilization is ‘the biggest of all . . . it comprises all the others’. It must transcend the multiple disciplinary boundaries erected by academics, with their compulsion to specialize, between economic, social, cultural, intellectual, political, military and international history. It must cover a great deal of time and space, because civilizations are not small or ephemeral. But a book like this cannot be an encyclopaedia. To those who will complain about what has been omitted, I can do no more than quote the idiosyncratic jazz pianist Thelonious Monk: ‘Don’t play everything (or every time); let some things go by . . . What you don’t play can be more important than what you do.’ I agree. Many notes and chords have been omitted below. But they have been left out for a reason. Does the selection reflect the biases of a middle-aged Scotsman, the archetypal beneficiary of Western predominance? Very likely. But I cherish the hope that the selection will not be disapproved of by the most ardent and eloquent defenders of Western values today, whose ethnic origins are very different from mine — from Amartya Sen to Liu Xiaobo, from Hernando de Soto to the dedicatee of this book.







    - pp. 196-198: What we must do is to transform our Empire and our people, make the empire like the countries of Europe and our people like the peoples of Europe.—Inoue Kaoru

    Will the West, which takes its great invention, democracy, more seriously than the Word of God, come out against this coup that has brought an end to democracy in Kars? . . . Or are we to conclude that democracy, freedom and human rights don’t matter, that all the West wants is for the rest of the world to imitate it like monkeys? Can the West endure any democracy achieved by enemies who in no way resemble them? — Orhan Pamuk


    THE BIRTH OF THE CONSUMER SOCIETY


    In 1909, inspired by a visit to Japan, the French-Jewish banker and philanthropist Albert Kahn* set out to create an album of colour photographs of people from every corner of the world. The aim, Kahn said, was ‘To put into effect a sort of photographic inventory of the surface of the globe as inhabited and developed by Man at the beginning of the twentieth century.’ Created with the newly invented autochrome process, the 72,000 photographs and 100 hours of film in Kahn’s ‘archives of the planet’ show a dazzling variety of costumes and fashions from more than fifty different countries: dirt-poor peasants in the Gaeltacht, dishevelled conscripts in Bulgaria, forbidding chieftains in Arabia, stark-naked warriors in Dahomey, garlanded maharajas in India, come-hither priestesses in Indo-China and strangely stolid-looking cowboys in the Wild West. [Okuefuna, Wonderful World of Albert Kahn.] In those days, to an extent that seems astonishing today, we were what we wore.

    * Kahn, a pupil of the philosopher Henri Bergson, was ruined by the Depression, bringing his grand photographic project to an end. A selection of the images can be viewed at http://www.albertkahn.co.uk/photos.html


    Today, a century later, Kahn’s project would be more or less pointless, because these days most people around the world dress in much the same way: the same jeans, the same sneakers, the same T-shirts. There are just a very few places where people hold out against the giant sartorial blending machine. One of them is rural Peru. In the mountains of the Andes, the Quechua women still wear their brightly coloured dresses and shawls and their little felt hats, pinned at jaunty angles and decorated with their tribal insignia. Except that these are not traditional Quechua clothes at all. The dresses, shawls and hats are in fact of Andalusian origin and were imposed by the Spanish Viceroy Francisco de Toledo in 1572, in the wake of Tupac Amaru’s defeat. Authentically traditional Andean female attire consisted of a tunic (the anacu), secured at the waist by a sash (the chumpi), over which was worn a mantle (the lliclla), which was fastened with a tupu pin. What Quechua women wear nowadays is a combination of these earlier garments with the clothes they were ordered to wear by their Spanish masters. The bowler hats popular among Bolivian women came later, when British workers arrived to build that country’s first railways. The current fashion among Andean men for American casual clothing is thus merely the latest chapter in a long history of sartorial Westernization.

    What is it about our clothes that other people seem unable to resist? Is dressing like us about wanting to be like us? Clearly, this is about more than just clothes. It is about embracing an entire popular culture that extends through music and movies, to say nothing of soft drinks and fast food. That popular culture carries with it a subtle message. It is about freedom — the right to dress or drink or eat as you please (even if that turns out to be like everybody else). It is about democracy — because only those consumer products that people really like get made. And, of course, it is about capitalism — because corporations have to make a profit by selling the stuff. But clothing is at the heart of the process of Westernization for one very simple reason. That great economic transformation which historians long ago named the Industrial Revolution — that quantum leap in material standards of living for a rising share of humanity — had its origins in the manufacture of textiles. It was partly a miracle of mass production brought about by a wave of technological innovation, which had its origin in the earlier Scientific Revolution. But the Industrial Revolution would not have begun in Britain and spread to the rest of the West without the simultaneous development of a dynamic consumer society, characterized by an almost infinitely elastic demand for cheap clothes. The magic of industrialization, though it was something contemporary critics generally overlooked, was that the worker was at one and the same time a consumer. The ‘wage slave’ also went shopping; the lowliest proletarian had more than one shirt, and aspired to have more than two.


    The consumer society is so all-pervasive today that it is easy to assume it has always existed. Yet in reality it is one of the more recent innovations that propelled the West ahead of the Rest. Its most striking characteristic is its seemingly irresistible appeal. Unlike modern medicine, which was often imposed by force on Western colonies, the consumer society is a killer application the rest of the world has generally yearned to download. Even those social orders explicitly intended to be anti-capitalist — most obviously the various derivatives of the doctrine of Karl Marx — have been unable to exclude it. The result is one of the greatest paradoxes of modern history: that an economic system designed to offer infinite choice to the individual has ended up homogenizing humanity.




    - pp. 227-231: The First World War . . . was a struggle between empires whose motives and methods had been honed overseas. It toppled four dynasties and shattered their empires. The American President Woodrow Wilson — the first of four Democratic holders of the office to embroil their country in a major overseas war — sought to recast the conflict as a war for national self-determination, a view that was never likely to be endorsed by the British and French empires, whose flagging war effort had been salvaged by American money and men. Czechs, Estonians, Georgians, Hungarians, Lithuanians, Latvians, Poles, Slovaks and Ukrainians were not the only ones who scented freedom; so did Arabs and Bengalis, to say nothing of the Catholic Irish. Aside from the Irish one and Finland, not one of the nation-states that emerged in the wake of the war retained meaningful independence by the end of 1939 (except possibly Hungary). The Mazzinian map of Europe appeared and then vanished like a flash in the pan.



    The alternative post-war vision of Vladimir Ilyich Lenin was of a Union of Soviet Socialist Republics, potentially expanding right across Eurasia. This gained its traction from the exceptional economic circumstances of the war. Because all governments financed the fighting to some degree by issuing short-term debt and exchanging it for cash at their central banks — printing money, in short — inflation gathered momentum during the war. Because so many men were under arms, labour shortages empowered the workers on the Home Front to push for higher wages. By 1917 hundreds of thousands of workers were involved in strikes in France, Germany and Russia. First Spanish influenza then Russian Bolshevism swept the world. As in 1848 urban order broke down, only this time the contagion spread as far as Buenos Aires and Bengal, Seattle and Shanghai. Yet the proletarian revolution failed everywhere but in the Russian Empire, which was reassembled by the Bolsheviks in the wake of a brutal civil war. No other socialist leaders were as ruthless as Lenin in adopting ‘democratic centralism’ (which was the opposite of democratic), rejecting parliamentarism and engaginging in terrorism against opponents. Some of what the Bolsheviks did (the nationalization of banks, the confiscation of land) was straight out of Marx and Engels’s Manifesto. Some of what they did (‘the greatest ferocity and savagery of suppression . . . seas of blood’) owed more to Robespierre. The ‘dictatorship of the proletariat’—which in fact meant the dictatorship of the Bolshevik leadership—was Lenin’s original contribution. This was even worse than the resurrection of Bazarov, the nihilist in Ivan Turgenev’s Fathers and Sons (1856). It was what his estranged friend Fyodor Dostoevsky had warned Russia about in the epilogue of Crime and Punishment (1866) — the murderer Raskolnikov’s nightmare of a ‘terrible, unprecedented and unparalleled plague’ from Asia:


    Those infected were seized immediately and went mad. Yet people never considered themselves so clever and unhesitatingly right as these infected ones considered themselves. Never had they considered their decrees, their scientific deductions, their moral convictions and their beliefs more firmly based. Whole settlements, whole cities and nations, were infected and went mad . . . People killed each other with senseless rage . . . soldiers flung themselves upon each other, slashed and stabbed, ate and devoured each other.


    To the east there was almost no stopping the Bolshevik epidemic. To the west it could not get over the Vistula, nor south of the Caucasus, thanks to a gifted trio of political entrepreneurs who devised that synthesis of nationalism and socialism which was the true manifestation of the Zeitgeist: Jozef Pilsudski in Poland, Kemal Ataturk in Turkey and Benito Mussolini in Italy. The defeat of the Red Army outside Warsaw (August 1920), the expulsion of the Anatolian Greeks (September 1922) and the fascist March on Rome (October 1922) marked the advent of a new era—and a new look.

    With the exception of Mussolini, who wore a three-piece suit with a winged collar and spats, most of those who participated in the publicity stunt that was the March on Rome were in makeshift uniforms composed of black shirts, jodhpurs and knee-high leather riding boots. The idea was that the manly, martial virtues of the Great War would now be carried over into peacetime, beginning with a smaller war fought in the streets and fields against the left. Uniformity was the order of the day — but a uniformity of dress without the tedious discipline of a real army. Even the famous March was more of a stroll, as the many press photographs make clear. It had been the Italian nationalist Giuseppe Garibaldi who had first used red-coloured shirts as the basis for a political movement. By the 1920s dyed tops were mandatory on the right; the Italian fascists opted for black while, as we have seen, the German National Socialist Sturmabteilung adopted colonial brown.

    Such movements might have dissolved into ill-tailored obscurity had it not been for the Great Depression. After the inflation of the early 1920s, the deflation of the early 1930s dealt a lethal blow to the Wilsonian vision of a Europe based on national identity and democracy. The crisis of American capitalism saw the stock market slump by 89 per cent, output drop by a third, consumer prices fall by a quarter and the unemployment rate pass a quarter. Not all European countries were so severely affected, but none was unscathed. As governments scrambled to protect their own industries with higher tariffs—the American Smoot-Hawley tariff bill raised the effective ad valorem rate on imported cotton manufactures to 46 per cent — globalization simply broke down. Between 1929 and 1932 world trade shrank by two-thirds. Most countries adopted some combination of debt default, currency depreciation, protectionist tariffs, import quotas and prohibitions, import monopolies and export premia. The day had dawned, it seems, of the nationalist-socialist state.


    This was an illusion. Though the US economy seemed to be imploding, the principal cause was the disastrous monetary policy adopted by the Federal Reserve Board, which half wrecked the banking system. [Friedman and Schwartz, Monetary History of the United States.] Innovation, the mainspring of industrial advance, did not slacken in the 1930s. New automobiles, radios and other consumer durables were proliferating. New companies were developing these products, like DuPont (nylon), Revlon (cosmetics), Proctor & Gamble (Dreft soap powder), RCA (radio and television) and IBM (accounting machines); they were also evolving and disseminating a whole new style of business management. Nowhere was the creativity of capitalism more marvellous to behold than in Hollywood, home of the motion-picture industry. In 1931—when the US economy was in the grip of blind panic—the big studios released Charlie Chaplin’s City Lights, Howard Hughes’s The Front Page and the Marx Brothers’ Monkey Business. The previous decade’s experiment with the Prohibition of alcohol had been a disastrous failure, spawning a whole new economy of organized crime. But it was only more grist for the movie mills. Also in 1931, audiences flocked to see James Cagney and Edward G. Robinson in the two greatest gangster films of them all: The Public Enemy and Little Caesar. No less creative was the live, recorded and broadcast music business, once white Americans had discovered that black Americans had nearly all the best tunes. Jazz approached its zenith in the swinging sound of Duke Ellington’s big band, which rolled out hit after hit even as the automobile-production lines ground to halt: ‘Mood Indigo’ (1930), ‘Creole Rhapsody’ (1931), ‘It Don’t Mean a Thing (If It Ain’t Got That Swing)’ (1932), ‘Sophisticated Lady’ (1933) and ‘Solitude’ (1934). The grandson of a slave, Ellington took reed and brass instruments where they had never been before, mimicking everything from spirituals to the New York subway. His band’s long residence at the Cotton Club was at the very heart of the Harlem Renaissance. And of course, as his aristocratic nickname required, Ellington was always immaculately dressed — courtesy of Anderson & Sheppard of Savile Row.


    In short, capitalism was not fatally flawed, much less dead. It was merely a victim of bad management, and the uncertainty that followed from it. The cleverest economist of the age, John Maynard Keynes, sneered at the stock exchange as a ‘casino’, comparing investors’ decisions to a newspaper beauty contest. President Franklin D. Roosevelt — elected just as the Depression was ending — inveighed against ‘the unscrupulous money changers’. The real culprits were the central bankers who had first inflated a stock-exchange bubble with excessively lax monetary policy and had then proceeded to tighten (or failed adequately to loosen) after the bubble had burst. Between 1929 and 1933, nearly 15,000 US banks — two-fifths of the total — failed. As a result, the money supply was savagely reduced. With prices collapsing by a third from peak to trough, real interest rates rose above 10 percent, crushing any indebted institution or household. Keynes summed up the negative effects of deflation:


    Modern business, being carried on largely with borrowed money, must necessarily be brought to a standstill by such a process. It will be to the interest of everyone in business to go out of business for the time being; and of everyone who is contemplating expenditure to postpone his orders so long as he can. The wise man will be he who turns his assets into cash, withdraws from the risks and the exertions of activity, and awaits in country retirement the steady appreciation promised him in the value of his cash. A probable expectation of Deflation is bad.


    How to escape from the deflation trap? With trade slumbering and capital imports frozen, Keynes’s recommendation — government spending on public works, financed by borrowing — made sense. It also helped to abandon the gold standard, whereby currencies had fixed dollar exchange rates, to let depreciation provide a boost to exports (though increasingly trade went on within regional blocs) and to allow interest rates to fall. Yet parliamentary governments that adopted only these measures achieved at best anaemic recoveries.





    - pp. 204-212: Like the French Revolution before it, the British Industrial Revolution spread across Europe. But this was a peaceful conquest. The great innovators were largely unable to protect what would now be called their intellectual property rights. With remarkable speed, the new technology was therefore copied and replicated on the continent and across the Atlantic. The first true cotton mill, Richard Arkwright’s at Cromford in Derbyshire, was built in 1771. Within seven years a copy appeared in France. It took just three years for the French to copy Watt’s 1775 steam engine. By 1784 there were German versions of both, thanks in large measure to industrial espionage. The Americans who had the advantage of being able to grow their own cotton as well as mine their own coal, were a little slower: the first cotton mill appeared in Bass River, Massachusetts, in 1788, the first steam engine in 1803. The Belgians, Dutch and Swiss were not far behind. The pattern was similar after the first steam locomotives began pulling carriages on the Stockton and Darlington Railway in 1825, though that innovation took a mere five years to cross the Atlantic, compared with twelve years to reach Germany and twenty-two to arrive in Switzerland. As the efficiency of the technology improved, so it became economically attractive even where labour was cheaper and coal scarcer. Between 1820 and 1913 the number of spindles in the world increased four times as fast as the world’s population, but the rate of increase was twice as fast abroad as in the United Kingdom. Such were the productivity gains — and the growth of demand — that the gross output of the world cotton industry rose three times as fast as total spindleage. As a result, between 1820 and 1870 a handful of North-west European and North American countries achieved British rates of growth; indeed, Belgium and the United States grew faster.

    By the late nineteenth century, then, industrialization was in full swing in two broad bands: one stretching across the American Northeast, with towns like Lowell, Massachusetts at its heart, and another extending from Glasgow to Warsaw and even as far as Moscow. In 1800 seven out of the world’s ten biggest cities had still been Asian, and Beijing had still exceeded London in size. By 1900, largely as a result of the Industrial Revolution, only one of the biggest was Asian; the rest were European or American.

    The spread around the world of the British-style industrial city inspired some observers but dismayed others. Among the inspired was Charles Darwin who, as he acknowledged in On the Origin of Species (1859), had been ‘well prepared to appreciate the struggle for existence’ by the experience of living through the Industrial Revolution. Much of Darwin’s account of natural selection could have applied equally well to the economic world of the mid-nineteenth-century textile business:


    All organic beings are exposed to severe competition . . . As more individuals are produced than can possibly survive, there must in every case be a struggle for existence, either one individual with another of the same species, or with the individuals of distinct species, or with the physical conditions of life. Each organic being . . . has to struggle for life . . . As natural selection acts solely by accumulating slight, successive, favourable variations, it can produce no great or sudden modification . . .


    In that sense, it might make more sense for historians to talk about an Industrial Evolution, in Darwin’s sense of the word. As the economists Thorstein Veblen and Joseph Schumpeter would later remark, nineteenth-century capitalism was an authentically Darwinian system, characterized by seemingly random mutation, occasional speciation, and differential survival or, to use Schumpeter’s memorable phrase, ‘creative destruction’.

    Yet precisely the volatility of the more or less unregulated markets created by the Industrial Revolution caused consternation among many contemporaries. Until the major breakthroughs in public health . . . mortality rates in industrial cities were markedly worse than in the countryside. Moreover, the advent of a new and far from regular ‘business cycle’, marked by periodic crises of industrial over-production and financial panic, generally made a stronger impression on people than the gradual acceleration of the economy’s average growth rate. Though the Industrial Revolution manifestly improved life over the long run, in the short run it seemed to make things worse. One of William Blake’s illustrations for his preface to Milton featured, among other sombre images, a dark-skinned figure holding up a blood-soaked length of cotton yarn. [The ‘dark Satanic mills’ of the text may well refer to the Albion Flour Mills, built by Boulton & Watt in London in 1769 and destroyed by fire in 1791.] For the composer Richard Wagner, London was ‘Alberich’s dream come true — Nibelheim, world dominion, activity, work, everywhere the oppressive feeling of steam and fog’. Hellish images of the British factory inspired his depiction of the dwarf’s underground realm in Das Rheingold, as well as one of the leitmotifs of the entire Ring cycle, the insistent, staccato rhythm of multiple hammers . . .


    Steeped in German literature and philosophy, the Scottish writer Thomas Carlyle was the first to identify what seemed the fatal flaw of the industrial economy: that it reduced all social relations to what he called, in his essay Past and Present, ‘the cash nexus’:



    [U]the world has been rushing on with such fiery animation to get work and ever more work done, it has had no time to think of dividing the wages; and has merely left them to be scrambled for by the Law of the Stronger, law of Supply-and-demand, law of Laissez-faire, and other idle Laws and Un-laws. We call it a Society; and go about professing openly the totalest separation, isolation. Our life is not a mutual helpfulness; but rather, cloaked under due laws-of-war, named ‘fair competition’ and so forth, it is a mutual hostility. We have profoundly forgotten everywhere that Cash-payment is not the sole relation of human beings . . .[It] is not the sole nexus of man with man, — how far from it! Deep, far deeper than Supply-and-demand, are Laws, Obligations sacred as Man’s Life itself.[U] [Thomas Carlyle, Past and Present, Book I, chs. 1-4, Book IV, chs, 4, 8.]



    That phrase—the ‘cash nexus’—so much pleased the son of an apostate Jewish lawyer from the Rhineland that he and his co-author, the heir of a Wuppertal cotton mill-owner, purloined it for the outrageous ‘manifesto’ they published on the eve of the 1848 revolutions.

    The founders of communism, Karl Marx and Friedrich Engels, were just two of many radical critics of the industrial society, but it was their achievement to devise the first internally consistent blueprint for an alternative social order. Since this was the beginning of a schism within Western civilization that would last for nearly a century and a half, it is worth pausing to consider the origins of their theory. A mixture of Hegel’s philosophy, which represented the historical process as dialectical, and the political economy of Ricardo, which posited diminishing returns for capital and an ‘iron’ law of low wages, Marxism took Carlyle’s revulsion against the industrial economy and substituted a utopia for nostalgia.




    Marx himself was an odious individual. An unkempt scrounger and a savage polemicist, he liked to boast that his wife was ‘née Baroness von Westphalen’, but nevertheless sired an illegitimate son by their maidservant. On the sole occasion when he applied for a job (as a railway clerk) he was rejected because his handwriting was so atrocious. He sought to play the stock market but was hopeless at it. For most of his life he therefore depended on handouts from Engels, for whom socialism was a hobby, along with fox-hunting and womanizing; his day job was running one of his father’s cotton factories in Manchester (the patent product of which was known as ‘Diamond Thread’). No man in history has bitten the hand that fed him with greater gusto than Marx bit the hand of King Cotton.


    The essence of Marxism was the belief that the industrial economy was doomed to produce an intolerably unequal society divided between the bourgeoisie, the owners of capital, and a propertyless proletariat. Capitalism inexorably demanded the concentration of capital in ever fewer hands and the reduction of everyone else to wage slavery, which meant being paid only ‘that quantum of the means of subsistence which is absolutely requisite to keep the labourer in bare existence as a labourer’. In chapter 32 of the first tome of his scarcely readable Capital (1867), Marx prophesied the inevitable denouement:


    Along with the constant decrease of the number of capitalist magnates, who usurp and monopolize all the advantages of this process of transformation, the mass of misery, oppression, slavery, degradation and exploitation grows; but with this there also grows the revolt of the working class . . .

    The centralization of the means of production and the socialization of labour reach a point at which they become incompatible with their capitalist integument. This integument is burst asunder. The knell of capitalist private property sounds. The expropriators are expropriated.



    It is not unintentional that this passage has a Wagnerian quality, part Gotterdammerung, part Parsifal. But by the time the book was published the great composer had left the spirit of 1848 far behind. Instead it was Eugene Pottier’s song ‘The Internationale’ that became the anthem of Marxism. Set to music by Pierre De Geyter, it urged the ‘servile masses’ to put aside their religious ‘superstitions’ and national allegiances, and make war on the ‘thieves’ and their accomplices, the tyrants, generals, princes and peers.



    Before identifying why there were wrong, we need to acknowledge what Marx and his disciples were right about. Inequality did increase as a result of the Industrial Revolution. Between 1780 and 1830 output per labourer in the UK grew over 25 per cent but wages rose barely 5 per cent. The proportion of national income going to the top percentile of the population rose from 25 per cent in 1801 to 35 per cent in 1848. In Paris in 1820, around 9 percent of the population were classified as ‘proprietors and rentiers’ (living from their investments) and owned 41 per cent of recorded wealth. By 1911 their share had risen to 52 per cent. In Prussia, the share of income going to the top 5 per cent rose from 21 per cent in 1854 to 27 per cent in 1896 and to 43 per cent in 1913. [Kaelble, Industrialization and Social Inequality.] Industrial societies, it seems clear, grew more unequal over the course of the nineteenth century. This had predictable consequences. In the Hamburg cholera epidemic of 1892, for example, the mortality rate for individuals with an income of less than 800 marks a year was thirteen times higher than that for individuals earning over 50,000 marks. [Evans, Death in Hamburg.] It was not necessary to be a Marxist to be horrified by the inequality of industrial society. The Welsh-born factory-owner Robert Owen, who coined the term ‘socialism’ in 1817, envisaged an alternative economic model based on co-operative production and utopian villages like the ones he founded at Orbiston in Scotland and New Harmony, Indiana. [Grayling, Toward the Light of Liberty: The Struggles for Rights and Freedoms That Made the Modern Western World, pp. 189-93.] Even the Irish aesthete and wit Oscar Wilde recognized the foundation of social misery on which the refined world of belles-lettres stood:


    These are the poor; and amongst them there is no grace of manner, or charm of speech, or civilization . . . From their collective force Humanity gains much in material prosperity. But it is only the material result that it gains, and the man who is poor is in himself absolutely of no importance. He is merely the infinitesimal atom of a force that, so far from regarding him, crushes him: indeed, prefers him crushed, as in that case he is far more obedient . . . Agitators are a set of interfering, meddling people, who come down to some perfectly contented class of the community, and sow the seeds of discontent amongst them. That is the reason why agitators are so absolutely necessary. Without them, in our incomplete state, there would be no advance towards civilization . . . [But] the fact is that civilization requires slaves. The Greeks were quite right there. Unless there are slaves to do the ugly, horrible, uninteresting work, culture and contemplation become almost impossible. Human slavery is wrong, insecure, and demoralizing. On mechanical slavery, on the slavery of the machine, the future of the world depends.


    Yet the revolution feared by Wilde and eagerly anticipated by Marx never materialized—at least, not where it was supposed to. The bouleversements of 1830 and 1848 were the results of short-run spikes in food prices and financial crises more than of social polarization. As agricultural productivity improved in Europe, as industrial employment increased and as the amplitude of the business cycle diminished, the risk of revolution declined. Instead of coalescing into an impoverished mass, the proletariat subdivided into ‘labour aristocracies’ with skills and a lumpenproletariat with vices. The former favoured strikes and collective bargaining over revolution and thereby secured higher real wages. The latter favoured gin. The respectable working class had their trade unions and working men’s clubs. The ruffians — ‘keelies’ in Glasgow — had the music hall and street fights.

    The prescriptions of the Communist Manifesto were in any case singularly unappealing to the industrial workers they were aimed at. Marx and Engels called for the abolition of private property; the abolition of inheritance; the centralization of credit and communications; the state ownership of all factories and instruments of production; the creation of ‘industrial armies for agriculture’; the abolition of the distinction between town and country; the abolition of the family; ‘community of women’ (wife-swapping) and the abolition of all nationalities. By contrast, mid-nineteenth-century liberals wanted constitutional government, the freedoms of speech, press and assembly, wider political representation through electoral reform, free trade and, where it was lacking, national self-determination (‘Home Rule’). In the half-century after the upheaval of 1848 they got a good many of these things — enough, at any rate, to make the desperate remedies of Marx and Engels seem de trop. In 1850 only France, Greece and Switzerland had franchises in which more than a fifth of the population got to vote. By 1900 ten European countries did, and Britain and Sweden were not far below that threshold. Broader representation led to legislation that benefited lower-income groups; free trade in Britain meant cheap bread, and cheap bread plus rising nominal wages thanks to union pressure meant a significant gain in real terms for workers. Building labourers’ day wages in London doubled in real terms between 1848 and 1913. Broader representation also led to more progressive taxation. Britain led the way in 1842 when Sir Robert Peel introduced a peacetime income tax; by 1913 the standard rate was 14 pence in the pound (6 per cent). Prior to 1842 nearly all British revenue had come from the indirect taxation of consumption, via customs and excise duties, regressive taxes taking a proportionately smaller amount of your income the richer you are. By 1913 a third revenue was coming from direct taxes on the relatively rich. In 1842 the central government had spent virtually nothing on education and the arts and sciences. In 1913 those items accounted for 10 per cent of expenditure. By then, Britain had followed Germany in introducing a state pension for the elderly.


    Marx and Engels were wrong on two scores, then. First, their iron law of wages was a piece of nonsense. Wealth did indeed become highly concentrated under capitalism, and it stayed that way into the second quarter of the twentieth century. But income differentials began to narrow as real wages rose and taxation became less regressive. Capitalists understood what Marx missed: that workers were also consumers. It therefore made no sense to try to grind their wages down to subsistence levels. On the contrary, as the case of the United States was making increasingly clear, there was no bigger potential market for most capitalist enterprises than their own employees. Far from condemning the masses to ‘immiseration’, the mechanization of textile production created growing employment opportunities for Western workers — albeit at the expense of Indian spinners and weavers — and the decline in the prices of cotton and other goods meant that Western workers could buy more with their weekly wages. The impact is best captured by the exploding differential between Western and non-Western wages and living standards in this period. Even within the West the gap between the industrialized vanguard and the rural laggards widened dramatically. In early seventeenth-century London, an unskilled worker’s real wages (that is, adjusted for the cost of living) were not so different from what his counterpart earned in Milan. From the 1750s until the 1850s, however, Londoners pulled far ahead. At the peak of the great divergence within Europe, London real wages were six times those in Milan. With the industrialization of Northern Italy in the second half of the nineteenth century, the gap began to close, so that by the eve of the First World War it was closer to a ratio of 3:1. German and Dutch workers also benefited from industrialization, though even in 1913 they still lagged behind their English counterparts. Chinese workers, by contrast, did no such catching up. Where wages were highest, in the big cities of Beijing and Canton, building workers received the equivalent of around 3 grams of silver per day, with no upward movement in the nineteenth and early twentieth (to around 5-6 grams). There was some improvement for workers in Canton after 1900 but it was minimal; workers in Sichuan stayed dirt poor. London workers meanwhile saw their silver-equivalent wages rise from around 18 grams between 1800 and 1870 to 70 grams between 1900 and 1913. Allowing for the cost of maintaining a family, the standard of living of the average Chinese worker fell throughout the nineteenth century, most steeply during the Taiping Rebellion. True, subsistence was cheaper in China than in North-western Europe. It should also be remembered that Londoners and Berliners by that time enjoyed a far more variegated diet of bread, dairy products and meat, washed down with copious amounts of alcohol, whereas most East Asians were subsisting on milled rice and small grains. Nevertheless, it seems clear that by the second decade of the twentieth century the gap in living standards between London and Beijing was around six to one, compared with two to one in the eighteenth century. [Allen at al., ‘Wages, prices, and living standards in China, 1738–1925: in comparison with Europe, Japan, and India’.]


    The second mistake Marx and Engels made was to underestimate the adaptive quality of the nineteenth-century state — particularly when it could legitimize itself as a nation-state.



    In his Contribution to a Critique of Hegel’s Philosophy of Right, Marx had famously called religion the ‘opium of the masses’. If so, then nationalism was the cocaine of the middle classes. On 17 March 1846 Venice’s Teatro La Fenice was the setting for the premiere of a new opera by the already celebrated Italian composer Giuseppe Verdi. Technically, Verdi had in fact been born a Frenchman: his name at birth was formally registered as ‘Joseph Fortunin Francois Verdi’ because the village where he was born was then under Napoleonic rule, having been annexed to France along with the rest of the Duchy of Parma and Piacenza. Venice, too, had been conquered by the French, but was handed over to Austria in 1814. The unpopularity of the Habsburg military and bureaucracy explains the rowdy enthusiasm with which the predominantly Italian audience responded to the following lines:


    [U]Tardo per gli anni, e tremulo,
    E il regnator d’Oriente;
    Siede un imbelle giovine
    Sul trono d’Occidente;
    Tutto sara disperso
    Quand’io mi unisca a te . . .
    Avrai tu l’universo,
    Resti l’Italia a me.


    (Aged and frail / Is the ruler of the Eastern Empire; / A young imbecile sits on the throne of the Western Empire; / All will be scattered / If you and I unite . . . / You can have the universe / But leave Italy to me.)


    Sung to Attila by the Roman envoy Ezio following the sack of Rome, these words were a thinly veiled appeal to nationalist sentiment. They perfectly illustrate what nationalism always had over socialism. It had style.









    http://h-net.msu.edu/cgi-bin/logbrow...IkHg&user=&pw=


    ‘Since for most of history mothers raise boys who then go off and hunt, farm, build things and fight wars rather than directly contributing much new to the psyche of the next generation, the course of evolution of the psyche has overwhelmingly been dependent upon the way mothers have treated their daughters, who become the next generation of mothers. Since early emotional relationships organize the entire range of human behavior, all cultural traits do not equally affect the evolution of the psyche — those that affect the daughter's psyche represent the main narrow bottleneck through which all other cultural traits must pass. The study of the evolution of the psyche depends more on developing a maternal ecology than on studying variations in the physical environment.

    The evolution of the psyche and culture has been crucially dependent upon turning the weak bonds between mother and daughter of apes and early humans into genuine love for daughters (and sons). This means that historical societies that create optimal conditions for improving the crucial mother-daughter relationship by surrounding the mother with support and love soon begin to show psychological innovation and cultural advances in the next generations—so that history begins to move in progressive new directions. In contrast, societies that cripple the mother-daughter emotional relationship experience psychogenic arrest and even psychogenic devolution. Only in modern times have fathers, too, begun to contribute much to the evolutionary task of growing the young child's mind.


    Paralleling the term "hopeful monster" that biologists use to indicate speciating biological variations, the idea that the mother-daughter emotional relationship is the focal point of epigenetic evolution and the main source of novelty in the psyche can be called the "hopeful daughter" concept. When mothers love and support, especially their daughters, a series of generations can develop new childrearing practices that grow completely new neuronal networks, hormonal systems and behavioral traits. If hopeful daughters are instead emotionally crippled by a society, a psychogenic cul-de-sac is created, generations of mothers cannot innovate, epigenetic arrest is experienced and meaningful cultural evolution ends.

    For instance, in China before the tenth century A.D. men began to footbind little girls' feet as a sexual perversion, making them into sexual fetishes, penis-substitutes which the men would suck on and masturbate against during sex play. Chinese literature reports the screaming cries of the five-year-old girl as she hobbles about the house for years to do her tasks while her feet are bound, because in order to make her foot tiny, her foot bones are broken and the flesh deteriorates. She loses several toes as they are bent under her foot, to emphasize the big toe as a female penis.


    This practice was added to the many brutal practices of what was perhaps the world's most anti-daughter culture, where over half the little girls were killed at birth without remorse and special girl-drowning pools were legion, where beating little girls until bloody was a common parental practice, and where girl rape and sex slavery were rampant. This vicious anti-daughter emotional atmosphere — extreme even for a time that was generally cruel and unfeeling towards daughters — was obviously not conducive to little girls producing innovations in childrearing when they grew up to be mothers. Therefore China—which was culturally ahead of the West in many ways at the time of the introduction of foot binding — became culturally and politically "frozen" until the twentieth century, when foot binding was stopped and boy-girl sex ratios in many areas dropped from 200/100 to near equality. The result was that whereas for much of its history China punished all novelty, during the twentieth century rapid cultural, political and economic evolution could resume. Japan, which shared much of Chinese culture but did not adopt foot binding of daughters, avoided the psychogenic arrest of China and could therefore share in the scientific and industrial revolution as it occurred in the West.


    The same kind of epigenetic arrest can be seen in the damage caused by genital mutilation of girls among circum-Mediterranean peoples that began thousands of years ago and continues today. Since "hopeful daughters" do not thrive on the chopping off of their clitorises and labias, the present cultural and political problems of those groups who still mutilate their daughters' genitals are very much a direct result of this psychogenic arrest.’




    http://primal-page.com/ps4.htm


    ‘All the other aspects of modern industrial society are equally results of the new socializing psychoclass childrearing, causing a greater increase in material prosperity in the past two centuries than in all the rest of human history. The reason for this astonishing progress is that science, technology and economic development depend more on investments in parenting than investments in equipment, since they crucially require an "exploring self" constructed from childhood. A few economists realize that the wealth of nations lies in the development of psyches more than in the investment of capital.


    Everett Hagan and Lawrence Harrison, for instance, have demonstrated that those nations furthest behind today in economic development suffer from a severe underinvestment in families and children, not in capital equipment.* The historical record is clear: early pioneers in science and technology first had to overcome their alter projections before they could discover how the world worked. As Keith Thomas puts it: "It was the abandonment of magic which made possible the upsurge of technology, not the other way round." Newton had to stop seeing falling objects "longing to return to Mother Earth" before he could posit a force of gravity.


    * Everett Hagen, The Economics of Development. Rev. Ed. Homewood: R. D. Irwin, 1975; Lawrence E. Harrison, Underdevelopment Is a State of Mind: The Latin American Case. Lanham: Madison Books, 1985, pp. 25, 29.



    Chemists had to give up "alchemical visions of womb-battles between good and evil" inside their flasks before they could observe the real causes of chemical change. Farmers had to be able to empathize with their horses in order to invent the harness collar that moved the pressure down from their throats to their flanks so they wouldn’t be choked in order to increase the loads they could pull.* Farmers also had to stop thinking of plowing as "tearing at the breast of Mother Earth" in order to invent the deep plow and change the face of European agriculture. Men had to begin to value their families in order to build wooden floors in their homes rather than leaving them clay as was the practice for millennia. Every invention had its origin in the evolution of the psyche; every exploration of nature was a dimension of the exploration of the self.


    Economic life, too, only evolved as childrearing and the psyche evolved. Tribal societies both in the past and in the present could not trust, because parents were untrustworthy, so they could not allow much wealth or surplus out of which they could create economic progress. Ownership was felt to be dangerous selfishness, envy ran rampant and ambition was feared: "The anthropologist may see people behaving with generosity, but this is the result of fear." Those who acquired too much were expected to either engage in gift-exchange and other redistributive rituals or else to periodically destroy their surplus in cleansing sacrificial ceremonies. Even the invention of money came from the sacred objects used for sacrifice to deities. [William H. Desmonde, Magic, Myth and Money: The Origin of Money in Religious Ritual. New York: The Free Press of Glencoe, 1962.] "Money is condensed wealth; condensed wealth is condensed guilt…money is filthy because it remains guilt." [Norman O. Brown, The Psychoanalytic Meaning of History. Middletown: Wesleyan University Press, 1959, p. 266.]


    What held back economic development for so many millennia was that early civilizations were so abusively brought up that they spent most of their energies chasing "ghosts from the nursery" — religious, political and economic domination group-fantasies — rather than joining in together to solve the real tasks of life. The appalling poverty of most people throughout history has been simply an extension of the emotional poverty of the historical family, making real cooperation in society impossible. For instance, slavery was one of the most wasteful, uneconomical systems ever invented, since denying autonomy to one’s fellow workers simply wasted both the slaves’ and the owners’ productivity and inventiveness.


    * Lynn White Jr., Medieval Technology and Social Change. Oxford: Oxford University Press, 1962, p. 57.





    http://www.youtube.com/watch?v=BNLq1Y_OFEQ

    “ . . . Progressives and reactionaries are two different parts of the nation, as they go to war, or as they take excessive risks, and put themselves into depressions like the one that we just started and the one in the ‘30s . . . the reactionaries just are against freedoms for people and are for siding with the powerful punishing parent."

    [Lloyd deMause is IEI/EIE? and Stefan Molyneux is SLE-Se? or ILE??]




    http://www.scribd.com/doc/104598456/...-Lloyd-DeMause

    Foot binding was a unique Chinese sexual mutilation practice that was performed on girls of all classes. Like other fetishists, the Chinese were so afraid of the vagina as a dangerous, castrating organ that they could only feel erotic toward the woman's foot—mainly her big toe. As Cheng Kuan-ying described foot binding in the nineteenth century: "When a child is four or five, or seven or eight, parents speak harshly to it, and frighten it with their looks, and oppress it in every conceivable manner so that the bones of its feet may be broken and its flesh may putrefy." [Howard S. Levy, Chinese Footbinding: The History of a Curious Erotic Custom. London: Neville Spearman, n.d., p.70.] The girl undergoes this extremely painful process for [anywhere] from five to ten years, crying out in pain each night as she hobbles about the house to do her tasks while holding on to the walls for support. [David and Vera Mace, Marriage: East & West. Garden City, N.Y.: Doubleday & Co., 1959, pp. 75-6; Levy, Chinese Footbinding, pp. 52, 82-88.] As the bones became broken and the flesh deteriorated, her foot became a perfect penis-substitute, often losing several toes as they were bent under her foot in order to emphasize the big toe sticking out.

    The penis-toe then became the focus of the man's perversion and of his sexual excitement during intercourse. "It formed an essential prelude to the sex act, and its manipulation excited and stimulated... The ways of grasping the foot in one's palms were both profuse and varied; ascending the heights of ecstasy, the lover transferred the foot from palm to mouth. Play included kissing, sucking, and inserting the foot in the mouth until it filled both cheeks, either nibbling at it or chewing it vigorously, and adoringly placing it against one's cheeks, chest, knees, or virile member. [Levy, Chinese Footbinding, p.34] Thus even sex with a female could simulate homosexual intercourse for Chinese males.

    Although Chinese literature has many descriptions of the screams of girls whose feet are being crushed,* the sources are silent as to whether the little girl herself fully understood the sexual purpose of her mutilation. Since she shared the family bed with her parents and presumably observed her father playing sexually with her mother's penis-foot, it is likely that the sexual aim of her painful mutilation was apparent to her.


    * A typical example, Ibid, p. 83: "My toes were pointed, my instep bent down,/And though I cried out to Heaven and Earth,/ Mother ignored me as if she were deaf./My nights were spent in pain,/My early mornings in tears;/I spoke to Mother by my bed;/How you worry when I'm ill,/How frightened if I fall!/Now the agony from my feet has penetrated the marrow of my bones,/And I am plunged into despair, but you,/You don't care a bit about me."
    Last edited by HERO; 04-26-2014 at 01:27 PM.

  2. #2
    WE'RE ALL GOING HOME HERO's Avatar
    Join Date
    Jul 2010
    Posts
    1,001
    Mentioned
    25 Post(s)
    Tagged
    0 Thread(s)

    Default

    http://www.newsweek.com/niall-fergus...needs-go-64419


    - Niall Ferguson (2012): ‘Remarkably the president polls relatively strongly on national security.

    Yet the public mistakes his administration’s astonishingly uninhibited use of political

    assassination for a coherent strategy. According to the Bureau of Investigative Journalism in

    London, the civilian proportion of drone casualties was 16 percent last year. Ask yourself how

    the liberal media would have behaved if George W. Bush had used drones this way. Yet

    somehow it is only ever Republican secretaries of state who are accused of committing “war

    crimes.”


    The real crime is that the assassination program destroys potentially crucial intelligence (as well

    as antagonizing locals) every time a drone strikes. It symbolizes the administration’s decision to

    abandon counterinsurgency in favor of a narrow counterterrorism. What that means in practice

    is the abandonment not only of Iraq but soon of Afghanistan too. Understandably, the men and

    women who have served there wonder what exactly their sacrifice was for, if any notion that

    we are nation building has been quietly dumped. Only when both countries sink back into civil

    war will we realize the real price of Obama’s foreign policy.

    America under this president is a superpower in retreat, if not retirement. Small wonder 46

    percent of Americans—and 63 percent of Chinese—believe that China already has replaced the

    U.S. as the world’s leading superpower or eventually will.

    It is a sign of just how completely Barack Obama has “lost his narrative” since getting elected

    that the best case he has yet made for reelection is that Mitt Romney should not be president.

    In his notorious “you didn’t build that” speech, Obama listed what he considers the greatest

    achievements of big government: the Internet, the GI Bill, the Golden Gate Bridge, the Hoover

    Dam, the Apollo moon landing, and even (bizarrely) the creation of the middle class. Sadly, he

    couldn’t mention anything comparable that his administration has achieved.

    Now Obama is going head-to-head with his nemesis: a politician who believes more in content

    than in form, more in reform than in rhetoric.’



    ‘His Cairo speech of June 4, 2009, was an especially clumsy bid to ingratiate himself on what

    proved to be the eve of a regional revolution. “I’m also proud to carry with me,” he told

    Egyptians, “a greeting of peace from Muslim communities in my country: Assalamu alaikum

    ... I’ve come here ... to seek a new beginning between the United States and Muslims around the

    world, one based ... upon the truth that America and Islam are not exclusive and need not be in

    competition.”


    Believing it was his role to repudiate neoconservatism, Obama completely missed the

    revolutionary wave of Middle Eastern democracy—precisely the wave the neocons had hoped

    to trigger with the overthrow of Saddam Hussein in Iraq. When revolution broke out—first in

    Iran, then in Tunisia, Egypt, Libya, and Syria—the president faced stark alternatives. He could

    try to catch the wave by lending his support to the youthful revolutionaries and trying to ride it

    in a direction advantageous to American interests. Or he could do nothing and let the forces of

    reaction prevail.


    In the case of Iran he did nothing, and the thugs of the Islamic Republic ruthlessly crushed the

    demonstrations. Ditto Syria. In Libya he was cajoled into intervening. In Egypt he tried to have it

    both ways, exhorting Egyptian President Hosni Mubarak to leave, then drawing back and

    recommending an “orderly transition.” The result was a foreign-policy debacle. Not only were

    Egypt’s elites appalled by what seemed to them a betrayal, but the victors—the Muslim

    Brotherhood—had nothing to be grateful for. America’s closest Middle Eastern allies—Israel

    and the Saudis—looked on in amazement.


    “This is what happens when you get caught by surprise,” an anonymous American official told

    The New York Times in February 2011. “We’ve had endless strategy sessions for the past two

    years on Mideast peace, on containing Iran. And how many of them factored in the possibility

    that Egypt moves from stability to turmoil? None.”’



    ‘I was a good loser four years ago. “In the grand scheme of history,” I wrote the day after

    Barack Obama’s election as president, “four decades is not an especially long time. Yet in that

    brief period America has gone from the assassination of Martin Luther King Jr. to the

    apotheosis of Barack Obama. You would not be human if you failed to acknowledge this as a

    cause for great rejoicing.”

    Despite having been—full disclosure—an adviser to John McCain, I acknowledged his

    opponent’s remarkable qualities: his soaring oratory, his cool, hard-to-ruffle temperament, and

    his near faultless campaign organization.

    Yet the question confronting the country nearly four years later is not who was the better

    candidate four years ago. It is whether the winner has delivered on his promises. And the sad

    truth is that he has not.

    In his inaugural address, Obama promised “not only to create new jobs, but to lay a new

    foundation for growth.” He promised to “build the roads and bridges, the electric grids, and

    digital lines that feed our commerce and bind us together.” He promised to “restore science to

    its rightful place and wield technology’s wonders to raise health care’s quality and lower its

    cost.” And he promised to “transform our schools and colleges and universities to meet the

    demands of a new age.” Unfortunately the president’s scorecard on every single one of those

    bold pledges is pitiful.

    In an unguarded moment earlier this year, the president commented that the private sector of

    the economy was “doing fine.” Certainly, the stock market is well up (by 74 percent) relative to

    the close on Inauguration Day 2009. But the total number of private-sector jobs is still 4.3

    million below the January 2008 peak. Meanwhile, since 2008, a staggering 3.6 million Americans

    have been added to Social Security’s disability insurance program. This is one of many ways

    unemployment is being concealed.

    In his fiscal year 2010 budget—the first he presented—the president envisaged growth of 3.2

    percent in 2010, 4.0 percent in 2011, 4.6 percent in 2012. The actual numbers were 2.4 percent

    in 2010 and 1.8 percent in 2011; few forecasters now expect it to be much above 2.3 percent

    this year.



    ‘According to the 2010 budget, the debt in public hands was supposed to fall in relation to GDP

    from 67 percent in 2010 to less than 66 percent this year. If only. By the end of this year,

    according to the Congressional Budget Office (CBO), it will reach 70 percent of GDP. These

    figures significantly understate the debt problem, however. The ratio that matters is debt to

    revenue. That number has leapt upward from 165 percent in 2008 to 262 percent this year,

    according to figures from the International Monetary Fund. Among developed economies, only

    Ireland and Spain have seen a bigger deterioration.

    Not only did the initial fiscal stimulus fade after the sugar rush of 2009, but the president has

    done absolutely nothing to close the long-term gap between spending and revenue.

    His much-vaunted health-care reform will not prevent spending on health programs growing

    from more than 5 percent of GDP today to almost 10 percent in 2037. Add the projected

    increase in the costs of Social Security and you are looking at a total bill of 16 percent of GDP

    25 years from now. That is only slightly less than the average cost of all federal programs and

    activities, apart from net interest payments, over the past 40 years. Under this president’s

    policies, the debt is on course to approach 200 percent of GDP in 2037—a mountain of debt

    that is bound to reduce growth even further.

    And even that figure understates the real debt burden. The most recent estimate for the

    difference between the net present value of federal government liabilities and the net present

    value of future federal revenues—what economist Larry Kotlikoff calls the true “fiscal gap”—is

    $222 trillion.

    The president’s supporters will, of course, say that the poor performance of the economy can’t

    be blamed on him. They would rather finger his predecessor, or the economists he picked to

    advise him, or Wall Street, or Europe—anyone but the man in the White House.

    There’s some truth in this. It was pretty hard to foresee what was going to happen to the

    economy in the years after 2008. Yet surely we can legitimately blame the president for the

    political mistakes of the past four years. After all, it’s the president’s job to run the executive

    branch effectively—to lead the nation. And here is where his failure has been greatest.



    On paper it looked like an economics dream team: Larry Summers, Christina Romer, and

    Austan Goolsbee, not to mention Peter Orszag, Tim Geithner, and Paul Volcker. The inside

    story, however, is that the president was wholly unable to manage the mighty brains—and

    egos—he had assembled to advise him.


    According to Ron Suskind’s book Confidence Men, Summers told Orszag over dinner in

    May 2009: “You know, Peter, we’re really home alone ... I mean it. We’re home alone. There’s

    no adult in charge. Clinton would never have made these mistakes [of indecisiveness on key

    economic issues].” On issue after issue, according to Suskind, Summers overruled the

    president. “You can’t just march in and make that argument and then have him make a

    decision,” Summers told Orszag, “because he doesn’t know what he’s deciding.” (I have heard

    similar things said off the record by key participants in the president’s interminable “seminar”

    on Afghanistan policy.)


    This problem extended beyond the White House. After the imperial presidency of the Bush

    era, there was something more like parliamentary government in the first two years of Obama’s

    administration. The president proposed; Congress disposed. It was Nancy Pelosi and her

    cohorts who wrote the stimulus bill and made sure it was stuffed full of political pork. And it

    was the Democrats in Congress—led by Christopher Dodd and Barney Frank—who devised

    the 2,319-page Wall Street Reform and Consumer Protection Act (Dodd-Frank, for short), a

    near-perfect example of excessive complexity in regulation. The act requires that regulators

    create 243 rules, conduct 67 studies, and issue 22 periodic reports. It eliminates one regulator

    and creates two new ones.’



    ‘And then there was health care. No one seriously doubts that the U.S. system needed to be

    reformed. But the Patient Protection and Affordable Care Act (ACA) of 2010 did nothing to

    address the core defects of the system: the long-run explosion of Medicare costs as the baby

    boomers retire, the “fee for service” model that drives health-care inflation, the link from

    employment to insurance that explains why so many Americans lack coverage, and the

    excessive costs of the liability insurance that our doctors need to protect them from our

    lawyers.

    Ironically, the core Obamacare concept of the “individual mandate” (requiring all Americans to

    buy insurance or face a fine) was something the president himself had opposed when vying with

    Hillary Clinton for the Democratic nomination. A much more accurate term would be

    “Pelosicare,” since it was she who really forced the bill through Congress.

    Pelosicare was not only a political disaster. Polls consistently showed that only a minority of the

    public liked the ACA, and it was the main reason why Republicans regained control of the

    House in 2010. It was also another fiscal snafu. The president pledged that health-care reform

    would not add a cent to the deficit. But the CBO and the Joint Committee on Taxation now

    estimate that the insurance-coverage provisions of the ACA will have a net cost of close to

    $1.2 trillion over the 2012–22 period.’



    ‘The failures of leadership on economic and fiscal policy over the past four years have had

    geopolitical consequences. The World Bank expects the U.S. to grow by just 2 percent in 2012.

    China will grow four times faster than that; India three times faster. By 2017, the International

    Monetary Fund predicts, the GDP of China will overtake that of the United States.





    ‘ . . . one thing is clear. Ryan psychs Obama out. This has been apparent ever since the White

    House went on the offensive against Ryan in the spring of last year. And the reason he psychs

    him out is that, unlike Obama, Ryan has a plan—as opposed to a narrative—for this country.

    Mitt Romney is not the best candidate for the presidency I can imagine. But he was clearly the

    best of the Republican contenders for the nomination. He brings to the presidency precisely the

    kind of experience—both in the business world and in executive office—that Barack Obama

    manifestly lacked four years ago. (If only Obama had worked at Bain Capital for a few years,

    instead of as a community organizer in Chicago, he might understand exactly why the private

    sector is not “doing fine” right now.) And by picking Ryan as his running mate, Romney has

    given the first real sign that—unlike Obama—he is a courageous leader who will not duck the

    challenges America faces.

    The voters now face a stark choice. They can let Barack Obama’s rambling, solipsistic narrative

    continue until they find themselves living in some American version of Europe, with low

    growth, high unemployment, even higher debt—and real geopolitical decline.

    Or they can opt for real change: the kind of change that will end four years of economic

    underperformance, stop the terrifying accumulation of debt, and reestablish a secure fiscal

    foundation for American national security.

    I’ve said it before: it’s a choice between les États Unis and the Republic of the Battle Hymn.’




    - from Civilization: The West and the Rest by Niall Ferguson; pp. xv-xxvii: The principal

    question addressed by this book increasingly seems to me the most interesting question a

    historian of the modern era can ask. Just why, beginning around 1500, did a few small polities

    on the western end of the Eurasian landmass come to dominate the rest of the world, including

    the more populous and in many ways more sophisticated societies of Eastern Eurasia? My

    subsidiary question is this: if we can come up with a good explanation for the West’s past

    ascendancy, can we then offer a prognosis for its future? Is this really the end of the West’s

    world and the advent of a new Eastern epoch? Put differently, are we witnessing the waning of

    an age when the greater part of humanity was more or less subordinated to the civilization that

    arose in Western Europe in the wake of the Renaissance and Reformation — the civilization

    that, propelled by the Scientific Revolution and the Enlightenment, spread across the Atlantic

    and as far as the Antipodes, finally reaching its apogee during the Ages of Revolution, Industry

    and Empire?

    The very fact that I want to pose such questions says something about the first decade of the

    twenty-first century. Born and raised in Scotland, educated at Glasgow Academy and Oxford

    University, I assumed throughout my twenties and thirties that I would spend my academic

    career at either Oxford or Cambridge. I first began to think of moving to the United States

    because an eminent benefactor of New York University’s Stern School of Business, the Wall

    Street veteran Henry Kaufman, had asked me why someone interested in the history of money

    and power did not come to where the money and power actually were. And where else could

    that be but downtown Manhattan? As the new millennium dawned, the New York Stock

    Exchange was self-evidently the hub of an immense global economic network that was

    American in design and largely American in ownership. The dotcom bubble was deflating,

    admittedly, and a nasty little recession ensured that the Democrats lost the White House just

    as their pledge to pay off the national debt began to sound almost plausible. But within just eight

    months of becoming president, George W. Bush was confronted by an event that emphatically

    underlined the centrality of Manhattan to the Western-dominated world. The destruction of

    the World Trade Center by al-Qaeda terrorists paid New York a hideous compliment. This

    was target number one for anyone serious about challenging Western predominance.


    The subsequent events were heady with hubris. The Taliban overthrown in Afghanistan. An

    ‘axis of evil’ branded ripe for ‘regime change’. Saddam Hussein ousted in Iraq. The Toxic Texan

    riding high in the polls, on track for re-election. The US economy bouncing back thanks to tax

    cuts. ‘Old Europe’—not to mention liberal America—fuming impotently. Fascinated, I found

    myself reading and writing more and more about empires, in particular the lessons of Britain’s

    for America’s; the result was Empire: How Britain Made the Modern World (2003). As I

    reflected on the rise, reign and probable fall of America’s empire, it became clear to me that

    there were three fatal deficits at the heart of American power: a manpower deficit (not enough

    boots on the ground in Afghanistan and Iraq), an attention deficit (not enough public enthusiasm

    for long-term occupation of conquered countries) and above all a financial deficit (not enough

    savings relative to investment and not enough taxation relative to public expenditure).


    In Colossus: The Rise and Fall of America’s Empire (2004), I warned that the United States

    had imperceptibly come to rely on East Asian capital to fund its unbalanced current and fiscal

    accounts. The decline and fall of America’s undeclared empire might therefore be due not to

    terrorists at the gates, nor to the rogue regimes that sponsored them, but to a financial crisis at

    the very heart of the empire itself. When, in late 2006, Moritz Schularick and I coined the word

    ‘Chimerica’ to describe what we saw as the dangerously unsustainable relationship — the word

    was a pun on ‘chimera’ — between parsimonious China and profligate America, we had

    identified one of the keys to the coming global financial crisis. For without the availability to the

    American consumer of both cheap Chinese labour and cheap Chinese capital, the bubble of the

    years 2002-7 would not have been so egregious.

    The illusion of American ‘hyper-power’ was shattered not once but twice during the presidency

    of George W. Bush. Nemesis came first in the backstreets of Sadr City and the fields of

    Helmand, which exposed not only the limits of American military might but also, more

    importantly, the naivety of neo-conservative visions of a democratic wave in the Greater Middle

    East. It struck a second time with the escalation of the subprime mortgage crisis of 2007 into

    the credit crunch of 2008 and finally the ‘great recession’ of 2009. After the bankruptcy of

    Lehman Brothers, the sham verities of the ‘Washington Consensus’ and the ‘Great Moderation’

    — the central bankers’ equivalent of the ‘End of History’ — were consigned to oblivion. A

    second Great Depression for a time seemed terrifyingly possible. What had gone wrong? In a

    series of articles and lectures beginning in mid-2006 and culminating in the publication of The

    Ascent of Money
    in November 2008 — when the financial crisis was at its worst — I argued

    that all the major components of the international financial system had been disastrously

    weakened by excessive short-term indebtedness on the balance sheets of banks, grossly

    mispriced and literally overrated mortgage-backed securities and other structured financial

    products, excessively lax monetary policy on the part of the Federal Reserve, a politically

    engineered housing bubble and, finally, the unrestrained selling of bogus insurance policies

    (known as derivatives), offering fake protection against unknowable uncertainties, as opposed

    to quantifiable risks. The globalization of financial institutions that were of Western origin had

    been supposed to usher in a new era of reduced economic volatility. It took historical

    knowledge to foresee how an old-fashioned liquidity crisis might bring the whole shaky edifice

    of leveraged financial engineering crashing to the ground.


    The danger of a second Depression receded after the summer of 2009, though it did not

    altogether disappear. But the world had nevertheless changed. The breathtaking collapse in

    global trade caused by the financial crisis, as credit to finance imports and exports suddenly

    dried up, might have been expected to devastate the big Asian economies, reliant as they were

    said to be on exports to the West. Thanks to a highly effective government stimulus

    programme based on massive credit expansion, however, China suffered only a slow-down in

    growth. This was a remarkable feat that few experts had anticipated. Despite the manifest

    difficulties of running a continental economy of 1.3 billion people as if it were a giant Singapore,

    the probability remains better than even at the time of writing (December 2010) that China will

    continue to forge ahead with its industrial revolution and that, within the decade, it will

    overtake the United States in terms of gross domestic product, just as (in 1963) Japan overtook

    the United Kingdom.


    The West had patently enjoyed a real and sustained edge over the Rest for most of the

    previous 500 years. The gap between Western and Chinese incomes had begun to open up as

    long ago as the 1600s and had continued to widen until as recently as the late 1970s, if not

    later. But since then it had narrowed with astonishing speed. The financial crisis crystallized the

    next historical question I wanted to ask. Had that Western edge now gone? Only by working

    out what exactly it had consisted of could I hope to come up with an answer.



    What follows is concerned with historical methodology . . . I wrote this book because I had

    formed the strong impression that the people currently living were paying insufficient attention

    to the dead. Watching my three children grow up, I had the uneasy feeling that they were

    learning less history than I had learned at their age, not because they had bad teachers but

    because they had bad history books and even worse examinations. Watching the financial crisis

    unfold, I realized that they were far from alone, for it seemed as if only a handful of people in

    the banks and treasuries of the Western world had more than the sketchiest information about

    the last Depression. For roughly thirty years, young people at Western schools and universities

    have been given the idea of a liberal education, without the substance of historical knowledge.

    They have been taught isolated ‘modules’, not narratives, much less chronologies. They have

    been trained in the formulaic analysis of document excerpts, not in the key skill of reading

    widely and fast. They have been encouraged to feel empathy with imagined Roman centurions

    or Holocaust victims, not to write essays about why and how their predicaments arose. In

    The History Boys, the playwright Alan Bennett posed a ‘trilemma’: should history be

    taught as a mode of contrarian argumentation, a communion with past Truth and Beauty, or

    just ‘one fucking thing after another’? He was evidently unaware that today’s sixth-formers are

    offered none of the above — at best, they get a handful of ‘fucking things’ in no particular

    order.


    The former president of the university where I teach once confessed that, when he had been an

    undergraduate at the Massachusetts Institute of Technology, his mother had implored him to

    take at least one history course. The brilliant young economist replied cockily that he was more

    interested in the future than in the past. It is a preference he now knows to be illusory. There

    is in fact no such thing as the future, singular; only futures, plural. There are multiple

    interpretations of history, to be sure, none definitive—but there is only one past. And although

    the past is over, for two reasons it is indispensable to our understanding of what we experience

    today and what lies ahead of us tomorrow and thereafter. First, the current world population

    makes up approximately 7 per cent of all the human beings who have ever lived. The dead

    outnumber the living, in other words, fourteen to one, and we ignore the accumulated

    experience of such a huge majority of mankind at our peril. Second, the past is really our only

    reliable source of knowledge about the fleeting present and to the multiple futures that lie

    before us, only one of which will actually happen. History is not just how we study the past; it is

    how we study time itself.

    Let us first acknowledge the subject’s limitations. Historians are not scientists. They cannot

    (and should not even try to) establish universal laws of social or political ‘physics’ with reliable

    predictive powers. Why? Because there is no possibility of repeating the single, multi-

    millennium experiment that constitutes the past. The sample size of human history is one.

    Moreover, the ‘particles’ in this one vast experiment have consciousness, which is skewed by all

    kinds of cognitive biases. This means that their behaviour is even harder to predict than if they

    were insensate, mindless, gyrating particles. Among the many quirks of the human condition is

    that people have evolved to learn almost instinctively from their own past experience. So their

    behaviour is adaptive; it changes over time. We do not wander randomly but walk in paths, and

    what we have encountered behind us determines the direction we choose when the paths fork

    — as they constantly do.


    So what can historians do? First, by mimicking social scientists and relying on quantitative data,

    historians can devise ‘covering laws’, in Carl Hempel’s sense of general statements about the

    past that appear to cover most cases (for instance, when a dictator takes power instead of a

    democratic leader, the chance increases that the country in question will go to war). Or —

    though the two approaches are not mutually exclusive — the historian can commune with the

    dead by imaginatively reconstructing their experiences in the way described by the great

    Oxford philosopher R. G. Collingwood in his 1939 Autobiography. These two modes of

    historical inquiry allow us to turn the surviving relics of the past into history, a body of

    knowledge and interpretation that retrospectively orders and illuminates the human

    predicament. Any serious predictive statement about the possible futures we may experience is

    based, implicitly or explicitly, on one or both of these historical procedures. If not, then it

    belongs in the same category as the horoscope in this morning’s newspaper.


    Collingwood’s ambition, forged in the disillusionment with natural science and psychology that

    followed the carnage of the First World War, was to take history into the modern age, leaving

    behind what he dismissed as ‘scissors-and-paste history’, in which writers ‘only repeat, with

    different arrangements and different styles of decoration, what others [have] said before them’.

    His thought process is itself worth reconstructing:



    a) ‘The past which an historian studies is not a dead past, but a past which in some sense is


    still living in the present’ in the form of traces . . . that have survived.


    b) ‘All history is the history of thought’, in the sense that a piece of historical evidence is


    meaningless if its intended purpose cannot be inferred.


    c) That process of inference requires an imaginative leap through time: ‘Historical


    knowledge is the re-enactment in the historian’s mind of the thought whose history he


    is studying.’


    d) But the real meaning of history comes from the juxtaposition of past and present:


    ‘Historical knowledge is the re-enactment of a past thought incapsulated in a context of


    present thoughts which, by contradicting it, confine it to a plane different from theirs.’


    e) The historian thus ‘may very well be related to the nonhistorian as the trained


    woodsman is to the ignorant traveller. “Nothing here but trees and grass,” thinks the


    traveller, and marches on. “Look,” says the woodsman, “there is a tiger in that grass.”’


    In other words, Collingwood argues, history offers something ‘altogether different from


    [scientific] rules, namely insight’.


    f) The true function of historical insight is ‘to inform [people] about the present, in so far


    as the past, its ostensible subject matter, [is] incapsulated in the present and


    [constitutes] a part of it not at once obvious to the untrained eye’.


    g) As for our choice of subject matter for historical investigation, Collingwood makes it


    clear that there is nothing wrong with what his Cambridge contemporary Herbert


    Butterfield condemned as ‘present-mindedness’: ‘True historical problems arise out of


    practical problems. We study history in order to see more clearly into the situation in


    which we are called upon to act. Hence the plane on which, ultimately, all problems


    arise is the plane of “real” life: that to which they are referred for their solution is


    history.’



    A polymath as skilled in archaeology as he was in philosophy, a staunch opponent of

    appeasement and an early hater of the Daily Mail,* Collingwood has been my guide for

    many years, but never has he been more indispensable than in the writing of this book. For the

    problem of why civilizations fall is too important to be left to the purveyors of scissors-and-

    paste history. It is truly a practical problem of our time, and this book is intended to be a

    woodsman’s guide to it. For there is more than one tiger hidden in this grass.


    * Which he called ‘the first English newspaper for which the word “news” lost its old meaning

    of facts which a reader ought to know . . . and acquired the new meaning of facts, or fictions,

    which it might amuse him to read’.



    In dutifully reconstructing past thought, I have tried always to remember a simple truth about

    the past that the historically inexperienced are prone to forget. Most people in the past either

    died young or expected to die young, and those who did not were repeatedly bereft of those

    they loved, who did die young. Consider the case of my favourite poet, the Jacobean master

    John Donne, who lived to the age of fifty-nine, thirteen years older than I am as I write. A

    lawyer, a Member of Parliament and, after renouncing the Roman Catholic faith, an Anglican

    priest, Donne married for love, as a result losing his job as secretary to his bride’s uncle, Sir

    Thomas Egerton, the Lord Keeper of the Privy Seal. [After he was briefly arrested for defying

    her father, she quipped: ‘John Donne — Anne Donne — Un-done.’ No wonder he loved her.]

    In the space of sixteen impecunious years, Anne Donne bore her husband twelve children.

    Three of them, Francis, Nicholas and Mary, died before they were ten. Anne herself died after

    giving birth to the twelfth child, which was stillborn. After his favourite daughter Lucy had died

    and he himself had very nearly followed her to the grave, Donne wrote his Devotions upon

    Emergent Occasions
    (1624), which contains the greatest of all exhortations to commiserate

    with the dead: ‘Any man’s death diminishes me, because I am involved in

    Mankinde; And therefore never send to know for whom the bell tolls; It tolls for

    thee.’ Three years later, the death of a close friend inspired him to write ‘A Nocturnal

    upon St Lucy’s Day, Being the Shortest Day’:


    Study me then, you who shall lovers be

    At the next world, that is, at the next spring;

    For I am every dead thing,

    In whom Love wrought new alchemy.

    For his art did express

    A quintessence even from nothingness,

    From dull privations, and lean emptiness;

    He ruin’d me, and I am re-begot

    Of absence, darkness, death — things which are not.



    Everyone should read these lines who wants to understand better the human condition in the

    days when life expectancy was less than half what it is today.


    The much greater power of death to cut people off in their prime not only made life seem

    precarious and filled it with grief. It also meant that most of the people who built the

    civilizations of the past were young when they made their contributions. The great Dutch-

    Jewish philosopher Baruch or Benedict Spinoza, who hypothesized that there is only a material

    universe of substance and deterministic causation, and that ‘God’ is that universe’s natural

    order as we dimly apprehend it and nothing more, died in 1677 at the age of forty-four,

    probably from the particles of glass he had inhaled doing his day-job as a lens grinder. Blaise

    Pascal, the pioneer of probability theory and hydrodynamics and the author of the

    Pensées, the greatest of all apologias for the Christian faith, lived to be just thirty-nine; he

    would have died even younger had the road accident that reawakened his spiritual side been

    fatal. Who knows what other great works these geniuses might have brought forth had they

    been granted the lifespans enjoyed by, for example, the great humanists Erasmus (sixty-nine)

    and Montaigne (fifty-nine)? Mozart, composer of the most perfect of all operas, Don

    Giovanni
    , died when he was just thirty-five. Franz Schubert, composer of the sublime String

    Quintet in C (D956), succumbed, probably to syphilis, at the age of just thirty-one. Prolific

    though they were, what else might they have composed if they had been granted the sixty-three

    years enjoyed by the stolid Johannes Brahms or the even more exceptional seventy-two years

    allowed the ponderous Anton Bruckner? The Scots poet Robert Burns, who wrote the

    supreme expression of egalitarianism, ‘A Man’s a Man for A’ That’, was thirty-seven when he

    died in 1796. What injustice, that the poet who most despised inherited status (‘The rank is but

    the guinea’s stamp, / The Man’s the gowd [gold] for a’ that’) should have been so much outlived

    by the poet who most revered it: Alfred, Lord Tennyson, who died bedecked with honours at

    the age of eighty-three. Palgrave’s Golden Treasury would be the better for more Burns

    and less Tennyson. And how different would the art galleries of the world be today if the

    painstaking Jan Vermeer had lived to be ninety-one and the over-prolific Pablo Picasso had died

    at thirty-nine, instead of the other way around?


    Politics, too, is an art — as much a part of our civilization as philosophy, opera, poetry or

    painting. But the greatest political artist in American history, Abraham Lincoln, served only one

    full term in the White House, falling victim to an assassin with a petty grudge just six weeks

    after his second inaugural address. He was fifty-six. How different would the era of

    Reconstruction have been had this self-made titan, born in a log cabin, the author of the

    majestic Gettysburg Address—which redefined the United States as ‘a nation, conceived in

    liberty, and dedicated to the proposition that all men are created equal’, with a ‘government of

    the people, by the people, for the people’ — lived as long as the polo-playing then polio-

    stricken grandee Franklin Delano Roosevelt, whom medical science kept alive long enough to

    serve nearly four full terms as president before his death at sixty-three?

    Because our lives are so very different from the lives of most people in the past, not least in

    their probable duration, but also in our greater degree of physical comfort, we must exercise

    our imaginations quite vigorously to understand the men and women of the past. In his

    Theory of Moral Sentiments, written a century and half before Collingwood’s memoir, the

    great economist and social theorist Adam Smith defined why a civilized society is not a war of

    all against all — because it is based on sympathy:


    As we have no immediate experience of what other men feel, we can form no idea of the

    manner in which they are affected, but by conceiving what we ourselves should feel in the like

    situation. Though our brother is on the rack, as long as we ourselves are at our ease, our

    senses will never inform us of what he suffers. They never did, and never can, carry us beyond

    our own person, and it is by the imagination only that we can form any conception of what are

    his sensations. Neither can that faculty help us to this any other way, than by representing to us

    what would be our own, if we were in his case. It is the impressions of our own senses only,

    not those of his, which our imaginations copy. By the imagination, we place ourselves in his

    situation.



    This, of course, is precisely what Collingwood says the historian should do, and it is what I want

    the reader to do as she encounters in these pages the resurrected thoughts of the dead. The

    key point of the book is to understand what made their civilization expand so spectacularly in

    its wealth, influence and power. But there can be no understanding without that sympathy

    which puts us, through an act of imagination, in their situation. That act will be all the more

    difficult when we come to resurrect the thoughts of the denizens of other civilizations — the

    ones the West subjugated or, at least, subordinated to itself. For they are equally important

    members of the drama’s cast. This is not a history of the West but a history of the world, in

    which Western dominance is the phenomenon to be explained.


    In an encyclopaedia entry he wrote in 1959, the French historian Fernand Braudel defined a

    civilization as:


    first of all a space, a ‘cultural area’ . . . a locus. With the locus . . . you must picture a great

    variety of ‘goods’, of cultural characteristics, ranging from the form of its houses, the material of

    which they are built, their roofing, to skills like feathering arrows, to a dialect or group of

    dialects, to tastes in cooking, to a particular technology, a structure of beliefs, a way of making

    love, and even to the compass, paper, the printing press. It is the regular grouping, the

    frequency with which particular characteristics recur, their ubiquity within a precise area

    [combined with] . . . some sort of temporal permanence . . .



    Braudel was better at delineating structures than explaining change, however. These days, it is

    often said that historians should tell stories; accordingly, this book offers a big story—a meta-

    narrative of why one civilization transcended the constraints that had bound all previous ones

    — and a great many smaller tales or micro-histories within it. Nevertheless the revival of the

    art of narrative is only part of what is needed. In addition to stories, it is also important that

    there be questions. ‘Why did the West come to dominate the Rest?’ is a question that demands

    something more than a just-so story in response. The answer needs to be analytical, it needs to

    be supported by evidence and it needs to be testable by means of the counterfactual question: if

    the crucial innovations I identify here had not existed, would the West have ruled the Rest

    anyway for some other reason that I have missed or under-emphasized? Or would the world

    have turned out quite differently, with China on top, or some other civilization? We should not

    delude ourselves into thinking that our historical narratives, as commonly constructed, are

    anything more than retro-fits. To contemporaries, as we shall see, the outcome of Western

    dominance did not seem the most probable of the futures they could imagine; the scenario of

    disastrous defeat often loomed larger in the mind of the historical actor than the happy ending

    vouchsafed to the modern reader. The reality of history as a lived experience is that it is much

    more like a chess match than a novel, much more like a football game than a play.


    It wasn’t all good. No serious writer would claim that the reign of Western civilization was

    unblemished. Yet there are those who would insist that there was nothing whatever good

    about it. This position is absurd. As is true of all great civilizations, that of the West was Janus-

    faced: capable of nobility yet also capable of turpitude. Perhaps a better analogy is that the

    West resembled the two feuding brothers in James Hogg’s Private Memoirs and Confessions

    of a Justified Sinner
    (1824) or in Robert Louis Stevenson’s Master of Ballantrae (1889).

    Competition and monopoly; science and superstition; freedom and slavery; curing and killing;

    hard work and laziness — in each case, the West was father to both the good and the bad. It

    was just that, as in Hogg’s or Stevenson’s novel, the better of the two brothers ultimately came

    out on top. We must also resist the temptation to romanticize history’s losers. The other

    civilizations overrun by the West’s, or more peacefully transformed by it through borrowings

    as much as through impositions, were not without their defects either, of which the most

    obvious is that they were incapable of providing their inhabitants with any sustained

    improvement in the material quality of their lives. One difficulty is that we cannot always

    reconstruct the past thoughts of these non-Western peoples, for not all of them existed in

    civilizations with the means of recording and preserving thought. In the end, history is primarily

    the study of civilizations, because without written records the historian is thrown back on

    spearheads and pot fragments, from which much less can be inferred.

    The French historian and statesman Francois Guizot said that the history of civilization is ‘the

    biggest of all . . . it comprises all the others’. It must transcend the multiple disciplinary

    boundaries erected by academics, with their compulsion to specialize, between economic,

    social, cultural, intellectual, political, military and international history. It must cover a great deal

    of time and space, because civilizations are not small or ephemeral. But a book like this cannot

    be an encyclopaedia. To those who will complain about what has been omitted, I can do no

    more than quote the idiosyncratic jazz pianist Thelonious Monk: ‘Don’t play everything (or

    every time); let some things go by . . . What you don’t play can be more important than what

    you do.’ I agree. Many notes and chords have been omitted below. But they have been left out

    for a reason. Does the selection reflect the biases of a middle-aged Scotsman, the archetypal

    beneficiary of Western predominance? Very likely. But I cherish the hope that the selection will

    not be disapproved of by the most ardent and eloquent defenders of Western values today,

    whose ethnic origins are very different from mine — from Amartya Sen to Liu Xiaobo, from

    Hernando de Soto to the dedicatee of this book.






    - pp. 196-198: What we must do is to transform our Empire and our people, make the empire

    like the countries of Europe and our people like the peoples of Europe.—Inoue Kaoru


    Will the West, which takes its great invention, democracy, more seriously than the Word of

    God, come out against this coup that has brought an end to democracy in Kars? . . . Or are we

    to conclude that democracy, freedom and human rights don’t matter, that all the West wants is

    for the rest of the world to imitate it like monkeys? Can the West endure any democracy

    achieved by enemies who in no way resemble them? — Orhan Pamuk


    THE BIRTH OF THE CONSUMER SOCIETY


    In 1909, inspired by a visit to Japan, the French-Jewish banker and philanthropist Albert Kahn*

    set out to create an album of colour photographs of people from every corner of the world.

    The aim, Kahn said, was ‘To put into effect a sort of photographic inventory of the surface of

    the globe as inhabited and developed by Man at the beginning of the twentieth century.’

    Created with the newly invented autochrome process, the 72,000 photographs and 100 hours

    of film in Kahn’s ‘archives of the planet’ show a dazzling variety of costumes and fashions from

    more than fifty different countries: dirt-poor peasants in the Gaeltacht, dishevelled conscripts in

    Bulgaria, forbidding chieftains in Arabia, stark-naked warriors in Dahomey, garlanded maharajas

    in India, come-hither priestesses in Indo-China and strangely stolid-looking cowboys in the Wild

    West. [Okuefuna, Wonderful World of Albert Kahn.] In those days, to an extent that

    seems astonishing today, we were what we wore.

    * Kahn, a pupil of the philosopher Henri Bergson, was ruined by the Depression, bringing his

    grand photographic project to an end. A selection of the images can be viewed at

    http://www.albertkahn.co.uk/photos.html


    Today, a century later, Kahn’s project would be more or less pointless, because these days

    most people around the world dress in much the same way: the same jeans, the same sneakers,

    the same T-shirts. There are just a very few places where people hold out against the giant

    sartorial blending machine. One of them is rural Peru. In the mountains of the Andes, the

    Quechua women still wear their brightly coloured dresses and shawls and their little felt hats,

    pinned at jaunty angles and decorated with their tribal insignia. Except that these are not

    traditional Quechua clothes at all. The dresses, shawls and hats are in fact of Andalusian origin

    and were imposed by the Spanish Viceroy Francisco de Toledo in 1572, in the wake of Tupac

    Amaru’s defeat. Authentically traditional Andean female attire consisted of a tunic (the

    anacu), secured at the waist by a sash (the chumpi), over which was worn a mantle

    (the lliclla), which was fastened with a tupu pin. What Quechua women wear

    nowadays is a combination of these earlier garments with the clothes they were ordered to

    wear by their Spanish masters. The bowler hats popular among Bolivian women came later,

    when British workers arrived to build that country’s first railways. The current fashion among

    Andean men for American casual clothing is thus merely the latest chapter in a long history of

    sartorial Westernization.

    What is it about our clothes that other people seem unable to resist? Is dressing like us about

    wanting to be like us? Clearly, this is about more than just clothes. It is about embracing an

    entire popular culture that extends through music and movies, to say nothing of soft drinks and

    fast food. That popular culture carries with it a subtle message. It is about freedom — the right

    to dress or drink or eat as you please (even if that turns out to be like everybody else). It is

    about democracy — because only those consumer products that people really like get made.

    And, of course, it is about capitalism — because corporations have to make a profit by selling

    the stuff. But clothing is at the heart of the process of Westernization for one very simple

    reason. That great economic transformation which historians long ago named the Industrial

    Revolution — that quantum leap in material standards of living for a rising share of humanity —

    had its origins in the manufacture of textiles. It was partly a miracle of mass production brought

    about by a wave of technological innovation, which had its origin in the earlier Scientific

    Revolution. But the Industrial Revolution would not have begun in Britain and spread to the

    rest of the West without the simultaneous development of a dynamic consumer society,

    characterized by an almost infinitely elastic demand for cheap clothes. The magic of

    industrialization, though it was something contemporary critics generally overlooked, was that

    the worker was at one and the same time a consumer. The ‘wage slave’ also went shopping; the

    lowliest proletarian had more than one shirt, and aspired to have more than two.


    The consumer society is so all-pervasive today that it is easy to assume it has always existed.

    Yet in reality it is one of the more recent innovations that propelled the West ahead of the

    Rest. Its most striking characteristic is its seemingly irresistible appeal. Unlike modern medicine,

    which was often imposed by force on Western colonies, the consumer society is a killer

    application the rest of the world has generally yearned to download. Even those social orders

    explicitly intended to be anti-capitalist — most obviously the various derivatives of the doctrine

    of Karl Marx — have been unable to exclude it. The result is one of the greatest paradoxes of

    modern history: that an economic system designed to offer infinite choice to the individual has

    ended up homogenizing humanity.




    - pp. 227-231: The First World War . . . was a struggle between empires whose motives and

    methods had been honed overseas. It toppled four dynasties and shattered their empires. The

    American President Woodrow Wilson — the first of four Democratic holders of the office to

    embroil their country in a major overseas war — sought to recast the conflict as a war for

    national self-determination, a view that was never likely to be endorsed by the British and

    French empires, whose flagging war effort had been salvaged by American money and men.

    Czechs, Estonians, Georgians, Hungarians, Lithuanians, Latvians, Poles, Slovaks and Ukrainians

    were not the only ones who scented freedom; so did Arabs and Bengalis, to say nothing of the

    Catholic Irish. Aside from the Irish one and Finland, not one of the nation-states that emerged

    in the wake of the war retained meaningful independence by the end of 1939 (except possibly

    Hungary). The Mazzinian map of Europe appeared and then vanished like a flash in the pan.



    The alternative post-war vision of Vladimir Ilyich Lenin was of a Union of Soviet Socialist

    Republics, potentially expanding right across Eurasia. This gained its traction from the

    exceptional economic circumstances of the war. Because all governments financed the fighting

    to some degree by issuing short-term debt and exchanging it for cash at their central banks —

    printing money, in short — inflation gathered momentum during the war. Because so many men

    were under arms, labour shortages empowered the workers on the Home Front to push for

    higher wages. By 1917 hundreds of thousands of workers were involved in strikes in France,

    Germany and Russia. First Spanish influenza then Russian Bolshevism swept the world. As in

    1848 urban order broke down, only this time the contagion spread as far as Buenos Aires and

    Bengal, Seattle and Shanghai. Yet the proletarian revolution failed everywhere but in the Russian

    Empire, which was reassembled by the Bolsheviks in the wake of a brutal civil war. No other

    socialist leaders were as ruthless as Lenin in adopting ‘democratic centralism’ (which was the

    opposite of democratic), rejecting parliamentarism and engaginging in terrorism against

    opponents. Some of what the Bolsheviks did (the nationalization of banks, the confiscation of

    land) was straight out of Marx and Engels’s Manifesto. Some of what they did (‘the greatest

    ferocity and savagery of suppression . . . seas of blood’) owed more to Robespierre. The

    ‘dictatorship of the proletariat’—which in fact meant the dictatorship of the Bolshevik

    leadership—was Lenin’s original contribution. This was even worse than the resurrection of

    Bazarov, the nihilist in Ivan Turgenev’s Fathers and Sons (1856). It was what his estranged

    friend Fyodor Dostoevsky had warned Russia about in the epilogue of Crime and

    Punishment
    (1866) — the murderer Raskolnikov’s nightmare of a ‘terrible, unprecedented

    and unparalleled plague’ from Asia:


    Those infected were seized immediately and went mad. Yet people never considered

    themselves so clever and unhesitatingly right as these infected ones considered themselves.

    Never had they considered their decrees, their scientific deductions, their moral convictions

    and their beliefs more firmly based. Whole settlements, whole cities and nations, were infected

    and went mad . . . People killed each other with senseless rage . . . soldiers flung themselves

    upon each other, slashed and stabbed, ate and devoured each other.



    To the east there was almost no stopping the Bolshevik epidemic. To the west it could not get

    over the Vistula, nor south of the Caucasus, thanks to a gifted trio of political entrepreneurs

    who devised that synthesis of nationalism and socialism which was the true manifestation of the

    Zeitgeist: Jozef Pilsudski in Poland, Kemal Ataturk in Turkey and Benito Mussolini in Italy.

    The defeat of the Red Army outside Warsaw (August 1920), the expulsion of the Anatolian

    Greeks (September 1922) and the fascist March on Rome (October 1922) marked the advent

    of a new era—and a new look.

    With the exception of Mussolini, who wore a three-piece suit with a winged collar and spats,

    most of those who participated in the publicity stunt that was the March on Rome were in

    makeshift uniforms composed of black shirts, jodhpurs and knee-high leather riding boots. The

    idea was that the manly, martial virtues of the Great War would now be carried over into

    peacetime, beginning with a smaller war fought in the streets and fields against the left.

    Uniformity was the order of the day — but a uniformity of dress without the tedious discipline

    of a real army. Even the famous March was more of a stroll, as the many press photographs

    make clear. It had been the Italian nationalist Giuseppe Garibaldi who had first used red-

    coloured shirts as the basis for a political movement. By the 1920s dyed tops were mandatory

    on the right; the Italian fascists opted for black while, as we have seen, the German National

    Socialist Sturmabteilung adopted colonial brown.

    Such movements might have dissolved into ill-tailored obscurity had it not been for the Great

    Depression. After the inflation of the early 1920s, the deflation of the early 1930s dealt a lethal

    blow to the Wilsonian vision of a Europe based on national identity and democracy. The crisis

    of American capitalism saw the stock market slump by 89 per cent, output drop by a third,

    consumer prices fall by a quarter and the unemployment rate pass a quarter. Not all European

    countries were so severely affected, but none was unscathed. As governments scrambled to

    protect their own industries with higher tariffs—the American Smoot-Hawley tariff bill raised

    the effective ad valorem rate on imported cotton manufactures to 46 per cent —

    globalization simply broke down. Between 1929 and 1932 world trade shrank by two-thirds.

    Most countries adopted some combination of debt default, currency depreciation, protectionist

    tariffs, import quotas and prohibitions, import monopolies and export premia. The day had

    dawned, it seems, of the nationalist-socialist state.


    This was an illusion. Though the US economy seemed to be imploding, the principal cause was

    the disastrous monetary policy adopted by the Federal Reserve Board, which half wrecked the

    banking system. [Friedman and Schwartz, Monetary History of the United States.]

    Innovation, the mainspring of industrial advance, did not slacken in the 1930s. New

    automobiles, radios and other consumer durables were proliferating. New companies were

    developing these products, like DuPont (nylon), Revlon (cosmetics), Proctor & Gamble (Dreft

    soap powder), RCA (radio and television) and IBM (accounting machines); they were also

    evolving and disseminating a whole new style of business management. Nowhere was the

    creativity of capitalism more marvellous to behold than in Hollywood, home of the motion-

    picture industry. In 1931—when the US economy was in the grip of blind panic—the big studios

    released Charlie Chaplin’s City Lights, Howard Hughes’s The Front Page and the

    Marx Brothers’ Monkey Business. The previous decade’s experiment with the Prohibition

    of alcohol had been a disastrous failure, spawning a whole new economy of organized crime.

    But it was only more grist for the movie mills. Also in 1931, audiences flocked to see James

    Cagney and Edward G. Robinson in the two greatest gangster films of them all: The Public

    Enemy
    and Little Caesar. No less creative was the live, recorded and broadcast music

    business, once white Americans had discovered that black Americans had nearly all the best

    tunes. Jazz approached its zenith in the swinging sound of Duke Ellington’s big band, which

    rolled out hit after hit even as the automobile-production lines ground to halt: ‘Mood Indigo’

    (1930), ‘Creole Rhapsody’ (1931), ‘It Don’t Mean a Thing (If It Ain’t Got That Swing)’ (1932),

    ‘Sophisticated Lady’ (1933) and ‘Solitude’ (1934). The grandson of a slave, Ellington took reed

    and brass instruments where they had never been before, mimicking everything from spirituals

    to the New York subway. His band’s long residence at the Cotton Club was at the very heart

    of the Harlem Renaissance. And of course, as his aristocratic nickname required, Ellington was

    always immaculately dressed — courtesy of Anderson & Sheppard of Savile Row.


    In short, capitalism was not fatally flawed, much less dead. It was merely a victim of bad

    management, and the uncertainty that followed from it. The cleverest economist of the age,

    John Maynard Keynes, sneered at the stock exchange as a ‘casino’, comparing investors’

    decisions to a newspaper beauty contest. President Franklin D. Roosevelt — elected just as the

    Depression was ending — inveighed against ‘the unscrupulous money changers’. The real

    culprits were the central bankers who had first inflated a stock-exchange bubble with

    excessively lax monetary policy and had then proceeded to tighten (or failed adequately to

    loosen) after the bubble had burst. Between 1929 and 1933, nearly 15,000 US banks — two-

    fifths of the total — failed. As a result, the money supply was savagely reduced. With prices

    collapsing by a third from peak to trough, real interest rates rose above 10 percent, crushing

    any indebted institution or household. Keynes summed up the negative effects of deflation:


    Modern business, being carried on largely with borrowed money, must necessarily be

    brought to a standstill by such a process. It will be to the interest of everyone in business to go

    out of business for the time being; and of everyone who is contemplating expenditure to

    postpone his orders so long as he can. The wise man will be he who turns his assets into cash,

    withdraws from the risks and the exertions of activity, and awaits in country retirement the

    steady appreciation promised him in the value of his cash. A probable expectation of Deflation

    is bad.



    How to escape from the deflation trap? With trade slumbering and capital imports frozen,

    Keynes’s recommendation — government spending on public works, financed by borrowing —

    made sense. It also helped to abandon the gold standard, whereby currencies had fixed dollar

    exchange rates, to let depreciation provide a boost to exports (though increasingly trade went

    on within regional blocs) and to allow interest rates to fall. Yet parliamentary governments that

    adopted only these measures achieved at best anaemic recoveries.




    - pp. 204-212: Like the French Revolution before it, the British Industrial Revolution spread

    across Europe. But this was a peaceful conquest. The great innovators were largely unable to

    protect what would now be called their intellectual property rights. With remarkable speed,

    the new technology was therefore copied and replicated on the continent and across the

    Atlantic. The first true cotton mill, Richard Arkwright’s at Cromford in Derbyshire, was built in

    1771. Within seven years a copy appeared in France. It took just three years for the French to

    copy Watt’s 1775 steam engine. By 1784 there were German versions of both, thanks in large

    measure to industrial espionage. The Americans who had the advantage of being able to grow

    their own cotton as well as mine their own coal, were a little slower: the first cotton mill

    appeared in Bass River, Massachusetts, in 1788, the first steam engine in 1803. The Belgians,

    Dutch and Swiss were not far behind. The pattern was similar after the first steam locomotives

    began pulling carriages on the Stockton and Darlington Railway in 1825, though that innovation

    took a mere five years to cross the Atlantic, compared with twelve years to reach Germany

    and twenty-two to arrive in Switzerland. As the efficiency of the technology improved, so it

    became economically attractive even where labour was cheaper and coal scarcer. Between

    1820 and 1913 the number of spindles in the world increased four times as fast as the world’s

    population, but the rate of increase was twice as fast abroad as in the United Kingdom. Such

    were the productivity gains — and the growth of demand — that the gross output of the world

    cotton industry rose three times as fast as total spindleage. As a result, between 1820 and 1870

    a handful of North-west European and North American countries achieved British rates of

    growth; indeed, Belgium and the United States grew faster.

    By the late nineteenth century, then, industrialization was in full swing in two broad bands: one

    stretching across the American Northeast, with towns like Lowell, Massachusetts at its heart,

    and another extending from Glasgow to Warsaw and even as far as Moscow. In 1800 seven out

    of the world’s ten biggest cities had still been Asian, and Beijing had still exceeded London in

    size. By 1900, largely as a result of the Industrial Revolution, only one of the biggest was Asian;

    the rest were European or American.

    The spread around the world of the British-style industrial city inspired some observers but

    dismayed others. Among the inspired was Charles Darwin who, as he acknowledged in On

    the Origin of Species
    (1859), had been ‘well prepared to appreciate the struggle for

    existence’ by the experience of living through the Industrial Revolution. Much of Darwin’s

    account of natural selection could have applied equally well to the economic world of the mid-

    nineteenth-century textile business:


    All organic beings are exposed to severe competition . . . As more individuals are produced

    than can possibly survive, there must in every case be a struggle for existence, either one

    individual with another of the same species, or with the individuals of distinct species, or with

    the physical conditions of life. Each organic being . . . has to struggle for life . . . As natural

    selection acts solely by accumulating slight, successive, favourable variations, it can produce no

    great or sudden modification . . .



    In that sense, it might make more sense for historians to talk about an Industrial Evolution,

    in Darwin’s sense of the word. As the economists Thorstein Veblen and Joseph Schumpeter

    would later remark, nineteenth-century capitalism was an authentically Darwinian system,

    characterized by seemingly random mutation, occasional speciation, and differential survival or,

    to use Schumpeter’s memorable phrase, ‘creative destruction’.


    Yet precisely the volatility of the more or less unregulated markets created by the Industrial

    Revolution caused consternation among many contemporaries. Until the major breakthroughs

    in public health . . . mortality rates in industrial cities were markedly worse than in the

    countryside. Moreover, the advent of a new and far from regular ‘business cycle’, marked by

    periodic crises of industrial over-production and financial panic, generally made a stronger

    impression on people than the gradual acceleration of the economy’s average growth rate.

    Though the Industrial Revolution manifestly improved life over the long run, in the short run it

    seemed to make things worse. One of William Blake’s illustrations for his preface to

    Milton featured, among other sombre images, a dark-skinned figure holding up a blood-

    soaked length of cotton yarn. [The ‘dark Satanic mills’ of the text may well refer to the Albion

    Flour Mills, built by Boulton & Watt in London in 1769 and destroyed by fire in 1791.] For the

    composer Richard Wagner, London was ‘Alberich’s dream come true — Nibelheim, world

    dominion, activity, work, everywhere the oppressive feeling of steam and fog’. Hellish images of

    the British factory inspired his depiction of the dwarf’s underground realm in Das

    Rheingold
    , as well as one of the leitmotifs of the entire Ring cycle, the insistent,

    staccato rhythm of multiple hammers . . .


    Steeped in German literature and philosophy, the Scottish writer Thomas Carlyle was the first

    to identify what seemed the fatal flaw of the industrial economy: that it reduced all social

    relations to what he called, in his essay Past and Present, ‘the cash nexus’:



    [U]the world has been rushing on with such fiery animation to get work and ever more work

    done, it has had no time to think of dividing the wages; and has merely left them to be

    scrambled for by the Law of the Stronger, law of Supply-and-demand, law of Laissez-faire, and

    other idle Laws and Un-laws. We call it a Society; and go about professing openly the totalest

    separation, isolation. Our life is not a mutual helpfulness; but rather, cloaked under due

    laws-of-war, named ‘fair competition’ and so forth, it is a mutual hostility. We have profoundly

    forgotten everywhere that Cash-payment is not the sole relation of human beings . . .[It] is

    not the sole nexus of man with man, — how far from it! Deep, far deeper than Supply-and-

    demand, are Laws, Obligations sacred as Man’s Life itself.[U] [Thomas Carlyle, Past and

    Present
    , Book I, chs. 1-4, Book IV, chs, 4, 8.]



    That phrase—the ‘cash nexus’—so much pleased the son of an apostate Jewish lawyer from the

    Rhineland that he and his co-author, the heir of a Wuppertal cotton mill-owner, purloined it for

    the outrageous ‘manifesto’ they published on the eve of the 1848 revolutions.

    The founders of communism, Karl Marx and Friedrich Engels, were just two of many radical

    critics of the industrial society, but it was their achievement to devise the first internally

    consistent blueprint for an alternative social order. Since this was the beginning of a schism

    within Western civilization that would last for nearly a century and a half, it is worth pausing to

    consider the origins of their theory. A mixture of Hegel’s philosophy, which represented the

    historical process as dialectical, and the political economy of Ricardo, which posited diminishing

    returns for capital and an ‘iron’ law of low wages, Marxism took Carlyle’s revulsion against the

    industrial economy and substituted a utopia for nostalgia.


    Marx himself was an odious individual. An unkempt scrounger and a savage polemicist, he liked

    to boast that his wife was ‘née Baroness von Westphalen’, but nevertheless sired an

    illegitimate son by their maidservant. On the sole occasion when he applied for a job (as a

    railway clerk) he was rejected because his handwriting was so atrocious. He sought to play the

    stock market but was hopeless at it. For most of his life he therefore depended on handouts

    from Engels, for whom socialism was a hobby, along with fox-hunting and womanizing; his day

    job was running one of his father’s cotton factories in Manchester (the patent product of which

    was known as ‘Diamond Thread’). No man in history has bitten the hand that fed him with

    greater gusto than Marx bit the hand of King Cotton.


    The essence of Marxism was the belief that the industrial economy was doomed to produce an

    intolerably unequal society divided between the bourgeoisie, the owners of capital, and a

    propertyless proletariat. Capitalism inexorably demanded the concentration of capital in ever

    fewer hands and the reduction of everyone else to wage slavery, which meant being paid only

    ‘that quantum of the means of subsistence which is absolutely requisite to keep the labourer in

    bare existence as a labourer’. In chapter 32 of the first tome of his scarcely readable

    Capital (1867), Marx prophesied the inevitable denouement:


    Along with the constant decrease of the number of capitalist magnates, who usurp and

    monopolize all the advantages of this process of transformation, the mass of misery,

    oppression, slavery, degradation and exploitation grows; but with this there also grows the

    revolt of the working class . . .

    The centralization of the means of production and the socialization of labour reach a point at

    which they become incompatible with their capitalist integument. This integument is burst

    asunder. The knell of capitalist private property sounds. The expropriators are expropriated.



    It is not unintentional that this passage has a Wagnerian quality, part

    Gotterdammerung, part Parsifal. But by the time the book was published the great

    composer had left the spirit of 1848 far behind. Instead it was Eugene Pottier’s song ‘The

    Internationale’ that became the anthem of Marxism. Set to music by Pierre De Geyter, it urged

    the ‘servile masses’ to put aside their religious ‘superstitions’ and national allegiances, and make

    war on the ‘thieves’ and their accomplices, the tyrants, generals, princes and peers.



    Before identifying why there were wrong, we need to acknowledge what Marx and his disciples

    were right about. Inequality did increase as a result of the Industrial Revolution. Between 1780

    and 1830 output per labourer in the UK grew over 25 per cent but wages rose barely 5 per

    cent. The proportion of national income going to the top percentile of the population rose

    from 25 per cent in 1801 to 35 per cent in 1848. In Paris in 1820, around 9 percent of the

    population were classified as ‘proprietors and rentiers’ (living from their investments) and

    owned 41 per cent of recorded wealth. By 1911 their share had risen to 52 per cent. In Prussia,

    the share of income going to the top 5 per cent rose from 21 per cent in 1854 to 27 per cent

    in 1896 and to 43 per cent in 1913. [Kaelble, Industrialization and Social Inequality.]

    Industrial societies, it seems clear, grew more unequal over the course of the nineteenth

    century. This had predictable consequences. In the Hamburg cholera epidemic of 1892, for

    example, the mortality rate for individuals with an income of less than 800 marks a year was

    thirteen times higher than that for individuals earning over 50,000 marks. [Evans, Death in

    Hamburg
    .] It was not necessary to be a Marxist to be horrified by the inequality of industrial

    society. The Welsh-born factory-owner Robert Owen, who coined the term ‘socialism’ in

    1817, envisaged an alternative economic model based on co-operative production and utopian

    villages like the ones he founded at Orbiston in Scotland and New Harmony, Indiana. [Grayling,

    Toward the Light of Liberty: The Struggles for Rights and Freedoms That Made the Modern

    Western World
    , pp. 189-93.] Even the Irish aesthete and wit Oscar Wilde recognized the

    foundation of social misery on which the refined world of belles-lettres stood:


    These are the poor; and amongst them there is no grace of manner, or charm of speech, or

    civilization . . . From their collective force Humanity gains much in material prosperity. But it is

    only the material result that it gains, and the man who is poor is in himself absolutely of no

    importance. He is merely the infinitesimal atom of a force that, so far from regarding him,

    crushes him: indeed, prefers him crushed, as in that case he is far more obedient . . . Agitators

    are a set of interfering, meddling people, who come down to some perfectly contented class of

    the community, and sow the seeds of discontent amongst them. That is the reason why

    agitators are so absolutely necessary. Without them, in our incomplete state, there would be

    no advance towards civilization . . . [But] the fact is that civilization requires slaves. The Greeks

    were quite right there. Unless there are slaves to do the ugly, horrible, uninteresting work,

    culture and contemplation become almost impossible. Human slavery is wrong, insecure, and

    demoralizing. On mechanical slavery, on the slavery of the machine, the future of the world

    depends.



    Yet the revolution feared by Wilde and eagerly anticipated by Marx never materialized—at

    least, not where it was supposed to. The bouleversements of 1830 and 1848 were the

    results of short-run spikes in food prices and financial crises more than of social polarization. As

    agricultural productivity improved in Europe, as industrial employment increased and as the

    amplitude of the business cycle diminished, the risk of revolution declined. Instead of coalescing

    into an impoverished mass, the proletariat subdivided into ‘labour aristocracies’ with skills and a

    lumpenproletariat with vices. The former favoured strikes and collective bargaining over

    revolution and thereby secured higher real wages. The latter favoured gin. The respectable

    working class had their trade unions and working men’s clubs. The ruffians — ‘keelies’ in

    Glasgow — had the music hall and street fights.

    The prescriptions of the Communist Manifesto were in any case singularly unappealing to

    the industrial workers they were aimed at. Marx and Engels called for the abolition of private

    property; the abolition of inheritance; the centralization of credit and communications; the state

    ownership of all factories and instruments of production; the creation of ‘industrial armies for

    agriculture’; the abolition of the distinction between town and country; the abolition of the

    family; ‘community of women’ (wife-swapping) and the abolition of all nationalities. By contrast,

    mid-nineteenth-century liberals wanted constitutional government, the freedoms of speech,

    press and assembly, wider political representation through electoral reform, free trade and,

    where it was lacking, national self-determination (‘Home Rule’). In the half-century after the

    upheaval of 1848 they got a good many of these things — enough, at any rate, to make the

    desperate remedies of Marx and Engels seem de trop. In 1850 only France, Greece and

    Switzerland had franchises in which more than a fifth of the population got to vote. By 1900 ten

    European countries did, and Britain and Sweden were not far below that threshold. Broader

    representation led to legislation that benefited lower-income groups; free trade in Britain meant

    cheap bread, and cheap bread plus rising nominal wages thanks to union pressure meant a

    significant gain in real terms for workers. Building labourers’ day wages in London doubled in

    real terms between 1848 and 1913. Broader representation also led to more progressive

    taxation. Britain led the way in 1842 when Sir Robert Peel introduced a peacetime income tax;

    by 1913 the standard rate was 14 pence in the pound (6 per cent). Prior to 1842 nearly all

    British revenue had come from the indirect taxation of consumption, via customs and excise

    duties, regressive taxes taking a proportionately smaller amount of your income the richer you

    are. By 1913 a third revenue was coming from direct taxes on the relatively rich. In 1842 the

    central government had spent virtually nothing on education and the arts and sciences. In 1913

    those items accounted for 10 per cent of expenditure. By then, Britain had followed Germany

    in introducing a state pension for the elderly.


    Marx and Engels were wrong on two scores, then. First, their iron law of wages was a piece of

    nonsense. Wealth did indeed become highly concentrated under capitalism, and it stayed that

    way into the second quarter of the twentieth century. But income differentials began to narrow

    as real wages rose and taxation became less regressive. Capitalists understood what Marx

    missed: that workers were also consumers. It therefore made no sense to try to grind their

    wages down to subsistence levels. On the contrary, as the case of the United States was making

    increasingly clear, there was no bigger potential market for most capitalist enterprises than

    their own employees. Far from condemning the masses to ‘immiseration’, the mechanization of

    textile production created growing employment opportunities for Western workers — albeit

    at the expense of Indian spinners and weavers — and the decline in the prices of cotton and

    other goods meant that Western workers could buy more with their weekly wages. The impact

    is best captured by the exploding differential between Western and non-Western wages and

    living standards in this period. Even within the West the gap between the industrialized

    vanguard and the rural laggards widened dramatically. In early seventeenth-century London, an

    unskilled worker’s real wages (that is, adjusted for the cost of living) were not so different from

    what his counterpart earned in Milan. From the 1750s until the 1850s, however, Londoners

    pulled far ahead. At the peak of the great divergence within Europe, London real wages were

    six times those in Milan. With the industrialization of Northern Italy in the second half of the

    nineteenth century, the gap began to close, so that by the eve of the First World War it was

    closer to a ratio of 3:1. German and Dutch workers also benefited from industrialization,

    though even in 1913 they still lagged behind their English counterparts. Chinese workers, by

    contrast, did no such catching up. Where wages were highest, in the big cities of Beijing and

    Canton, building workers received the equivalent of around 3 grams of silver per day, with no

    upward movement in the nineteenth and early twentieth (to around 5-6 grams). There was

    some improvement for workers in Canton after 1900 but it was minimal; workers in Sichuan

    stayed dirt poor. London workers meanwhile saw their silver-equivalent wages rise from

    around 18 grams between 1800 and 1870 to 70 grams between 1900 and 1913. Allowing for

    the cost of maintaining a family, the standard of living of the average Chinese worker fell

    throughout the nineteenth century, most steeply during the Taiping Rebellion. True,

    subsistence was cheaper in China than in North-western Europe. It should also be remembered

    that Londoners and Berliners by that time enjoyed a far more variegated diet of bread, dairy

    products and meat, washed down with copious amounts of alcohol, whereas most East Asians

    were subsisting on milled rice and small grains. Nevertheless, it seems clear that by the second

    decade of the twentieth century the gap in living standards between London and Beijing was

    around six to one, compared with two to one in the eighteenth century. [Allen at al., ‘Wages,

    prices, and living standards in China, 1738–1925: in comparison with Europe, Japan, and India’.]


    The second mistake Marx and Engels made was to underestimate the adaptive quality of

    the nineteenth-century state — particularly when it could legitimize itself as a nation-state.



    In his Contribution to a Critique of Hegel’s Philosophy of Right, Marx had famously

    called religion the ‘opium of the masses’. If so, then nationalism was the cocaine of the middle

    classes. On 17 March 1846 Venice’s Teatro La Fenice was the setting for the premiere of a new

    opera by the already celebrated Italian composer Giuseppe Verdi. Technically, Verdi had in fact

    been born a Frenchman: his name at birth was formally registered as ‘Joseph Fortunin Francois

    Verdi’ because the village where he was born was then under Napoleonic rule, having been

    annexed to France along with the rest of the Duchy of Parma and Piacenza. Venice, too, had

    been conquered by the French, but was handed over to Austria in 1814. The unpopularity of

    the Habsburg military and bureaucracy explains the rowdy enthusiasm with which the

    predominantly Italian audience responded to the following lines:


    [U]Tardo per gli anni, e tremulo,

    E il regnator d’Oriente;

    Siede un imbelle giovine

    Sul trono d’Occidente;

    Tutto sara disperso

    Quand’io mi unisca a te . . .

    Avrai tu l’universo,

    Resti l’Italia a me.


    (Aged and frail / Is the ruler of the Eastern Empire; / A young imbecile sits on the throne of the

    Western Empire; / All will be scattered / If you and I unite . . . / You can have the universe /

    But leave Italy to me.)



    Sung to Attila by the Roman envoy Ezio following the sack of Rome, these words were a thinly

    veiled appeal to nationalist sentiment. They perfectly illustrate what nationalism always had over

    socialism. It had style.[/SIZE]







    http://h-net.msu.edu/cgi-bin/logbrow...IkHg&user=&pw=


    [SIZE=4]‘Since for most of history mothers raise boys who then go off and hunt, farm, build

    things and fight wars rather than directly contributing much new to the psyche of the next

    generation, the course of evolution of the psyche has overwhelmingly been dependent upon the

    way mothers have treated their daughters, who become the next generation of mothers. Since

    early emotional relationships organize the entire range of human behavior, all cultural traits do

    not equally affect the evolution of the psyche — those that affect the daughter's psyche

    represent the main narrow bottleneck through which all other cultural traits must pass. The

    study of the evolution of the psyche depends more on developing a maternal ecology than on

    studying variations in the physical environment.


    The evolution of the psyche and culture has been crucially dependent upon turning the weak

    bonds between mother and daughter of apes and early humans into genuine love for daughters

    (and sons). This means that historical societies that create optimal conditions for improving the

    crucial mother-daughter relationship by surrounding the mother with support and love soon

    begin to show psychological innovation and cultural advances in the next generations—so that

    history begins to move in progressive new directions. In contrast, societies that cripple the

    mother-daughter emotional relationship experience psychogenic arrest and even psychogenic

    devolution. Only in modern times have fathers, too, begun to contribute much to the

    evolutionary task of growing the young child's mind.


    Paralleling the term "hopeful monster" that biologists use to indicate speciating biological

    variations, the idea that the mother-daughter emotional relationship is the focal point of

    epigenetic evolution and the main source of novelty in the psyche can be called the "hopeful

    daughter" concept. When mothers love and support, especially their daughters, a series of

    generations can develop new childrearing practices that grow completely new neuronal

    networks, hormonal systems and behavioral traits. If hopeful daughters are instead emotionally

    crippled by a society, a psychogenic cul-de-sac is created, generations of mothers cannot

    innovate, epigenetic arrest is experienced and meaningful cultural evolution ends.


    For instance, in China before the tenth century A.D. men began to footbind little girls' feet as a

    sexual perversion, making them into sexual fetishes, penis-substitutes which the men would

    suck on and masturbate against during sex play. Chinese literature reports the screaming cries

    of the five-year-old girl as she hobbles about the house for years to do her tasks while her feet

    are bound, because in order to make her foot tiny, her foot bones are broken and the flesh

    deteriorates. She loses several toes as they are bent under her foot, to emphasize the big toe as

    a female penis.


    This practice was added to the many brutal practices of what was perhaps the world's most

    anti-daughter culture, where over half the little girls were killed at birth without remorse and

    special girl-drowning pools were legion, where beating little girls until bloody was a common

    parental practice, and where girl rape and sex slavery were rampant. This vicious anti-daughter

    emotional atmosphere — extreme even for a time that was generally cruel and unfeeling

    towards daughters — was obviously not conducive to little girls producing innovations in

    childrearing when they grew up to be mothers. Therefore China—which was culturally ahead

    of the West in many ways at the time of the introduction of foot binding — became culturally

    and politically "frozen" until the twentieth century, when foot binding was stopped and boy-girl

    sex ratios in many areas dropped from 200/100 to near equality. The result was that whereas

    for much of its history China punished all novelty, during the twentieth century rapid cultural,

    political and economic evolution could resume. Japan, which shared much of Chinese culture

    but did not adopt foot binding of daughters, avoided the psychogenic arrest of China and could

    therefore share in the scientific and industrial revolution as it occurred in the West.


    The same kind of epigenetic arrest can be seen in the damage caused by genital mutilation of

    girls among circum-Mediterranean peoples that began thousands of years ago and continues

    today. Since "hopeful daughters" do not thrive on the chopping off of their clitorises and labias,

    the present cultural and political problems of those groups who still mutilate their daughters'

    genitals are very much a direct result of this psychogenic arrest.’




    http://primal-page.com/ps4.htm


    ‘All the other aspects of modern industrial society are equally results of the new socializing

    psychoclass childrearing, causing a greater increase in material prosperity in the past two

    centuries than in all the rest of human history. The reason for this astonishing progress is that

    science, technology and economic development depend more on investments in parenting

    than investments in equipment, since they crucially require an "exploring self" constructed from

    childhood. A few economists realize that the wealth of nations lies in the development of

    psyches more than in the investment of capital.


    Everett Hagan and Lawrence Harrison, for instance, have demonstrated that those nations

    furthest behind today in economic development suffer from a severe underinvestment in

    families and children, not in capital equipment.* The historical record is clear: early pioneers in

    science and technology first had to overcome their alter projections before they could discover

    how the world worked. As Keith Thomas puts it: "It was the abandonment of magic which

    made possible the upsurge of technology, not the other way round." Newton had to stop seeing

    falling objects "longing to return to Mother Earth" before he could posit a force of gravity.


    * Everett Hagen, The Economics of Development. Rev. Ed. Homewood: R. D.

    Irwin, 1975; Lawrence E. Harrison, Underdevelopment Is a State of Mind: The

    Latin American Case
    . Lanham: Madison Books, 1985, pp. 25, 29.



    Chemists had to give up "alchemical visions of womb-battles between good and evil" inside their

    flasks before they could observe the real causes of chemical change. Farmers had to be able to

    empathize with their horses in order to invent the harness collar that moved the pressure

    down from their throats to their flanks so they wouldn’t be choked in order to increase the

    loads they could pull.* Farmers also had to stop thinking of plowing as "tearing at the breast of

    Mother Earth" in order to invent the deep plow and change the face of European agriculture.

    Men had to begin to value their families in order to build wooden floors in their homes rather

    than leaving them clay as was the practice for millennia. Every invention had its origin in the

    evolution of the psyche; every exploration of nature was a dimension of the exploration of the

    self.


    Economic life, too, only evolved as childrearing and the psyche evolved. Tribal societies both in

    the past and in the present could not trust, because parents were untrustworthy, so they could

    not allow much wealth or surplus out of which they could create economic progress.

    Ownership was felt to be dangerous selfishness, envy ran rampant and ambition was feared:

    "The anthropologist may see people behaving with generosity, but this is the result of fear."

    Those who acquired too much were expected to either engage in gift-exchange and other

    redistributive rituals or else to periodically destroy their surplus in cleansing sacrificial

    ceremonies. Even the invention of money came from the sacred objects used for sacrifice to

    deities. [William H. Desmonde, Magic, Myth and Money: The Origin of Money in

    Religious Ritual
    . New York: The Free Press of Glencoe, 1962.] "Money is

    condensed wealth; condensed wealth is condensed guilt…money is filthy because it remains

    guilt." [Norman O. Brown, The Psychoanalytic Meaning of History.

    Middletown: Wesleyan University Press, 1959, p. 266.]


    What held back economic development for so many millennia was that early civilizations were

    so abusively brought up that they spent most of their energies chasing "ghosts from the

    nursery" — religious, political and economic domination group-fantasies — rather than joining

    in together to solve the real tasks of life. The appalling poverty of most people throughout

    history has been simply an extension of the emotional poverty of the historical family, making

    real cooperation in society impossible. For instance, slavery was one of the most wasteful,

    uneconomical systems ever invented, since denying autonomy to one’s fellow workers simply

    wasted both the slaves’ and the owners’ productivity and inventiveness.


    * Lynn White Jr., Medieval Technology and Social Change. Oxford: Oxford

    University Press, 1962, p. 57.





    http://www.youtube.com/watch?v=BNLq1Y_OFEQ

    “ . . . Progressives and reactionaries are two different parts of the nation, as they go to war, or

    as they take excessive risks, and put themselves into depressions like the one that we just

    started and the one in the ‘30s . . . the reactionaries just are against freedoms for people and

    are for siding with the powerful punishing parent."




    http://www.scribd.com/doc/104598456/...-Lloyd-DeMause


    Foot binding was a unique Chinese sexual mutilation practice that was performed on girls of all

    classes. Like other fetishists, the Chinese were so afraid of the vagina as a dangerous, castrating

    organ that they could only feel erotic toward the woman's foot—mainly her big toe. As Cheng

    Kuan-ying described foot binding in the nineteenth century: "When a child is four or five, or

    seven or eight, parents speak harshly to it, and frighten it with their looks, and oppress it in

    every conceivable manner so that the bones of its feet may be broken and its flesh may

    putrefy." [Howard S. Levy, Chinese Footbinding: The History of a Curious Erotic Custom.

    London: Neville Spearman, n.d., p.70.] The girl undergoes this extremely painful process for

    [anywhere] from five to ten years, crying out in pain each night as she hobbles about the house

    to do her tasks while holding on to the walls for support. [David and Vera Mace, Marriage:

    East & West
    . Garden City, N.Y.: Doubleday & Co., 1959, pp. 75-6; Levy, Chinese

    Footbinding, pp. 52, 82-88.] As the bones became broken and the flesh deteriorated, her foot

    became a perfect penis-substitute, often losing several toes as they were bent under her foot in

    order to emphasize the big toe sticking out.


    The penis-toe then became the focus of the man's perversion and of his sexual excitement

    during intercourse. "It formed an essential prelude to the sex act, and its manipulation excited

    and stimulated... The ways of grasping the foot in one's palms were both profuse and varied;

    ascending the heights of ecstasy, the lover transferred the foot from palm to mouth. Play

    included kissing, sucking, and inserting the foot in the mouth until it filled both cheeks, either

    nibbling at it or chewing it vigorously, and adoringly placing it against one's cheeks, chest, knees,

    or virile member. [Levy, Chinese Footbinding, p.34] Thus even sex with a female could simulate

    homosexual intercourse for Chinese males.


    Although Chinese literature has many descriptions of the screams of girls whose feet are being

    crushed,* the sources are silent as to whether the little girl herself fully understood the sexual

    purpose of her mutilation. Since she shared the family bed with her parents and presumably

    observed her father playing sexually with her mother's penis-foot, it is likely that the sexual aim

    of her painful mutilation was apparent to her.


    * A typical example, Ibid, p. 83: "My toes were pointed, my instep bent down,/And though I

    cried out to Heaven and Earth,/ Mother ignored me as if she were deaf./My nights were spent

    in pain,/My early mornings in tears;/I spoke to Mother by my bed;/How you worry when I'm

    ill,/How frightened if I fall!/Now the agony from my feet has penetrated the marrow of my

    bones,/And I am plunged into despair, but you,/You don't care a bit about me."
    Last edited by HERO; 04-18-2014 at 08:34 PM.

  3. #3
    xerx's Avatar
    Join Date
    Dec 2007
    Posts
    5,472
    Mentioned
    53 Post(s)
    Tagged
    0 Thread(s)

    Default

    douchebaggy sei imo.
    It was in the reign of George III that the aforesaid personages lived and quarrelled; good or bad, handsome or ugly, rich or poor, they are all equal now.

  4. #4
    Ti centric krieger's Avatar
    Join Date
    Sep 2006
    Posts
    5,983
    Mentioned
    80 Post(s)
    Tagged
    0 Thread(s)

    Default

    kind of ESxj ish. style over content, volume over quality.

    i think the whole squabble with krugman he had was an example of how people often pick "fights" with people not that different from them because it's safer than confronting your actual enemies.

  5. #5
    Contra's Avatar
    Join Date
    Apr 2014
    TIM
    ILI-Ni
    Posts
    1,405
    Mentioned
    55 Post(s)
    Tagged
    0 Thread(s)

    Default

    I'm not reading all of that, but I've seen quite a few videos of him and I have him typed LSE or possibly LIE.

  6. #6
    xerx's Avatar
    Join Date
    Dec 2007
    Posts
    5,472
    Mentioned
    53 Post(s)
    Tagged
    0 Thread(s)

    Default

    @ exxj... seriously people?
    It was in the reign of George III that the aforesaid personages lived and quarrelled; good or bad, handsome or ugly, rich or poor, they are all equal now.

  7. #7
    Contra's Avatar
    Join Date
    Apr 2014
    TIM
    ILI-Ni
    Posts
    1,405
    Mentioned
    55 Post(s)
    Tagged
    0 Thread(s)

    Default

    yes

  8. #8
    xerx's Avatar
    Join Date
    Dec 2007
    Posts
    5,472
    Mentioned
    53 Post(s)
    Tagged
    0 Thread(s)

    Default

    keep the ridiculous typings coming y'all. it's good to confuse anyone who wanted to use socionics for nefarious purposes.
    It was in the reign of George III that the aforesaid personages lived and quarrelled; good or bad, handsome or ugly, rich or poor, they are all equal now.

  9. #9
    Jesus is the cruel sausage consentingadult's Avatar
    Join Date
    Jul 2006
    Posts
    2,784
    Mentioned
    52 Post(s)
    Tagged
    0 Thread(s)

    Default

    I watched his documentary series "Civilization: Is the West History?" Obviously using lots of Mobilizing-Ne does not prevent one from becoming professor at Harvard.
    Last edited by consentingadult; 04-22-2014 at 12:16 AM.
    The future of Socionics:
    Quote Originally Posted by Maritsa View Post
    Many black Americans are SEE type.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •