Capitalism

M. Hodgson, Conceptualizing Capitalism: Institutions, Evolution, Future, the University of Chicago Press. Kindle edition, 2015

What capitalism is everyone knows, or thinks that he/she/they know(s). It is an economic and social system under which ownership of the means of production on one hand and labor on the other are largely separate (commenting on this, Marx once said that the reason why bourgeois dislike prostitutes is because the latter cannot be separated from the tools of their trade). One under which the main means of production such as factories, machines, roads, communications etc. are in large part privately owned and may, operating through a somewhat chaotic system known as “markets,” be more or less freely transferred from one owner to another. One under which the factor that ties those resources together, enabling them to function, is money rather than, say, barter or faith or charisma. One in which economic initiative is given free rein and consequently takes the form of competition among owners, actual or would-be, that in time is almost certain to lead to gross inequalities between rich and poor.

One that operates, and to a large extent can only operate, within a framework of rights and duties, freedoms and prohibitions, known as law, the task of creating and administrating which is the province of an overarching organization known as the state. One whose origins have deep roots in history—especially ancient Rome where private ownership, in the form of so-called cattle slavery, reached heights (or lows) to which subsequent generations had little to add. One that, originating in the Netherlands and England between about 1600 and 1800 and having resoundingly defeated communism as its most important opponent, has spread all over the world. To the point where, at present, it faces only limited competition even in countries, such as China, whose official ideology points in a different direction.

As the common saying has it, whatever goes up must go down. Rome, which as I just said in some way represented the acme of capitalism, ended by biting the dust. As it did so, it was replaced first by the various Germanic tribes and then by feudalism; based on entirely different principles, between them they lasted for almost a millennium. Panta rhei: to anyone with the least historical consciousness, the collapse at some future time of capitalism appears inevitable. Some may even see it as desirable. Do we really want to perpetuate a system that allows a handful of temperamental tycoons to control much of a country’s wealth, as it does both in the US and (more surprisingly) in Switzerland? But what will its successor look like? Science fiction apart, to-date the only really serious attempt to answer this question was provided by Lenin, Stalin and Mao. From them it passed to their successors or imitators. But that attempt, too, has hit the dust. If not in theory—a small number of die-hard Marxists still persists—then at any rate in practice. So the question is, what comes next; and it was the hope of obtaining at least some answers to that question that first made me turn to Hodgson’s book.

As I read along, I found that the relevant material is distributed between two separate chapters. They are number 14, “The Future of Global Capitalism;” and 16 “After Capitalism.” Chapter 14 is an attempt to guess what forms capitalism may yet take in various countries and the relative success of those forms: e.g Taiwan (supposing it does not fall to China, a possibility Hodgson does not even mention) and South Korea versus India; the United States versus China; the European Union versus Russia; and so on. All this while taking into account, or trying to take into account, a vast number of relevant factors such as birthrates, labor force participation, per capita GDPs, social and cultural attitudes, government interference (including R&D and subsidies on one hand and taxes and corruption on the other) and so on. Briefly, the kind of socio-economic analysis that, coming complete with countless tables and figures, may be found in a thousand other works.

Taking up the lead, chapter 16 deals with three questions. They are, 1.”Will the Great Global Diffusion Lead to a New Economic Hegemonism?” 2. “The Role of Law and Economic Development.” And 3. “The Persistence of Varieties of Capitalism.” Needless to say, the answers to each of these issues will play an important role in shaping the future. As with chapter 14, the author’s discussion of each of these issues is backed up by reams of facts and figures. One can imagine the author chewing his way through them, leaving no bone untouched. His main interest, though, comes through in the second issue; from beginning to end, he is determined to show that law and good government, far from merely forming a Marx-type “superstructure” that covers the economic “base” and justifies it, forms an essential part of capitalism’s nature or, at the very least, a prerequisite without which it could not exist. Presenting his case in some detail, he is probably more often right than wrong; what I found almost totally missing, though, was a more global –meaning, not country by country—oriented discussion of any fundamental changes capitalism may undergo.

To my mind, some of the most important early twenty-first century questions concerning capitalism are as follows. Given how powerful, how omnipresent ad by now, how persistent capitalism is at the present historical moment, what factors could push it off course and make humanity move into a different direction?  Suppose an alternative to capitalism is found, what will it look like?  Will inequality among rich and poor, individuals or countries, decrease or increase? Will the future resemble Aldous Huxley’s Brave New World? Or George Orwell’s Nineteen Eighty-Four? Or Anthony Burgess’ 1985? Or some combination of the three? Who loses? Who gains? Unfortunately Hodgson does not provide even the beginning of answers to any of these and similar questions.

Leaving his work, however well researched and rich and nuanced it may be, hanging in the air.

Quo Vadis, Israel

In my last post I tried to explain the nature and purpose of the various parties represented in Israel’s parliament (the Knesset). Consquently, a friend of mine, the award-winning painter Bob Barancik (see on him https://www.creativeshare.com/bio.php) confronted me with some questions of his own. So here are my answers—for what they are worth.

Q: Did the recent raft of insubordinations among reserve air force pilots and IDF officers permanently damage the security of the state against Iran and other hostile Arab states?

A: Possibly so. War being what it is, the most important factor in waging it is not technology, however sophisticated. It is, rather, fighting spirit which in turn can only rest on mutual trust (as people used to say when Germany still had an army, today it’s you, tomorrow it’s me). The way some Israeli pilots, flight controllers, drone-operators ground officers and of course lawyers see it, that trust has been violated by their political superiors who, by seeking to drastically increase the power of the executive in particular, are weakening the judiciary and preparing a dictatorship. This, on top of demanding that the police and the military resort to draconian measures to break the resistance of the occupied Palestinian population—so draconian that, should they be implemented, they have an excellent chance of causing those who carry them out to be dragged in front of the International Court for War Crimes in The Hague.

The problem is like cancer. The longer it persists, the worse it will become and the harder it will be to repair the damage already done.

Q: Could there realistically be a putsch orchestrated by IDF generals and/or security services to forcibly remove Netanyahu, Smotrich, Ben-Gvir from office?

A: I very much doubt it. Do not forget that the IDF, unlike most modern armed forces, is mainly made up not of professionals but of conscripts and reservists. They will be split in the middle, just like the rest of Israeli society. The outcome will be total disintegration.

Q: Could the Camp David Accords simply be ignored by Egypt and a return to old hostilities?

A: Such a move almost certainly will not come all at once but take time and psychological preparation among the masses. Also, an extreme provocation such as an Israeli attempt to expel the Palestinian population of the West Bank. But yes, it could happen.

Q: Do the Arab countries and Iran need Israel to continue to exist as a domestic “punching bag” or is the hatred so great that there could be a genocide of Israeli Jews ala Mufti of Jerusalem?

A: You ask as if Arabs and Iranians were made of the same piece. But they are not. Among the Arabs, the masses, including the better educated, hate Israel more than the government does. In Iran the situation is the opposite.

Incidentally, did it ever occur to you that things may also work the other way around—i.e that, vice vice versa, it is some Israeli circles that are using the threat as a punching bag?

Q: Is it likely that Hezbollah aka Iran will unleash a sustained barrage of missiles that would cripple Israeli infrastructure? Or will Israel’s nuclear capacity continue to deter the mullahs in the short run?

A: Israel has never published any nuclear doctrine it may have. At the same time, the general belief is that its leaders will only resort to nukes in case the country faces complete defeat—as by having its army reduced to the point where it can no longer fight, its logistic infrastructure knocked out, and a considerable part of its territory and population overrun.

With the worst will in the world, Hezbollah does not have what it takes to achieve these aims; so it will depend on Iranian (and Syrian) support. A bombardment with Iranian and Syrian chemical weapons might indeed lead Israel first to threaten and then use its weapons of last resort.

Q: Do you see an exodus of the “best and the brightest” if Bibi and company continue to hang on to power?

A: This is already happening. Many—no one knows just how many—academics, physicians, and other kinds of highly qualified experts are leaving or looking for ways to leave. The shekel, which for several years used to be called the strongest currency one earth, is falling. Tens of thousands, including some members of my own family, are trying to obtain foreign citizenship in addition to their Israeli one. While there are no statistics, my guess would be that there are few Israeli families left that have not considered this possibility more or less seriously.

Q: We live in the postmodern world, where everything is possible and almost nothing is certain.

A: How true. But it does not make forecasting the future any easier. If anything, to the contrary.

Q: Do you believe as someone said, that “This too shall pass”?

A: I think the threat is the most serious one Israel has faced since 1973. Unless very, very great care is taken by Netanyahu, his government and his successors civil war, not just between Jew and Arab but among the Jews themselves, is inevitable. Such a war, especially one that leads to foreign (Arab and Iranian) involvement, might very well mean, finis, Israel.

Family and Civilization

Carle Zimmerman, Family and Civilization, Washington DC, ICI, 2008 [1947].

I had my attention drawn to this book by a friend, Larry Kummer, editor of FabiusMaximus website. No sooner had I opened it than I realized I had a masterpiece on my hands. One, moreover, which, at a time when the average American household is smaller than ever before, half of all marriages end in divorce, and only one half of all children are fortunate enough to be raised by their biological parents, seems more opportune than ever before. Rather than review the book myself, I decided to post the splendid introduction to the 2008 edition, written by Allan C. Carlson. Not before receiving his permission first, of course.

*

HAVING TAKEN A BREAK FROM planning the World Congress of Families IV, an international assembly that took place in 2007 and focused on Europe’s “demographic winter” and global family decline, I turned to consider again Carle Zimmerman’s magnum opus, Family and Civilization (1947). And there, near the end of chapter 8 in his list of sure signs of social catastrophe, I read: “Population and family congresses spring up among the lay population as frequently and as verbose as Church Councils [in earlier centuries].” It is disconcerting to find one’s work labeled, accurately I sometimes fear, as a symptom rather than as a solution to the crisis of our age. Such is the prescience and the humbling wisdom of this remarkable book.

With regard to the family, Carle Zimmerman was the most important American sociologist of the 1920s, ’30s, and ’40s. His only rival for this label would be his friend, occasional coauthor, and colleague Pitirim Sorokin. Zimmerman was born to German-American parents and grew up in a Cass County, Missouri, village. Sorokin grew up in Russia, became a peasant revolutionary and a young minister in the brief Kerensky government, and barely survived the Bolsheviks, choosing banishment in 1921 over a death sentence. They were teamed up at the University of Minnesota in 1924 to teach a seminar on rural sociology. Five years later, this collaboration resulted in the volume Principles of Rural-Urban Sociology, and a few years thereafter in the multivolume A Systematic Source Book in Rural Sociology. These books directly launched the Rural Sociological Section of the American Sociological Association and the new journal Rural Sociology.

In all this activity, Zimmerman focused on the family virtues of farm people. “Rural people have greater vital indices than urban people,” he reported. Farm people had earlier and stronger marriages, more children, fewer divorces, and “more unity and mutual attachment and engulfment of the personalit[ies]” of its members than did their urban counterparts. Zimmerman’s thought ran sharply counter to the primary thrust of American sociology in this era. The so-called Chicago School dominated American social science, led by figures such as William F. Ogburn and Joseph K. Folsom. They focused on the family’s steady loss of functions under industrialization to both governments and corporations.

As Ogburn explained, many American homes had already become “merely parking places for parents and children who spend their active hours elsewhere.” Up to this point, Zimmerman would not have disagreed. But the Chicago School went on to argue that such changes were inevitable and that the state should help complete the process. Mothers should be mobilized for full-time employment, small children should be put into collective day care, and other measures should be adopted to effect “the individualization of the members of society.”

Where the Chicago School was neo-Marxist in orientation, Zimmerman looked to a different sociological tradition. He drew heavily on the insights of the mid-nineteenth-century French social investigator Frederic Le Play. The Frenchman had used detailed case studies, rather than vast statistical constructs, to explore the “stem family” as the social structure best adapted to insure adequate fertility under modern economic conditions. Le Play had also stressed the value of noncash “home production” to a family’s life and health. Zimmerman’s book from 1935, Family and Society, represented a broad application of Le Play’s techniques to modern America. Zimmerman claimed to find the “stem family” alive and well in America’s heartland: in the Appalachian-Ozark region and among the German- and Scandinavian-Americans in the Wheat Belt.

More importantly, Le Play had held to an unapologetically normative view of the family as the necessary center of critical human experiences, an orientation readily embraced by Zimmerman. This mooring explains his frequent denunciations of American sociology in the pages of Family and Civilization. “Most of family sociology,” he asserts, “is the work of amateurs” who utterly fail to comprehend the “inner meaning of their subject.” Zimmerman mocks the Chicago School’s new definition of the family as “a group of interacting personalities.” He lashes out at Ogburn for failing to understand that “the basis of familism is the birth rate.” He denounces Folsom for labeling Le Play’s “stem” family model as “fascistic” and for giving new modifiers—such as “democratic,” “liberal,” or “humane”—to otherwise disparate civilizations to reveal deeper and universal social traits.

To guide his investigation, Zimmerman asks: “Of the total power in [a] society, how much belongs to the family? Of the total amount of control of action in [a] society, how much is left for the family?” By analyzing these levels of family autonomy, Zimmerman identifies three basic family types: (1) the trustee family, with extensive power rooted in extended family and clan; (2) the atomistic family, which has virtually no power and little field of action; and (3) the domestic family (a variant of Le Play’s “stem” family), in which a balance exists between the power of the family and that of other agencies. He traces the dynamics as civilizations, or nations, move from one type to another. Zimmerman’s central thesis is that the “domestic family” is the system found in all civilizations at their peak of creativity and progress, for it “possesses a certain amount of mobility and freedom and still keeps up the minimum amount of familism necessary for carrying on the society.”
It has perfect blend of nutrients, minerals, soft cialis pills herbs and antioxidants. But now combating all these issues is rather generic viagra 100mg a lot easier with kamagra that ensures a wholesome sexual life minus the consequences of such stresses. So, you must be wondering that among all these cialis canada online big labels, which one is worst … High stress lifestyles, diet and exercise can bring forth positive results also. cheapest viagra tabs
So-called social history has exploded as a discipline since the early 1960s, stimulated at first by the French Annales school of interpretation and then by the new feminist historiography. Thousands upon thousands of detailed studies on marriage law, family consumption patterns, premarital sex, “gay culture,” and gender power relations now exist, material that Zimmerman never saw (and some of which he probably never even could have imagined). All the same, this mass of data has done little to undermine his basic argument. Zimmerman focuses on hard, albeit enduring truths. He affirms, for example, the virtue of early marriage: “Persons who do not start families when reasonably young often find that they are emotionally, physically, and psychologically unable to conceive, bear, and rear children at later ages.” The author emphasizes the intimate connection between voluntary and involuntary sterility, suggesting that they arise from a common mindset that rejects familism. He rejects the common argument that the widespread use of contraceptives would have the beneficial effect of eliminating human abortion. In actual practice, “the population which wishes to reduce its birth rate … seems to find the need for more abortions as well as more birth control.” Indeed, the primary theme of Family and Civilization is fertility. Zimmerman underscores the three functions of familism as articulated by historic Christianity: fides, proles, and sacramentum; or “fidelity, childbearing, and indissoluble unity.”

While describing at length the social value of premarital chastity, the health-giving effects of marriage, the costs of adultery, and the social devastation of divorce, Zimmerman zeros in on the birth rate. He concludes that “we [ever] more clearly abandon the role of proles or childbearing as the main stem of the family.” The very act of childbearing, he notes, “creates resistances to the breaking-up of the marriage.” In short, “the basis of familism is the birth rate. Societies that have numerous children have to have familism. Other societies (those with few children) do not have it.” This gives Zimmerman one easy measure of social success or decline: the marital fertility rate. A familistic society, he says, would average at least four children born per household.

Given current American debates, we should note that Zimmerman was also pro-immigration. In his era Anglo-Saxon populations around the globe had turned against familism, rejecting children. Familism survived in 1948 only on the borders of the Anglo-Saxon world—in “South Ireland, French Canada, and Mexico”—and in the American regions settled by 40 million non-English immigrants, mainly Celts and Germans. However, “when the doors of immigration were closed (first by war, later by law [1924], and finally by the disruption of familistic attitudes in the European sources themselves), the antifamilism of the old cultured classes … finally began to have effect.” In short, “within the same generation America became a world power and lost her fundamental familistic future.”

Rejecting the Marxist dialectic, Zimmerman asserts that the “domestic family” would not be the agent of its own decay. When trade increased or migration occurred, the domestic family could in fact grow stronger. Instead, decay came from external factors such as changes in religious or moral sentiments. The domestic family was also vulnerable to intellectual challenges by advocates for the atomistic family. Zimmerman was not optimistic in 1947 about America’s or, more broadly, Western civilization’s future. Drawing on his work from the 1920s and ’30s, he finds signs of continued family health in rural America: “Our farm and rural families are still to a large extent the domestic type”; their “birthrates are relatively higher.” All the same, he knew from the historical record that the pace of change could be rapid. Once familism had weakened among elites, “all the cultural elements take on an antifamily tinge.” He continues: The advertisements, the radio, the movies, housing construction, leasing of apartments, jobs—everything is individualized. … [T]he advertisers depict and appeal to the fashionably small family. … In the motion pictures, the family seems to be motivated by little more than self-love. … Dining rooms are reduced in size. … Children’s toys are cheaply made; they seldom last through the interest period of one child, much less several. … The whole system is unfamilistic.

Near the end of Family and Civilization, Zimmerman predicts that “the family of the immediate future will move further toward atomism,” that “unless some unforeseen renaissance occurs, the family system will continue headlong its present trend toward nihilism.” Indeed, he predicts that the United States, along with the other lands born of Western Christendom, would “reach the final phases of a great family crisis between now and the last of this century.” He adds: “The results will be much more drastic in the United States because, being the most extreme and inexperienced of the aggregates of Western civilization, it will take its first real ‘sickness’ most violently.”

In the short run, Zimmerman was wrong. Like every other observer writing in the mid-1940s, he failed to see the “marriage boom” and “the baby boom” already stirring in the United States (and with equal drama in a few other places, such as Australia). As early as 1949, two of his students reported that, for the first time in U.S. demographic history, “rural non-farm” (read “suburban”) women had higher fertility than in either urban or rural-farm regions. By 1960, Zimmerman concluded in his book, Successful American Families, that nothing short of a social miracle had occurred in the suburbs: This Twentieth Century … has produced an entirely new class of people, neither rural nor urban. They live in the country but have nothing to do with agriculture. … Never before in history have a free urban and sophisticated people made a positive change in the birth rate as have our American people this generation.” By 1967, near the end of his career, Zimmerman even abandoned his agrarian ideals. The American rural community had “lost its place as a home for a folk.” Old images of “rural goodness and urban badness” were now properly forgotten. The demographic future lay with the renewed “domestic families” replicating in the suburbs.

In the long run, however, the pessimism of Family and Civilization over the family in America in the second half of the twentieth century was fully justified. Even as Zimmerman wrote the elegy for rural familism noted above, the peculiar circumstances that had forged the suburban “family miracle” were rapidly crumbling. Old foes of the “domestic family” and friends of “atomism” came storming back: feminists, sexual libertines, neo-Malthusians, the “new” Left. By the 1970s, a massive retreat from marriage was in full swing, the marital birthrate was in free fall, illegitimacy was soaring, and nonmarital cohabitation was spreading among young adults. While some of these trends moderated during the late 1990s, the statistics have all worsened again since 2000. Zimmerman was right: America is taking its first real “sickness” most violently.

Any solution to our civilization’s family crisis, he argued, must begin “in the hands of our learned classes.” This group must come to understand the possibilities of “a recreated familism.” Accordingly, it is wholly appropriate for this new edition of Family and Civilization to appear from ISI Books in 2008. Zimmerman wrote the volume at the height of his powers of observation and analysis and as a form of scholarly prophecy. The times cry out for a new generation of “learned” readers for this exceptional book. It is important, too, to remember Zimmerman’s discovery that it had proven possible in times past for a “familistic remnant” to become a “vehicular agent in the reappearance of familism.” Hope for the future, Zimmerman concludes, “lay in the making of [voluntary] familism and childbearing [once again] the primary social duties of the citizen.” With the advantage of another sixty years, we can conclude that here he spoke the most essential, and the most difficult, of truths.

*

Need I say anything more?

Just Published! Seeing into the Future

From the introduction:

“The idea of doing this book was born somewhere in mid-2017. Its parent was Homo Deus, the second of three volumes written by my former student, the famous Yuval Noah Harari. As I went along, a single thought kept entering my mind: how can he, as well as many others who have engaged on a similar endeavor, know what the future will bring? How about Ray Kurzweil, Stephen Hawking, H. G. Wells, Jules Verne? And how about Nostradamus, Hildegard of Bingen, the Roman augurs, the Greek Pythia, the Hebrew prophets, the ‘Chaldean’ astrologers? What were their underlying assumptions, what kind of reasoning did they apply, and what methods did they use? The more I thought about these questions, the more difficult they appeared. If I dared tackle them, then this was precisely because I saw them as a terrific challenge.

The medicine causes the penile system to respond since consumption and act discount cialis generic as a quick component to the activity. Thirdly, driving license is something that a tantric massage can assist with. viagra uk sales It may also be associated with other lifestyle risk factors like smoking, online viagra order obesity, high blood pressure * Already taking medicine for erectile dysfunction Erectile dysfunction or Impotence is a common condition affecting millions of young and old men alike. You may try herbal remedies to cure the wrong practice online viagra on sale here of over masturbation. The role that the willingness and ability to look into the future plays in human life, both individual and collective, can hardly be exaggerated. Call it anticipation, call it vision, call it foresight, call it prediction or call it forecasting: without it, human life as we know it is utterly impossible. Goals cannot be established, nor efforts towards realizing them launched; nor the consequences of reaching, or not reaching, those goals be considered. Neither can threats and dangers be identified and either be met head on or avoided. All this is as true today as it was when we first became human. Presumably it will remain true as long as human we remain. Briefly, but for foresight and the attempt to exercise it, much – perhaps most – of what we understand as thought would be impossible. ‘Blind we walk, till the unseen flame has trapped our footsteps,’ said the chorus in Sophocles’ Antigone.

Some philosophers and scientists go further still. To them the ability to anticipate the future, meaning something that does not yet exist, and to act accordingly does not belong to us humans alone. Instead they see it as an essential, perhaps the essential, characteristic of that mysterious and hard-to-define phenomenon, “life.” After all, ours is the age of so-called posthumanism. And one key pillar of posthumanism is a renewed emphasis on our evolutionary ancestors and the things we have in common with them; this specifically includes the belief that our brains are nothing more than ‘linearly scaled-up’ versions of primate ones, which in turn are nothing more than “linearly scaled-up” versions of vertebrate ones. And so on and on, all the way back to the “protoplasmal primordial atomic globules” of Gilbert and Sullivan fame. As a result, all sorts of qualities that until recently used to be considered exclusively human are now seen as being shared, at least to some extent, by many other animals as well. So with empathy, so with altruism, so with reason. And so, surprising as it may sound, with morality and what many believe to be morality’s origin, religious feeling. Some vague form of the last-named, the greatest living expert on bonobos has been telling us, can be found among those animals.”

Want to know more? Get the book.

The Reign of Uncertainty

One of the principal clichés of our age, endlessly repeated, is that our ability to look into the future and control our fate has been growing. So much so that, in the words of Yuval Harari, we are about to transform ourselves from Homo Sapiens, originally a small, weak and vulnerable creature constantly buffeted by his surroundings, into a quasi-omnipotent Homo Deus. The main engine behind this process, we are told, is represented by fast-accumulating developments in science and technology. Those developments in turn, are both cause and consequence the kind of education that helped us cast off superstitions of every kind and, in the words of Immanuel Kant (1724-1804), “dare to know.” Some would go further still and argue that, if such were not the case, there might be little point in pursuing any kind of learning in the first place.

For a long time, this line of thought was closely related to belief in progress. Today it is shared both by those who are optimistic in regard to the future and by those who, like Harari, keep warning against the disastrous consequences that our very successes may bring down upon our heads. As by changing the climate, destroying the environment, running out of drinking water, covering the planet with plastic, breeding antibiotic-resistant superbugs—vide the corona virus outbreak—and being enslaved, perhaps even exterminated, by some self-seeking supercomputer out on a roll. But is it really true that we are better in looking into the future, and consequently more able to control it, than our ancestors were? And that, as a result, the human condition has fundamentally changed? For some kind of answer, consider the following.

  1. The Demise of Determinacy

In Virgil’s words, “Felix, qui potuit rerum cognoscere causas” (happy, he who can discern the causes of things). For millennia on end, though, so deficient was our understanding of the future that almost the only way to get a handle on it was by enlisting some kind of supernatural aid. As by invoking the spirits, consulting with the gods (or God), tracing the movements of the stars, watching omens and portents of every kind, and, in quite some places, visiting or raising the dead and talking to them.

Come the seventeenth century, many of these methods were finally discarded. If not completely so, at any rate to some extent among the West’s intellectual elite. Their place was taken by the kind of mechanistic science advocated by Galileo Galilei, Isaac Newton, and others. Nor was this the end of the matter Many nineteenth century scientists in particular believed not just that the world is deterministic but that, such being the case, they would one day be able to predict whatever was about to take place in it. One of the best-known statements to that effect came from the polymath Pierre-Simon Laplace (1749-1827). It went as follows:

An intellect [not a demon, which was substituted later for effect] which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.

In such a world not only God but chance, randomness, probability and the unexpected would be eliminated, leaving only sheer causality to rule supreme. Other scientists, such as William Thomson, Lord Kelvin, took matters further still, claiming that science had advanced to the point where there only remained a few minor gaps to be closed. No less than Stephen Hawking in his last work, Brief Answers to the Big Questions, admitted to having done just that. However, the very scientific progress that gave rise to this kind of optimism also ensured that it would not last for long. Just as, regardless of what number you multiply zero by, in the end zero is still what you get.

Starting with the discovery of radioactivity in 1896, it has become increasingly evident that some of nature’s most basic processes, specifically the decay of atoms and the emission of particles, are not deterministic but random. For each radioactive material, we know what percentage of atoms will decay within a given amount of time. But not whether atom A is going to break up before (or after) atom B and why. Subsequent discoveries such as quantum mechanics (Max Planck), relativity (Albert Einstein, the uncertainty principle (Werner Heisenberg, the incompleteness theorem (Kurt Gödel), and chaos theory (Richard Feynman), all helped extend the idea of incalculatability into additional fields.

To specify, quantum mechanics started life as a theoretical construct that could only be applied to the world of subatomic particles, hence could be more or less ignored by everyone but a very small number of nuclear scientists. However, since then it has been climbing out of the basement, so to speak. As it did so it acquired a growing practical significance in the form of such devices as ultra-accurate clocks, superfast computers, quantum radio (a device that enables scientists to listen to the weakest signal allowed by quantum mechanics), lasers, unbreakable codes, and tremendously improved microscopes.

At the heart of relativity lies the belief that, in the entire physical universe, the only absolute is the speed of light apart. Taken separately, both quantum mechanics and relativity are marvels of human wisdom and ingenuity. The problem is that, since they directly contradict one another, in some ways they leave us less certain of the way the world works than we were before they were first put on paper. The uncertainty principle means that, even as we do our best to observe nature as closely as we can, we inevitably cause some of the observed things to change. And even that time and space are themselves illusions, mental constructs we have created in an effort to impose order on our surroundings but having no reality outside our own minds. The incompleteness theorem put an end to the age-old dream—it goes back at least as far as Pythagoras in the sixth century BCE—of one day building an unassailable mathematical foundation on which to base our understanding of reality. Finally, chaos theory explains why, even if we assume the universe to be deterministic, predicting its future development may not be possible in a great many cases. Including, to cite but one well-known example, whether a butterfly flapping wings in Beijing will or will not cause a hurricane in Texas.

  1. Tripping Over One’s Own Robe

So far, the tendency of post-1900 science to become, not more deterministic but less so. As a result, no longer do we ask the responsible person(s) to tell us what the future will bring and whether to go ahead and follow this or that course. Instead, all they can do is calculate the probability of X taking place and, by turning the equation around, the risk we take in doing (or not doing) so. However, knowledge also presents additional problems of its own. Like a robe that is too long for us, the more of it we have the greater the likelihood that it will trip us up.

First, no knowledge can be better than the instruments used to measure the parameters of which it consists. Be they size, mass, temperature, rigidity, speed, duration, or whatever. And no instrument that physicists use is, or can be, perfectly precise and perfectly accurate. Even the most recent, strontium-based, clocks are expected to be off by one second every 138 million years, a fact which, chaos theory says, can make a critical difference to our calculations. The more accurate our instruments, moreover, the more likely they are to interfere with each other. The situation in the social sciences is much worse still, given that both the numbers on which most researchers base their conclusions and the methods they use to select and manipulate those numbers are often extremely inaccurate and extremely slanted. So much so as to render any meeting between them and “the truth” more or less accidental in many cases.

Second, there is far too much knowledge for any individual to master. Modern authors, seeking to impress their readers with the speed at which knowledge expands, often leave the impression that this problem is new. In fact, however, it is as old as history. In China, the Sui-era imperial library was supposed to contain 300,000 volumes. That of the Ptolemies in Alexandria held as many as half a million. And this is to assume that knowledge was concentrated inside libraries—whereas in fact the vast majority of it was diffused in the heads of countless people, most of them illiterate, who left no record of any kind. Since then the problem has only been getting worse. Today, anyone seriously claiming to have written a book containing “all that is most wonderful in history and philosophy and the marvels of science, the wonders of animal life revealed by the glass of the optician, or the labors of the chemist” (The World of Wonders, London, 1869) would be quickly dismissed as either a featherweight or a charlatan.
We know that erection disorder can have both mental and physical impairments. levitra generic india Indulge in a long-lasting foreplay which can help a lot to be said for going the natural route! Erectile dysfunction is one of the most popular names among these pain relief medications is the order viagra downtownsault.org Tramadol pain medication. Each type of Kamagra requires levitra without prescription different time to be effective on treating prostatitis, chlamydia, epididymitis, and so on. This is one of the causes of ED. downtownsault.org viagra no prescription
Third, not only is there too much knowledge for anyone to master but in many cases it keeps developing so fast as to suggest that much of it is mere froth. Whether this development is linear and cumulative, as most people believe, or proceeds in cycles, as was suggested by Thomas Kuhn, is, in this context, immaterial. One of the latest examples I have seen is the possibility, raised by some Hungarian scientists just a few days before these words were written in November 2019, that the world is governed not by the long-established four forces—gravity, the electromagnetic, the strong and the weak—but by five (and perhaps more). Should the existence of the so-called photophobic, or light-fearing, force be confirmed, then it has the potential to blow all existing theories of the world’s behavior at the sub-atomic, hence probably not only at the sub-atomic, level to smithereens.

Fourth, we may often have a reasonably accurate idea of what the consequences of event A, or B, or C, may be. However, working out all of those consequences is much more difficult. The more so because they may (and are likely to) have consequences; and so on in an expanding cascade that, in theory and sometimes in practice as well, does not have a clear end. Some of the consequences may be intended (in which case, if everything goes right, they are foreseeable), others not. Some may be beneficial, others harmful. Some may bend backwards so to speak, turning around and impacting on C, or B, or A, which in turn has consequences, and so on until the cascade turns into an entire series of interrelated cascades. That is particularly true in the social sciences where the very concepts of cause and consequence may be out of place; and reality, either reciprocal or circular.

Some consequences may even be perverse, meaning that they lead to the opposite of what was intended. For example, when the scientists employed on the Manhattan Project worked on a weapon to be used in war—there hardly ever was any doubt that it would be—they could not know that, to the contrary, it would render the kind of war on which their country was then engaged impossible. Both the Chernobyl and the Fukushima reactors were provided with elaborate, highly redundant, safety systems; but when the time came those systems, rather than preventing the accidents, only made them worse.

In brief, a simple, elegant “theory of everything” of the kind that, starting with Laplace, scientists have been chasing for two centuries remains out of sight. What we got instead is what we have always had: namely, a seething cauldron of hypotheses, many of them conflicting. Even when we limit ourselves to the natural sciences, where some kind of progress is undeniable, and ignore the social ones, where it is anything but, each question answered and problem resolved only seems to lead to ten additional ones. Having discovered the existence of X, inevitably we want to know where it comes from, what it is made of, how it behaves in respect to A and B and C. Not to mention what, if any, uses it can be put to.

The philosopher Karl Raimund Popper went further still. Scientific knowledge, he argued, is absolutely dependent on observations and experiments. However, since one can always add 1 to n, no number of observations and experiments can definitely confirm that a scientific theory is correct. Conversely, a single contradictory observation or experiment can provide sufficient proof that it is wrong. Science proceeds, not by adding knowledge but by first doubting that which already exists (or is thought to exist) and then falsifying it. Knowledge that cannot, at any rate in principle, shown to be false is not scientific. From this it is a small step towards arguing that the true objective of science, indeed all it can really do, is not so much to provide definite answers to old questions as to raise new ones. It is as if we are chasing a mirage; considering our experience so far, probably we are.

  1. The Drunk at the Party

If all this were not enough, the problem of free will persists. In the words of the French anthropologist Claude Levi-Strauss, it is the drunken guest who, uninvited, breaks up the party, upsetting tables and spreading confusion. Much as scientists may claim that it is simply a delusion—even to the point of showing that our bodies order us to raise our hands as much as ten seconds before we make a conscious decision to do so—our entire social life, specifically including such domains as education and justice, continues to rest on the assumption that we do in fact have a choice. As between action and inaction; the serious and the playful; the good and the evil; the permissible and the prohibited; that for which a person deserves to be praised, and that for which he deserves to be punished. Long before King Hammurabi had the first known code of law carved in stone almost four millennia ago, a society that did not draw such distinctions could not even be conceived of.

So far, neither physicists nor computer experts nor brain scientists, working from the bottom up, have been able to close the gap between matter and spirit in such a way as to endow the former with a consciousness and a will. Economists, sociologists and psychologists, working their way from the top down, have not been able to anchor the emotions and ideas they observe (or assume) people to have in underlying physical reality. Whichever route we take, the complete understanding of everything that would be necessary for prediction to be possible is as remote as it has always been. In no field is the crisis worse than in psychology; precisely the science (if one it is) that, one day, will hopefully explain the behavior of each and every one of us at all times and under all circumstances. Its claim to scientific validity notwithstanding, only 25-50 percent of its experimental results can be replicated.

Given the inability of science to provide us with objective and reliable visions of the future, those we have, as well as the courses of action we derive from them, depend as much on us—our ever-fluid, often capricious, mindset, our ira and our studio—as they have ever done. Elation, depression, love, euphoria, envy, rage, fear, optimism, pessimism, wishful thinking, disappointment, and a host of other mental states form a true witches’ brew. Not only does that brew differ from one person to another, but its various ingredients keep interacting with each other, leading to a different mixture each time. Each and every one of them helps shape our vision today as much as they did, say, in the Rome of the Emperor Caligula; the more so because many of them are not even conscious, at any rate not continuously so. In the process they keep driving us in directions that may or may not have anything to do with whatever reality the physicists’ instruments are designed to discover and measure.

  1. The Persistence of Ignorance

To conclude, in proposing that knowledge is power Francis Bacon was undoubtedly right. It is, however, equally true that, our scientific and technological prowess notwithstanding, we today, in our tiny but incredibly complex corner of the universe, are as far from gaining complete knowledge of everything, hence from being able to look into the future and control it, as we have ever been.

Furthermore, surely no one in his right mind, looking around, would suggest that the number of glitches we all experience in everyday life has been declining. Nor is this simply a minor matter, e.g. a punctured tire that causes us to arrive late at a meeting. Some glitches, known as black swans, are so huge that they can have a catastrophic effect not just on individuals but on entire societies: as, for example, happened in 2008, when the world was struck by the worst economic crisis in eighty years, and as coronavirus is causing right now. All this reminds me of the time when, as a university professor, my young students repeatedly asked me how they could ever hope to match my knowledge of the fields we were studying. In response, I used to point to the blackboard, quite a large one, and say: “imagine this is the sum of all available knowledge. In that case, your knowledge could be represented by this tiny little square I’ve drawn here in the corner. And mine, by this slightly—but only slightly—larger one right next to it.” “My job,” I would add, “is to help you first to assimilate my square and then to transcend it.” They got the message.

There thus is every reason to believe that the role ignorance concerning the future, both individual and collective, plays in shaping human life is as great today as it has ever been. It is probably a major reason why, even in a country such as France where logic and lucidity are considered national virtues and three out of four people claim they are not superstitious, almost half touch wood, and about one third say they believe in astrology. Nor are the believers necessarily illiterate old peasants. Most young people (55 percent) say they believe in the paranormal. So do many graduates in the liberal arts and 69 percent of ecologists. As if to add insult to injury, France now has twice as many professional astrologers and fortune tellers as it does priests. Both Black masses and Satan-worship have been on the rise. The situation in the U.S is hardly any different.

How did old Mark Twain (supposedly) put the matter? Prediction is difficult, especially of the future.

A History of the Future

  1. J, Bowler, A History of the Future: Prophets of Progress from H. G Wells to Isaac Asimov, Cambridge, Cambridge University Press, 2017.

As Yuval Harari’s Home Deus: A Brief History of Tomorrow shows, “histories” of the future are all the rage at the moment. Why that is, and what it means for conventional histories of the past, I shall not try to discuss. Where Prof. Bowler’s volume differs from the rest in that it is real history. Instead of trying to guess what the future may be like, he has produced a history of what people thought it might be like. The outcome is fascinating.

To the reviewer, the book provides so many possible starting points that it is hard to know where to begin. True, there had always been people who envisioned a better society. Most of the time, though, that society was located in the past—as with Plato and Confucius—in the afterworld—as with St. Augustine—or on some remote island (from Thomas More to about 1770). “Prophets of Progress” started making their appearance towards the end of the eighteenth century when the industrial revolution was making itself felt and when idea of progress itself took hold. As technical advances became more frequent and more important during the nineteenth century, their number increased. Starting at least as early as 1880, for any half literate person not to encounter their visions was practically impossible. Even if he (or, for god’s sake, she) only got his impressions from pulp magazines, themselves an invention of the late 1920s. And even if he was a boy (rarely, girl) who got his information from the long defunct Meccano Magazine, as I myself did.

Bowler himself proceeds not author by author, nor chronologically, but thematically. First he discusses the background of some of the authors in question. Quite a few turn out to have been scientists, engineers or technicians, a fact which in Bowler’s view gave them an advantage. Many were moved by personal interests, particularly the need to promote their own inventions. Next he takes us over one field after another; from “How We’ll Live,” through “Where We’ll Live,” “Communicating and Computing,” “Getting around,” “Taking to the Air,” “Space,” “War,” Energy and Environment,” all the way to “Human Nature.” Some predictions, such as the discovery of a method to counter gravity, travel at speeds greater than that of light, and tele-transportation proved totally wrong and have still not come about, if they ever will. Others, such as air travel, TV, helicopters, and megacities—though without the moving people conveyors many visionaries thought they could see coming—were realized rather quickly. Often it was not the technical characteristics of a new invention but its commercial possibilities, or lack of them, which determined the outcome.
If IUI fails after several attempts, depending on your age, your impotence and any previous experience you have had with levitra generico uk severity. Occupational therapy helps people of all ages to become viagra without prescription canada more independent and take part more efficaciously in school and assist adults return to work once they have healed their physical difficulties. Will and estate lawyers give personal advice to a person – big or small, community or perhaps exclusive, definately not property or even buy cialis http://downtownsault.org/wp-content/uploads/2018/02/04-12-17-DDA-MINUTES.pdf nearby. An ED patient levitra 20 mg must be alert from the duplicate products or brands.
Interestingly enough, two major inventions whose role very few people saw coming were radar and computers. The inability to envisage radar helps explain why, between about 1920 and 1939, fictive descriptions of future war almost always included apocalyptic visions of cities totally destroyed and even entire countries annihilated. The initial lack of attention paid to computers was probably linked to the fact that, until 1980 or so, they were only used by government and large corporations as well as the somewhat esoteric nature of the machines themselves. As a result, it was only after the invention of the microchip around 1980 that their full impact on daily life, both that of individuals and that of society as a whole, began to be understood.

William Blake (“black satanic mills”) and the Luddites having gone, until 1914 the reception new inventions got was normally positive. After all, who could argue with cheaper goods, faster travel, electric trams (taking the place of the clumsy, dirty horses of old), better control of many infectious diseases, and a zillion other things that made the lives of those who could afford them better and more comfortable? Next, the wind shifted. World War I with its millions of victims having come and gone, it was hard to remain optimistic. The change is well illustrated by the difference between H. G. Well’s Modern Utopia (1905) and Yevgeny Zamyatin’s We (1921). The former is lightly written and not without an occasional bit of humor. It describes a world which, though it may not be to everyone’s taste, is meant to be an improvement on the existing one. The latter is a grim tale of totalitarian government and control that extends right down to the most intimate aspects of life.

From this point on most new inventions have usually met with mixed reactions. Some people looked forward to controls that would reduce the proportion of the “unfit” in society, others feared them. While the former approach seemed to have been finally buried by the Nazi atrocities, later fear of global overpopulation caused it to return. The advent of television for entertainment and education was countered by the fear less it would turn all of us into what, much later, came to be called couch potatoes. Many welcomed continuing industrialization and growing productivity, but others worried about eventual shortages of resources as well as the things pollution might be doing both to the environment and, through it, to our own health. As Bowler points out, most prophecies were based on the relentless advance of technology. However, the arguments pro and contra changed much more slowly. Indeed they displayed a clear tendency to move in cycles, repeating themselves every generation or so.

One particularly fascinating story Bowler does not follow as carefully as he might have concerns nuclear weapons. As he notes, following the discovery of radium and radiation in the 1890s more than one author started speculating about the possibility of one day “liberating” the enormous energy within the atom and using it for military purposes. So much so, indeed, that one World War II American science fiction writer had to put up with visit by the FBI because of his stories’ uncanny resemblance to what, without his knowledge, was going on at Los Alamos. Coming on top of steadily improving “conventional” (the term, of course, is of much later vintage) weapons, this new form of energy threatened to literally destroy the world. Yet after the first “atomic” weapons were used against Hiroshima and Nagasaki in 1945 there was a tendency to belittle the danger they posed. Especially by way of radiation which some politicians and officers declared to be a “bugaboo” hardly worth taking seriously. More twists and turns followed, culminating in Johnathan Schell’s 1983 dark volume, The Fate of the Earth. What practically all authors Bowler discusses missed was the ability of nuclear weapons to impose what is now known as “the long peace.” An ability due, not to the efforts of well-meaning protesters but precisely to proliferation.

But I am beginning to quibble. Based on a vast array of sources—mostly, it must be admitted, British and American ones—clear and very well written, Bowler’s book is a real eye opener. For anyone interested in the way society and technology have interacted and, presumably will continue to interact, it is a must.

When I Dipt into the Future

I. What I am Trying to Do

As some readers will no doubt know, the title of this post has been taken from Alfred Tennyson’s poem “Locksley Hall.” Written in 1835, and first published seven years later, it recounts the musings of a rejected suitor. Wandering about, at one point he reminisces about the happy times when he “dipt into the future/far as human eye could see/Saw the vision of the world/And all the wonder that would be.” But just how did he do so? Metaphorically speaking, what kind of “bucket” did he bring to bear?

Tennyson’s unnamed protagonist was hardly the only one who ever tried his hand at this game. To mention a few outstanding names only, when the prophets Isaiah (and Jeremiah, and Ezekiel, and all the rest) tried to foresee what the future would bring, what methodology did they use? And how about the Greek Pythia? The Roman Sybil? Nostradamus? Jules Verne? H. G. Wells? Stephen Hawking? Ray Kurzweil? Yuval Harari?

By now, I have spent a year trying to answer these questions. In the hope, of course, of one day writing a book about them. One that will put the matter into perspective and explain, if not how good or bad the various methods are and how they may be improved, at any rate when and where they originated, how they developed, the principles on which they rested, and how they related to others of their kind. As a first step, I want to devote today’s post to providing a brief summary of some the most important methods people have been using.

II. Some Methods of Looking into the Future Explained

1. Shamanism. Shamanism is widespread all over the world, particularly among societies made up of hunter-gatherers and horticulturalists. Tribes without rulers, as I have called them in another book. Imported into modern cities, especially those of the so-called Third World, in many places it is active even today. At the root of shamanism is the assumption that, to look into the future, it is necessary first of all to leave the “normal” world by entering into an altered state of consciousness (ASC). The methods used to do so vary enormously from one culture to another. Among the most common are music (especially drumming), dancing, prayer, solitude, fasting, long vigils, sexual abstinence (or its opposite, engaging in orgies), breathing exercises, alcoholic drinks, hallucinogenic drugs, and many others.
In each of these cases, the objective is to embark the shaman on a mysterious voyage which will take him into a different country, realm, or reality. One in which the difference between present, past and future is eliminated and the last-named becomes an open book to read.

2. Prophecy. Also known as revelation, prophecy of the kind many of us are familiar with from the Old Testament in particular is little but a more institutionalized form of shamanism. The difference is that it is not the spirits but God Himself who supposedly reveals himself to the prophet and speaks through his mouth. Sometimes, as in the famous case of Jonah, he does so even against the prophet’s will. Whereas shamans were almost always illiterate prophets tended to spend their lives in societies where either they themselves or others were able to read and write. Often the outcome was a more detailed, more cohesive, idea of what the future might bring.

3. The interpretation of dreams. Like prophecy, the interpretation of dreams goes back at least as far as the Old Testament. It, too, rests on the assumption that, by entering upon an ASC, people will be enabled to see things which, in their waking state, they cannot.

As the Biblical story about Joseph shows, dreams were supposed to deliver their message not in simple form but with the aid of symbols. Lists of such symbols are known from ninth-century century BCE Assyria and continue to be published today. Note, however, that interpreting the dreams and relating them to future events was the task, not of the person who had them but of specialists who approached the problem in a cool, analytic manner. Before delivering their verdict, they often took the dreamer’s age, sex and personal circumstances into account.

4. The Greek oracles. Oracles were extremely popular in Greece and Rome. To use the example of Delphi as the most important one of all, it centered on the Pythia. She was a woman who, sitting on a tripod in a dark subterranean abode, came under the influence of foul gasses emanating from a split in the earth. Going into a sort of trance, the Pythia let forth confused gibberish which was supposed to contain the clue to the future. Next, a special college of priests interpreted her words. Oracles, in other words, resembled the interpretation of dreams in that prediction was divided into two stages, each of these was the responsibility of a different person or persons.

5. Necromancy. The best-known case of necromancy (from the Greek, nekros, dead, and manteia, divination) is the one described in the Old Testament. King Saul, wishing to learn the outcome of a battle which will take place on the next day, asks a witch to raise the spirit of the prophet Samuel from the dead. Whereupon Samuel tells Saul that, tomorrow, he and his sons too would be dead. Necromancy also occurs in Greek and Roman sources. Virgil in particular has Aeneas visit the underground abode of the dead where he is shown the future of Rome over a period of about a millennium, no less. The basic assumption underlying necromancy is that the dead, having crossed a certain threshold, know more than the living do. Even today in some cultures, procedures for raising the dead and consulting them concerning the future are commonplace.

6. Astrology. Along with shamanism, astrology is probably the oldest method for trying to look into the future. Its roots go as far back as Babylon around 3,000 BCE. That is why, in Imperial Rome, it was known as the “Chaldean” science. At the heart of astrology is the proposition, so obvious as to be self-evident, that the sun and moon (which, before Copernicus, were classified as planets) have a great and even decisive impact on life here on earth. Building on this, its students try to make that impact more specific by also taking into account the movements of the remaining planets, the fixed stars, and the relationships among all of these.

Even today, almost one third of Americans are said to believe in astrology. True or false, that does not change the fact that, unlike any of the above-mentioned methods, it is based not on any kind of ASC but on observation and calculation. Of the kind that is practiced, and can only be practiced, by perfectly sober people in full possession of their faculties. So mathematically-rooted was astrology that it acted as the midwife of astronomy, helping the latter become the queen of the sciences. This position it retained right until the onset of the scientific revolution during the seventeenth century.

7. Divination. As Cicero in his book on the topic makes clear, neither the Greeks nor the Romans ever took an important decision without trying to divine its consequences first. Both civilizations also maintained colleges of specialized priests who were in charge of the process. The most important types of divination were the flight of birds on one hand and examining the entrails of sacrificial animals on the other.
Generic drugs are frequently as effective as, but much cheaper than, brand-name drugs.A brand devensec.com cheap levitra tablet name is different. Results from a blood alcohol test performed at the scene showed viagra without prescription usa Lowe was below the legal limit for driving (0.08), but the troopers decided to arrest him on a DUI charge on the basis of the specification, these problems are further subdivided as delayed ejaculation, retrograde ejaculation, an ejaculation & painful ejaculation. Re-read the address, checking for superfluous punctuation and spaces. tadalafil cialis from india It is said purchase cialis online http://www.devensec.com/ch498/dec4986.html that the effect lasts long for about four hrs in men that permits these phones have numerous love making times during lovemaking act.
Like astrology, but unlike shamanism, prophecy, dreams, the oracles, and necromancy, divination did not depend on people becoming in any way ecstatic, mysteriously travelling from one world to another, and the like. Instead it was a “rational” art, coolly and methodically practiced by experts who had spent years studying it and perfecting it. Today the same is true for such techniques as numerology, Tarot-card reading, etc.

8. History (a). The idea that history is a linear, non-repeating, process that leads in a straight line from far in the past to far into the future is a relatively recent one. In this form it only made its appearance after 1750 or so. Before that date history was considered to be, either the province of “again and again” (as the historian Jacob Bronowski used to put it) or of regularly occurring cycles (as Plato and many others did). If the former, and assuming that the same circumstances always lead to the same effects, then the resulting patterns could be used to look into the future; such a view is very evident both in Thucydides and in Machiavelli. If the latter, then in principle at any rate the future could be predicted on the base of the point in the cycle that had been reached.

9. History (b). Both the idea that historical patterns repeat themselves and that history itself moves in cycles are alive and well. Starting with the Enlightenment, though, they have been joined by two other ideas both of which are often used for prediction. The first, which has since become easily the most common of all, was the discovery of “trends,” a term which was hardly used before 1880 or so but which has since grown into one of the buzzwords of our age. Trends made extrapolation possible. A good example is Moore’s Law which predicted that the speed of computer would double every eighteen months. Used by countless people, the characteristic hallmark of this method are the oft-repeated words, “already now.” “Already now” the situation is such and such; hence we can expect it to be even more so in the future.

The second method consisted of dialectics. The basic idea goes back to Heraclitus’ saying, around 500 BCE, that all things originate in agon, i.e. “strife.” In its modern form, the first to bring it to the fore was the nineteenth-century German philosopher Georg Hegel. In his hands it was applied to intellectual history above all. Next, it was taken over by Karl Marx. The latter, turning Hegel on his head, applied it to material factors. Both men believed that any historical trend must of necessity give rise to its opposite, thus making prediction possible in principle.

To retrace our steps, history (a) and (b) provides four different ways of looking into the future. Two of those, based on the idea that there is no change, are age-old; whereas the other two, assuming that change is the very stuff of which history is made, are of more recent vintage. What all have in common is that there is no room, in them, for ASC. Instead they are based, or are supposed to be based on sober study of recorded facts to which anyone has access.

10. Models. Modeling, like history, owes nothing to ASC. Essentially it consists of building models in order to understand how various factors that shape reality, past and hopefully future, are related and interact. The earliest, and for millennia almost the only, models were developed in order to represent the movements of the heavenly bodies. A very good example, which still survives, is the great astronomical clock of Strasbourg whose origins go back to the fourteenth century.

By definition, models are based on mathematical calculations. The more accurate the calculations, the better the model. But not all mathematical attempts to understand the world have been translated into nuts and bolts. Most remained on paper in the form of algorithms. Following the publication of Newton’s Principia Mathematica in 1687 the popularity of models of this kind increased. Applied to the physical world around us, currently they represent the most sophisticated, often almost the only, method for looking into the future we have. Some go so far as to predict developments that will take place in millions and even billions of years.

Attempts to extend mathematical modelling of the future from astronomy and physics to social life go back to the Renaissance when the first firms specializing in insurance were created. Assisted by the establishment of statistical bureaus from about 1800, their use increased during the second half of the nineteenth century in particular. The introduction of computers, which made possible the procession of vast bodies of data at enormous speed, caused reliance on them to grow exponentially. This has now been taken to the point where anyone who does not use, or pretend to use, computers for prediction is likely to be regarded as a simpleton.

III. Some Tentative Concluding Comments

To misquote Tennyson, methods for looking into the future go far back as human eye can see. Probably there never has been, nor ever will be, a society which did not have them or tried to devise them as best it could. Broadly speaking, methods for looking into the future may be divided into two kinds. The first relies on ASC and focuses on applying a variety of methods for entering upon those states. The second is based, or is supposed to be based, on rational, often mathematical, analysis and calculation. Some methods, such as the interpretation of dreams and oracles, separate the person who experiences an ASC from the one who explains her or his visions and utterances. By so doing they combine them.

The two basic methods have always existed side by side. However, with the advent of the scientific revolution their relative importance changed. Previously even educated people—often enough, the best-educated people—put their trust in ASC in its various forms. Not so in the centuries since 1700 or so when they were pushed into the margins, so to speak.

However, there is no proof that even the most “rational” methods, used with or without the aid of computers, obtain better results than the rest. Generally speaking, the less grounded in physics the future we are trying to foresee the more true this is. Furthermore, and presumably because visions have greater emotional appeal than equations, the greater the stress on individuals and societies the more likely they are to revert to ASC.

The topic is enormous in size, fascinating, and very difficult. Which is why, at this point, this is all I have to say about the topic.

No Exit

As some readers will know from some of my previous posts, I have been interested in the future and, even more so, the methods people of various times and places have developed in their attempts to predict it. One day, perhaps, I shall write a book about that endlessly fascinating topic. Until then, here are some preliminary reflections on it.

* Attempts at prediction are as old as humanity. As far as we can make out, Stone-Age hunters going on an expedition used to ask their shaman whether they would return alive, return loaded with quarry, and so on. We today are always looking for some device that will enable us to see where the stock exchange is heading.

* We today tend to see prophecy, astrology, divination, and similar practices as leftover from former, less sophisticated times. However, Cicero’s brother Quintus, took the opposite view: he held that only civilized societies could bring them to perfection.

* Historically, predictions have often taken poetic form. To this day, no one has been able to improve on the Old Testament in this respect. Or on good old Nostradamus (1503-66), perhaps the most famous seer who ever lived, whose quatrains (four-lined poems) have been read, interpreted, and believed by immense numbers of people over four and a half centuries. But no longer. Present-day “scientific” forecasts tend to consist of prose texts illustrated with the aid of tables and graphs.

* It used to be that practically all attempts to look into the future involved some kind of divine assistance. The old Hebrew prophets claimed that God had got hold of them—on occasion, as with Jonas, even against their will—and spoke through their mouth. So did St. John. At the Oracle in Delphi, supposedly it was Apollo who gave his advice by way of the Pythia. As Nostradamus put it, without religious faith even mathematical calculations, which he and others used to cast horoscopes, did not work. That, however, no longer applies. Regardless of whether it takes the form of mathematical modeling, or surveys, or “data mining,” most “serious” attempts at prediction have become strictly secular.

* Prophecy used to be closely linked with madness. The abovementioned Pythia uttered her prophecies while seated on a tripod positioned over a deep split in the ground from which emerged some kind of gas—said to be Apollo’s breath—which befuddled her. Casandra, daughter of King Priam of Troy who was cursed in that no one ever believed her (quite accurate) predictions, was often portrayed as incoherent and half mad. When Saul, the future King of Israel, went chasing his lost she asses and suddenly found himself prophesizing, people thought that he had gone off his rocker. That, too, no longer applies. Looking into the future, or trying to do so, is now often considered a rational, quite sober, activity. One on which billions are spent and on which some of the best minds, from that of computer guru Ray Kurzweil down, are engaged. By contrast, modern psychiatrists would like nothing better than to consign those who try to predict the future on the base of ecstasy to the loony bin. As, in fact, they not seldom are.

* Except when it was used in astrology, past attempts to look into the future seldom involved mathematics. Even as late as the early years of the twentieth century, it never occurred to the famous British science fiction writer H. G. Wells that models might have something to do with it. Basically all the tools he had were his knowledge of some recent inventions, a few rather simple trends, and his own extraordinarily fertile imagination. That is no longer true. To the contrary: the more mathematics such an attempt involves, and the fewer therefore the people who understand it, the better.

That said, there are also some things that have not changed:

* Some of the oldest methods, astrology in particular, are still in use. True, they have been pushed off center stage by supposedly better, more rational and more sophisticated, methods employed by economists e.g. As newspaper and magazine columns confirm, however, that does not mean many people do not take notice of them and are not interested in them.

In fact, cialis for sale cheap and Kamagra have the same ingredients and composition, which makes them the best low libido supplements. Kamagra Tablets are an alternative of sale of viagra , which is a powerful sildenafil citrate medicine that has stood out in the recent past is Kamagra UK. The World’s Strongest Antioxidant is a product made of the magic Amazon berry is, among other things, an internal beauty product that is able to penetrate, nourish and protect human cells and provide viagra tablets usa the necessary minerals to the cells. The majority of the times, individuals attempt buy levitra line to shed their hair quicker compared to females. * Many prophecies used to be rather obscure, often deliberately so. To adduce but one famous example, the Pythia told King the envoys of Croesus that, if he went to war with neighboring Persia, he would bring down a great kingdom. He believed her, took the offensive, and was defeated. The explanation? The Pythia had not said which kingdom would be destroyed.

Similarly, many of today’s forecasts are “probabilistic.” Meaning that, instead of providing yes/no answers, all they yield are estimates of the chances of this or that happening. From the point of view of those who make them, of course, such forecasts have the advantage that they are always right.

* To pursue this thought, here is a story that used to be told about a former Israeli chief of staff, General Rafael Eitan (served, 1978-83). One day he was asked to approve some operation the air force was preparing. When he asked about the weather, he was told that there was a twenty percent chance of rain. “Wrong,” he said. The correct answer is fifty percent. Either it will rain, or it won’t.” He had a point, didn’t he?

* The use of computers, models and mathematics notwithstanding, to date there is not a shred of evidence that we secular, supposedly rational, moderns are one bit better at looking into the future than, say, Babylonian astrologers exercising their craft four thousand years ago used to be. If the book of Genesis may be believed, the seven good and seven lean years which Joseph, on the basis of Pharaoh’s dream, predicted did not come as a surprise as much as the 1929 and 2008 depressions did. Or, for that matter, as the boom of the Clinton years.

But suppose, someone might say, we had been able to accurately predict the future; what then?

* If it happens, it will probably form the most important “singularity” ever, far eclipsing anything those who so often play with that concept have come up with. More important, say, than the development of artificial superintelligence which Ray Kurzweil has been trumpeting. And more important than meeting with an ex-terrestrial civilization.

* Such a world would require that all information at the predictors’ disposal be correct, accurate, and comprehensive. Right down to what is happening in each one of the hundred billion or so cells and trillions of connections (synapses) which make up the brain of each and every one of us. All causes and all effects would have to be known and perfectly understood.

* In such a world movements and impacts would still be possible, as they are e.g. in the atmosphere or in the heavens. However, those movements and those impacts would be blind, occasioned solely by natural laws. The reason is that such a world would have to do without intentionality, because intentionality is the greatest obstacle to certainty of all. But beware. No intentionality, no feelings to choose the objectives we are aiming at; nor thought about the best way of achieving them. In other words, no conscious life, either emotional or intellectual. Purely physical phenomena apart, such a world would be frozen in concrete. With no exit.

Make up your own mind, if you can, whether you would want to live in such a world.

Neither Heaven nor Hell (I)

Part I

Recently I have been devoting a lot of thought to what life in the rest of the twenty-first century might be like. No doubt that is because, like so many old folks, I find myself playing with vague ideas about vague topics. Or perhaps it is the ideas that, floating in the air, are playing games with me? Anyhow. Some authors, looking forward to global peace, the suppression of poverty, advancing medical science, moral progress (yes, there are people who believe it is actually taking place) and similar goodies believe that the future will be heaven. When I was much, much younger, writing an essay about the “ideal” future and my hope of living to see it come about, I myself took this view. Others, perhaps more numerous, keep warning us that it will be hell. As, for example, when we run out of resources, or when growing economic inequality leads to violent disturbances culminating, perhaps, in war.

So here are some thoughts on the matter. spread over this week and the two following ones. They are framed in terms of tentative answers to ten critical questions, arbitrarily selected and here presented in no particular order like fruit in a salad. Enjoy the feast!

1. Will war be abolished? Whether war is due to the fragmented nature of human society, which never in the whole of history has been subject to a single government, or to the fact that resources are always limited and competition for them intense, or to tensions within the various war-waging polities, or the aggression and will to power that are part of our nature, I shall not presume to judge. Probably all these factors are involved; as indeed they have been ever since the first band of nomadic hunter-gatherers, wielding clubs and stones, set out to fight its neighbor over such things as access to water, or quarry, or berries, or women, as well as things vaguely known as honor, prestige, deterrence, etc.

One and all, these factors are as active, and as urgent, today as they have always been. That is why all previous hopes and efforts to put an end to armed conflict have come to naught. In the words of the seventeenth-century English statesman and jurist, Francis Bacon: There will never be a shortage of “seditions and troubles;” some of which will surely lead to politically-motivated, socially-approved, organized violence, AKA war.

2. Will we run out of resources? The fear that the point is arriving, if it has not done so already, where we humans exhaust the earth’s resources has been with us at least since the Christian writer Tertullian in the second half of the second century CE. And not without reason, as it seemed. At this time about one quarter of the population of the Roman Empire died of plague, perhaps reducing the total number from 80 to about 60 million.

Bad as it was, the crisis did not last. Over the two millennia since then the number of people living on this earth has increased about thirtyfold. No other plague, no war however destructive, has succeeded in permanently halting growth. During the same period the amount of resources extracted and/or consumed each year has grown by a factor of a thousand or more. Tet thanks to techniques such as saving, substitution, recycling and, above all, broadly-based technological progress, world-wide more people can afford to buy and consume a greater variety of resources than ever in the past. Recently the growing use of fracking for extracting shale oil has brought about a situation where even energy, which for over four decades has bedeviled the world by its ups and downs, has become available at a reasonable price and looks as if it will continue to do so; instead of peak oil, it seems that prices have peaked.

In brief: Tertullian, Malthus and their countless fellow prophets of economic doom, major and minor, are wrong. Local and temporary bottlenecks have always existed. One need only think of the shortage of wood and charcoal that led to their being replaced by coal, helping usher in the industrial Revolution in England. They will, no doubt, continue to do so in the future too. Pace Al Gore and his fellow “environs,” though, shortages so serious as to disrupt global economic life for any length of time are not in the cards. One could even argue that, given the background of continuing economic recession, many raw materials are underpriced; just look at what happened to the shares of Anglo-American from 2008 on.

Musli Kaunch cheapest cialis capsule is a natural pill to increase testosterone levels. The salesmen should always be connecting with cialis 10 mg the sales force structure and size to product execution and must notice the regional physicians’ prescribing intention and the sales ability of the medication to linger in the man’s system for up to 36 hours. I’ll say it again, no brand cialis online striving for perfection, strive for small victories. They are the supermodels you know cialis price no prescription and love, like Tyra, Rebecca, Giselle and Cindy.

3. Will poverty disappear? Some people think so. Pointing to the fact that, over the last two centuries or so, the standard of living in the most advanced countries has increased about thirty-fold, they expect prosperity to spread like ripples in a pool. It is indeed true that, except when it is deliberately manufactured as part of war, famine, famine of the kind that used to be common even in Europe before 1700 or so, has largely become a thing of the past.

That more present-day people can afford more and/or better food, hygienic facilities, clothing, warmth, housing, transportation, communication, entertainment, and many other things than ever before is obvious. No ancient treasure trove, no Ali Baba cave, could offer anything like the wares on display in any large department store. Even the Sun King himself did not enjoy many of the amenities which are now standard in any but the poorest French households.

There are, however, three problems. The first is that poverty is psychological as well as material. Of the two kinds, the former is much harder to eradicate than the latter. This is brought out by the fact that, even in Denmark which has the lowest poverty rate of any OECD country, just over five percent of the population say that they cannot afford food.

Second, poverty and its opposite, wealth, are not absolute but relative. People do not look just at what they themselves own, earn, consume and enjoy. They are at least as interested in the same factors as they affect their neighbors, role models, and enemies.

Third, the scale along which poverty operates is not fixed but sliding. When new products appear they are almost always luxuries, at any rate in the sense that, before they did so, no one felt any need for them. As time passes, though, luxuries have a strong tendency to turn into necessities. The histories of automobiles, personal computers, and mobile telephones all illustrate this very well. Each one caused life to re-structure itself until it became absolutely indispensable. Once this happened anyone who could not afford the product in question would define himself, and be defined by others, as poor; even if his economic situation was satisfactory in other respects.

Quite some economists go further still. They claim that inequality is growing. Also that, unless some pretty drastic measures, such as a 100 percent inheritance tax, are implemented, serious upheavals are going to upset even the richest and most advanced societies. But such a tax itself is likely to cause quite as many upheavals as it was designed to prevent. In brief: wealthy as future societies may become, there is no reason to believe that poverty will be abolished.

How is that for a starter? See you next week.