Category: History

  • The War That Wasn’t: Christianity, Science, and the Making of the Western World

    The War That Wasn’t: Christianity, Science, and the Making of the Western World

    Examining the History between Science and Christianity

    During my early adulthood I was a zealous New Atheist, and as such believed wholeheartedly in a message that was central to the NA movement: that Christianity had been a parasite on Western civilization, dragging humanity into the Dark Ages and smothering science until skeptics and Enlightenment thinkers finally pulled us back into the light. While studying European history in depth, though, I began to see cracks in that story. The relationship between Christianity and science wasn’t as clear-cut as New Atheists made it out to be, and in some elements was rather constructive. But the question remained in my mind, and it grew into a larger curiosity about what made the West different—one that would eventually drive my MA studies in international economics and development.

    Recently, though, something surprising happened: I saw the old narrative resurface. If you were active in New Atheism circles in the 2000s (or honestly if you were active on the internet at all; to quote Scott Alexander, “I imagine the same travelers visiting 2005, logging on to the Internet, and holy @#$! that’s a lot of atheism-related discourse) you probably saw a chart that looks something like this:

    While many of those who were active New Atheists back in the early 2000s have mellowed out and found other ideological motivations (or disenchantments), it seems there is a new generation of zealous Atheists engaging with these ideas for the first time, and one secular group I saw the above graphic posted unironically, even with the ridiculous attribution of “Christian” dark ages. The resurfacing of that old certainty, combined with a provocative new scholarly article offering an economic perspective on Christianity’s role in scientific progress, prompted me to revisit the question. How exactly did Christianity interact with science throughout history? The answer is messier, and far more interesting, than the stories I once took for granted. I will tackle this in a way that is at once thematic and chronological: 1) the Dark Ages (or Early Middle Ages) 2) The (High) Middle Ages, and 3) the modern (Post-renaissance) world.

    I. Did Christianity Cause the Dark Ages?

      First, I want to address the easiest part of the question: did Christianity somehow cause the Dark Ages?

      I think this can be answered very briefly with an unqualified “no”, and I will go even farther and say “quite the opposite”. I don’t know of any reputable modern historians who would say otherwise. Historically there have of course been literally hundreds of different causes blamed for the fall of the Western Roman Empire and the ensuing “Dark Ages”, including such obscure culprits as infertility, trade imbalances, loss of martial virtues, and wealth inequality. Yes, contemporaries in late antiquity did blame Christianity and the abandonment of traditional paganism for the fall of Rome. For example one of the last notable pagan scholars, the Eastern Roman historian Zosimus, put it plainly that “It was because of the neglect of the traditional rites that the Roman Empire began to lose its strength and to be overwhelmed by the barbarians.” (Historia Nova, Book 4.59), and most famously (Saint) Augustine of Hippo wrote his City of God to refute such perspectives (though he wrote before Zosimus): “They say that the calamities of the Roman Empire are to be ascribed to our religion; that all such evils have come upon them since the preaching of the Gospel and the rejection of their old worship.”(The City of God, Book 1, Chapter 1). Needless to say, these were not modern academic historians and were clearly making biased assertions based on the vibes of the day.

      Today the historical consensus seems to be some combination of climate change-induced plague (see Harper, Kyle, “The Fate of Rome”), mismanagement by later emperors of the resulting chaos and malaise, and most importantly the Völkerwanderung, “The wandering of the peoples,” i.e. the migration of massive numbers of people from central and western Eurasia into Europe during the 4th and 5th centuries. To summarize the contemporary consensus into one sentence: the Western Roman Empire fell because decades of plague and civil war made it too weak to repulse or assimilate the entire nations of people migrating into Western Europe, and thus had to cut deals, cede territory, and slowly “delegate itself out of existence” (Collins, “Early Medieval Europe”). In direct refutation of the claim that Christianity caused the Dark Ages, the Christianization of these Germanic and other peoples was a vital channel for the transmission of culture and values and an important step toward (in Rome’s conception) civilizing and settling them in the Mediterranean world (see https://en.wikipedia.org/wiki/Ulfilas as one example).

      As further refutation, the brightest spark of European Civilization during the Western European Dark Ages (roughly 500-800) was the Eastern Roman Empire, which was unquestionably *more* thoroughly Christian than the “Darkening” West. The Eastern empire boasted a stronger institutional religious structure, with the emperor himself dictating much of theological policy, and strongarm enforcement of official positions (e.g. with the Christological debates of late antiquity) was common: “The Byzantine emperor, always the head of the imperial hierarchy, automatically evolved into the head of this Christian hierarchy. The various bishops were subservient to him as the head of the Church, just as the governors had been (and were still) subservient to him as the head of the empire. The doctrines of the Christian religion were formulated by bishops at councils convened by the emperor and updated periodically at similar councils, with the emperor always having the final say” (Ansary, Tamim, “Destiny Disrupted”).

      The Western empire, by contrast, struggled for centuries with institutionalization and conversion, with the Catholic church wrestling not just with latent paganism and heretical syncretism among rural populations but also an existential battle with Arian Christianity (a nontrinitarian form of Christianity that asserts that Jesus not an incarnation of God but merely a lesser creation of God), common for centuries among the ruling strata of Vandals and Goths in the early middle ages; “the Vandals were Arian Christians, and they regarded the beliefs of the Roman majority as sufficiently incorrect that they needed to be expunged… Arianism continued as the Christianity of ‘barbarian’ groups, notably Goths, Vandals and eventually Lombards, into the seventh century” (Chris Wickham, “The Inheritance of Rome”). Though I will risk overextending my argument here, I will say that the Church in fact prevented the Dark Ages from being even worse: “the Church emerged as the single source of cultural coherence and unity in western Europe, the cultural medium through which people who spoke different languages and served different sovereigns could still interact or travel through one another’s realms.” (Ansary).

      There is a caveat to all this, though. Christianity did seem to have a deleterious effect on the logical philosophy of the late Empire. I have been able to find at least three separate early Christian philosophers who all deliver variation on the same idea that faith should triumph over logic and reason:  “The nature of the Trinity surpasses all human comprehension and speech” (Origen, First Principles, Preface, Ch. 2); “If you comprehend it, it is not God” (Tertullian, De Carne Christi); and “I believe because it is absurd”, “Credo quia absurdum est” (Augustine of Hippo). But it is important to contextualize these perspectives in a general trend towards mysticism in late antiquity. Christianity was not alone, as Mithraism, Manicheism, Sun worship, and other prescriptive revealed religions swirled in the ideological whirlpool of late antiquity, and also to see the rise of all of the above as reactions to the declining state of the political economy: as we see evidenced today, material insecurity pushes people toward the comfort of religion (see e.g. Norris and Inglehart, Sacred and Secular).

      II. Did Christianity Hinder or Help the Birth of Modern Science?

      This question is somewhat more difficult to answer, and I originally had drafted much more, but decided to cut it down to prevent losing anyone in an overly academic morass. To summarize what I see as the answer to this question, there are two necessary components to the rise of modern science, the ideological and the structural. Ideologically, the quest to understand the divine plan through exploration of the natural world was common to both the Christian and Islamic proto-scientists, but when authorities decided this ideological quest was becoming threatening, structural changes that had taken place in Christendom (in part as a result of religious motivations) but not in the Caliphate saved proto-science in the former from the same fate as the latter. Thus Christianity initially helped, then became antagonistic toward emerging proto-science, but by the point that things got antagonistic, structural changes prevented the Church from effectively tamping out the proto-scientific sparks. Let’s expand a bit, but first a note about the Twelfth Century Renaissance.

      In the west, the critical window here was the 12th Century Renaissance and the resulting changes that took place in the 13th century. The 12th Century Renaissance is less well-known than “The” Renaissance of the 15th century, but arguably had more far-reaching consequences in terms of laying the foundations of Western civilization and culture. “Although some scholars prefer to trace Europe’s defining moments back to the so-called Axial Age between 800 and 300 b.c., the really defining transformative period took place during the Renaissance of the twelfth and thirteenth centuries. That is when the extraordinary fusion of Greek philosophy, Roman law, and Christian theology gave Europe a new and powerful civilizational coherence.” (Huff)

      The Twelfth Century Renaissance witnessed two unrelated trends that came together at the end of the 13th century in one seminal decade, which I will unpack in later paragraphs. The first trend is the reintroduction (via translation schools in Toledo, Constantinople and the Papal States) of most of the works of Aristotle, giving birth to a “new intellectual discipline [that] came to be known as ‘dialectic.’ In its fully developed form it proceeded from questions (quaestio), to the views pro (videtur quod) and con (sed contra) of traditional authorities on a particular subject, to the author’s own conclusion (responsio). Because they subjected the articles of faith to tight logical analysis, the exponents of the new rational methods of study became highly suspect in the eyes of church authorities” (Ozment). The second trend was the rediscovery of Roman Law which triggered an immense restructuring of legal rights and entities in the entire Western world: “An examination of the great revolutionary reconstruction of Western Europe in the twelfth and thirteenth centuries shows that it witnessed sweeping legal reforms, indeed, a revolutionary reconstruction, of all the realms and divisions of law […] It is this great legal transformation that laid the foundations for the rise and autonomous development of modern science […]” (Huff).

      We will examine how the novelties of the Twelfth Century Renaissance interacted to create the preconditions for Science as we know it, by examining the confluence of the two requirements, the ideological and the structural.

      Ideologically, what was required for the creation of science was the attempt to use existing knowledge to understand the underlying structure of the world, i.e. the codification and understanding of the scaffold of natural laws that allowed an understanding of the way the world worked. The belief in a knowable creator god seems to have given rise to this concept in the Abrahamic world. In his “history of the world through Muslim eyes,” Destiny Disrupted, Tamim Ansary encapsulates that both Christian and Muslim proto-scientists shared the same goals of understanding God through natural observation: “As in the West, where science was long called natural philosophy, they [Abbasid-era Muslim philosophers] saw no need to sort some of their speculations into a separate category and call it by a new name[…]science as such did not exist to be disentangled from religion. The philosophers were giving birth to it without quite realizing it. They thought of religion as their field of inquiry and theology as their intellectual specialty; they were on a quest to understand the ultimate nature of reality. That (they said) was what both religion and philosophy were about at the highest level. Anything they discovered about botany or optics or disease was a by-product of this core quest[…]”. Islamic and European civilization both shared the Greek intellectual roots: “Greek logic and its various modes were adopted among the religious scholars.” (Huff) China, in contrast, along with the pre-Christian Mediterranean world, had admirable command of engineering principles and keen natural observation as exemplified by the likes of Zhang Heng, Archimedes, or Heron of Alexandria, but while visionary voices likely existed in each, neither Classical nor Chinese civilization generally adopted an ideological outlook that sought to comprehend what a bunch of discrete natural and engineering phenomena, such as the flow of water, motion of planets or architectural physics, might all say about the fundamental structure of the world or the will of the divine. To nail the issue even more tightly shut, “traditional Chinese mathematics was not abstract because the Chinese did not see mathematics in any philosophical sense or as a means to comprehend the universe. When mathematical patterns were established, they ‘were quite in accord with the tendency towards organic thinking’ and equations always ‘retained their connection with concrete problems, so no general theory could emerge’  (Olerich 22). In the west, in contrast, the Twelfth Century Renaissance added jet fuel to the existing ideological quest to create general theories and comprehend the universe: “In a word, by importing and digesting the corpus of the “new Aristotle” and its methods of argumentation and inquiry, the intellectual elite of medieval Europe established an impersonal intellectual agenda whose ultimate purpose was to describe and explain the world in its entirety in terms of causal processes and mechanisms” (Huff 152).

      Structurally, what was required for the creation of modern science was the institutional independence of proto-universities to explore questions that ran contrary to social and religious dogmas. As historian Toby Huff explains, the new legal world created by the rediscovery of Roman legal codes was a veritable Cambrian explosion for European institutions and ideas

      “For example, the theory of corporate existence, as understood by Roman civil law and refashioned by the Canonists and Romanists of the twelfth and thirteenth centuries, granted legal autonomy to a variety of corporate entities such as cities and towns, charitable organizations, and merchant guilds as well as professional groups represented by surgeons and physicians. Not least of all, it granted legal autonomy to universities. All of these entities were thus enabled to create their own rules and regulations and, in the case of cities and towns, to mint their own currency and establish their own courts of law. Nothing like this kind of legal autonomy existed in Islamic law or Chinese law or Hindu law of an earlier era.”  (Huff)

      To expand, whereas up to the 12th century, the legal forms in Western Europe were almost wholly those that had been imported from Germanic feudal law, a highly personalist structure of fealty and dependence – land, businesses, countries, churches were the responsibility of individual lords, artisans, kings, bishops, what have you – who depended on their superiors for the mere right to exist. The idea of corporate personhood, that “corporations are people”, (putting aside all of the exploitative and oligarchic connotations it has taken on in the context of the American political scene in the 21st century) was a fascinating, powerful, liberating idea in the 12th century, and one that proved critical to the rise of modern science. Quickly towns, cities, mercantile ventures, and most critically cathedral schools, seminaries, and proto-universities strove to incorporate (literally, “to make into a body”) their existence in the Roman legal mold – no longer were they merely collections of people, they argued their way into being as legal entities distinct from their members and adherents. Further, the Catholic church enriched its canon law with Roman borrowings and promoted the creation of formal legal studies, for example at the University of Bologna. The Compounding with the ideological ferment after the reintroduction of Aristotle and other new texts, “European scholars began gravitating to monasteries that had libraries because the books were there[…] Learning communities formed around the monasteries and these ripened into Europe’s first universities.” (Ansary), which could then as independent corporate entities survive political or theological pressure on or from any individual member.

      We can quite clearly examine the benefit of this arrangement by counterfactual comparison with Islamic society. Despite the fact that Islamic society also had a quest to understand the divine structure of the physical world and thus shared the same ideological perspectives that gave rise to proto-science, the very different institutional structure of the Islamic world resulted in a very different outcome for Islamic science. As philosophers began to question fundamental religious dogmas such as the necessity of revelation or the infallibility of the Quran, “the ulama were in a good position to fight off such challenges. They controlled the laws, education of the young, social institutions such as marriage, and so on. Most importantly, they had the fealty of the masses” (Ansary). The intellectual institutions such as Islamic awaqaf (plural of waqf,  “pious endowment”) that did house these nascent intellectual pursuits were not legally independent but were the dependencies of individual sponsors who could apply pressure – or have pressure applied to them – and their very nature as pious endowments meant that “they had to conform to the spirit and letter of Islamic law” (Huff). Reaching higher into the political structure, In the 10th century, most of the Islamic world was united under the Abbasid Caliphate, and consequently a reactionary shift by the government could result in a persecution that could reach to most of the Islamic world. That is precisely what happened, for the Ulama used their influence to force a change in direction of the Caliphate. After a high tide of intellectual ferment, subsequent persecution of the scientist-philosophers under the next Caliph “signaled the rising status of the scholars who maintained the edifice of orthodox doctrine, an edifice that eventually choked off the ability of Muslim intellectuals to pursue inquiries without any reference to revelation.” (Ansary). And just to once again contrast with the Far East, “In China, the cultural and legal setting was entirely different, though it too lacked the vital idea of legal autonomy” (Huff). Most importantly in China, the dominance of the Civil Service Examinations served as a gravity well attracting all intellectual talent to a centralized, conservative endeavor, stifling other intellectual pursuits: “This was not a system that instilled or encouraged scientific curiosity […] The official Civil Service Examination system created a structure of rewards and incentives that over time diverted almost all attention away from disinterested learning into the narrow mastery of the Confucian classics.” (Huff 164).

      Bringing this all together, in the West, the twin fuses of ideological ferment and corporate independence intertwined, and who else would light the spark aside from the Catholic church. As noted already, the Church realized the threat posed by the rising tide of Aristotelianism and its promotion of rigorous logical examination of the Church’s teachings. Whereas earlier in the 1200s the tendency was to try to find common ground between Aristotle and Christianity, or even to use them to reinforce each other as exemplified by Thomas Aquinas, by the latter part of the century conservative elements in the church saw Aristotelianism as an inherently hostile cancer, and in 1270 and again in 1277 they declared war, issuing (a reinforcing) a blanket condemnation of Aristotle.  Historian Steven Ozment explains that “In the place of a careful rational refutation of error, like those earlier attempted by Albert the Great and Thomas Aquinas, Bishop Tempier and Pope John XXI simply issued a blanket condemnation. The church did not challenge bad logic with good logic or meet bad reasoning with sound; it simply pronounced Anathema sit.”


      The momentousness of this decision for the course of western thought cannot be overstated, for it represented an end to the attempt to reconcile theology and philosophy, science and religion. “Theological speculation, and with it the medieval church itself, henceforth increasingly confined itself to the incontestable sphere of revelation and faith[…] rational demonstration and argument in theology became progressively unimportant to religious people, while faith and revelation held increasingly little insight into reality for secular people.” (Ozment, Steven “The Age of Reform 1250-1550”). In short from 1277 onward, the religious got more religious, and the rational became more rational.

      We see in action the importance of corporate independence and decentralized governance, because there were attempts to stamp out Aristotelianism: In England, there were attempts at Oxford in the late 13th century to restrict the teaching of certain Aristotelian texts. In 1282, the Franciscan Minister General Bonagratia issued statutes attempting to limit the study of “pagan” philosophy (mainly Aristotle) among Franciscan students. In the Dominican Order, after Aquinas’s death, there were some attempts by conservative members to restrict the teaching of his Aristotelian-influenced theology. The Dominican General Chapter of 1278 sent visitors to investigate teachers suspected of promoting dangerous philosophical doctrines. But these efforts failed, and universities proudly asserted their newfound legal independence: the University of Toulouse, which incorporated in only 1229, declared that “those who wish to scrutinize the bosom of nature to the inmost can hear the books of Aristotle which were forbidden at Paris” (Thorndike). The University of Padua became particularly known as a center for “secular Aristotelianism” in the late 13th and 14th centuries, and maintained a strong tradition of studying Averroes’ commentaries on Aristotle even when these were controversial elsewhere (Conti, Stanford).

      But for the thinkers during and just after this time period, the intellectual whiplash stimulated new thought that truly began the rebirth of scientific thinking in the western world. Instead of blindly taking either the Church or Aristotle at face value, the idea that they could be in conflict gave rise to the idea that either or both could be wrong.  The University of Padua mentioned above Scholars such as Jean Buridan or Nicole Oresme began their studies in religious matters (the former was a cleric and the latter a bishop) before turning to “scientific” studies, but their questioning of both religious and Aristotelian dogmas led them to pierce through accepted dogmas, making unique contributions to a wide variety of fields and generally considered to have lain the foundations for the coming scientific and Copernican revolutions.

      III. How have science and religion reacted Post-Renaissance?

      In a recent post on MarginalRevolution, economist Tyler Cowen linked a new article which tears this ancient quarrel new abroach, at least for the modern era. The opening statement concisely encapsulates the picture painted above:  “Today’s leading historians of science have ‘debunked’ the notion that religious dogmatism and science were largely in conflict in Western history: conflict was rare and inconsequential, the relationship between religion and science was constructive overall”, and Cowen adds his commentary that “Christianity was a necessary institutional background”, as I believe the preceding section has shown. But the article by Matías Cabello picks up the story where I left off, and looks at the relationship after the Renaissance. Cabello sees the modern period as unfolding in three stages, with an increasingly secular perspective from the late Middle Ages until the Reformation, then a new period of increased religious fervor during the period of the Reformation and Wars of Religion (16th-17th centuries), finally relenting with the dawn of the Enlightenment in the early 18th century.

      Cabello’s chronology lines up closely with my own knowledge of the topic, though I admit that after the Reformation my knowledge of this period is inferior to that of the previous eras. I draw primarily from Carlos Eire’s monumental and reputed book Reformations for knowledge of this period. But in general, there’s a lot of data showing that the Reformation was a much more violent, zealous, and unscientific time than the periods straddling it.  A useful theory for understanding the dynamics of religion during this period is the Religious Market theory as formulated by Stark and Bainbridge (1987). In this theory, religions compete for adherents on a sort of market (or playing field, if you will), and in areas of intense competition, religions must improve and hone their “products” to stay competitive against other religions, but in areas where one religion monopolizes the market it becomes less competitive, vital, and active in the minds of adherents. This phenomenon is visible most clearly in the secularization of Scandinavian countries once Lutheranism enjoyed near complete monopoly for 400 years, and is often employed to explain why the pluralistic US is more religious than European countries which usually have one hegemonic church, but I would it argue it was also clearly at play in the Midde Ages. By the late Middle Ages, Catholicism enjoyed complete dominance in Western Europe against all rivals, allowing cynicism, political infighting (e.g. the western Schism which at one point saw three popes competing for recognition over the church), and, most critically, corruption to creep into the Church’s edifice. But when the Protestant Reformation broke out (in large part for the reasons just enumerated), suddenly there were several competing “vendors” who had to knuckle down and compete with each other and with different strands within themselves, leading to increased fanaticism and internecine violence for more than a century. There’s a lot of evidence that corroborates this general trend, for example Witch Hunts, which despite being portrayed in popular culture as a medieval phenomenon were decidedly a Reformation-era thing as shown in the below chart, (to wit, many of our popular ideas of the middle ages come from Enlightenment/Whig writers looking back on the 17th century and erroneously extrapolating from there).

      If I can contribute my own assembled quasi “data set”, a few years ago I put together a western musical history playlist featuring the most notable composers from each time period, and one thing that clearly jumped out to me without being aware of this historical topography was that the music before the Reformation was much more joyous and open (and to my ears, just more enjoyable to listen to) than the rather conservative and solemn music that would come just after. To sum up, a lot of indicators tell us that the period of roughly 1500-1700 would have been a much less creative, openminded, and probably fun time to live than the periods just before or after.

      Getting back to Cabello, one of the novelties of his work is in its quantitative approach to what has traditionally been a very non-quantitative area of inquiry, scraping and analyzing Wikipedia articles to see how the distribution and length of science-related figures shifted over time. His perspective is most concisely presented by his figure B2, reproduced here:

      To quote the author, “This article provides quantitative evidence—from the continental level down to the personal one—suggesting that religious dogmatism has been indeed detrimental to science on balance. Beginning with Europe as a whole, it shows that the religious revival associated with the Reformations coincides with scientific deceleration, while the secularization of science during the Enlightenment coincides with scientific re-acceleration. It then discusses how regional- and city-level dynamics further support a causal interpretation running from religious dogmatism to diminished science. Finally, it presents person-level statistical evidence suggesting that—throughout modern Western history, and within a given city and time period—scientists who doubted God and the scriptures have been considerably more productive than those with dogmatic beliefs.”

      It is no coincidence, then, that the single most famous skirmish in history between science and religion, the trial and condemnation of Galileo Galilei, came squarely in the nadir of this fanatical period (1633).

      And yet even in this context, science did, of course, continue to progress, and religious beliefs often lit the way. Historian of Science Thomas Kuhn produced what is likely the best analysis to date of how science progresses throughout the ages and how it is embedded in sociocultural assumptions. In his magnum opus, The Structure of Scientific Revolutions it is clear that the paradigm shifts that create scientific revolutions do not require secularization but rather the communion of different assumptions and worldviews. For example, the different worldviews of Tyco Brahe and Johannes Kepler, two Dutch astronomers who, as master and apprentice respectively, were looking at the same data but, with different worldviews, came to very different conclusions. Brahe believed that in a biblically and divinely structured universe, the earth must be at the center, and he as such rejected the new Copernican heliocentrism. His apprentice Kepler, however, also employed religious motivation, seeing the new heliocentric model as an equally beautiful expression of divine design, and one which squared more elegantly with the existing data and mathematics. Science, thus, is not about accumulating different facts, but looking at them through different worldviews. In one of my posts a few months ago, I mentioned that religious convictions and other reasons can push scientists to bend or break the scientific method, sometimes leading to scientific breakthroughs. One of the clearest examples was the scientific expedition of Sir Arthur Eddington, whose Quaker convictions likely made him see scientific discovery as a route towards global brotherhood and peace. In short the scientific method is an indispensable tool for verifying discoveries and we neglect it at our peril, but we must not let it become a dogma, for the initial spark of discovery often emanates from the deeply personal, irrational, cultural, or religious ideas of individuals.

      In a recent blog post, Harvard professor of AI and Data Science Colin Plewis posted the following praise of Kuhn: “Far from the smooth and steady accumulation of knowledge, scientific advancement, as Kuhn demonstrates, often comes in fits and starts, driven by paradigm shifts that challenge and ultimately overthrow established norms. His concept of ‘normal science’ followed by ‘revolutionary science’ underscores the tension between tradition and innovation, a dynamic that resonates far beyond science itself. Kuhn’s insights have helped me see innovation as a fundamentally disruptive and courageous act, one that forces society to confront its entrenched beliefs and adapt to new ways of understanding.”

      Conclusion

      Hopefully this post has given a strong argument for the limited claim that Christianity is not a perennial enemy of science and civilizational progress. And perhaps it has also given some evidence for the idea that scientific advancement benefits from the contact and communication of different worldviews, assumptions, and frameworks of belief, and that Christianity, or religious belief in general, is not necessarily harmful for this broader project. Without question there can be places and times in which dogma, oppression, and fanaticism inhibit freedom of thought and impede the scientific project – but these can be found not only in the fanatical religious periods of the Wars of Religion or the fall of the Caliphates, but also in the fire of secular fanaticism such as Lysenkoism or the Cultural Revolution, or even the simple oppressive weight of institutional gravity, as was the case of the imperial exam system in China.

      What can we take away from this historical investigation to inform the present and future?

      Normally, we consider universities and affiliated research labs to be the wellsprings of scientific advancement in the modern West. But given that higher education in the united states demonstrates an increasing ideological conformity (a 2016 study found that “Democratic-to-Republican ratios are even higher than we had thought (particularly in Economics and in History), and that an awful lot of departments have zero Republicans”), Americans are increasingly sorting themselves into likeminded bubbles including into politically homogenous universities, preventing the confrontation with alternative worldviews that is the very stuff of free thought, creativity, and scientific progress. And since popular perceptions imply that this trend has only been exacerbated in intervening years, it may be that the “courageous act” of “revolutionary science” that “challenges and overthrows existing norms” may have to come from an ideological perspective outside the secular, liberal worldview of modern American academia. It may be that an overtly religious institution like a new Catholic Polytechnic Institute or explicitly Free Speech schools like the University of Austin,  no matter how retrograde or reactionary they may appear to many rationalists, are the future of heterodox thinking necessary to effect the next scientific revolution.

      Of course, the future of science will be inextricably linked to Artificial Intelligence, and it remains to be seen exactly what kind of role AI (or AGI or ASI) will play in the future of scientific discovery, leaving me with nothing but questions: (when) will AI have a curious and creative spark? Will it have a model of the universe, preconceptions and biases that determine how it deals with anomalous data and limit what and how it is able to think? Or will it have the computational power and liberty to explore all possible hypotheses and models at once, an entire quantum universe of expanding and collapsing statistical possibilities that ebb and flow with each new data point? And if, or when, it reaches that point, will scientific discovery cease to be a human endeavor? Or will human interpreters still need to pull the printed teletype from Multivac and read it out to the masses like the Moses of a new age?

    1. The Life and Death of Honor

      The Life and Death of Honor

      Obituary of one of the oldest human values

      I was recently reading the book version of Disney’s Mulan to my four-year-old son when he asked me what “honor” was. Although I usually pride myself on concocting kid-friendly analogies and simplifications, I truly struggled with his question and muttered something like “people thinking you’re a good person” before moving on. The question stuck in the back of my mind, however,  and I have been continuously mulling how to mentally model “honor” in a concise way. After days of struggle, I began to read, research, and think critically about the idea, and what follows is the digest of that process.

      The concept of honor was a staple of human society since the dawn of recorded history, and yet somehow in the past 300 years it has gone the way of steamboats and horsedrawn carriages. Honor, today, is a quaint vestige at best and pathologized at worst, coming up most often in the context of “honor killings” or the “honor culture” of the American South. Outside the limited scope of military service, “honor” is nearly synonymous with “machismo” or “bravado”, a puerile attachment to one’s own ego and reputation (or that of one’s kith and kin).

      A comparison of a random selection of some other broad but unrelated terms demonstrated that the fall of honor is not just absolute but relative – freedom, for example, was a minor concern in the 16th century but has since dwarfed honor.

      Interestingly honor was more prevalent than even “love” in the 16th century but the opposite holds true since then.

      Wikipedia implies that honor is a proxy for morality prior to the rise of individuality and inner consciousness: in our earlier, tribal and clannish stages of moral development, the perceptions of actions by others and how they reflected on our kith and kin were much more important than any inherent deontological moral value, hence honor.

      And yet there is a part of us that knows that this idea is missing something critical. When we read or watch stories about “honorable” characters like Ned Stark or Aragorn talking about honor, we don’t think of them as being quaint, macho, and motivated by superficial public perceptions of their clans and relatives. We know that when they are talking about their honor, they are talking about being morally upstanding figures who do the right thing regardless of the material and reputational cost (to quote Aragorn, “I do not fear death nor pain, Éomer. Only dishonor.” -JRR Tolkien, Return of the King). When we read this, we know that that is not socially-minded showmanship but rather bravery and altruism, and a reader is supposed to like Aragorn for it.

      Upon contemplating this and reading further, it became obvious that “honor” was a catch-all term for many different qualities. It refers to personal moral righteousness and familial reputation, but it also refers to one’s conduct in warfare, or one’s fealty to one’s liege, and to one’s formal recognition by society. Given the ubiquitous and multifarious uses of the term, and the fact that pre-modern peoples seem to have absolutely obsessed over it (prior to 1630 it was as important as love and vastly more important than freedom), it stands to reason that it was useful and good. So how exactly can we explain the benefits of honor and what it meant?

      The Benefits of Honor

      I came to the following categorizations of how honor worked and why it was useful in pre-modern society, from shortest to longest:

      1. A solution to the game-theory of pre-modern warfare

      In the modern world there are the International Criminal Court to enforce international law and prevent war crimes, and an international press to publicize the violation of treaties and ceasefires. In the premodern world, these institutions did not exist. What prevented pre-modern peoples from regularly engaging in these acts? To some extent the answer is “nothing”, and indeed rape, pillage, and general atrocities were a constant feature of premodern warfare: the ancient Roman statesman Cato the Elder (234-149 BC) coined the phrase “The war feeds itself” (bellum ipsum se alet) to explain that armies would sustain themselves by pillaging and starving occupied territories, and it is telling of the longevity of that mode of warfare that the phrase is most heavily associated with a war nearly two millennia after Cato, the Thirty Years’ War of 1618-1648 AD.  One institution that may have held these atrocities in check was the papacy and later state churches, though it stands to reason that a commander willing to commit such venal acts might not be dissuaded simply by threats of additional eschatological penalties. But one additional force holding back the worst of human nature during premodern war may indeed have been the concept of honor. A society that places a high value on honor means that individuals will pay a high reputational cost for such actions as attacking civilians, violating treaties, or encouraging mass atrocities. This societal expectation discourages individuals from engaging in such behavior because they would lose honor – and as honor was transmitted familially, their families would be marked for generations. In a society where legacy is important, staining that legacy in perpetuity for a one-time military benefit may have made some commanders think twice.

      1. A heuristic to encourage the efficacy of pre-modern society and administration

      It is difficult for us moderns to understand the extent to which the pre-modern world was personal. Max Weber, one of the founders of modern sociology, viewed modernity as a process of transitioning from a Gemeinschaft (community) into a Gesellschaft (society). In the former, looking out for one’s friends and family, using influence for their benefit and helping them get into positions of power is considered good and dutiful; in the latter, being dutiful means impersonally discharging the role of one’s office without regard to personal relationships; doing too much favoritism is considered corruption or nepotism. Indonesia’s former president Suharto once neatly encapsulated the difference (and revealed that Indonesia was still a Gemeinschaft) with the quote “what you call corruption, we call family values”.

      Most pre-modern societies, particularly feudal ones, had almost non-existent states, the governing apparatus being almost completely dissolved and reconstituted upon a monarchical succession. 14th century England had only a few hundred people in direct employ of the crown, most of them being tax collectors, royal estate and forest managers, and keepers of the peace. A monarchical succession meant that a new person with his or her own network of dependents, friends and trustees would need to pick and choose his or her own councilors and appointees to go out and do things.  How was a monarch to choose people for all of these roles? What all this meant was that the work of state administration was built on personal reputation. If a monarch needed something done well, they needed a quick metric to be able to assess the efficacy of an appointee. To that end, they could simply use someone’s reputation for honor.

      Thus, to the extent that honor encompasses such qualities as honesty and forthrightness, it would encourage the enforcement of contracts and upholding of laws. If it encodes diligence, willingness to abide by oaths and be overt in one’s declarations of allegiances, then it would help to encourage stable politics and relations amongst peers of the realm (an above-board network of alliances allows a rational calculus of when and whether to initiate a conflict; a clandestine alliance system gets us the tangle of World War One). If honor encompasses honesty and charity, it would entail dependability in collective and remitting taxes and making necessary investments to prevent or curtail social unrest by the lower classes. And most importantly, honor was a stand-in for loyalty to one’s liege and the crown. If you’re assigning a general to lead an army or a governor to lead a wealthy province, you want to be sure that they’re not going to betray you. Honor serves as both a metric for that likelihood and, failing that, a reputational deterrent on future generations of a traitor.

      1. A general shorthand for morality that is responsive to changing moral frameworks

      Western civilization has spoken of “honor” over thousands of years, and what that means in terms of personal virtues has changed radically over that time. One of the most important ideas of Friederich Nietzsche is the conceptualization of Slave Morality versus Master Morality. In Nietzsche’s conception, the Master Morality of the societies that used to be slave masters (Mesopotamians, Greeks, Romans) was later overcome by an inverted Slave Morality of those who used to be their slaves, i.e. the Judeo-Christian moral values.

      Let us first examine how honor worked in the Master Morality of the ancient world. In this conception, what was morally good was to be judiciously utilize the benefits of being at the top of the social pyramid, to become the most powerful and beautiful version of oneself, to surpass one’s rivals, to fully self-actualize. We see this fully laid out in epics such as the Iliad, wherein what is exalted is martial, mental, and interpersonal prowess. As Hector explains in the Iliad (Book VI), “I have long understood, deep in my heart, to defend the Trojans under Hector’s spear, and to win noble glory for myself and my forebears”.  We see this carried over into Roman society as exemplified by the pursuit of glory and the acquisition of familial “honors” which is how the word (honor) is used when it first enters the Western lexicon. Ancient Romans, particularly in the late Republican period, were absolutely obsessed with acquiring honors, in the plural. In this sense, honors often means public recognitions of honors via titles, offices, and displays such as the all-important triumph in which the celebrated man would have to be reminded that he was mortal lest he follow the path of Bellerophon and think he had acquired divinity. By the late Republic the quest for honors had become an obsession, and their pursuit was fuel for the civil wars and lust for power that ended the Roman Republic. To wit, Cicero comments in his De Officiis (On Duties), “honor is the reward of virtue, and the pursuit of honor is the very aim of every great man. It is the highest obligation to seek the recognition of those who are worthy, not for personal gain, but for the service of the state.” And as later Roman commenters (Livy) observed in looking back on that period, “No man is truly great unless he has acquired honor through the strength of his own actions. It is the pursuit of honor that drives men to greatness” (History of Rome book 2). In other words, honor in the Master Morality framework was exogenous, not endogenous. It was about getting others to recognize your greatness.

      The rise of the former slave populations with their Slave Morality truly inverted things.  In the Gospels Jesus intones repeatedly against the pursuit of public recognition: “When you give to the needy, sound no trumpet before you, as the hypocrites do in the synagogues and in the streets, that they may be praised by others” (Matthew 6:2); “you know that the rulers of the Gentiles lord it over them, and their great ones exercise authority over them. It shall not be so among you. But whoever would be great among you must be your servant, and whoever would be first among you must be your slave” (Matthew 20:25-26) and further “for they loved the glory that comes from man more than the glory that comes from God” (John 12:43). The message of humility naturally was taken up by those who were materially already humbled by the existing socioeconomic order (the slaves and urban poor) and resisted by those who had the most to lose – the practitioners of the Master Morality of public honor and glory. Over the following three hundred years Christianity came to be the dominant religion of the Mediterranean world. What followed in the West was literal moral revolution, the overthrow of the Masters by the Slaves, and the creation of a new order in which honor was still a goal, but the means of its attainment shifted radically. Thus by the 5th century St. Augustine echoed the same message “Let us then approach with reverence, and seek our honor in humility, not in pride” (City of God, published 426). Through the Middle Ages we hear from Dante Alighieri that “The greatest honor is not that which comes from men, but from God. And the greatest humility is knowing that, without His grace, we are nothing” (Divine Comedy, 1321). And at the end of the Medieval period, Sir Thomas Mallory repeats the same message, that “He who is humble in heart, his honour shall be pure and his glory eternal; but pride is the enemy of honor and virtue” (Le Mort d’Arthur, 1470).

      The Death of Honor

      As we saw from the n-grams above, Honor died around 1700 in Western Europe, as “the old aristocratic code of honor was gradually replaced by a new middle-class ethic of self-discipline, hard work, and social respectability” (Lawrence Stone, “The Crisis of the Aristocracy” (1965)). But following the trend line, its subsequent exhumation in the 19th century was as a pastiche to reminisce about or poke fun at, not as a genuine revival of the cultural value. For example, in Sir Walter Scott’s magnum opus Rob Roy, the word “honour” appears 152 times in just a little over 500 pages. Jane Austen used the term quite often, 256 times in her collected works, but anyone who has read Austen will know that she came to bury honor, not praise it. 

      The exact reasons for the decline in honor are difficult to pinpoint as there were myriad processes unfolding at the time. The Enlightenment subjected many cultural values to rational scrutiny and Enlightenment thinkers like Voltaire had no mercy for the concept that they wished to be rid of with great urgency: “The idea of honor has caused more bloodshed than any other human folly.” Voltaire Candide 1759. If one looks back on the typology for the benefits of honor above, we can see that many of the reasons for honor’s existence began to be moot by the 18th and 19th  centuries. For example, the growth of centralized bureaucratic states allowed expanded recordkeeping and objective evaluations of merit, eliminating the need for reputational heuristics. Increased law, order, communication and infrastructure meant greater movement and atomization of the individual as the Gemeinschaft gave way to the Gesellschaft; familial reputation gave way to licenses, certifications, degrees, and more affected signals of social status. And as “honor” used to be a vacuous stand-in for any number of human virtues and moral qualities, it was with little difficulty that it could simply be replaced by more precise terms for those qualities, e.g. “honesty” or “bravery”. Thus “honor” came to be a term of critique for those areas of the world most resistant to modernizing and enlightenment influences, where “honor culture” and “honor killings” persist as remnants of what once was the dominant mode of human thought and moral reasoning.

      Epilogue: Another Resurrection?

      Looking back at the n-grams, we can see one last fillip on the trend line beginning in the late 20th century. While no clear explanation has been put forth for this, one obvious suspect would be the rise in historical fantasy. With Lord of the Rings, Dungeons and Dragons, Game of Thrones, and countless other fantasy worlds attracting millions of fans on screens, pages and tabletops, it is little wonder that concomitant historical concepts such as “honor” should rise in popularity as well. This clear growth in this direction can be evidenced by the fact that historical fantasy tropes such as dragons, empires, knights and castles have seen large growth in popularity from the 1990s, but historical terms with less resonance in historical fantasy such as “crusade”, “dowry”, “abbot” and “chastity” demonstrate no such gains.

      Nevertheless, the usage in the English lexicon of “honor” remains a small fraction of what it once was. Honor continues to interest us academically and fictionally, but there is little chance of it returning to guide our choices and moral values in the here-and-now.

    2. The Future Is Ours: A Short Dissection of Accelerationisms, Left, Right, and Center

      In modern usage, the term “accelerationism” is claimed by far-right groups as a philosophy of destabilizing society to bring about a more authoritarian and conservative future. However, soi-disant accelerationists have no monopoly on accelerationist ideas. That is to say that the perspective of “acceleration” of society through stages is neither new nor confined to the political right; accelerationist mindsets are espoused by various groups aspiring to “accelerate” society toward some predicted end and effect a transformation to a more “ideal” version of society. Though the nominal idea of accelerationism is widely conceived as radical and dangerous in most interpretations, the general concept of “accelerating” society toward a predefined end has a long history on many points on the political spectrum and has through its real-world political effects substantially influenced the modern world. To understand where accelerationist ideas come from, it is worthwhile to investigate, in brief, their history and legacy. It is also worthwhile to investigate their fundamental flaws.

      The Philosophical Underpinnings

      The concept of society moving toward an inexorable end is not new, but neither is it universal; many ancient peoples kept time with respect to dynasties or the founding of cities, commencing cycles that were inevitably reset every time a dynasty or city fell – for a modern relic of this system, we can see the Japanese imperial calendar or gengō system, in which the current year is Reiwa 2, second year of the reign of the new emperor. Ancient Romans kept time with relation to the founding of the city or by reference to the consuls who were in power in a particular year.[i] With the rise of monotheistic religion, however, societies began keeping time with respect to immutable events, such as the birth of Jesus or the Hijra of Mohammed – fixed dates that allowed a linear outlook on time irrespective of the city or ruling family one happened to live near. These societies also prophesied the eventual arrival at some future event, be it the end of the world or the coming of the Messiah, and even into the early modern era it was common to think that human actions could help bring it about – for example, in the 1500s, Jews began settling in the holy land, not to create a Jewish state like the modern Israel, but rather, they “hoped to accelerate the coming of the Messiah”[ii].  In the late 18th century, the German philosopher Friederich Hegel gave rise to a conception of history moving through a set of defined stages. For Hegel, this progress was most clearly visualized in the form of European civilization passing from pre-civilized barbarism, to slavery under classical societies, to the theological thought during the middle ages and culminating (for him) in the humanism and enlightenment philosophy of his time. For Hegel, this furthering of civilization was in turn furthering the evolution of the Weltgeist, or the Worldspirit, the collective mental and spiritual progress of humanity that developed inexorably toward greater liberation.

      “[…]The world spirit, has possessed the patience to pass through these forms over a long stretch of time and to take upon itself the prodigious labor of world history, and because it could not have reached consciousness about itself in any lesser way, the individual spirit itself cannot comprehend its own substance with anything less.” – Hegel, Preface, Paragraph 29[iii]

      Left-Accelerationism

      Without question the most famous application of Hegelian history was made by Karl Marx, who took the idea of historical stages and wedded them to another (and more long-lived) Hegelian philosophical invention of “dialectics” – the idea that a prevailing and dominant idea (a “thesis”) is at some point confronted with a contrary or opposite idea (the “antithesis”), and the result of this conflict of ideas is that one of the ideas would win out but be altered in the process, producing a new idea (the “synthesis”), which in being dominant would be the new thesis, continuing the cycle. Marx took this Hegelian dialectic formula and famously applied it to social classes, seeing one dominant class as the thesis, a rival class as the antithesis, and the result of their inevitable conflict would be a new synthesis and new social order, which would inevitably be challenged by a new class. Thus society progressed from slavery to feudalism to capitalism to communism.

      What does this have to do with Accelerationism? Well, the first real example of Accelerationism is tied to Marxist thought. Communism, according to Marx, could only come about once the philosophical infrastructure of Capitalism was in place, for only the underclass of capitalism, the proletariat, could overthrow the oppressive bourgeoisie and institute Communism. Marx was wedded to the inevitability of the entire endeavor:

      “The advance of industry, whose involuntary promoter is the bourgeoisie, replaces the isolation of the labourers, due to competition, by the revolutionary combination, due to association. The development of Modern Industry, therefore, cuts from under its feet the very foundation on which the bourgeoisie produces and appropriates products. What the bourgeoisie therefore produces, above all, are its own grave-diggers. Its fall and the victory of the proletariat are equally inevitable.” – Marx and Engels, 1848[iv]

      But to Marxists such as Lenin and the Bolsheviks in Russia during the late 19th and early 20th centuries, the ideal socialist society they longed for was decades or centuries away: according to most observers at the time, Russia was not yet even capitalist – rather, with the ascendancy of the church, the czar, and the nobility, (Orthodoxy, Autocracy, Nationality, went the triune slogan of Russian conservatism) Russia was still trapped, economically and socially, in a kind of feudalist proto-capitalism. Thus, in the years leading up to the Bolshevik Revolution in Russia, would-be Communists were deeply conflicted over the question of Marxists stages of history. The communists wanted Communism now, but according to Marx they would first have to usher in an era of capitalism to create the necessary foundations for their long-awaited Communist system. As a result, many Russian socialists and communists in the early 20th century embraced the possibility that Russia might have to undergo a capitalist, liberal revolution before the infrastructure could be laid for a second, socialist revolution. In the 1920s, after the Russian Civil War had been put to rest, The Communist Party of the Soviet Union embraced the “New Economic Plan” which was (relative to the “war socialism” the Bolsheviks emplaced during the late teens) a market-based system of exports and investment that would aim to get the USSR’s productive capacity on par with the capitalism they sought to surpass. Mao Zedong would embrace the same kind of stepwise thinking at times, not between capitalism and communism, but rather socialism and communism, in the lead-up to the infamous “great leap forward”[v].Should the communists, therefore, support the rise of capitalism? An idea that arose to deal with this problem is an early formulation of accelerationism. If a society has to go through stages to reach a desired end-goal, then those who want the desired end-goal should do their best to speed up the natural processes.

      Accelerationism is, then, in its fundamental form, a belief in some kind of set of stages that society needs to be walked through—and support for attempts to destabilize the current system or otherwise put in place the necessary conditions to see the change transpire organically

      In the 1970s, Marxist political philosophers Hardt and Negri published an unexpectedly popular book, “Empire”, examining the way in which American Capitalism pervaded the world, but also looking (in a devil’s advocate manner) at ways in which Capitalism was setting in motion global progress toward what would come next. For example, they noted the ways in which corporations were astutely indexing and integrating all world resources and productive capacities into a networked global market. Socialists and communist grappled onto these ideas, contending, as Bolsheviks had done decades before, with the possibility that the best way to arrive at a global transition to socialism was actually to support the growth of these capitalist global structures:

      “The huge transnational corporations construct the fundamental connective fabric of the biopolitical world in certain important respects. […] Some claim that these corporations have merely come to occupy the place that was held by the various national colonialist and imperialist systems in earlier phases of capitalist development, from nineteenth-century European imperialism to the Fordist phase of development in the twentieth century. This is in part true, but that place itself has been substantially transformed by the new reality of capitalism. The activities of corporations are no longer defined by the imposition of abstract command and the organization of simple theft and unequal exchange. Rather, they directly structure and articulate territories and populations. They tend to make nation-states merely instruments to record the flows of the commodities, monies, and populations that they set in motion. The transnational corporations directly distribute labor power over various markets, functionally allocate resources, and organize hierarchically the various sectors of world production. The complex apparatus that selects investments and directs financial and monetary maneuvers determines the new geography of the world market, or really the new biopolitical structuring of the world. The most complete figure of this world is presented from the monetary perspective. From here we can see a horizon of values and a machine of distribution, a mechanism of accumulation and a means of circulation, a power and a language.”

      – Hardt and Negri, Empire, pp 32-33.[vi]

      In other words, corporations are not merely exploitative, extractive engines serving the interests of the bourgeoisie in the global north, but are rather organizing forces that mobilize resources (notably labor power) into a global connected system. Thus, Hardt and Negri argue, the modern corporation may be moving some people toward the proletarian organization that early Marxists sought to effect through cadres and labor unions. Echoing Hardt and Negri’s work, it is common these days in some corners of the internet to talk about “late-stage capitalism”, an overt assumption that society progresses in stages and that capitalism’s stage is on the way out, laying the foundation for a transition to socialism[vii]. These communists pay heed to the inevitability in Marx’s work, the teleological inexorability, which classes would find their way to conflict without need of the cadre-driven insurrection embraced by Bolsheviks and Maoists, who truly believed that they could “accelerate” the stages of history, rather than simply letting them unfold naturally.

      Technological Accelerationism

      Another form of accelerationism that had a short-lived but influential moment in the late 20th and early 21st centuries is that of a pseudo-apolitical techno-futurist accelerationism. In this conception of futurism, which held precedence just before the far-right swing in nominal futurism mentioned above, acceleration is viewed in a technological sense: society must invest in technological progress to speed us through this era of directionless sociopolitical uncertainty. In a 2017 conception,

      This accelerationism has a conservative flair (at least in the American sense): government should get out of the way and allow technology leaders to chart the path to the utopian post-scarcity future. This is a vision of acceleration, and a known future state, strongly influenced by trends of Science Fiction. “In an era where left-of-center voices increasingly paint a dark vision of the future as fraught with ecological dangers, science fiction conservatives have a near monopoly on utopian dreams of a tomorrow of abundance and technological wonders.”[viii] A prominent proponent of this conservative techno-utopian ideal was former Speaker of the House Newt Gingrich, a self-described pursuer of Star Trek-like visions of the future, who advocated a libertarian approach to scientific advancement: “If you take all the money we’ve spent at NASA since we landed on the moon and you had applied that money for incentives to the private sector, we would today probably have a permanent station on the moon, three or four permanent stations in space, a new generation of lift vehicles. And instead what we’ve had is bureaucracy after bureaucracy after bureaucracy, and failure after failure”.[ix] This same techno-libertarian futurism was on full display as late as the 2016 Republican National convention, in which billionaire tech investor Peter Thiel declared that “today our government is broken. Our nuclear bases still use floppy disks. Our newest fighter jets can’t even fly in the rain […] Instead of going to Mars, we have invaded the Middle East […] When Donald Trump asks us to Make America Great Again, he’s not suggesting a return to the past. He’s running to lead us back to that bright future.”[x]

      It was as an outgrowth of this culture – conservative, sci-fi influenced techno-utopianism, that in the late 2010s observers characterized “accelerationism” in the following way:

      “Accelerationists argue that technology, particularly computer technology, and capitalism, particularly the most aggressive, global variety, should be massively sped up and intensified – either because this is the best way forward for humanity, or because there is no alternative. Accelerationists favour automation. They favour the further merging of the digital and the human. They often favour the deregulation of business, and drastically scaled-back government. They believe that people should stop deluding themselves that economic and technological progress can be controlled. They often believe that social and political upheaval has a value in itself. Accelerationism, therefore, goes against conservatism, traditional socialism, social democracy, environmentalism, protectionism, populism, nationalism, localism and all the other ideologies that have sought to moderate or reverse the already hugely disruptive, seemingly runaway pace of change in the modern world.”[xi]

      Right-accelerationism

      Today, however, “accelerationism” is nominally more of a right-wing ideology. How did it make this transition? Communists did not maintain a monopoly on the concept of accelerating society through stages. In the 1920s, the German Nationalist (and proto-Nazi) philosopher Carl Schmitt embraced accelerationist attitudes in his belief in the need for a strong authoritarian center for modern society. Given that “the sovereign power of the king has been dissolved, disembodied, and dispersed in the communication flows of civil society, and it has at the same time assumed the shape of procedures, be it for general elections or the numerous deliberations and decisions of various political bodies,” Schmitt believed that it would be necessary for people to develop a new kind of sacred reverence for a new source of authority and legitimacy. Schmitt believed that even supposedly liberal democracies were authoritarian at the core, and that when real and consequential decisions had to be made (e.g. to fight against terrorism or a global pandemic), the pretense of procedural democracy would always be shunted aside. More specifically, he conceptualized that even a liberal democracy would encounter moments—crises—in which “exceptions” had to be made, and as Schmitt put it, “Sovereign is he who decides on the exception.”[xii] To that end, right-accelerationism attempts to bring about precisely that destabilization of society in order to reach the exception, with a kind of conservative authoritarianism able to check the undesirable aspects of liberal democracy. His answer was to call for a mythologizable and revered leader, very much like what Nazi ideology embraced regarding Hitler.

      Ever since the chaos of the 1930s and resulting ascension of fascism, political observers have noted the relationship between a breakdown in the normal fabric of society and the resulting popular support for authoritarianism. For example, economic shocks such as market collapse are often associated with increased support for tougher, roughshod measures to get things back on track.[xiii]

      This brings us to the typical modern instantiation of accelerationism: the white supremacist and far-right accelerationism embraced by, among others, the shooter who murdered 49 mosque-attendees in New Zealand in March 2019. The terrorist attack, committed in the explicit name of “accelerationism”, has set the standard for the popularity and use of the term (see: fig. 1)[xiv]. These accelerationists believe that western liberal democracies must embrace authoritarianism to rid themselves of weak and detracting elements – namely non-white people, feminists, and other components of what they consider to be “others” and part of the cultural left. Further, they feel that this sort of society will naturally come about when society is destabilized enough that the majority demands stronger security and policing. As such, they advocate chaos and anarchical behavior to shock and terrorize society in radical lockdowns and internal transformation.

      The Fundamental Error

      Accelerationist ideas across all political ideologies stem inexorably from a preconception about two things: first, a prescience about the future trajectory of the sociopolitical; second, a belief in the ability to bring about that future trajectory. From Leninists who believed that a campaign of Bolshevistic force could bring about the necessary transition to sustainable socialism to the New Zealand shooter who believed that his actions would contribute to a destabilization of society sufficient that a critical mass would call for a revocation of liberal and multicultural values, the fundamental assumption of accelerationists is an ability to tell the future. Accelerationists of all political stripes believe that the future is inherently more in line with their political goals and preconceptions, and that certain institutions of the status quo must be overcome or changed in order to arrive at that utopian end.

      Indeed, many observers, even those of us who do not believe ourselves to be “accelerationists” of any stripe are guilty of some form of this. A common instantiation of this error is that of the so-called “Whiggish” view of history, that is, that “the arc of history is long and it bends toward justice”. Though this may have been the general trajectory for the past few hundred years, to extrapolate this out a few centuries hence and to assume that society can go in no direction other than the maximization of justice is somewhat presumptive. Believing that the future is inherently on one’s side, and that all one must do to bring about one’s ideal future is clear away certain blockers in the present (e.g. removing certain injustices to accelerate the arrival of an inexorably just future) is certainly a form of accelerationist mindset, albeit a relatively dilute one.

      But such an assumption is not unique to those who view inexorable progress only in sociocultural terms – indeed, those who view progress in technological terms are equally fallible, for as desirable as the post-scarcity utopias of Star Trek and related visions of the future may be, they hinge as much on a fixed interpretation of the arc of human progress: indeed technological progress could allow humanity to escape the Malthusian trap and create a prosperous world free of competition, but it could just as likely lead to a world of Orwellian or Huxleyan social control.[xv]

      To that end, the way to avoid making the errors and assumptions of accelerationism is as follows: one must forget one’s idea of what the future will be like. Working towards a particular end will not necessarily bring it about, and may, through the invocation of opposition, bring about a countervailing reaction that undoes the entirety of one’s progress. The vicissitudes of history are fierce and many, and few institutions have the capacity to see through plans and goals through more than a few decades before “today’s problems [become] the result of yesterday’s solutions”.


      [i] Day, Abby. “Sacred Time”. The International Encyclopedia of Anthropology, 1-8. 2018. doi:10.1002/9781118924396.wbiea1919 

      [ii] Abulafia, David. The Great Sea. 2012. Ebook version, Section 4, Chapter III, Paragraph 5.

      [iii] Hegel, Friedrich. The Phenomenology of Spirit. 1807.

      [iv] Marx, Karl and Friedrich Engels. “Manifesto of the Communist Party”. 1848

      [v] Meisner, Maurice. “Mao’s China and After: A History of the People’s Republic”. Simon and Schuster, 1999.

      [vi] Hardt, Michael and Antonio Negri. Empire. Harvard University Press, 2001.

      [vii] Reddit. “/r/latestagecapitalism”. www.reddit.com/r/latestagecapitalism. Retrieved June 11, 2020. At the time of retrieval, the community had 538,889 subscribers.

      [viii] Kill Screen Staff. “How Much of a Sci-fi buff is Newt Gingrich, and what does science fiction tell us about the GOP?”. Kill Screen, February 29, 2012. https://killscreen.com/previously/articles/how-much-of-a-sci-fi-buff-is-newt-gingrich-and-what-does/. Retrieved June 2020.

      [ix] Malik, Tariq. “Newt Gingrich on Space Exploration: ‘NASA Is Standing in the Way’”. Space.com, June 14, 2011. https://www.space.com/11959-gop-presidential-debate-nasa-future-republicans.html. Accessed June 2020.

      [x] Thiel, Peter, as reported by Will Drabold. “Read Peter Thiel’s Speech at the Republican National Convention”. Time, July 21, 2016. https://time.com/4417679/republican-convention-peter-thiel-transcript/

      [xi] Beckett, Andy. Accelerationism: how a fringe philosophy predicted the future we live in”. The Guardian, May 11 2017. https://www.theguardian.com/world/2017/may/11/accelerationism-how-a-fringe-philosophy-predicted-the-future-we-live-in

      [xii] Schmitt, Carl. Political Theology: Four Chapters on the Concept of Sovereignty. George D. Schwab, trans. (MIT Press, 1985 / University of Chicago Press; University of Chicago edition, 2004 with an Introduction by Tracy B. Strong. Original publication: 1922, 2nd edn. 1934.

      [xiii] Haggard, Stephan and Robert Kaufman

      [xiv] Figure 1: Source: Google Trends. https://trends.google.com/trends/explore?date=2010-07-31%202020-07-31&q=accelerationism. Retrieved July 31, 2020. DOI: 10.6084/m9.figshare.12745526

      [xv] A particularly insightful comparison can be drawn from McMillen Stuart, “Amusing Ourselves to Death”, Recombinantrecords.com, May 2009. However, McMillen deleted his claim to this comic given claims by copyright holders of Postman, Neil. “Amusing Ourselves to Death”. Viking Penguin, Methuen, UK, 1985.