Tag: progress

  • The War That Wasn’t: Christianity, Science, and the Making of the Western World

    The War That Wasn’t: Christianity, Science, and the Making of the Western World

    Examining the History between Science and Christianity

    During my early adulthood I was a zealous New Atheist, and as such believed wholeheartedly in a message that was central to the NA movement: that Christianity had been a parasite on Western civilization, dragging humanity into the Dark Ages and smothering science until skeptics and Enlightenment thinkers finally pulled us back into the light. While studying European history in depth, though, I began to see cracks in that story. The relationship between Christianity and science wasn’t as clear-cut as New Atheists made it out to be, and in some elements was rather constructive. But the question remained in my mind, and it grew into a larger curiosity about what made the West different—one that would eventually drive my MA studies in international economics and development.

    Recently, though, something surprising happened: I saw the old narrative resurface. If you were active in New Atheism circles in the 2000s (or honestly if you were active on the internet at all; to quote Scott Alexander, “I imagine the same travelers visiting 2005, logging on to the Internet, and holy @#$! that’s a lot of atheism-related discourse) you probably saw a chart that looks something like this:

    While many of those who were active New Atheists back in the early 2000s have mellowed out and found other ideological motivations (or disenchantments), it seems there is a new generation of zealous Atheists engaging with these ideas for the first time, and one secular group I saw the above graphic posted unironically, even with the ridiculous attribution of “Christian” dark ages. The resurfacing of that old certainty, combined with a provocative new scholarly article offering an economic perspective on Christianity’s role in scientific progress, prompted me to revisit the question. How exactly did Christianity interact with science throughout history? The answer is messier, and far more interesting, than the stories I once took for granted. I will tackle this in a way that is at once thematic and chronological: 1) the Dark Ages (or Early Middle Ages) 2) The (High) Middle Ages, and 3) the modern (Post-renaissance) world.

    I. Did Christianity Cause the Dark Ages?

      First, I want to address the easiest part of the question: did Christianity somehow cause the Dark Ages?

      I think this can be answered very briefly with an unqualified “no”, and I will go even farther and say “quite the opposite”. I don’t know of any reputable modern historians who would say otherwise. Historically there have of course been literally hundreds of different causes blamed for the fall of the Western Roman Empire and the ensuing “Dark Ages”, including such obscure culprits as infertility, trade imbalances, loss of martial virtues, and wealth inequality. Yes, contemporaries in late antiquity did blame Christianity and the abandonment of traditional paganism for the fall of Rome. For example one of the last notable pagan scholars, the Eastern Roman historian Zosimus, put it plainly that “It was because of the neglect of the traditional rites that the Roman Empire began to lose its strength and to be overwhelmed by the barbarians.” (Historia Nova, Book 4.59), and most famously (Saint) Augustine of Hippo wrote his City of God to refute such perspectives (though he wrote before Zosimus): “They say that the calamities of the Roman Empire are to be ascribed to our religion; that all such evils have come upon them since the preaching of the Gospel and the rejection of their old worship.”(The City of God, Book 1, Chapter 1). Needless to say, these were not modern academic historians and were clearly making biased assertions based on the vibes of the day.

      Today the historical consensus seems to be some combination of climate change-induced plague (see Harper, Kyle, “The Fate of Rome”), mismanagement by later emperors of the resulting chaos and malaise, and most importantly the Völkerwanderung, “The wandering of the peoples,” i.e. the migration of massive numbers of people from central and western Eurasia into Europe during the 4th and 5th centuries. To summarize the contemporary consensus into one sentence: the Western Roman Empire fell because decades of plague and civil war made it too weak to repulse or assimilate the entire nations of people migrating into Western Europe, and thus had to cut deals, cede territory, and slowly “delegate itself out of existence” (Collins, “Early Medieval Europe”). In direct refutation of the claim that Christianity caused the Dark Ages, the Christianization of these Germanic and other peoples was a vital channel for the transmission of culture and values and an important step toward (in Rome’s conception) civilizing and settling them in the Mediterranean world (see https://en.wikipedia.org/wiki/Ulfilas as one example).

      As further refutation, the brightest spark of European Civilization during the Western European Dark Ages (roughly 500-800) was the Eastern Roman Empire, which was unquestionably *more* thoroughly Christian than the “Darkening” West. The Eastern empire boasted a stronger institutional religious structure, with the emperor himself dictating much of theological policy, and strongarm enforcement of official positions (e.g. with the Christological debates of late antiquity) was common: “The Byzantine emperor, always the head of the imperial hierarchy, automatically evolved into the head of this Christian hierarchy. The various bishops were subservient to him as the head of the Church, just as the governors had been (and were still) subservient to him as the head of the empire. The doctrines of the Christian religion were formulated by bishops at councils convened by the emperor and updated periodically at similar councils, with the emperor always having the final say” (Ansary, Tamim, “Destiny Disrupted”).

      The Western empire, by contrast, struggled for centuries with institutionalization and conversion, with the Catholic church wrestling not just with latent paganism and heretical syncretism among rural populations but also an existential battle with Arian Christianity (a nontrinitarian form of Christianity that asserts that Jesus not an incarnation of God but merely a lesser creation of God), common for centuries among the ruling strata of Vandals and Goths in the early middle ages; “the Vandals were Arian Christians, and they regarded the beliefs of the Roman majority as sufficiently incorrect that they needed to be expunged… Arianism continued as the Christianity of ‘barbarian’ groups, notably Goths, Vandals and eventually Lombards, into the seventh century” (Chris Wickham, “The Inheritance of Rome”). Though I will risk overextending my argument here, I will say that the Church in fact prevented the Dark Ages from being even worse: “the Church emerged as the single source of cultural coherence and unity in western Europe, the cultural medium through which people who spoke different languages and served different sovereigns could still interact or travel through one another’s realms.” (Ansary).

      There is a caveat to all this, though. Christianity did seem to have a deleterious effect on the logical philosophy of the late Empire. I have been able to find at least three separate early Christian philosophers who all deliver variation on the same idea that faith should triumph over logic and reason:  “The nature of the Trinity surpasses all human comprehension and speech” (Origen, First Principles, Preface, Ch. 2); “If you comprehend it, it is not God” (Tertullian, De Carne Christi); and “I believe because it is absurd”, “Credo quia absurdum est” (Augustine of Hippo). But it is important to contextualize these perspectives in a general trend towards mysticism in late antiquity. Christianity was not alone, as Mithraism, Manicheism, Sun worship, and other prescriptive revealed religions swirled in the ideological whirlpool of late antiquity, and also to see the rise of all of the above as reactions to the declining state of the political economy: as we see evidenced today, material insecurity pushes people toward the comfort of religion (see e.g. Norris and Inglehart, Sacred and Secular).

      II. Did Christianity Hinder or Help the Birth of Modern Science?

      This question is somewhat more difficult to answer, and I originally had drafted much more, but decided to cut it down to prevent losing anyone in an overly academic morass. To summarize what I see as the answer to this question, there are two necessary components to the rise of modern science, the ideological and the structural. Ideologically, the quest to understand the divine plan through exploration of the natural world was common to both the Christian and Islamic proto-scientists, but when authorities decided this ideological quest was becoming threatening, structural changes that had taken place in Christendom (in part as a result of religious motivations) but not in the Caliphate saved proto-science in the former from the same fate as the latter. Thus Christianity initially helped, then became antagonistic toward emerging proto-science, but by the point that things got antagonistic, structural changes prevented the Church from effectively tamping out the proto-scientific sparks. Let’s expand a bit, but first a note about the Twelfth Century Renaissance.

      In the west, the critical window here was the 12th Century Renaissance and the resulting changes that took place in the 13th century. The 12th Century Renaissance is less well-known than “The” Renaissance of the 15th century, but arguably had more far-reaching consequences in terms of laying the foundations of Western civilization and culture. “Although some scholars prefer to trace Europe’s defining moments back to the so-called Axial Age between 800 and 300 b.c., the really defining transformative period took place during the Renaissance of the twelfth and thirteenth centuries. That is when the extraordinary fusion of Greek philosophy, Roman law, and Christian theology gave Europe a new and powerful civilizational coherence.” (Huff)

      The Twelfth Century Renaissance witnessed two unrelated trends that came together at the end of the 13th century in one seminal decade, which I will unpack in later paragraphs. The first trend is the reintroduction (via translation schools in Toledo, Constantinople and the Papal States) of most of the works of Aristotle, giving birth to a “new intellectual discipline [that] came to be known as ‘dialectic.’ In its fully developed form it proceeded from questions (quaestio), to the views pro (videtur quod) and con (sed contra) of traditional authorities on a particular subject, to the author’s own conclusion (responsio). Because they subjected the articles of faith to tight logical analysis, the exponents of the new rational methods of study became highly suspect in the eyes of church authorities” (Ozment). The second trend was the rediscovery of Roman Law which triggered an immense restructuring of legal rights and entities in the entire Western world: “An examination of the great revolutionary reconstruction of Western Europe in the twelfth and thirteenth centuries shows that it witnessed sweeping legal reforms, indeed, a revolutionary reconstruction, of all the realms and divisions of law […] It is this great legal transformation that laid the foundations for the rise and autonomous development of modern science […]” (Huff).

      We will examine how the novelties of the Twelfth Century Renaissance interacted to create the preconditions for Science as we know it, by examining the confluence of the two requirements, the ideological and the structural.

      Ideologically, what was required for the creation of science was the attempt to use existing knowledge to understand the underlying structure of the world, i.e. the codification and understanding of the scaffold of natural laws that allowed an understanding of the way the world worked. The belief in a knowable creator god seems to have given rise to this concept in the Abrahamic world. In his “history of the world through Muslim eyes,” Destiny Disrupted, Tamim Ansary encapsulates that both Christian and Muslim proto-scientists shared the same goals of understanding God through natural observation: “As in the West, where science was long called natural philosophy, they [Abbasid-era Muslim philosophers] saw no need to sort some of their speculations into a separate category and call it by a new name[…]science as such did not exist to be disentangled from religion. The philosophers were giving birth to it without quite realizing it. They thought of religion as their field of inquiry and theology as their intellectual specialty; they were on a quest to understand the ultimate nature of reality. That (they said) was what both religion and philosophy were about at the highest level. Anything they discovered about botany or optics or disease was a by-product of this core quest[…]”. Islamic and European civilization both shared the Greek intellectual roots: “Greek logic and its various modes were adopted among the religious scholars.” (Huff) China, in contrast, along with the pre-Christian Mediterranean world, had admirable command of engineering principles and keen natural observation as exemplified by the likes of Zhang Heng, Archimedes, or Heron of Alexandria, but while visionary voices likely existed in each, neither Classical nor Chinese civilization generally adopted an ideological outlook that sought to comprehend what a bunch of discrete natural and engineering phenomena, such as the flow of water, motion of planets or architectural physics, might all say about the fundamental structure of the world or the will of the divine. To nail the issue even more tightly shut, “traditional Chinese mathematics was not abstract because the Chinese did not see mathematics in any philosophical sense or as a means to comprehend the universe. When mathematical patterns were established, they ‘were quite in accord with the tendency towards organic thinking’ and equations always ‘retained their connection with concrete problems, so no general theory could emerge’  (Olerich 22). In the west, in contrast, the Twelfth Century Renaissance added jet fuel to the existing ideological quest to create general theories and comprehend the universe: “In a word, by importing and digesting the corpus of the “new Aristotle” and its methods of argumentation and inquiry, the intellectual elite of medieval Europe established an impersonal intellectual agenda whose ultimate purpose was to describe and explain the world in its entirety in terms of causal processes and mechanisms” (Huff 152).

      Structurally, what was required for the creation of modern science was the institutional independence of proto-universities to explore questions that ran contrary to social and religious dogmas. As historian Toby Huff explains, the new legal world created by the rediscovery of Roman legal codes was a veritable Cambrian explosion for European institutions and ideas

      “For example, the theory of corporate existence, as understood by Roman civil law and refashioned by the Canonists and Romanists of the twelfth and thirteenth centuries, granted legal autonomy to a variety of corporate entities such as cities and towns, charitable organizations, and merchant guilds as well as professional groups represented by surgeons and physicians. Not least of all, it granted legal autonomy to universities. All of these entities were thus enabled to create their own rules and regulations and, in the case of cities and towns, to mint their own currency and establish their own courts of law. Nothing like this kind of legal autonomy existed in Islamic law or Chinese law or Hindu law of an earlier era.”  (Huff)

      To expand, whereas up to the 12th century, the legal forms in Western Europe were almost wholly those that had been imported from Germanic feudal law, a highly personalist structure of fealty and dependence – land, businesses, countries, churches were the responsibility of individual lords, artisans, kings, bishops, what have you – who depended on their superiors for the mere right to exist. The idea of corporate personhood, that “corporations are people”, (putting aside all of the exploitative and oligarchic connotations it has taken on in the context of the American political scene in the 21st century) was a fascinating, powerful, liberating idea in the 12th century, and one that proved critical to the rise of modern science. Quickly towns, cities, mercantile ventures, and most critically cathedral schools, seminaries, and proto-universities strove to incorporate (literally, “to make into a body”) their existence in the Roman legal mold – no longer were they merely collections of people, they argued their way into being as legal entities distinct from their members and adherents. Further, the Catholic church enriched its canon law with Roman borrowings and promoted the creation of formal legal studies, for example at the University of Bologna. The Compounding with the ideological ferment after the reintroduction of Aristotle and other new texts, “European scholars began gravitating to monasteries that had libraries because the books were there[…] Learning communities formed around the monasteries and these ripened into Europe’s first universities.” (Ansary), which could then as independent corporate entities survive political or theological pressure on or from any individual member.

      We can quite clearly examine the benefit of this arrangement by counterfactual comparison with Islamic society. Despite the fact that Islamic society also had a quest to understand the divine structure of the physical world and thus shared the same ideological perspectives that gave rise to proto-science, the very different institutional structure of the Islamic world resulted in a very different outcome for Islamic science. As philosophers began to question fundamental religious dogmas such as the necessity of revelation or the infallibility of the Quran, “the ulama were in a good position to fight off such challenges. They controlled the laws, education of the young, social institutions such as marriage, and so on. Most importantly, they had the fealty of the masses” (Ansary). The intellectual institutions such as Islamic awaqaf (plural of waqf,  “pious endowment”) that did house these nascent intellectual pursuits were not legally independent but were the dependencies of individual sponsors who could apply pressure – or have pressure applied to them – and their very nature as pious endowments meant that “they had to conform to the spirit and letter of Islamic law” (Huff). Reaching higher into the political structure, In the 10th century, most of the Islamic world was united under the Abbasid Caliphate, and consequently a reactionary shift by the government could result in a persecution that could reach to most of the Islamic world. That is precisely what happened, for the Ulama used their influence to force a change in direction of the Caliphate. After a high tide of intellectual ferment, subsequent persecution of the scientist-philosophers under the next Caliph “signaled the rising status of the scholars who maintained the edifice of orthodox doctrine, an edifice that eventually choked off the ability of Muslim intellectuals to pursue inquiries without any reference to revelation.” (Ansary). And just to once again contrast with the Far East, “In China, the cultural and legal setting was entirely different, though it too lacked the vital idea of legal autonomy” (Huff). Most importantly in China, the dominance of the Civil Service Examinations served as a gravity well attracting all intellectual talent to a centralized, conservative endeavor, stifling other intellectual pursuits: “This was not a system that instilled or encouraged scientific curiosity […] The official Civil Service Examination system created a structure of rewards and incentives that over time diverted almost all attention away from disinterested learning into the narrow mastery of the Confucian classics.” (Huff 164).

      Bringing this all together, in the West, the twin fuses of ideological ferment and corporate independence intertwined, and who else would light the spark aside from the Catholic church. As noted already, the Church realized the threat posed by the rising tide of Aristotelianism and its promotion of rigorous logical examination of the Church’s teachings. Whereas earlier in the 1200s the tendency was to try to find common ground between Aristotle and Christianity, or even to use them to reinforce each other as exemplified by Thomas Aquinas, by the latter part of the century conservative elements in the church saw Aristotelianism as an inherently hostile cancer, and in 1270 and again in 1277 they declared war, issuing (a reinforcing) a blanket condemnation of Aristotle.  Historian Steven Ozment explains that “In the place of a careful rational refutation of error, like those earlier attempted by Albert the Great and Thomas Aquinas, Bishop Tempier and Pope John XXI simply issued a blanket condemnation. The church did not challenge bad logic with good logic or meet bad reasoning with sound; it simply pronounced Anathema sit.”


      The momentousness of this decision for the course of western thought cannot be overstated, for it represented an end to the attempt to reconcile theology and philosophy, science and religion. “Theological speculation, and with it the medieval church itself, henceforth increasingly confined itself to the incontestable sphere of revelation and faith[…] rational demonstration and argument in theology became progressively unimportant to religious people, while faith and revelation held increasingly little insight into reality for secular people.” (Ozment, Steven “The Age of Reform 1250-1550”). In short from 1277 onward, the religious got more religious, and the rational became more rational.

      We see in action the importance of corporate independence and decentralized governance, because there were attempts to stamp out Aristotelianism: In England, there were attempts at Oxford in the late 13th century to restrict the teaching of certain Aristotelian texts. In 1282, the Franciscan Minister General Bonagratia issued statutes attempting to limit the study of “pagan” philosophy (mainly Aristotle) among Franciscan students. In the Dominican Order, after Aquinas’s death, there were some attempts by conservative members to restrict the teaching of his Aristotelian-influenced theology. The Dominican General Chapter of 1278 sent visitors to investigate teachers suspected of promoting dangerous philosophical doctrines. But these efforts failed, and universities proudly asserted their newfound legal independence: the University of Toulouse, which incorporated in only 1229, declared that “those who wish to scrutinize the bosom of nature to the inmost can hear the books of Aristotle which were forbidden at Paris” (Thorndike). The University of Padua became particularly known as a center for “secular Aristotelianism” in the late 13th and 14th centuries, and maintained a strong tradition of studying Averroes’ commentaries on Aristotle even when these were controversial elsewhere (Conti, Stanford).

      But for the thinkers during and just after this time period, the intellectual whiplash stimulated new thought that truly began the rebirth of scientific thinking in the western world. Instead of blindly taking either the Church or Aristotle at face value, the idea that they could be in conflict gave rise to the idea that either or both could be wrong.  The University of Padua mentioned above Scholars such as Jean Buridan or Nicole Oresme began their studies in religious matters (the former was a cleric and the latter a bishop) before turning to “scientific” studies, but their questioning of both religious and Aristotelian dogmas led them to pierce through accepted dogmas, making unique contributions to a wide variety of fields and generally considered to have lain the foundations for the coming scientific and Copernican revolutions.

      III. How have science and religion reacted Post-Renaissance?

      In a recent post on MarginalRevolution, economist Tyler Cowen linked a new article which tears this ancient quarrel new abroach, at least for the modern era. The opening statement concisely encapsulates the picture painted above:  “Today’s leading historians of science have ‘debunked’ the notion that religious dogmatism and science were largely in conflict in Western history: conflict was rare and inconsequential, the relationship between religion and science was constructive overall”, and Cowen adds his commentary that “Christianity was a necessary institutional background”, as I believe the preceding section has shown. But the article by Matías Cabello picks up the story where I left off, and looks at the relationship after the Renaissance. Cabello sees the modern period as unfolding in three stages, with an increasingly secular perspective from the late Middle Ages until the Reformation, then a new period of increased religious fervor during the period of the Reformation and Wars of Religion (16th-17th centuries), finally relenting with the dawn of the Enlightenment in the early 18th century.

      Cabello’s chronology lines up closely with my own knowledge of the topic, though I admit that after the Reformation my knowledge of this period is inferior to that of the previous eras. I draw primarily from Carlos Eire’s monumental and reputed book Reformations for knowledge of this period. But in general, there’s a lot of data showing that the Reformation was a much more violent, zealous, and unscientific time than the periods straddling it.  A useful theory for understanding the dynamics of religion during this period is the Religious Market theory as formulated by Stark and Bainbridge (1987). In this theory, religions compete for adherents on a sort of market (or playing field, if you will), and in areas of intense competition, religions must improve and hone their “products” to stay competitive against other religions, but in areas where one religion monopolizes the market it becomes less competitive, vital, and active in the minds of adherents. This phenomenon is visible most clearly in the secularization of Scandinavian countries once Lutheranism enjoyed near complete monopoly for 400 years, and is often employed to explain why the pluralistic US is more religious than European countries which usually have one hegemonic church, but I would it argue it was also clearly at play in the Midde Ages. By the late Middle Ages, Catholicism enjoyed complete dominance in Western Europe against all rivals, allowing cynicism, political infighting (e.g. the western Schism which at one point saw three popes competing for recognition over the church), and, most critically, corruption to creep into the Church’s edifice. But when the Protestant Reformation broke out (in large part for the reasons just enumerated), suddenly there were several competing “vendors” who had to knuckle down and compete with each other and with different strands within themselves, leading to increased fanaticism and internecine violence for more than a century. There’s a lot of evidence that corroborates this general trend, for example Witch Hunts, which despite being portrayed in popular culture as a medieval phenomenon were decidedly a Reformation-era thing as shown in the below chart, (to wit, many of our popular ideas of the middle ages come from Enlightenment/Whig writers looking back on the 17th century and erroneously extrapolating from there).

      If I can contribute my own assembled quasi “data set”, a few years ago I put together a western musical history playlist featuring the most notable composers from each time period, and one thing that clearly jumped out to me without being aware of this historical topography was that the music before the Reformation was much more joyous and open (and to my ears, just more enjoyable to listen to) than the rather conservative and solemn music that would come just after. To sum up, a lot of indicators tell us that the period of roughly 1500-1700 would have been a much less creative, openminded, and probably fun time to live than the periods just before or after.

      Getting back to Cabello, one of the novelties of his work is in its quantitative approach to what has traditionally been a very non-quantitative area of inquiry, scraping and analyzing Wikipedia articles to see how the distribution and length of science-related figures shifted over time. His perspective is most concisely presented by his figure B2, reproduced here:

      To quote the author, “This article provides quantitative evidence—from the continental level down to the personal one—suggesting that religious dogmatism has been indeed detrimental to science on balance. Beginning with Europe as a whole, it shows that the religious revival associated with the Reformations coincides with scientific deceleration, while the secularization of science during the Enlightenment coincides with scientific re-acceleration. It then discusses how regional- and city-level dynamics further support a causal interpretation running from religious dogmatism to diminished science. Finally, it presents person-level statistical evidence suggesting that—throughout modern Western history, and within a given city and time period—scientists who doubted God and the scriptures have been considerably more productive than those with dogmatic beliefs.”

      It is no coincidence, then, that the single most famous skirmish in history between science and religion, the trial and condemnation of Galileo Galilei, came squarely in the nadir of this fanatical period (1633).

      And yet even in this context, science did, of course, continue to progress, and religious beliefs often lit the way. Historian of Science Thomas Kuhn produced what is likely the best analysis to date of how science progresses throughout the ages and how it is embedded in sociocultural assumptions. In his magnum opus, The Structure of Scientific Revolutions it is clear that the paradigm shifts that create scientific revolutions do not require secularization but rather the communion of different assumptions and worldviews. For example, the different worldviews of Tyco Brahe and Johannes Kepler, two Dutch astronomers who, as master and apprentice respectively, were looking at the same data but, with different worldviews, came to very different conclusions. Brahe believed that in a biblically and divinely structured universe, the earth must be at the center, and he as such rejected the new Copernican heliocentrism. His apprentice Kepler, however, also employed religious motivation, seeing the new heliocentric model as an equally beautiful expression of divine design, and one which squared more elegantly with the existing data and mathematics. Science, thus, is not about accumulating different facts, but looking at them through different worldviews. In one of my posts a few months ago, I mentioned that religious convictions and other reasons can push scientists to bend or break the scientific method, sometimes leading to scientific breakthroughs. One of the clearest examples was the scientific expedition of Sir Arthur Eddington, whose Quaker convictions likely made him see scientific discovery as a route towards global brotherhood and peace. In short the scientific method is an indispensable tool for verifying discoveries and we neglect it at our peril, but we must not let it become a dogma, for the initial spark of discovery often emanates from the deeply personal, irrational, cultural, or religious ideas of individuals.

      In a recent blog post, Harvard professor of AI and Data Science Colin Plewis posted the following praise of Kuhn: “Far from the smooth and steady accumulation of knowledge, scientific advancement, as Kuhn demonstrates, often comes in fits and starts, driven by paradigm shifts that challenge and ultimately overthrow established norms. His concept of ‘normal science’ followed by ‘revolutionary science’ underscores the tension between tradition and innovation, a dynamic that resonates far beyond science itself. Kuhn’s insights have helped me see innovation as a fundamentally disruptive and courageous act, one that forces society to confront its entrenched beliefs and adapt to new ways of understanding.”

      Conclusion

      Hopefully this post has given a strong argument for the limited claim that Christianity is not a perennial enemy of science and civilizational progress. And perhaps it has also given some evidence for the idea that scientific advancement benefits from the contact and communication of different worldviews, assumptions, and frameworks of belief, and that Christianity, or religious belief in general, is not necessarily harmful for this broader project. Without question there can be places and times in which dogma, oppression, and fanaticism inhibit freedom of thought and impede the scientific project – but these can be found not only in the fanatical religious periods of the Wars of Religion or the fall of the Caliphates, but also in the fire of secular fanaticism such as Lysenkoism or the Cultural Revolution, or even the simple oppressive weight of institutional gravity, as was the case of the imperial exam system in China.

      What can we take away from this historical investigation to inform the present and future?

      Normally, we consider universities and affiliated research labs to be the wellsprings of scientific advancement in the modern West. But given that higher education in the united states demonstrates an increasing ideological conformity (a 2016 study found that “Democratic-to-Republican ratios are even higher than we had thought (particularly in Economics and in History), and that an awful lot of departments have zero Republicans”), Americans are increasingly sorting themselves into likeminded bubbles including into politically homogenous universities, preventing the confrontation with alternative worldviews that is the very stuff of free thought, creativity, and scientific progress. And since popular perceptions imply that this trend has only been exacerbated in intervening years, it may be that the “courageous act” of “revolutionary science” that “challenges and overthrows existing norms” may have to come from an ideological perspective outside the secular, liberal worldview of modern American academia. It may be that an overtly religious institution like a new Catholic Polytechnic Institute or explicitly Free Speech schools like the University of Austin,  no matter how retrograde or reactionary they may appear to many rationalists, are the future of heterodox thinking necessary to effect the next scientific revolution.

      Of course, the future of science will be inextricably linked to Artificial Intelligence, and it remains to be seen exactly what kind of role AI (or AGI or ASI) will play in the future of scientific discovery, leaving me with nothing but questions: (when) will AI have a curious and creative spark? Will it have a model of the universe, preconceptions and biases that determine how it deals with anomalous data and limit what and how it is able to think? Or will it have the computational power and liberty to explore all possible hypotheses and models at once, an entire quantum universe of expanding and collapsing statistical possibilities that ebb and flow with each new data point? And if, or when, it reaches that point, will scientific discovery cease to be a human endeavor? Or will human interpreters still need to pull the printed teletype from Multivac and read it out to the masses like the Moses of a new age?

    1. The Irrationality of GMO Opposition

      The way in which the issue of GMOs is framed drastically influences opinions on the subject. If we take GMOs broadly to mean their common implementation in the context of corporatized, chemical-heavy, monocultured agribusiness, few people are strongly in favor of these systems. However, if we take GMOs narrowly to mean the simple fact of inserting positive genetic traits into crops to make them for example more drought-tolerant, more nutritious, or resistant to certain pests, strong opposition plummets.

      Let us address these two definitions in turn. As regards the broad definition, in this context there are few aspects of GMO cultivation that differ from general international agribusiness. The same criticisms that are levelled at the former are levelled at the latter. The exploitation of farmers in the developing world, the destruction of natural habitats, the use of pesticides, and the creation of environmentally monotonous monocultures are problems common to any form of corporate agriculture, be it GMO or non. It may thus behoove us to consider the possibility of decoupling these two concepts: let us consider for a moment the possibility of GMOs implemented in small-scale, ecologically harmonious, potentially organic conditions – all the problems that critics may cite regarding corporatized agriculture removed from the picture, with only the GMO technology itself remaining. To what point, then, does the level of opposition fall? I am not sure how easy it would be to ascertain this information, given the need to walk poll participants through a bit of a thought experiment, but the question does remain.

      Even so, there remains a core of critics who are starkly resistant even to the technology itself. These critics primarily see the manipulation of nature via genetic modification as an inherent wrong – nature bequeaths to us a certain set of genetic variation, and it is our station to work within that framework. To these critics, it should be pointed out that the entire history of human agriculture has involved genetic manipulation – selection of desirable traits in crops and animals for the past 10,000 years has left many of them starkly different from their pre-modern forms. Genetic modification has always taken place and always will. Another line of criticism, though, concerns the dangers of genetic manipulation. Inserting pesticidal genes into human crops might be inadvertently harming humans or the rest of the ecosystem, and the results may not be apparent for years. Thus, it is better to maintain precaution by avoiding such technologies altogether.

      This argument is not entirely incorrect, but it is still not an argument against GMOs. The fault in this reasoning is that conventional plant breeding is susceptible to exactly the same problem. For a benign example, consider modern storebought tomatoes. Throughout the 20th century, grocers began noting that bright red tomatoes sold better than mottled or off-color varieties. Grocers passed this fact on to farmers and plant breeders, who responded by selecting for brighter and redder tomatoes, leaving us today with blue-ribbon tomatoes – at least on the surface. But beneath the exterior, a genetic trap was being sprung: in tomatoes, genes for color are coincidentally close on the strand of DNA to those for sweetness and flavor. And thus decades of careful breeding and selection for bright red tomatoes left us with storebought tomatoes bleached of flavor and sugar. Anyone who has ever tasted a home-grown tomato can attest to the fact that there is very little in common with the flavorless storebought varieties other than an ostensible name and exterior. We manipulated plant genetics through conventional plant breeding, and had no idea what we were doing. The result is a culinary disaster.

      But as I mentioned, that was the benign example. Other such mishaps are not so harmless. Consider, for example, the Lenape potato. In the mid-twentieth century, potato breeders sought a perfect spud to make chips with – white, flavorful, and capable of frying up with a delightful crispy crunch. But as with the tomato above, the treasure trove of conventional breeding perfection turned out to be a pandora’s box of unwanted genetic consequences. The resulting potato proved unexpectedly high in glycoalkaloids, resulting in several potato chip enthusiasts experiencing life-threatening illness.

      To this end, GMOs actually provide a solution rather than a deepening problem. Breeding of plants and animals is always a tricky proposition – in addition to the myriad of genetic shuffles that take place between the DNA of two parents, there is the randomness of genetic chance – mutations, the stuff of natural selection and evolution. Without sequencing the genetic makeup of each individual seed, it is impossible to know what exactly the final product will be. But with genetic modification, it is possible to know exactly what genetic changes have taken place; indeed, that is precisely the point. If we want to insert a gene for the production of Vitamin K from an obscure berry into a common rice plant, we don’t have to guess at what those genes are going to do: the genes for vitamin k production are read by the plant’s cells as only that. However, due to the nature of genetic variation and mutation, it is actually a possibility that through conventional breeding, a rice plant could produce offspring with the very same gene that the berry has – or to have a genetic mutation result in a gene that fills the rice plant with cyanide. Both could happen without breeders, farmers, grocers or consumers ever knowing – until it was too late.


      The Spain Study

      These things said, I would like to draw attention to a recent study on GMO cultivation in Spain. The study reveals that GMO cultivation has resulted in greater economic benefit, lower pesticide use, lower land use, and ultimately lower environmental impact than conventional agriculture. On the whole, GMO agriculture in Spain is simply better in various dimensions. It remains a fact that many of these plantations are monocrop, that they are not organic, and that they displace traditional small-scale agriculture. But what is the right comparison to make here? I propose that the comparison ought to be drawn not with organic farming, compared to which GMO cultivation can leave much to be desired for many people, but rather with conventional large-scale agriculture. On the several dimensions mentioned above, GMOs offer an improvement. And for those who would still say that it’s better to be safe than sorry on GMOs, I would argue – is keeping pesticide use and land use higher than necessary really “playing it safe”? The precautionary principle works both ways: sometimes caution is keeping a new technology out; sometimes caution is letting a new technology in. Not innocent until proven guilty, but an improvement until proven a detraction.

      Opposition to GMOs will remain widespread. The idea that humans are haphazardly tinkering with nature and creating frankenfoods has a great deal of cache. The idea that we should make do with what we have and not risk unforeseen harm is a powerful one as well. I do not ask anyone to abandon caution, or to abandon a reverence for a natural way of doing things. All that I ask is that these same principles be taken to their natural conclusions: could caution not lead us to, sometimes, accept GMOs? And with a complete reverence for natural ways of doing things, what else would we need to abandon?

    2. Free Trade – the American Past and Future

      Aside from human decency, several minorities, and general faith in the American political system, one of the lesser-lamented victims of the post-2016 turn in American politics was Free Trade. Strangely, from it being one of few things that most American politicians agreed upon in 2012, free trade has fallen to the wayside as a cornerstone of American economic policy. Before we so lightly abandon it, let us consider what it has done for us, and what it can still do.

      In the wake of the second world war, the united states stood alone as a colossus of industry. Western Europe and the Japanese Empire, the most industrialized places outside the Americas, as well as secondary centers like the Soviet Union and the Middle East, had been devastated by years, in some places nearly a decade, of total war. In 1945, the United States possessed nearly half of the world’s industrial capacity (estimates vary but over 40%) – literally all of rest of the world combined could barely equal the combined industrial output of the United States.

      During this period, it made complete and total logical sense for the United States to pursue policies of free trade, and to get others to do the same. But let us dig a little bit into the why of the thing. To some, this idea is intuitively obvious; to others, free trade means hemorrhaging jobs overseas and impoverishing workers and undercutting the power of unions. But in the postwar years, the United States with its unrivaled industrial capacity was doing two things that benefitted from free trade: 1, exporting these manufactured goods like cars, radios, and the nascently popular television, and 2, importing the raw materials like wood, agricultural products, and metals to be turned into these manufactured goods. For a country in these situations, it was beneficial to go knock on the doors of trade partners, and propose an exchange: the US would allow their exporters to sell without tariffs into the US, and in return they would allow US exporters to sell without tariffs into their markets. For many of these countries like Latin America, East Asia, and the Middle East, who were major exporters of these raw materials and importers of finished goods, these free trade deals made intuitive sense as well; there was little need for coercion and strong-arming or massive expense of diplomatic capital to see most of the world embrace a regime of free trade and open borders, especially considering that a major cause for the great depression and ensuing Second World War was the imposition of punitive, beggar-thy-neighbor trade barriers and the ensuing collapse of global commerce (global trade in 1933 collapsed to 1/3 of its value in 1929).

      Now back in the United States, surely this embrace of free trade in the immediate postwar era meant that these foresters and coal miners and farmers were being outcompeted by these cheap imports and leaving their communities impoverished, and the CEOs were getting fat off the profits, like many people allege is happening today with deals like NAFTA, right? Quite the contrary. There are many different contributing factors to why this didn’t turn out poorly (and in fact turned out so well) and intense political debates and entire academic careers surround the relative importance of each one, but amongst the most important (in no particular order) are these:

      • American (and indeed global) industry in this period was still highly labor intensive and much of it did not require many special skills; those unemployed miners, farmers, etc. did not have very much trouble finding work, preventing mass layoffs and unemployment
      • The 1944 GI Bill provided funding for returning military personnel who had served in WWII to get training and educations, allowing immense mobility into the growing and expanding higher-skilled post-industrial sectors of the American Economy. Between 1944 and 1956, nearly 10 million veterans received these education and training provisions (considering the US population in 1950 was only about 150 million people, this is an enormous swath of the American workforce receiving post-secondary education assistance).
      • The top marginal tax rate in much of the 1950s was nearly 90%. Economic inequality remained very low and social mobility remained very high. The increased government revenues resulting from these tax rates funded the infamous Military Industrial Complex as well as such far-reaching investments as the Interstate Highway System which circulated massive amounts of money throughout the economy.

      As a result of these and other factors, what happened was that American companies had massive demand for their increasingly advanced manufactured goods as the economies of Western Europe and East Asia rebuilt, and sold easily as a result of lowered or absent tariffs and barriers to trade; additionally, their access to cheap tariff-free raw materials from Latin America and elsewhere meant that they stayed profitable, were able to grow and soak up massive amounts of excess labor in the US labor market, preventing mass unemployment and contributing to a growing middle class and the halcyon prosperity of the 50s and early 60s. High marginal tax rates and high investments in social programs, education, and infrastructure ensured unparalleled levels of socioeconomic mobility. And many developing countries, many of them still under the yoke of European empire, were either struggling with fundamental problems of internal development and wartime devastation (e.g. Korea) or finding their own path to prosperity via more efficient and productive exploitation and export of basic agricultural goods and natural resources such as via the route that Argentina followed in the late 1800s when it briefly surpassed France, Sweden, or Italy in economic output. Eager for access to US markets in exchange for lowered tariffs on imported manufactured goods, much of the world climbed aboard the trade bandwagon.

      In this new regime of free trade, between 1945 and 1970 the volume and value global exchange skyrocketed, and American prosperity along with it.

      The current state of world affairs could not be more different. Countries like China and Mexico are not content to sell agricultural products and ores and buy American cars and televisions; they produce televisions, cars, and durable goods of their own, and often more cheaply and innovatively than those produced in the US. What began in the late 60s as Volkswagens and later Hondas began outcompeting GM and Ford in the US market has become the new global status quo – the United States holds no monopolies on manufactured goods, electronics, or post-industrial services.

      Politicians like Bernie Sanders and Donald Trump are in one respect right when they assert that the United States cannot maintain prosperity by doing exactly what it has been doing with regards to free trade, and that continue going down that path will lead to the end of the American working class and the continuing hemorrhaging of jobs and wealth to developing countries. If one defines the working class as the class that can make a decent living off of relatively unskilled labor, particularly in manufacturing, those are some of the jobs that are inherently likely to relocate to areas with cheaper labor, like the Chinas or Mexicos of the world.

      However, the politicians like Sanders and Trump are completely wrong in the assertion that the response to this loss of jobs and massive trade deficits should be protectionism and a retreat from neoliberalism and free trade. Rather, an advanced post-industrial economy like the United States has no business trying to compete with developing and industrial economies in unskilled manufacturing; the Chinese manufacturer of bicycles or computer monitors will always be able to outcompete the American manufacturer in such products, because the Chinese factory worker only requires the wages to sustain a Chinese factory worker standard of living. Unless the United States is willing to sacrifice all of the advances in comfort and standards of living for the last 50 years, we will never be able to compete with China in basic manufacturing. Instead we should strive to be competing with Japan and Germany in advanced electronics and engineering, energizing our Infotech sectors on the West coast and biochemical clusters in the Northeast, and investing in clean and renewable energies in the Midwest, not striving to protect the dying industries of the Rust Belt.

      The only obstacle is that the US will have to rebuild much of the institutional foundation of 1950s prosperity in order to accomplish this transition. The US should be investing in trade and technical schools and public universities to make sure that those previously unskilled manufacturers have the skills and talents necessary to work on advanced manufacturing, in the way that was done with the GI Bill in the 40s and 50s. We must invest in new infrastructure; in a 21st century analog to the Interstate Highway System, the US must craft nationwide broadband and smart grids for all, allowing underprivileged students access to the socioeducational benefits of resources like Wikipedia, Khanacademy, and Youtube as well as for Midwest energy production in wind, natural gas, biofuel, and solar energy to power the major coastal cities. We must strive for the future, not the past; we must strive to compete with countries whose standard of living we envy, not those whose standards of living we have surpassed long ago. A brighter future is possible. But we must fight for it.