What is an Outside Scholar

NOTE: The following is 1st draft of a chapter which I wrote for a book I am currently working on.  I decided to cut the chapter as being outside the scope of the book but it seemed a shame to waste it, so I decided to post it here, particularly since parts of it relate so various of my Great Books posts.


So far in this book I have used the term outside scholar fairly casually to refer to Victor Sharrow and those like him. Before proceeding, I think it is time to expand on what I mean when I use this label. First, though, I think it is appropriate that we review a bit of history about scholarship in general.

A scholar is a person who creates knowledge by a process called research and transmits it to others, usually through writing. In contemporary usage the term often connotes a profession. In the truest and most historic sense, however, scholarship is a vocation; scholars are driven by intellectual curiosity, love of knowledge, and a desire to create a permanent legacy for other scholars who will come later. A person with the true scholarly vocation will usually find a way to pursue their interests regardless of what formal profession they follow to make a living. In fact, the idea of a professional scholar who is paid for their studies is largely an invention of the modern age.

In our western tradition this conception of scholarship has its roots, like much else in our society, in the Golden Age of classical Greece when literate men began to research science, philosophy, and history and record their conclusions on scrolls which they allowed other scholars to borrow and copy, birthing the concept of scholarly publication. Typical of these men were the historians Herodotus, Thucydides, and Xenophon. The first of these was a merchant, while the other two were career military men but all three were fascinated by recent history and the causes and effects of war. After collecting and comparing oral histories and visiting the some of the locations where important events had occurred, they wrote books which not only chronicle history, but also analyzed it. There works are still read and studied today1.

Thucydides, at least, was fully cognizant of his drive to leave a permanent intellectual legacy, writing,

“It will be enough for me…if these words of mine are judged useful by those who want to understand clearly the events which happened in the past and which (human nature being what it is) will, at some time or other and in much the same ways, be repeated in the future. My work is not a piece of writing designed to meet the taste of an immediate public, but was done to last forever.”

Thucydides knew Herodotus personally and was influenced by his book. Xenophon would have been acquainted with the work of both and seems to have written the Hellenica as a direct sequel to Thucydides’ work. Nevertheless, it never occurred to these men, or their contemporary colleagues whose work is now lost, to think of themselves as a community or school of historic scholars. They merely shared a common interest. It was the philosophers of Greece who originated the idea of an academy. The original academy was a grove of trees outside Athens where teachers met with their students. The academy became an actual institution when Plato joined with other local philosophers to create a school, holding classes in his home or the nearby gymnasium. Aristotle, the son of the Macedonian Royal Physician, studied there for several years before returning to Macedon to found his own academy, the Lyceum. Philosophers in Greece had always supplemented their incomes by teaching the sons of the local aristocracy. Formal academies were a way of persuading the students to come to them, rather than wandering the country in search of students. Academies in the pattern of Plato’s came and went in the Hellenic until the very end of the ancient period. From the first century of onward the Christianity began to dominate the intellectual life of the West, gradually replacing the more secular philosophy of classical antiquity. By the time the Western Roman empire collapsed, most learning was concentrated in the Church. Literacy rates dropped throughout Europe and the secular members of the upper classes found they were too busy fighting for survival to devote time to scholarship. The Eastern empire survived and was spared the worst effects of the Dark Ages, but the Byzantine mind was increasingly inclined towards mysticism and away from rational scholarship. In 529 the emperor Justinian ordered the closure of the last incarnation of the Athenian Academy, an event which some historians consider to be the official end of the ancient era and the beginning of the medieval period.

For nearly 1000 years, the church, particularly the monasteries, had a virtual monopoly on scholarship. Nearly everyone who learned to read and write was taught by clerics and most of what books still survived were the property of the church. The first universities were an outgrowth of earlier monastic schools and existed mainly to train priests and church officials2. Even those rare lay scholars who did not accept ordination pursued their studies with and within the church organization, or not at all.

All of this began to change around the end of the 15th century. The invention of the printing press and availability of paper drastically lowered the cost of books. Rising economic prosperity allowed more lay people in the upper class and the emerging bourgeoisie the luxury of an extended education. Since the 13th century, classical works which had long been lost in Europe but had survived in the Islamic world had begun to make their way back into the libraries of the West. Now they could be purchased and read by the laity. A new kind of intellectual, began to emerge throughout Europe to help build the modern age .

These renaissance men had more in common with the scholars of classical Athens than with monks of the Middle Ages. Typical of them was Niccolò Machiavelli. Machiavelli was a Florentine politician. After finding himself on the wrong side of a coup, he found himself unemployed and was forced to retire to the countryside. His best known work, The Prince was an unsuccessful attempt to showcase his knowledge of political science and recent history in the hopes that a powerful noble would notice and offer him a position. Permanently shut out of politics, he consoled himself by reading the classics and writing a scholarly commentary on the works of Livy. Machiavelli might be the first successful outside scholar of the modern age. In fact, at least some historians feel that the publication of his works mark the start of the modern age3. Machiavelli was a layman and out of favor with the establishment. His major works were not published until after his death and were officially banned by the Church. Even today The Prince, while widely read, remains controversial. Despite this, Machiavelli’s eventual influence on western thought is incontestable .

Perhaps the greatest of all the Enlightenment outsiders, though, was Spinoza. Born in 1632 to a family of Portuguese Jews who had fled to Amsterdam to escape the Inquisition, he showed a scholarly turn and was initially expected to become a rabbi. His curiosity soon drove him beyond the Torah, Talmud, and orthodox judaica into the Cabala and other esoteric studies. Then, after taking Latin lessons from a gentile freethinker, he proceeded to devour every philosophical text he could find, from Aristotle to Descartes. By then the young philosopher was beginning to harbor theories that made the elders of the synagogue extremely nervous .

Intellectual life among the Dutch Jews of the 17th century was closely circumscribed. Holland was one of the only places in Europe that was not closed to them in that period, and they remained only at the sufferance of their Protestant Christian hosts. Driven by the dual imperatives to maintain their cultural unity and to avoid giving offense to the Christians, they focused their studies on the Torah and avoided dangerous speculation. Young Spinoza, who had now begun saying things like “Angels are probably only hallucinations” and “The Bible uses figurative language and isn’t meant to be taken literally,” was not just a destabilizing influence, but was all too likely to bring down the wrath of the Christian majority on the Jewish community .

At the age of 24 Spinoza given a choice: he could either accept an annuity of 1,000 florins in return for keeping his unorthodox theories to himself, or he could be excommunicated from the Jewish faith. He chose excommunication. Europe had recently concluded a series of brutal wars of religion between Catholics and Protestants which raged intermittently for 126 years. Religious affiliation was still the single most important factor in the personal identity of most people and to not belong to an organized religion was unthinkable. Yet Spinoza never converted to another faith. Changing his first name from Baruch to Benedict, he moved into an attic apartment and spent the rest of his life writing books on philosophy while he supported himself by grinding lenses. Later, when his reputation began to grow, he turned down financial support from Lois XIV of France and even a prestigious university professorship on the grounds that accepting money from the government would irrevocably compromise his freedom to philosophize.

Of his five works (one unfinished) only two could be safely published during his life: a commentary on the philosophy of Decarte and the Theologico-Political Treatise, which was immediately placed on the index of banned books and had to be sold with a false cover and only the author’s initials on the title page. Among the inflammatory ideas contained in the book is the idea that the Bible is written in figurative language. The key to understanding it is to study the historical, biographical, and cultural context in which the authors lived,

The universal rule, then, in interpreting Scripture is to accept nothing as an authoritative Scriptural statement which we do not perceive very clearly when we examine it in the light of its history.

… such a history should relate the environment of all the prophetic books extant; that is, the life, the conduct, and the studies of the author of each book, who he was, what was the occasion, and the epoch of his writing, whom did he write for, and in what language. Further, it should inquire into the fate of each book: how it was first received, into whose hands it fell, how many different versions there were of it, by whose advice was it received into the Bible, and, lastly, how all the books now universally accepted as sacred, were united into a single whole.

All such information should, as I have said, be contained in the ‘history’ of Scripture. For, in order to know what statements are set forth as laws, and what as moral precepts, it is important to be acquainted with the life, the conduct, and the pursuits of their author: moreover, it becomes easier to explain a man’s writings in proportion as we have more intimate knowledge of his genius and temperament.

Further, that we may not confound precepts which are eternal with those which served only a temporary purpose, or were only meant for a few, we should know what was the occasion, the time, the age, in which each book was written, and to what nation it was addressed. Lastly, we should have knowledge on the other points I have mentioned, in order to be sure, in addition to the authenticity of the work, that it has not been tampered with by sacrilegious hands, or whether errors can have crept in, and, if so, whether they have been corrected by men sufficiently skilled and worthy of credence. All these things should be known, that we may not be led away by blind impulse to accept whatever is thrust on our notice, instead of only that which is sure and indisputable.

Today, this viewpoint is at the core of all but the most fundamentalist bible Judeo-Christian bible study, but it was revolutionary in 1670. In fact, the Theologico-Political Treatise is barely studied or quoted today, except by historians, because most of its arguments are now taken for granted in mainstream western thought.

Spinoza’s greatest work is his Ethics which solidified his reputation, along with Descartes and Leibniz, as one of the three greatest rationalist philosophers. It would be hard to exaggerate the extent of Spinoza’s influence on the next 500 years of modern philosophy. His impact on Judaism, once his people were ready to reclaim him, was equally pervasive. He has been called “The “first modern secular Jew” and credited with originating many of the core ideas of Reform Judaism .

Even as Machiavelli, Spinoza, and numerous other freethinkers were revolutionizing Western thought from outside any organized intellectual establishment, new forces were making themselves felt throughout Western Civilization4. Universities, which had first appeared in the medieval period, multiplied through the modern period, first in Europe and then in the New World5. Meanwhile scholars and learned professionals, seeing the value of communication and collaboration, began to organize themselves into societies. Typical of these was the Royal Society, founded in 1660, of which Henry Oldenburg, one of Spinoza’s best friends, was the first secretary. The, often overlapping, influence of the universities and societies on the growth of knowledge was overwhelmingly positive. However, as time went on a divide began to appear between the “elite” scholars who attended and taught at universities and/or belonged to scholarly societies and the “amateur” scholars who did not. A new Academy was forming which had the power to give or withhold approval and legitimacy to scholarly efforts.

The implicit narrative began to be that outside scholars were undisciplined and underprivileged. By the end of the Enlightenment, efforts were made to bring the most brilliant of them into the fold, which many accepted joyfully. Spinoza was exceptional in turning down a university position when it was offered. More typical was Samuel Johnson, that brilliant titan of English letters, who was given an honorary doctorate and referred to as “Dr Johnson” by academics forever more. Benjamin Franklin, a self-educated man who spent his early career as the archetypal outside scholar, happily accepted his own honorary doctorate and membership in the Royal Society in later life, glorying in his hard-won academic legitimacy.

As time went on, it became harder even for exceptional outsiders to gain admission to the ivory tower of academia. The Academy had emerged as a new international priesthood, with a hold over scholarship almost as strong as the church had enjoyed in the previous age. Only those who had served their novitiate and displayed appropriately orthodox dogmas could be ordained.

Rise of the Modern University

While universities first appeared in the middle ages and can, in at least in theory, be placed into the tradition of higher education which began with the Athenian academy, most of the traits which we associate with the modern university first appeared in the 19th century. It was in this period when two major schools of thought emerged which still shape thinking about the role of the university. One of these viewpoints was articulated by Cardinal John Henry Newman, in a series of lectures given in Dublin in the 1850s. Newman’s view was shaped by his own experiences at Oxford which, like the other “ancient universities” of the British Islands was then in the process of transitioning from training aristocrats to providing a liberal education for the new class of skilled bourgeoisie. He argued that the primary role of a university was to provide a generalized education. Research was a less important mission than teaching. Indeed, research could be more efficiently conducted outside the university,

The view taken of a University in these Discourses is the following:—That it is a place of teaching universal knowledge. This implies that its object is, on the one hand, intellectual, not moral; and, on the other, that it is the diffusion and extension of knowledge rather than the advancement. If its object were scientific and philosophical discovery, I do not see why a University should have students; if religious training, I do not see how it can be the seat of literature and science. … …there are other institutions far more suited to act as instruments of stimulating philosophical inquiry, and extending the boundaries of our knowledge, than a University. Such, for instance, are the literary and scientific “Academies,”… … To discover and to teach are distinct functions; they are also distinct gifts, and are not commonly found united in the same person. He, too, who spends his day in dispensing his existing knowledge to all comers is unlikely to have either leisure or energy to acquire new .

The Newman model of the university’s mission was highly influential in the United Kingdom and, to a lesser extent, on liberal arts colleges in America .

Meanwhile, in Germany, another model was emerging based on the University of Berlin, founded by Wilhelm von Humboldt in 1810. In the Humboldt type university teaching and research were inseparable. The university was a sort of knowledge factory. Students learned by being involved, albeit at a very low level, in the critical investigation of truth. The overall prestige of a university was based on the quality of research it generated. The Humboldt model became wildly popular on the continent because Humboldt type research systems were seen as a major factor in Germany’s economic growth. When the US began building its state university system with the passage of the Morrill Acts in 1862 and 1890, the Humboldt model was taken as a template for the ideal public university .

Until World War II most new universities in Europe and the Americas were based on the Humboldt paradigm. After the war, however, pressures to provide mass education to all citizens, combined with population pressures from the baby boom and the passage of the GI Bill in the US, which allowed returning soldiers to finance higher education, created demand for a third type of university. Neither Newman nor Humboldt type schools were physically capable of absorbing the influx of new students, which pushed student-to-faculty ratios to an historic high. nor were the new–primarily first generation–students particularly interested either in gaining a generalized liberal education or engaging in research. They came to school to learn technical skills and gain specialized diplomas which would increase their incomes. In response to this demand, the second half of the twentieth century saw a wave of new polytechnic schools, vocational schools that reinvented themselves as “technical universities”, and, finally, for profit “universities”. At these new schools basic research, if conducted at all, was a distinctly secondary pursuit. The need for faculty in these institutions paved the way a type of second-class academic whose primary job was lecturing to students who would never themselves become scholars .

Older universities, forced to compete with the new technical schools for funding, faculty, and students, began to adopt some of their traits. Student-to-faculty ratios rose, universities began doing more applied research, and an increasing number of specialized professional degree programs appeared in catalogs. Many older universities added professional schools, which allowed them to attract talented students who might otherwise go to a technical university while charging them tuition at a much higher rate than that for “research” graduate degrees. In 1908 Harvard began offering a new graduate degree, the Master of Business Administration (MBA), which was essentially a vocati9onal diploma for corporate executives. Other major research universities rapidly followed. Today the MBA is the most awarded graduate degree world-wide. Some MBA students are involved with research and a few go on to PhD programs, but the degree is not seen as preparation for a research career. In most business schools that offer PhD programs, MBA and PhD candidates are admitted based on different criteria and are almost completely segregated from each other throughout their studies. An MBA, even if they are a talented researcher, has almost no chance of landing a tenure-track academic job after graduation. There are around 800,000 of them graduating every year and every one of them, if they choose to do research, is, by definition, an outside scholar6.

The result of these four decades of competitive convergence, the typical state university of today has a case of institutional schizophrenia. One side of the split personality is a Humboltian research university in which research teams, led by tenured professors assisted by a chosen few students, spend their time competing for grant money and cranking out papers. The other side is a career school in which lecturers and graduate teaching assistants cater to legions of undergraduates’ and professional students’ need to diplomas which will allow them to take their places among the ranks of the bourgeoisies.

The same period over which the university attained its final form has seen the stratification of the scholarly community into four rigid castes, with relatively little mobility between them. The two upper castes make up the Academy, while the two lower castes are outsiders. At the top are the professional researchers. Most often they are tenured professors at a research university, or hold an analogous position at a public or private research facility. This caste not only has little trouble getting their research published and accepted, but because they control the peer review process, conference agendas, and PhD committees, are able to give or withhold the stamp of legitimacy to scholars of the lower castes. Below them are the lecturers, scholars who have either failed to reach the upper class, or whose main interest is education. Their main function is undergraduate and professional education but if they can somehow find the time and money for research they can often get it published. Below them are the professionals who hold specialized doctoral or masters degrees in law, business, medicine, engineering, education or other fields. They they generally are generally able to publish applied research in their own field, generally under the auspices of a professional association, but are discouraged from pure or theoretical research. At the lowest level are the autodidacts. These scholars, no matter what their level of interest, ability, and knowledge, have not managed to obtain the graduate degree which is the minimum requirement for scholarly legitimacy. In general, they have no access to journals, conferences, or “respectable” academic presses and are totally ignored by the academy. The avenues open to them to communicate their work–“popular” nonfiction, Internet blogs and predatory, for-profit journals, have little reach even among their own caste.

One of the most universal traits of all four castes in specialization. Despite a certain amount of lip service to multidisciplinary or interdisciplinary scholarship, 21st century scholars tend to confine their work to incredibly narrow disciplines. The typical modern scholar is thus defined by their place in a rigid system which labels and circumscribes them according to type of (or lack of) institution, rank, and specialty. There is no place in such a system for a Benjamin Franklin, a Francis Bacon, or even an Aristotle or Spinoza.

Historian John Lukacs explains this phenomenon as part of a process of bureaucratization which has continued in all aspects of Western Civilization throughout the modern age, reaching new heights in the twentieth century, “In this increasingly bureaucratized world, little more than the possession of various diplomas mattered. Since admission to certain schools–rather than the consequently almost automatic acquisition of degrees–depended on increasingly competitive examinations, the word ’meritocracy’ was coined…In reality the term ’meritocracy’ was misleading. As in so many other spheres of life the rules that governed the practices and functions of schools and universities were bureaucratic rather than meritocratic.” Securing admission to a program and earning a degree is only the first step for someone seeking an academic career. In the US it takes around ten years for the average PhD student to earn their degree, counting from the receipt of their bachelor’s . Once they take the examinations and submit to copious paperwork to gain admission to a program, they are presented with a list of required courses, further exams, and residency requirements to gain the degree. The only requirement that is designed purely to test the student’s skill as a writer and researcher is the dissertation. Even in this area following the correct format and submitting the appropriate paperwork often becomes nearly as important as the actual scholarship. In many fields, particularly the physical sciences, the PhD program is not even seen as adequate preparation for independent research and students are expected to spend further years in one or more “post-doc” research appointments to gain further experience.

Newly made PhDs as next subjected to yet another “meritocratic” sorting process. The lucky and well-connected are placed in “tenure track” positions as assistant professors. The second tier secure positions as lecturers–second class faculty who have no prospect of tenure and are expected to teach heavy course loads to free up the professors for research. The rest, an increasing percentage of the total, eke out a living as part time adjunct instructors, often commuting to three or more schools in a week in order to earn a living income. These “gypsies”, as they are referred to by their more fortunate colleagues, live in hope that a full time position will materialize, but the odds are stacked against them. It is hardly surprising that so many PhD students either fail to complete their degree or, having obtained it, give up and leave academia forever. Some of them have no choice: a gap in employment of more than a few months, or two much time spent as an adjunct, is often seen as a black mark in an academic’s career, permanently excluding them from consideration for full time positions7.

As for those lucky few, the small percentage of scholars who make it onto the tenure track, they are privileged to spend the next six or seven years working sixty hour weeks while they accumulate the requisite ticket punches for promotion. If all goes well they gain tenure around year seven, finally making it into full membership in the academy. If something goes wrong, or the university simply decides that it doesn’t need any more associate professors at the moment, they are thanked and excused and leave to start over from the beginning .

An associate professor working towards tenure has no incentive to take risks. A large volume of acceptable publications is always less risky than a few brilliant ones. Research that is two controversial, or steps on the toes of a member of the tenure committee, can easily wreck their career. Some of them tell themselves that they will play it safe until they get tenure, then work on the projects that they really want to do. A few follow through on this, but it is hard to radically change the direction of one’s research after seven years of escalating commitment. Many of them, after spending two decades of their research career playing it safe, have no idea how to take risks even if they wanted to.

Everything in the career path of an academic selects for risk avoiding individuals who know how to play the system. Successful professors have all the same character traits of a career bureaucrat. Worse, by the time they achieve tenure they have been thoroughly socialized to look down on any scholar who has not managed to survive the same process. At the same time, they have spent years acquiring narrowly specialized knowledge, working mostly with people in the same discipline, and being warned by their mentors not to have opinions or do work outside their field8

American research universities are incredibly good at their main function, which is rigorous, deep research in narrowly defined areas. They focus on training the kind of scholars that they need for this mission. Unfortunately, these specialized professors are much less effective at some of the other functions which have traditionally been associated with scholars. Teaching, particularly at the undergraduate level, is generally fobbed off on lecturers and graduate students. Practical applications, particularly those involving interdisciplinary knowledge, tend to be the province or corporate R&D organizations, where researchers are expected to pursue projects that will make a profit for the company and which only share their findings with competitors when it is in their interest. The task of advising policymakers is carried out by staff intellectuals at government agencies–which are, more or less by definition–even more bureaucratic and conservative than the universities.

But what of those scholars who follow the more traditional model, more like the great thinkers of the ancient world and the enlightenment? What about those who left the academy after earning a graduate degree–PhD, masters or professional, but still have an interest in doing real scholarly research and creating knowledge or affecting public policy? What about autodidacts who never had a formal education at all but, after a lifetime of reading are now ready to write serious nonfiction works? Is it even possible for these outside scholars to make a contribution in the modern era?

So far in this book, I have deliberately avoided writing any autobiographical details because I felt it would distract from the purpose of the work. Now, however, in the interests of full disclosure, I must mention that I too am one of these outsiders, and the answers to these questions affect me personally. I attended professional school at a major research university, earning an MBA. While there I did original research and completed a thesis which was later published as my first book. Several professors strongly urged my to continue on and finish a PhD. Upon examining what would actually be required, and the personal and family sacrifices that I would need to make, I decided that it wasn’t worth it. I am still doing primary research in my specialty, but I am finding every aspect of it more difficult now that I am now affiliated with an institution: it is much harder to obtain grant funding, I have trouble getting the journals and database access I need, and I no longer have a departmental fund to pay my way to conferences. When I go to publish in journals I find that the burden of proving my credibility is on me; without the name of an institution under my byline, the assumption is that I don’t have the qualifications to publish. I am far from the only one in this situation, though. Later, I will talk about some of the changes which are making life easier for us.


  1. Read together Books V-XII of Herodotus’ Histories, Thucydides’ History of the Peloponnesian War, and Xenophon’s Hellenica form a continuous trilogy of the history of Greece and her neighbors from just before the Greco-Persian wars up to the aftermath of the Peloponnesian War, a period of approximately 136 years.
  2. Note the modern similarity between academic regalia and monastic habits.
  3. Alan Bloom argues that Machiavelli was the philosopher who began the Enlightenment. According to Bloom, it was Machiavelli who first suggested that the philosophers of western civilization, who had formerly been dependent on the patronage of the aristocracy, should “change camps” and espouse democracy, reason, and the theory of rights–some of the most characteristic concepts of the modern age–as these would create a society that offered them greater protection and scope for their talents.
  4. My discussion has necessarily been limited in scope to the history of Western Civilization. Other societies have their own scholarly traditions and institutions, some of which predate Western civilization itself. Likewise, they have had their own outside scholars who toiled outside the scholarly establishment and gained legitimacy and influence only late in life or even centuries after their deaths. Confucius is but one example. As the modern age continued, however, the ruling and intellectual classes of the East were increasingly educated by the Academy of the West. By the 20th century the Academy was completely international, and organized on the Western Model. See Eberhard.
  5. Even the destruction and upheavals of the Wars of Religion did little to slow the spread of universities. In fact, some of the most famous universities were founded as gambits in the struggle between Protestants and Catholics. For example, Trinity College in Dublin was established on the orders of Elizabeth I to educate the sons of her protestant subjects in Ireland without subjecting them to the corruptive influences of Catholicism.
  6. During orientation on my first day of business school I raised my hand and asked an associate dean about research opportunities for MBA students. He laughed and said “If you want to do research, what are you doing in the MBA program? You should have applied as a PhD.”
  7. For purposes of discussion I have focused on the career path of scholars at a research university. Many PhDs also work for government agencies or for-profit research organizations which have their own bureaucratic hurdles.
  8. At American universities and schools in other countries that are based on the American model, the basic unit of organization is the department, which consists of all of the university’s specialists in a particular discipline. At English universities, on the other hand, the basic unit is the college, which will typically include one professor from each discipline. English professors, and European academics in general, also tend to be more involved with teaching and administration than their American colleagues. See Eagleton for a delightful overview of some of the differences.

Bibliography

Anderson, Robert. “The ‘Idea of a University’ today.” History and
Politics (2010). http://www.historyandpolicy.org/hp/research/papers/policy-paper-98.html.

Bloom, Allan David. The Closing of the American Mind: How Higher Education Has Failed Democracy and Impoverished the Souls of Today’s Students. New York: Simon & Schuster, 1987.

Copulsky, Jerome E. “The Last Prophet: Spinoza and the Political Theology of Moses Hess.” University of Chicago Divinity School, 2008. https://divinity.uchicago.edu/sites/default/files/imce/pdfs/webforum/032008/copulsky_last_prophet.pdf.

Durant, Will. The Story of Philosophy: The Lives and Opinions of the World’s Greatest Philosophers. Kindle Ed. Aristeus, 2014.

Eagleton, Terry. Across the pond: an Englishman’s view of America. 2013

Eberhard, Wolfram. A History of China. 3rd ed. [org. pub. 1969]. Project Gutenberg, 2006. http://www.gutenberg.org/ebooks/17695.

Herodotus. The Persian War. Translated by William Shepherd. Cambridge; New York: Cambridge University Press, 1982.

Hoffer, Thomas B., and Vincent Welch. Time to degree of U.S. Research doctorate Recipients. National Science Foundation Directorate for Social, Behavioral, and Economic Sciences, March 2006. http://www.nsf.gov/statistics/infbrief/nsf06312/.

Lukacs, John. At the end of an Age. New Haven: Yale University
Press, 2002.

Machiavelli, Niccoló. The Prince. Translated by George Bull. LondoEagleton, Terry. Across the pond: an Englishman’s view of America.
2013n; New York: Penguin Books, 2003.

Newman, John Henry. The Idea of a University Defined and Illustrated In Nine Discourses Delivered to the Catholics of Dublin. Project Gutenberg, 2008. http://www.gutenberg.org/ebooks/24526

Newman, John Henry. The University: Its Rise and Progress. Edited by Kevin A. Straight. Montrose, CA: Creative Minority Productions, 2015.

O’Brien, Keith. “The Ronin Insitute for wayward academics: a bold new idea to solve the PhD crises.” Boston Globe  (May 27, 2012). https://www.bostonglobe.com/ideas/2012/05/26/new-idea-for-unemployed-academics/UUZOGe1KNWvUXDl7Yae1IL/story.html.

Spinoza, Benedictus de. The ethics of Spinoza: the road to inner freedom. Secaucus, N.J.: Citadel Press, 1976.

Spinoza, Benedictus de. Theologico-Political Treatise. Translated by R.H.M. Elwes. Project Gutenberg, 1997. http://www.gutenberg.org/ebooks/990.

Spinoza, Benedictus de, and Joseph Ratner. “The Life of Spinoza.” in The philosophy of Spinoza, [org. pub. 1926]. Project Gutenberg, 2010. https://www.gutenberg.org/ebooks/31205.

Thucydides. Thucydides: History of the Peloponnesian War. Translated by Rex Warner. Harmondsworth, Middlesex: Penguin, 1954.

Xenophon. Hellenica. Translated by Henry Graham Dakyns. Champaign, Ill.: Project Gutenberg, 2008. http://www.gutenberg.org/ebooks/1174

 

Clouds of Aristophanes

Aristophanes’ play The Clouds is fascinating in a number of ways, not least because it contains one of the earliest literary mentions of Socrates.  Socrates, or at least the complex of ideas that Socrates came to represent, would become one of the most important figures in the Western tradition and the well-spring of one the two most important strands of Western philosophy (the other of which would begin with Aristotle).  At the time of The Clouds, however, Socrates was just starting to become a salient figure–a well known local character, but not yet the famous philosopher who would be immortalized by Plato and others.

Aristophanes picked Socrates to be his caricature of a “modern” teacher at least partially because Socrates’ famously homely appearance would lend itself to a hilarious and recognizable mask.  When the Socrates character first came on stage in the original performance the actual Socrates stood up so the crowd could admire the resemblance.  Shortly before this period Socrates seems to have spent considerable time talking to sophists and other pre-socratic philosophers, prior to fully developing his own philosophy, so this portrayal as a Sophist is not completely unwarranted.  On the other hand, the main criticism that Aristophanes levels against the sophistic school, that they are willing to argue both sides of an issue and are more concerned with the argument itself than the truth, is decidedly not applicable to Socrates’ mature philosophical methods, as portrayed by Plato. Plato’s  Socrates is only interested in understanding universal truths, and seeks them not through argument but by admitting his own ignorance and asking questions.  We must keep in mind, though, that The Clouds was written decades before Plato’s dialogues.

Plato’s Socrates rejects Aristophanes’ caricature in The Apology,

I will begin at the beginning, and ask what is the accusation which has given rise to the slander of me, and in fact has encouraged Meletus to proof this charge against me. Well, what do the slanderers say? They shall be my prosecutors, and I will sum up their words in an affidavit: ‘Socrates is an evil-doer, and a curious person, who searches into things under the earth and in heaven, and he makes the worse appear the better cause; and he teaches the aforesaid doctrines to others.’ Such is the nature of the accusation: it is just what you have yourselves seen in the comedy of Aristophanes, who has introduced a man whom he calls Socrates, going about and saying that he walks in air, and talking a deal of nonsense concerning matters of which I do not pretend to know either much or little—not that I mean to speak disparagingly of any one who is a student of natural philosophy. I should be very sorry if Meletus could bring so grave a charge against me. But the simple truth is, O Athenians, that I have nothing to do with physical speculations.

We should remember, though, that the framing of this statement might represent a revisionist attempt on the part of Plato.  The Clouds was a popular play and many copies were made.  Plato might have been concerned that the play was tarnishing the memory of his teacher, and gone out of his way to refute the impression.

Antique Bust of Socrates, Paulus Pontius, 1638 [public domain via Rijksmuseum]

Antique Bust of Socrates, Paulus Pontius, 1638 [public domain via Rijksmuseum]

The basic plot of the play is that Strepsiades, whose son Phidippides has racked up huge debts in his name, goes to the “Think Shop”, a sort of school of sophistry run by Socrates.  His goal is to learn rhetoric so well that he can argue his way out of paying his creditors.  After finding that he is too old to follow Socrates’ logical acrobatics, he decides to send Phidippides in his stead.  Phidippides learns so well that he is later able to publicly beat his father and justify it so convincingly that no one can argue with him.

The Clouds, of course, is a story about conflict between old and new systems of education.  The old system, represented by Strepsiades, emphasized military training and memorizing traditional poetry, preparing a young citizen to be a successful hopelite citizen-soldier.  The new system of the sophists was also practical, since it emphasized rhetoric and public speaking  to make the student successful in lawsuits or the assembly.  To Aristophanes, who thought that his fellow Athenians were far too litigious, and was at heart a social conservative, the new system would have provided a rich field for ridicule, even if generational conflict was not a classic subject for comedy.  As is often the case with the deeply intellectual comedy of Aristophanes, however, there were deeper philosophical issues in play.

“What is the best form of education?” is one of the perennial philosophical questions.  We will meet it again repeatedly in the Great Books.  On a more meta level, the Great Books movement in general represents one side of a modern debate about education.  At the risk of oversimplification, Great Books proponents believe in a more traditional form of education based on the core literature and concepts of Western Civilization, as opposed the newer “progressive” or “democratic” systems of education which emphasize relativism, openness, and inclusion of minority viewpoints.  The Great Books approach is based primarily on that used in ancient universities in the high medieval through early Victorian periods, as adapted by such Victorian reformers as John Henry Newman.  Its primary modern champions were Mortimer Adler and his associates.  More recently writers such as Allan Bloom, John Lukacs, and Donald Kagan, though they shy away from associating themselves with the Adler clique, have argued for a similar approach.  The progressive/democratic approach was first articulated in the works of John Dewey, reached its full realization during the culture wars of the 1960’s, and is taught as dogma in nearly every Education graduate program today.

In the later Hellenistic world, particularly among the elite of the Roman Empire, the dominant educational philosophy that emerged was a essentially a synthesis of the old gymnasium education and sophism, and post-Socratic philosophy.  This gives me hope that our own civilization may yet learn to balance the ideals of the Great Books movement with those of Dewey and his disciples.

Book Review: At The End of An Age

Lukacs, At The End of An Age, cover picture

At the End of an Age is a small book, and John Lukacs’ elegant yet simple prose could easily lull you into thinking it is an easy read.  It doesn’t take many pages, though, to realize that every paragraph in this book (or rather, book-length essay) is laden with complex ideas and meaning.  I found myself rereading whole pages to make sure I understood, and I suspect that I would need to read the whole book two or three times to pick up on all of his points.  That being said, the book is worth it.

As I mentioned in my previous post, the ostensible thesis of the book is that the modern age, which Lukacs calls the “bourgeoisie age” is nearing its end.  He offers cogent arguments and examples in support and, in general, makes a strong case.  As it happens, I agree with him; I wrote something very similar on this blog a couple weeks ago, before I had ever read Lukacs.  I think that anyone with some level of historical awareness can see that our civilization is gearing up for a drastic change.  Other historians I have read would have spent the entire book (or 12, in the case of Toynbee) expanding on their particular theory.  Lukacs, having laid out his arguments, then moves up to a higher, more meta-historical level.  Lukacs is interested not just in how history works, but in the epistemology and metaphysics of history and its relationship to the other sciences.  These are deep waters indeed.  Only Lukac’s strong voice and skill as a writer keep the reader from sinking.  Since I lack his mastery, I will not attempt to explain his points here, but will merely mention a couple of his main themes.

Lukacs believes that in history, as in quantum physics, the phenomena is ultimately inseparable from the observer.  The historian does not just record history but, in the act of writing it, actually influences and creates it.  This means that true objectivity is impossible for the historian, and that a purely deterministic conception of history is as obsolete as deterministic physics was after Heisenberg.  This matches up with comments I have occasionally made about history as a narrative.  History is based on fact but, ultimately, is a literary discipline.  This historian doesn’t just tell the story, he creates it.

Another major theme in the book is the role of the human mind in creating history.  Lukacs asserts that “the inclinations of men’s minds” and their beliefs are more important than their competence or any material factor.  “Mind” in this sense means consciousness or soul, separate from brain and body.  Lukacs believes in the power of the mind to influence reality and manifest different potentialities.  Comparative metaphysics is far from my specialty.  However, this sounds very similar to the writings of various New Thought philosophers,  particularly Earnest Holmes and his Science of Mind disciples.  I wonder to what extent the young John Lukacs was influenced by these metaphysical systems.  Regardless, the take away is that if a historian wants to understand a person or group he needs to go beyond studying their situation and strive to understand their minds.

Overall, I found many ideas in this book which I could agree with, or at least try on for size.  There were a few arguments, however, with which I did take minor issue.  In an early section of the book, as part of an overview of various ways the social structures of the current age are breaking down, he discusses the trend towards women’s equality in the workplace and announces that,

Women thought (or, rather, convinced themselves) that they were reacting against the age-old and often senseless categories and assertions  of male authority; yet their dissatisfaction often arose not because of the oppressive strength but because of the weakness of males.  The rising tide of divorces and abortions, the acceptance of sexual liberties, including pre-marital (and sometimes post-marital) habits of frequent copulation and other forms of cohabitation, the increasing numbers of unmarried women and single mothers, the dropping birth rate–thus the decline of the so-called “nuclear” family–were, especially after 1955, grave symptoms suggesting vast social changes.  They included the perhaps seldom wholly conscious, but more and more evident, tendency of many young women to desire any kind of male companionship, even of a strong and brutal kind, if need be at the cost of their self-respect. (pp. 23-24)

He offers no support for this complex, arguable, and potentially inflammatory claim.  This is not the sort of paragraph you just casually slip into a book without offering evidence to back it up.  This is the sort of thing which would have caused me, when I was still a teaching assistant grading papers, to circle the whole paragraph with red pen and write “BURDEN OF PROOF” in the margin.

Lukacs is also universally deprecatory of post-modernism in all of its forms, seeing it as a basically vague and degenerate direction for scholarship and culture.  That is a legitimate, if somewhat reactionary stance.  However, Lukacs, who escaped communist Hungary as a young man, is also blatantly anti-Marxist.  Since, as a historian, Lukacs could not help but be aware of the many contributions that Marxism has made to post-modern analysis and art, I have to question whether he might not be biased on the whole subject of post-modernism.

Finally, Lukacs is dismissive of any value in mathematics for the study of history.  As a “quant”, I feel compelled to respond.  As evidence, he cites his own non-deterministic, non-objectivist view of history as well as Gödel’s incompleteness theorems, which say that 1) Any non-trivial mathematical system contains some postulates which can not be proven without going beyond the system.  2) No mathematical system is capable of proving its own consistency.  Personally, I have been fascinated by Gödel’s theorems since I first studied them in an Abstract Algebra class that I took as a college junior.   As an illustration of what they mean, consider Euclid’s geometrical system, as set down in the Elements.  Euclid begins “A point is that which has position but no dimension.”  The entire system doesn’t work without this axiom, yet there is no way to prove that a point has no dimension using only Euclidean geometry.  You would need to introduce propositions from topology and/or calculus–which are themselves systems which contain propositions which can not be proven without introducing even more complex systems of mathematics.

Kurt Gödel in 1925 [public domain via Wikimedia]

Kurt Gödel in 1925 [public domain via Wikimedia]

And yet, geometry works quite well enough for most purposes, as do topology and calculus.  Granted, the incompleteness theorems seem to imply that a grand-unified theory of history, in the sense of of a closed form solution (plug all the variables into the equation, predict what will happen next) is impossible.  But applied math and statistics are about approximations, empirical formulas, noisy data, and models that work “well enough”, with a quantifiable margin of error.  The incredible advances over the past fifty years in fields like data mining, complexity theory, machine learning, and signal processing have paved the way for a useful discipline of mathematical history, probably within our own lifetimes.  Such a system will only be one more tool for the historian to use, and the results must not be allowed to dominate the historical narrative itself.  But to dismiss all mathematical history out of hand because it will not be an internally provable system seems like a major error.  Even in a non-deterministic universe, mathematical modeling can still provide startling and useful insights.

Despite these minor qualms, I truly enjoyed this book and would recommend it.  Overall, in fact, it is the kind of book I would like to write myself some day.  I will absolutely be reading (and probably reviewing) more of Lukacs’ works in the future.

John Lukacs on Intellectual Bureaucratization

I am currently reading historian John Lukacs’ At the End of an Age.  I only discovered Lukacs’ work fairly recently, after reading a review of one of his other books on David Withun’s blog.  He is an insightful and very readable author whose platform happens to match my own in several interesting ways; I’m sure I will be mentioning him again in the future.

John Lukacs speaking at Eastern University in 2009 (copyright by Eastern University.  fair use justification:  this is a low resolution still from a video, used for educational purposes.  No public domain substitute is available.  There is no foreseeable financial impact on Eastern University.)

John Lukacs speaking at Eastern University in 2009 (copyright by Eastern University)

I will probably do a full review of the book as soon as I finish it.  At the moment, however, I would like to respond to one of the ideas which he discusses multiple times in the first half of the book:  intellectual bureaucratization.  The main thesis of At the End of an Age is that the modern era of history, which began around the end of the fifteenth century, is now drawing to a close.  One of the main attributes that Lukacs points out as differentiating the modern era from previous eras is a the massive growth of bureaucracy in every area of human existence.  This trend is even evident–in fact especially evident–in the pursuit of knowledge.  Lukacs points to the increasing tendencies towards specialization, the need for credentials, and the drive to place intellectuals within some sort of larger organization.  He points out that the words “writer”, “scholar”, “philosopher”, and “intellectual” were once essentially synonyms but have not come to mean very different things.  The word “scientist”, meaning a philosopher who cultivates scientific knowledge, did not even appear in print until 1840.  Now “scientist” usually implies a practitioner of the natural sciences who has little or no connection with philosophy.

In many ways, this parallels an argument in Allan Bloom’s The Closing of the American Mind, which I reviewed in back in December.  Like Bloom, Lukacs infers a connection between the growth of Democracy and academic specialization.  Bloom, however, argued that the preeminence of Democracy was itself the result of the concerted efforts of philosophers from Machiavelli on to ensure their own comfort and survival. Lukacs, so far at least, has identified no such causal relationship.

Like Bloom, Lukacs points out that education is increasingly concerned with credentials such as specialized college degrees which are designed to fit students into particular pigeon holes in society and assign them a label (e.g. “physicist, business analyst, journalist”) that equate what they are with what they do.

Thinking on my own career, I realize that I have often chaffed against this phenomenon of intellectual bureaucratization.  My “official” academic specialty is operations research, the field which is concerned with using certain techniques of applied mathematics to find optimum solutions to problems in management and engineering.  I have succumbed to societal pressure and acquired certain credentials on this area, such as my MBA degree with a concentration in operations.  In my final terms in graduate school I was strongly urged by my adviser and others to continue to a PhD program so I could become a “real” operations researcher.

But I never wanted to label myself that way.  I don’t think of myself as an operations researcher.  I am an intellectual.  Operations research is a set of useful tools which I use to understand the world and create knowledge; it isn’t what I am.  Nor do I think of myself as a scientist, a philosopher, a scholar, or even a writer, though I do think that each of these are important facets of the intellectual life.  Even the label of “intellectual” is limiting.  If I am an intellectual, does that mean that I’m less of a worker, or an artist, or a homemaker, or any of a dozen other roles which I fill?

Society though, at least in this age, is very uncomfortable with anyone who doesn’t wear a label.  If someone at a cocktail party asks me “What do you do?” they become flustered when I don’t have an easy answer.

At the beginning of the modern era a PhD, or “doctor of philosophy” degree was a general degree, because philosophy was the discipline that included all of the others.  A PhD was someone who had achieved a breadth and depth of knowledge in all areas of philosophy sufficient to provide a liberal education to students.  Nowadays, though, PhDs are incredibly specialized.  The professors and PhD students I knew at school were entirely focused on publishing in the “hot” areas of their own disciplines, to the extent that they would refuse to consider or comment on questions in other fields.  Many operations PhDs will not even answer a student’s question about economics or finance, even though these are closely related disciplines.  Many of them teach the same two or three classes year after year and become offended when asked to take on a course which is outside their own research interests.  I enjoy both research and teaching, but was horrified at the idea of doing either at that level of specialization.  Lukacs is right:  an intellectual who is willing to be labeled and limited in such a way has become a bureaucrat, a cog in machine which is supposed to create knowledge, but mostly just produces citations and degrees.

For a few months now I have been reading my way through the Great Books and publishing my responses to them on this blog.  One of the things all of the authors of the Great Books have in common was that none of them allowed themselves to be cogs in a machine or limited themselves to considering one narrow area of study.  As a writer, there is only a slim chance that I will myself produce the next great book.  On the other hand, if devoted the rest of my life to being a professor in some narrow area of operations research, there is no chance I would write such a work at all.

Perhaps things will be different in the next era of history, and people will be able to just be people, without labels that fit them into a bureaucracy.  Perhaps Lukacs is right and that next era is coming soon.  I rather hope so.