Feeling jet lagged after a great trip to London for the weekend. While riding business class on the way back, I finally finished up the last of three essays of Nietzsche’s “On the Genealogy of Morals.” It started slow but the last half was an excellent argument about how Christian morality has so embedded itself in the West that even being an “atheist” is in some way just another stage in a long ascetic tradition – in this case, denying oneself the possibility of God’s existence.
There was also a passage about the strong versus the weak that resonated with me because of its arguments about herd mentality and meetings. I have always felt that meeting culture – “touching base,” having “a quick chat,” spewing 30- and 60-minute calendar blocks that probably merit only 5 minutes of time at most, etc. – was one of the most regrettable aspects of the workplace in the U.S. It’s like the corporate equivalent of church. So imagine my delight at this segment:
“[I]t should not be overlooked: the strong are as naturally inclined to strive to be apart as the weak are to strive to be together, when the former unite, this takes place only with a view to an aggressive collective action and collective satisfaction of their will to power, with much resistance from the individual consciences; the latter, on the contrary, gather together with pleasure at this very gathering, – their instinct is just as satisfied in doing this as the instinct of the born ‘masters’ is basically irritated and unsettled by organization.”
Meetings and gatherings of any kind, especially ones that involve, say, at least 3 people, are usually a waste of time for individuals who do their best work on their own. Being a cynic, I have often thought that the purpose of most corporate meetings is exhaustion – bringing people together in an ‘all-hands’ environment in which attention spans are tested and things are agreed to when no one is paying full attention. Opinions of people who don’t feel comfortable in the superficial environment of meetings – the ‘best’ argument doesn’t always win and is overcome by the best-sounding argument – are also crowded out.
I will write more about the last part of “On the Genealogy of Morals” later since it is really a treasure-trove of useful contrarian arguments against 21st century attitudes toward work. For now, though, I’ll note that Nietzsche talks about how the appeal of religion and the act of congregation – which these days has shifted in the U.S. from churches to workplaces – is the product of poor physical well-being (which needs some kind of relief) and wanting to be someone else. I can agree with that.
Body and Soul
The word “soul” has too much baggage in English. First encounters with it usually come in the context of religious instruction, maybe at a parochial school, or in some vague discussions about a certain, indefinable verve in music or performance. Similarly, there’s the entire genre of ‘soul’ music, which has very specific stylistic and even racial associations.
When I was in elementary school, I remember flipping through a science textbook with diagrams of the body and looking in vain for the physical location of the soul. It had to be large, I thought, so why was there nothing big and yellow (not sure why I thought it would be this color, but who knows what makes a 7 year-old think what he does) labeled “soul.”
Over the ensuing years, I felt stupid for having expected to find it on a diagram (‘souls don’t exist, silly’), but in retrospect I think my feelings were due to the religious connotations of ‘soul’ in English. In Greek, “soul” is not bound-up with the notion of something contained in the body that escapes it upon death and is “judged” by an egotistical maniac. This is clear from its etymology; psyche, the Greek word for ‘soul,’ has meanings that suggest “breath” and “life,” even “refrigeration,” kinda (katapsuxis).
The Mass-Production of Souls?
After finishing Aristotle’s “On Interpretation,” which I talked about yesterday, I moved on to “On the Soul,” in which the author seems as fascinated with the soul’s relationship with the body – irrespective of religious design – as I was as a kid. He even takes some of the Presocratics to task for not making any specifications of “the bodily conditions required for it”:
“[T]hey do not try to determine anything about the body which is to contain it, as if it were possible, as in the Pythagorean myths, that any soul could be clothed upon with any body – an absurd view, for each body seems to have a form and shape of its own. It is as absurd as to say that the art of carpentry could embody itself in flutes; each art must use its tools, each soul its body.”
I like this observation since it argues against a sort of “mass-production of souls.” I have always been bothered by the idea of a god, any god, making humankind in an absurd assembly line in the sky, putting a standard-issue heart, brain, and soul into each one. I need to do more research here, but this strikes me as simultaneously an old notion (the prerequisite to the egalitarianism of ashes-to-ashes/dust-to-dust) and a new one (informed by the industrialization of the 19th century).
Totalitarianism and the Soul
One of the fundamental problems with monotheism is that it is so dictatorial. “God” is a male deity whose authority can’t be challenged and whose every absurd whim (the Abraham/Isaac story is the best example) must be obeyed.
Being in charge of a dark factory of mass production seems right up this god’s alley, as he amasses the masses that must be at his beck and call. The other overarching implication of the one-size-fits-all soul is that we are all limited to a laughably narrow range of options: do this and go to heaven, do that and go to hell.
Aristotle spends much of Book I of “On the Soul” reviewing Presocratic notions of the soul and whether it was fire, air, or water (the fourth element, earth, was apparently not put forth as a serious candidate as the essence of the soul, as Aristotle notes). This Presocratic notion of the soul somehow coming up from the elements, rather than down from some dictatorial headmaster, is refreshing and liberating, one that I wish I had learned in the first grade instead of “where” my soul was destined to go.
Nouns and Greek texts
Looking back at elementary school, the earliest thing I remember learning was what a noun was. “A person, place, or thing” – that seems to cover all the bases. It’s the type of knowledge that quickly becomes secondhand, only coming to mind in cases like interpreting a sentence that contains a gerund, which is an English nouns that seem like a verb (e.g., “the happening is up ahead”).
Sixteen years after I learned what a noun was, I started reading Aristotle in Greek. Although Aristotle exerts tremendous influence on all of Western civilization – in every field from biology (which he started with his examinations of specimens brought to him by Alexander the Great) to theater criticism – I have never loved his ideas or stylistic flourishes as much as those of his teacher, Plato.
Some of his Greek texts seemed rough to me, requiring a lot of insertion of English words in the translation, whereas Plato’s writing was full of plays on words and syntactical arrangements that made it enjoyable in ways that English couldn’t reproduce. When translating, I felt like sometimes English was an upgrade for Aristotle, while it never was for Plato.
Nouns and sounds: Nounds?
I began reading Aristotle’s “On Interpretation” today, my first real brush with his work since 2007, when I was working with the “Nichomachean Ethics.” It won’t take me too long to finish, which is exciting, having recently read almost nothing but long philosophical tracts and novels.
Early on, Aristotle, like an elementary school teacher, sets the grounds rules by defining what he means by a noun. He says:
“By a noun we mean a sound significant by convention, which has no reference to time, and of which no part is significant apart from the rest.”
I don’t have the Greek text with me (I’ll try to find an image of it later) but isn’t it strange that a noun is defined as a sound? Obviously, nouns are also written, soundlessly, on paper and word processors, but, as Aristotle notes, “written words are the symbols of spoken words.” It all comes back to speech.
Sounds and good and bad writing
This makes sense when you start to think about bad writing, more so than good writing. So much bad writing and so many bad ideas emerge because they have no predecessors in speech and would sound close to nonsense if spoken aloud. I’m thinking of all that business writing about “full-service solutions providers.” Jason Fried tore into it several years ago for Inc.:
“One of my favorite phrases in the business world is full-service solutions provider. A quick search on Google finds at least 47,000 companies using that one. That’s full-service generic. There’s more. Cost effective end-to-end solutions brings you about 95,000 results. Provider of value-added services nets you more than 600,000 matches. Exactly which services are sold as not adding value?.”
All of these phrases sound horrible in conversation – even the people who write them wouldn’t utter them aloud in relaxed company. It’s like there’s nothing there; encountering the word “solutions” in text makes me instantly skip like 2 or 3 lines ahead to see if things get better. There may as well be no nouns on the page.
Aristotle is helpful here, too, in a strange way:
“[N]othing is by nature a noun or name – it is only so when it becomes a symbol; inarticulate sounds, such as those which brutes produce, are significant, yet none of these constitutes a noun.”
It’s a weird image that comes to mind for me here, as I equate brutes raving inarticulately with business writers ranting about best-of-breed management structures in ghostwritten columns or ‘touching base’ in their emails. What counts as “inarticulate,” though? A liberal interpretation, I suspect, could capture so much that is bad and nebulous about writing, particularly writing about technology.
Some terms, like “the Internet,” have become so vast as to be meaningless without first trying to figure out what they’re not – what is the Internet not, when it comes to technology? As I noted a few posts ago, the term has come to bind together software, hardware, networks, and many other disparate technologies into a homogenous term.
If it’s not everything, then it’s trying to become so by incorporating every device possible, through the “Internet of Things.” Sensors, “analytics,” and, yep, valued-added services all pile into conversations about this term: All I know is that trying to write about “the Internet of Things” makes me sound like an inarticulate brute.
One time, while I was with my dad in a ride around Chicago, another car up ahead of us made a strange move, switching lanes and squeezing-in in front of a long lines of cars. The car directly behind it (not ours) honked loudly and protractedly at this perceived offense. My dad quipped: “What’s the point? The offense is already passed (past?).”
These small acts of retribution, these mini punishments, though we take them for granted, require a strange and almost unnatural mindset. Cue Nietzsche (I’m just now finishing up his On the Genealogy of Morals; I have some others thoughts here and here):
“Throughout most of human history, punishment has not been meted out because the miscreant was held responsible for his act, therefore it was not assumed that the guilty party alone should be punished: – but rather, as parents still punish their children, it was out of some anger over some wrong that had been suffered, directed at the perpetrator, – but this anger was held in check and modified by the idea that every injury has its equivalent which can be paid in compensation, if only through the pain of the person who injures. And where did this primeval, deeply-rooted and perhaps now ineradicable idea gains its power, this idea of an equivalence between injury and pain? … [I]n the contractual relationship between creditor and debtor, which is as old as the very conception of a ‘legal subject’ and itself refers back to the basic forms of buying, selling, bartering, trade, and traffic.”
Criminals, for the most part, haven’t been punished because of the notion that they were free to have acted otherwise (i.e., not murdered, not stolen). Instead, they have been punished because some equivalence was drawn between the initial injury and the subsequent pain that the punisher could inflict – it’s all very transactional, and, really, the initial act – and all of its details of who did it and why – is already secondary to the drive to immediately get even via pain.
Looking back, that car horn episode was very much in the Nietzschean mold, plus it revealed how the injury/pain calculus is profoundly weird. The injury (“offense,” as my dad said), if considered in the abstract, cannot be undone; that car cannot be flung out of its lane with precision so that only the perpetrator is punished. The injury either passes or is addresses through the infliction of pain, in this case by the noise of the horn (insignificant) and I suppose by the hope (on the part of the honker) that the horn will bring shame to the errant car’s driver.
Learn to tame the mammoth, though, and the latter “pain” bounces off. On that note, it feels like a lot of the punishment mechanisms in society, in addition to seeking to inflict existential damage (bankruptcy, starvation, extermination), also have this element of social shaming to them. Being unemployed, for instance, can be as bad for the social awkwardness as it can for the day-to-day panic of surviving if and when the money runs out.
There is a perceived injury – “not contributing to society,” which I don’t think is right, but it’s a common enough mindset – and the response is the infliction of pain, rather than something would actually undo the injury, like…giving the person a job? I need to think more about this weird logic of getting even, maybe after I finish the final essay in this Nietzsche work.
One of my plans for 2015 is to read more books. Last year felt like a series of challenges – from family moving to the country from the Philippines, to my spouse switching job, to moving out of our apartment of three years – pressed on me and the only books I ended up reading were a few Orhan Pamuk novels, William Boyd’s Any Human Heart, Will Self’s Umbrella and a handful of others like H.G. Wells’ The First Men in the Moon. As with albums, I feel that infrequent consumption – although common due to myriad distraction – is a missed opportunity to keep learning.
So I kicked off the year by finishing David Foster Wallace’s The Broom of The System, which I thought was uneven but of enormous sentimental value (more so as I moved toward the end, with its mentions of Chicago). But its Wittgensteinian style nudged me to return to a subject I devoted plenty of time to in college but have neglected in the past 5 years. I moved on to Nietzsche’s On the Genealogy of Morals, a 19th century philosophy tract that I downloaded as a PDF and read entirely on my iPhone 6 Plus.
‘Words, words, words’
We live in a time in which the humanities are under siege. Subjects such as English, art, and music are perceived by the American education system as well as the corporations that reinforce it as at best recreational options for the so inclined and at worst as catastrophic distractions from an imaginary shortage of STEM graduates. It’s important, though, to realize that today’s colleges are by and large not even run by intellectuals, so anyone interested in the humanities shouldn’t feel bad if nagged about why they even study something as “irrelevant” (the classic question “so what are you going to do with that?” says more about the asker’s limited imagination than the respondent’s wisdom in life choices) as centuries-old literature or philosophy.
So: why study any of this, or why even read, as I have committed myself to doing more of in 2015? Why major in English? Why do anything that isn’t knee-deep in business speak or algorithms?
Start with words. Studying language with any seriousness is humbling. The student realizes how the meanings and societal controls imposed by seemingly simple arrangements of words on a page literally change our lives. Think about these examples:
- Only two of the New Testament evangelists (Luke and Matthew) assert that Jesus was born of a virgin. For this belief, they lean upon Isaiah 7:14, but the author of Isaiah uses the Hebrew word ‘alma,’ which means ‘young woman,’ not necessarily ‘virgin.’ Imagine the millennia of sexual anxiety and control – abstinence education, obsession with monogamy, etc. – tied up in this issue of the Hebrew-to-Greek translation of a single word modifying Jesus’ mother.
- America is the only industrialized, developed country without a healthcare system committed to universal coverage. Why? Partly because of delusional belief in the self-made man, but also because of the idea that “universal coverage” is the same as “single-payer,” which isn’t the case. Moreover, the phrase “single-payer” is ominous and productive of meanings and images – someone fitting the bill for millions of “undeserving” fellow humans – that would not necessarily play out in practice.
- In the Genealogy of Morals, Nietzsche points out that across multiple languages, the words equivalent of “good” and “bad” have associations with the aristocratic and plebian classes, respectively. The Latin “malus” (bad), like the Greek “melas” (“black”; you know the latter from “melancholy,” literally “black-bile”) are bound up with the idea of dark-skinned men who predated the latter settlers of the Italian peninsula. “Bonus,” in contrast, is related to “warrior,” and associated with the chivalrous nobility. In other cultures, “bad” takes a backseat to “evil” as a foil to “good,” allowing users to shed the baggage of “bad” and instead institute a moral system that does not depend on class.
It’s as if words are really all we have. In addition to the above trio, just think of how the creators of words like “good” and “evil,” and of their opposition to each other (as Nietzsche notes), can with linguistic mechanisms alone simplify the entire field of interpretation and enable everything from church groups denouncing the “evil” of Satan to heads of state declaring that is all a matter of “us vs. them.”
Ultimately, knowing how words are implemented to serve political and sociological ends is enormously liberating, at least for me as an informal (I’m no longer in school) student of the field. There’s a certain vitality in knowing that nothing has to be the way it is and that it is the product of decisions and what people have decided to utter aloud and put in writing.