Showing posts sorted by date for query Phenomenology. Sort by relevance Show all posts
Showing posts sorted by date for query Phenomenology. Sort by relevance Show all posts

Thursday, September 19, 2024

Evil

Evil (pronounced ee-vuhl)

(1) Morally wrong or bad; immoral; wicked; morally corrupt.

(2) Harmful; injurious (now rare).

(3) Marked or accompanied by misfortune (now rare; mostly historic).

(4) Having harmful qualities; not good; worthless or deleterious (obsolete).

Pre 900: From the Middle English evel, ivel & uvel (evil) from the Old English yfel, (bad, vicious, ill, wicked) from the Proto-Germanic ubilaz.  Related were the Saterland Frisian eeuwel, the Dutch euvel, the Low German övel & the German übel; it was cognate with the Gothic ubils, the Old High German ubil, the German übel and the Middle Dutch evel and the Irish variation abdal (excessive).  Root has long been thought the primitive Indo-European hupélos (diminutive of hwep) (treat badly) which produced also the Hittite huwappi (to mistreat, harass) and huwappa (evil, badness) but an alternative view is a descent from upélos (evil; (literally "going over or beyond (acceptable limits)")) from the primitive Indo-European upo, up & eup (down, up, over).  Evil is a noun & adjective (some do treat it as a verb), evilness is a noun and evilly an adverb; the noun plural is evils.

Evil (the word) arrived early in English and endured.  In Old English and all the early Teutonic languages except the Scandinavian, it quickly became the most comprehensive adjectival expression of disapproval, dislike or disparagement.  Evil was the word Anglo-Saxons used to convey some sense of the bad, cruel, unskillful, defective, harm, crime, misfortune or disease.  The meaning with which we’re most familiar, "extreme moral wickedness" existed since Old English but did not assume predominance until the eighteenth century.  The Latin phrase oculus malus was known in Old English as eage yfel and survives in Modern English as “evil eye”.  Evilchild is attested as an English surname from the thirteenth century and Australian-born Air Chief Marshall Sir Douglas Evill (1892-1971) was head of the Royal Air Force (RAF) delegation to Washington during World War II (1939-1945).  Despite its utility, there’s probably no word in English with as many words of in the same vein without any being actually synonymous.  Consider: destructive, hateful, vile, malicious, vicious, heinous, ugly, bad, nefarious, villainous, corrupt, malefic, malevolent, hideous, wicked, harm, pain, catastrophe, calamity, ill, sinful, iniquitous, depraved, vicious, corrupt, base, iniquity & unrighteousness; all tend in the direction yet none quite matches the darkness of evil although malefic probably come close.  

Hannah Arendt and the banality of evil

The word evil served English unambiguously and well for centuries and most, secular and spiritual, knew that some people are just evil.  It was in the later twentieth century, with the sudden proliferation of psychologists, interior decorators, sociologists, criminologists, social workers and basket weavers that an industry developed exploring alternative explanations and causations for what had long been encapsulated in the word evil.  The output was uneven but among the best remembered, certainly for its most evocative phrase, was in the work of German-American philosopher and political theorist Hannah Arendt (1906–1975).  Arendt’s concern, given the scale of the holocaust was: "Can one do evil without being evil?"

Whether the leading Nazis were unusually (or even uniquely) evil or merely individuals who, through a combination of circumstances, came to do awful things has been a question which has for decades interested psychiatrists, political scientists and historians.  Arendt attended the 1961 trial of Adolph Eichmann (1906-1962), the bureaucrat responsible for transportation of millions of Jews and others to the death camps built to allow the Nazis to commit the industrial-scale mass-murder of the final solution.  Arendt thought Eichmann ordinary and bland, “neither perverted nor sadistic” but instead “terrifyingly normal”, acting only as a diligent civil servant interested in career advancement, his evil deeds done apparently without ever an evil thought in his mind.  Her work was published as Eichmann in Jerusalem: A Report on the Banality of Evil (1963).  The work attracted controversy and perhaps that memorable phrase didn’t help.  It captured the popular imagination and even academic critics seemed seduced.  Arendt’s point, inter alia, was that nothing in Eichmann’s life or character suggested that had it not been for the Nazis and the notion of normality they constructed, he’d never have murdered even one person.  The view has its flaws in that there’s much documentation from the era to prove many Nazis, including Eichmann, knew what they were doing was a monstrous crime so a discussion of whether Eichmann was immoral or amoral and whether one implies evil while the other does not does, after Auschwitz, seems a sterile argument.

Evil is where it’s found.

Hannah Arendt's relationship with Martin Heidegger (1889–1976) began when she was a nineteen year old student of philosophy and he her professor, married and aged thirty-six.  Influential still in his contributions to phenomenology and existentialism, he will forever be controversial because of his brief flirtation with the Nazis, joining the party and taking an academic appointment under Nazi favor.  He resigned from the post within a year and distanced himself from the party but, despite expressing regrets in private, never publicly repented.  His affair with the Jewish Arendt is perhaps unremarkable because it pre-dated the Third Reich but what has always attracted interest is that their friendship lasted the rest of their lives, documented in their own words in a collection of their correspondence (Letters: 1925-1975, Hannah Arendt & Martin Heidegger (2003), Ursula Ludz (Editor), Andrew Shields (Translator)).  Cited sometimes as proof that feelings can transcend politics (as if ever there was doubt), the half-century of letters which track the course of a relationship which began as one of lovers and evolved first into friendship and then intellectual congress.  For those who wish to explore contradiction and complexity in human affairs, it's a scintillating read.  Arendt died in 1975, Heidegger surviving her by some six months.

New York Post, November 1999.

In 1999, Rupert Murdoch’s (b 1931) tabloid the New York Post ran one of their on-line polls, providing a list of the usual suspects, asking readers to rate the evil to most evil, so to determine “The 25 most evil people of the last millennium”.  The poll received 19184 responses which revealed some “recency bias” (a cognitive bias that favors recent events over historic ones) in that some US mass-murderers were rated worse than some with more blood on their hands but most commented on was the stellar performance of the two “write-ins”: Bill Clinton (b 1946; US president 1993-2001) & crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), the POTUS coming second and the FLOTUS an impressive sixth, Mr Murdoch’s loyal readers rating both more evil than Saddam Hussein (1937–2006; president of Iraq 1979-2003), Vlad the Impaler (Vlad Dracula or Prince Vlad III of Wallachia (circa 1430-circa 1477); thrice Voivode of Wallachia 1448-circa 1477 or Ivan the Terrible (Ivan IV Vasilyevich (1530–1584; Grand Prince of Moscow and all Russia 1533-1584 & Tsar of all Russia 1547-1584).

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

While fun and presumably an indication of something, on-line polls should not be compared with the opinion polls run by reputable universities or polling organizations, their attraction for editors looking for click-bait being they’re essentially free and provide a result, sometimes within a day, unlike conventional polls which can cost thousands or even millions depending on the sample size and duration of research.  The central problem with on-line polls is that responders are self-selected rather than coming from a cohort determined by a statistical method developed in the wake of the disastrously inaccurate results of a poll “predicting” national voting intentions in the 1936 presidential election.  The 1936 catchment had been skewered towards the upper-income quartile by being restricted to those who answered domestic telephone connections, the devices then rarely installed in lower-income households.  A similar phenomenon of bias is evident in the difference on-line responses to the familiar question: “Who won the presidential debate?”, the divergent results revealing more about the demographic profiles of the audiences of CBS, MSNBC, CNN, ABC & FoxNews than on-stage dynamics on-stage.

Especially among academics in the social sciences, there are many who object to the frequent, almost casual, use of “evil”, applied to figures as diverse as serial killers and those who use the “wrong” pronoun.  Rightly on not, academics can find “complexity” in what appears simple to most and don’t like “evil” because of the simple moral absolutism it implies, the suggestion certain actions or individuals are inherently or objectively wrong.  Academics call this “an over-simplification of complex ethical situations” and they prefer the nuances of moral relativism, which holds that moral judgments can depend on cultural, situational, or personal contexts.  The structuralist-behaviorists (a field still more inhabited than a first glance may suggest) avoid the word because it so lends itself to being a “label” and the argument is that labeling individuals as “evil” can be both an act of dehumanizing and something which reinforces a behavioral predilection, thereby justifying punitive punishment rather than attempting rehabilitation.  Politically, it’s argued, the “evil” label permits authorities to ignore or even deny allegedly causative factors of behavior such as poverty, mental illness, discrimination or prior trauma.

There are also the associative traditions of the word, the linkages to religion and the supernatural an important part of the West’s cultural and literary inheritance but not one universally treated as “intellectually respectable”.  Nihilists of course usually ignore the notion of evil and to the post-modernists it was just another of those “lazy” words which ascribed values of right & wrong which they knew were something wholly subjective, evil as context-dependent as anything else.  Interestingly, in the language of the polarized world of US politics, while the notional “right” (conservatives, MAGA, some of what’s left of the Republican Party) tends to label the notional “left” (liberals, progressives, the radical factions of the Democratic Party) as evil, the left seems to depict their enemies (they’re no longer “opponents”) less as “evil” and more as “stupid”.

The POTUS & the Pope: Francis & Donald Trump (aka the lesser of two evils), the Vatican, May 2017.

Between the pontificates of Pius XI (1857–1939; pope 1922-1939) and  Francis (b 1936; pope since 2013), all that seems to have changed in the Holy See’s world view is that civilization has moved from being threatened by communism, homosexuality and Freemasony to being menaced by Islam, homosexuality and Freemasony.  It therefore piqued the interest of journalists accompanying the pope on his recent 12-day journey across Southeast Asia when they were told by a Vatican press secretary his Holiness would, during the scheduled press conference, discuss the upcoming US presidential election: duly, the scribes assembled in their places on the papal plane. The pope didn’t explicitly tell people for whom they should vote nor even make his preference obvious as Taylor Swift (b 1989) would in her endorsement mobilizing the childless cat lady vote but he did speak in an oracular way, critiquing both Kamala Harris (b 1964; US vice president since 2021) and Donald Trump (b 1946; US president 2017-2021) as “against life”, urging Catholic voters to choose the “lesser of two evils.”  That would have been a good prelude had he gone further but there he stopped: “One must choose the lesser of two evils. Who is the lesser of two evils?  That lady or that gentleman? I don’t know.

Socks (1989-2009; FCOTUS (First Cat of the United States 1993-2001)) was Chelsea Clinton's (b 1980; FDOTUS (First Daughter of the United States)) cat.  Cartoon by Pat Oliphant, 1996.

The lesser of two evils: Australian-born US political cartoonist Pat Oliphant’s (b 1935) take on the campaign tactics of Bill Clinton (b 1946; US president 1993-2001) who was the Democratic Party nominee in the 1996 US presidential election against Republican Bob Dole (1923–2021).  President Clinton won by a wide margin which would have been more handsome still, had there not been a third-party candidate.  Oliphant’s cartoons are now held in the collection of the National Library of Congress.  It’s not unusual for the task presented to voters in US presidential elections to be reduced to finding “the lesser of two evils”.  In 1964 when the Democrats nominated Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) to run against the Republican's Barry Goldwater (1909–1998), the conclusion of many was it was either “a crook or a kook”.  On the day, the lesser of the two evils proved to be crooked old Lyndon who won in a landslide over crazy old Barry.

Francis has some history in criticizing Mr Trump’s handling of immigration but the tone of his language has tended to suggest he’s more disturbed by politicians who support the provision of abortion services although he did make clear he sees both issues in stark moral terms: “To send migrants away, to leave them wherever you want, to leave them… it’s something terrible, there is evil there. To send away a child from the womb of the mother is an assassination, because there is life. We must speak about these things clearly.  Francis has in the past labelled abortion a “plague” and a “crime” akin to “mafia” behavior, although he did resist suggestions the US bishops should deny Holy Communion to “pro-choice” politicians (which would have included Joe Biden (b 1942; US president 2021-2025), conscious no doubt that accusations of being an “agent of foreign interference” in the US electoral process would be of no benefit.  Despite that, he didn’t seek to prevent the bishops calling abortion is “our preeminent priority” in Forming Consciences for Faithful Citizenship, the 2024 edition of their quadrennial document on voting.  Some 20% of the US electorate describe themselves as Catholics, their vote in 2020 splitting 52/47% Biden/Trump but that was during the Roe v Wade (1973) era and abortion wasn’t quite the issue it's since become and a majority of the faith in the believe it should be available with only around 10% absolutist right-to-lifers.  Analysts concluded Francis regards Mr Trump as less evil than Ms Harris and will be pleased if his flock votes accordingly; while he refrained from being explicit, he did conclude: “Not voting is ugly.  It is not good.  You must vote.

Wednesday, June 12, 2024

Reduction

Reduction (pronounced ri-duhk-shuhn)

(1) The act of reducing or the state of being reduced.

(2) The amount by which something is reduced or diminished.

(3) The form (result) produced by reducing a copy on a smaller scale (including smaller scale copies).

(4) In cell biology, as meiosis, especially the first meiotic cell division in which the chromosome number is reduced by half.

(5) In chemistry, the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen).

(6) In film production when using physical film stock (celluloid and such), the process of making a print of a narrower gauge from a print of a wider gauge (historically from 35 to 16 mm).

(7) In music, a simplified form, typically an arrangement for a smaller number of parties  such as an orchestral score arranged for a solo instrument.

(8) In computability theory, a transformation of one problem into another problem, such as mapping reduction or polynomial reduction.

(9) In philosophy (notably in phenomenology), a process intended to reveal the objects of consciousness as pure phenomena.

(10) In metalworking, the ratio of a material's change in thickness compared to its thickness prior to forging and/or rolling.

(11) In engineering, (usually as “reduction gear”), a means of energy transmission in which the original speed is reduced to whatever is suitable for the intended application.

(12) In surgery, a procedure to restore a fracture or dislocation to the correct alignment, usually with a closed approach but sometimes with an open approach.

(13) In mathematics, the process of converting a fraction into its decimal form or the rewriting of an expression into a simpler form.

(14) In cooking, the process of rapidly boiling a sauce to concentrate it.

(15) During the colonial period, a village or settlement of Indians in South America established and governed by Spanish Jesuit missionaries.

1475–1485: From the Middle English reduccion, from the earlier reduccion, from the Middle French reduction, from the Latin reductiōnem & reductiōn- (stem of reductiō (a “bringing back”)) the construct being reduct(us) (past participle of redūcere (to lead back) + -iōn- (the noun suffix).  The construct in English was thus reduc(e), -ion.  Reduce was from the Middle English reducen, from the Old French reduire, from the Latin redūcō (reduce), the construct being re- (back) + dūcō (lead).  The –ion suffix was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive -iōnis).  It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process.  Reduction, reductivism, reductionistic & reductionism are nouns, reductionist is a noun & adjective, reductional & reductive are adjectives; the noun plural is reductions.  Forms like anti-reduction, non-reduction, over-reduction, pre-reduction, post-reduction, pro-reduction, self-reduction have been created as required.

Actor Ariel Winter (b 1998), before (left) and after (right) breast reduction (reduction mammaplasty).  Never has satisfactorily it been explained why this procedure is lawful in most jurisdictions.

In philosophy & science, reductionism is an approach used to explain complex phenomena by reducing them to their simpler, more fundamental components.  It posits that understanding the parts of a system and their interactions can provide a complete explanation of the system as a whole an approach which is functional and valuable is some cases and to varying degrees inadequate in others.  The three generally recognized classes of reductionism are (1) Ontological Reductionism, the idea that reality is composed of a small number of basic entities or substances, best illustrated in biology where life processes are explained by reducing things to the molecular level.  (2) Methodological Reductionism, an approach which advocates studying systems by breaking into their constituent parts, much used in psychology where it might involve studying human behavior by examining neurological processes.  (3) Theory Reductionism which involves explaining a theory or phenomenon in one field by the principles of another, more fundamental field as when chemistry is reduced to the physics or chemical properties explained by the operation of quantum mechanics.  Reduction has been an invaluable component in many of the advances in achieved in science in the last two-hundred-odd years and some of the process and mechanics of reductionism have actually been made possible by some of those advances.  The criticism of an over-reliance on reductionism in certain fields in that its very utility can lead to the importance of higher-level structures and interactions being overlooked; there is much which can’t fully be explained by the individual parts or even their interaction.  The diametric opposite of reductionism is holism which emphasizes the importance of whole systems and their properties that emerge from the interactions between parts.  In philosophy, reductionism is the position which holds a system of any level of complexity is nothing but the sum of its parts and an account of it can thus be reduced to accounts of individual constituents.  It’s very much a theoretical model to be used as appropriate rather than an absolutist doctrine but it does hold that phenomena can be explained completely in terms of relations between other more fundamental phenomena: epiphenomena.  A reductionist is either (1) an advocate of reductionism or (2) one who practices reductionism.

Reductionism: Lindsay Lohan during "thin phase".

The adjective reductive has a special meaning in Scots law pertaining to reduction of a decree or other legal device (ie something rescissory in its effect); dating from the sixteenth century, it’s now rarely invoked.  In the sense of “causing the physical reduction or diminution of something” it’s been in use since the seventeenth century in fields including chemistry, metallurgy, biology & economics, always to convey the idea of reduces a substance, object or some abstract quantum to a lesser, simplified or less elaborated form.  At that time, it came to be used also to mean “that can be derived from, or referred back to; something else” and although archaic by the early 1800s, it existence in historic texts can be misleading.  It wasn’t until after World War II (1939-1945) that reductive emerged as a derogatory term, used to suggest an argument, issue or explanation has been “reduced” to a level of such simplicity that so much has been lost as to rob things of meaning.  The phrase “reductio ad absurdum” (reduction to the absurd) is an un-adapted borrowing from the Latin reductiō ad absurdum, and began in mathematics, logic (where it was a useful tool in deriving proofs in fields like).  In wider use, it has come to be used of a method of disproving a statement by assuming the statement is true and, with that assumption, arriving at a blatant contradiction; the synonyms are apagoge & “proof by contradiction”.

Single-family houses (D-Zug) built in 1922 on the principle of architectural reductionism by Heinrich Tessenow in collaboration with Austrian architect Franz Schuster (1892–1972), Moritzburger Weg 19-39 (the former Pillnitzer Weg), Gartenstadt Hellerau, Dresden, Germany.

As a noun, a reductivist is one who advocates or adheres to the principles of reductionism or reductivism.  In art & architecture (and some aspects of engineering) this can be synonymous with the label “a minimalist” (one who practices minimalism).  As an adjective, reductivist (the comparative “more reductivist”, the superlative “most reductivist”) means (1) tending to reduce to a minimum or to simplify in an extreme way and (2) belonging to the reductivism movement in art or music.  The notion of “extreme simplification” (a reduction to a minimum; the use of the fewest essentials) has always appealed some and appalled others attracted to intricacy and complexity.  The German architect Professor Heinrich Tessenow (1876-1950) summed it up in the phrase for which he’s remembered more than his buildings: “The simplest form is not always the best, but the best is always simple.”, one of those epigrams which may not reveal a universal truth but is probably a useful thing to remind students of this and that lest they be seduced by the process and lose sight of the goal.  Tessenow was expanding on the principle of Occam's Razor (the reductionist philosophic position attributed to English Franciscan friar & theologian William of Ockham (circa 1288–1347) written usually as Entia non sunt multiplicanda praeter necessitatem (literally "Entities must not be multiplied beyond necessity" which translates best as “the simplest solution is usually the best.

Reductio in extrema

1960 Lotus Elite Series 1 (left) and at the Le Mans 24 Hour endurance classic, June 1959 (left) Lotus Elite #41 leads Ferrari 250TR #14. The Ferrari (DNF) retired after overheating, the Elite finishing eighth overall, winning the 1.5 litre GT class.

Weighing a mere 500-odd kg (1100 lb), the early versions of the exquisite Lotus Elite (1957-1963) enchanted most who drove it but the extent of the reductionism compromised the structural integrity and things sometimes broke when used under everyday conditions which of course includes potholed roads.  Introduced late in 1961 the Series 2 Elite greatly improved this but some residual fragility was inherent to the design.  On the smooth surfaces of racing circuits however, it enjoyed an illustrious career, notable especially for success in long-distance events at the Nürburgring and Le Mans.  The combination of light weight and advanced aerodynamics meant the surprisingly powerful engine (a robust unit which began life power the water pumps of fire engines!) delivered outstanding performance, frugal fuel consumption and low tyre wear.  As well as claiming five class trophies in the Le Mans 24 hour race, the Elite twice won the mysterious Indice de performance (an index of thermal efficiency), a curious piece of mathematics actually designed to ensure, regardless of other results, a French car would always win something.

Colin Chapman (1928–1982), who in 1952 founded Lotus Cars, applied reductionism even to the Tessenow mantra in his design philosophy: “Simplify, then add lightness.”  Whether at the drawing board, on the factory floor or on the racetrack, Chapman seldom deviated from his rule and while it lent his cars sparking performance and delightful characteristics, more than one of the early models displayed an infamous fragility.  Chapman died of a heart attack which was a good career move, given the likely legal consequences of his involvement with John DeLorean (1925–2005) and the curious financial arrangements made with OPM (other people's money) during the strange episode which was the tale of the DMC DeLorean gullwing coupé.

1929 Mercedes-Benz SSKL blueprint (recreation, left) and the SSKL “streamliner”, AVUS, Berlin, May 1932 (right).

The Mercedes-Benz SSKL was one of the last of the road cars which could win top-line grand prix races.  An evolution of the earlier S, SS and SSK, the SSKL (Super Sports Kurz (short) Leicht (light)) was notable for the extensive drilling of its chassis frame to the point where it was compared to Swiss cheese; reducing weight with no loss of strength.  The SSK had enjoyed success in competition but even in its heyday was in some ways antiquated and although powerful, was very heavy, thus the expedient of the chassis-drilling intended to make it competitive for another season.  Lighter (which didn't solve but at least to a degree ameliorated the high tyre wear) and easier to handle than the SSK (although the higher speed brought its own problems, notably in braking), the SSKL enjoyed a long Indian summer and even on tighter circuits where its bulk meant it could be out-manoeuvred, sometimes it still prevailed by virtue of sheer power.  By 1932 however the engine’s potential had been reached and there was no more metal which could be removed without dangerously compromising safety.  The solution was an early exercise in aerodynamics (“streamlining” the then fashionable term), an aluminium skin prepared for the 1932 race held on Berlin’s AVUS (Automobil-Versuchs und Übungsstraße (automobile traffic and practice road)).  The reduction in air-resistance permitted the thing to touch 255 km/h (158 mph), some 20 km/h (12 mph) more than a standard SSLK, an increase the engineers calculated would otherwise have demanded another 120 horsepower.  The extra speed was most useful at the unique AVUS which comprised two straights (each almost six miles (ten kilometres) in length) linked by two hairpin curves, one a dramatic banked turn.  The SSKL was the last of the breed, the factory’s subsequent Grand Prix machines all specialized racing cars.

Reduction gears: Known casually as "speed reducers", reduction gears are widely used in just about every type of motor and many other mechanical devices.  What they do is allow the energy of a rotating shaft to be transferred to another shaft running at a reduced speed (achieved usually by the use of gears (cogs) of different diameters.

In chemistry, a reduction is the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen) and as an example, if an iron atom (valence +3) gains an electron, the valence decreases to +2.  Linguistically, it’s obviously counterintuitive to imagine a “reduced atom” is one which gains rather than loses electrons but the term in this context dates from the early days of modern chemistry, where reduction (and its counterpart: “oxidation”) were created to describe reactions in which one substance lost an oxygen atom and the other substance gained it.   In a reaction such as that between two molecules of hydrogen (2H2)and one of oxygen (O2) combining to produce two molecules of water (2H2O), the hydrogen atoms have gained oxygen atoms and were said to have become “oxidized,” while the oxygen atoms have “lost them” by attaching themselves to the hydrogens, and were thus “reduced”.  Chemically however, in the process of gaining an oxygen atom, the hydrogen atoms have had to give up their electrons and share them with the oxygen atoms, while the oxygen atoms have gained electrons, thus the seeming paradox that the “reduced” oxygen has in fact gained something, namely electrons.

Saturday, November 4, 2023

Phenomenology

Phenomenology (pronounced fi-nom-uh-nol-uh-jee)

(1) The study of phenomena.

(2) In philosophy, the system of German philosopher Edmund Husserl (1859–1938) stressing the description of phenomena; the study of structures of consciousness as experienced from the first-person point of view; developed later as existential phenomenology, in the work of Husserl's student, the one-time Nazi, Martin Heidegger (1889–1976).

(3) In the philosophy of science, the science of phenomena as opposed to the science of being.

(4) In architecture, a school of design based on the experience of building materials and their sensory properties.

(5) In archaeology, a set of theories based upon understanding cultural landscapes from a sensory perspective.

(6) In physics, a branch which deals with the application of theory to experiments.

(7) In empirical psychology, the study of subjective experiences or the experience itself.

(8) In the study of comparative religions, a field of research concerning the experiential aspect of religion in terms consistent with the orientation of the worshippers.

1764: A compound word phenomen(on) (from the Late Latin phaenomenon (appearance) from the Ancient Greek φαινόμενον (phainómenon) (thing appearing to view), the neuter present passive participle of φαίνω (phaínō) (I show) + logy.  In English the logy suffix originates with loanwords from the Greek, usually via Latin and French, where the suffix (λογία) is an integral part of the word loaned.  Within English, the suffix became productive, forming names of sciences or departments of study and original compositions with no link to Greek or Latin forms were common by the late eighteenth century.  Phenomenology & phenomenologist are nouns phenomenological is an adjective and phenomenologically is an adverb; the noun plural is phenomenologies.

The road to Hegel

Portrait of the philosopher as a young man: Georg Wilhelm Friedrich Hegel (1770-1831).

Phenomenology was a created philosophical system that was intended to be free of presupposition, the notion being that objects and events should be observed and described from the position of the observer(s) a process supposedly free from claims about any objective reality. Anything not immediately conscious is to be excluded and rather than deductive or empirical methods, there was a reliance on the information gathered by the senses; all scientific or metaphysical knowledge or belief was discarded.  Phenomenology is not an essentially theoretical exercise like idealism which claimed the only thing truly to exist is the mind.  Phenomenology instead takes the position that that can be known is subjective reality, thus the pointlessness of an attempt to seek out some objective reality.  The focus is on the subjective.  That didn’t mean existentialism and phenomenology were the same.  Phenomenology was a toolbox of processes with which to view metaphysics and knowledge; existentialism, ultimately, was about generating the normative ethics to make a worthwhile life.  Phenomenology’s core method was the investigation and description of phenomena as consciously experienced, devoid from any theoretical framework, and, to whatever extent was possible, undertaken without preconceptions and presuppositions.

Lindsay Lohan capturing Hegel's phenomenology of spirit in a T-shirt.

The word, though without quite its modern meaning, seems first to have been used in 1764 by Swiss philosopher and mathematician, Johann Heinrich Lambert (1728–1777) in his work Neues Organon, a treatise on logic, the title (New Organon), a nod to The Organon (ργανον in the Ancient Greek meaning “instrument, tool, organ"), the collection of Aristotle's (384-322 BC) six works on logic assembled (circa 40 BC) by a group of disciples known as the Peripatetics.  In the reductionist spirit of logic, Lambert applied the word to his particular exploration of the systemic use of knowledge to differentiate truth falsehood but it wasn’t until the publication in 1807 of Phänomenologie des Geistes (Phenomenology of Spirit) by German philosopher GWF Hegel that lineal paths to the twentieth century phenomenological can be traced.  Hegel was impenetrable even by the standards of German philosophers so the discursive output of the new phenomenologists of the new century seems hardly surprising.  As many students discovered, one can find one's way to Hegel but it's hard to find one's way back.

Sunday, July 23, 2023

Zeitgeist

Zeitgeist (pronounced tsahyt-gahyst)

A German noun, the spirit of the time; general trend of thought or feeling characteristic of a particular period of time, historically especially as reflected in literature and philosophy although now also used to reference popular culture.

1835: From the German, Zeit + Geist, literally "time spirit (or ghost)", a calque of Latin genius sēculī, and best translated as “spirit of the age”.  It’s not commonly pluralized but the plural of Geist (ghost; spirit) is Geister, thus in English the irregular noun zeitgeister (also often in the plural), sometimes used of those who write of the fads in contemporary culture. Zeitgeist is a noun and zeitgeisty, zeitgeistier & zeitgeistiest are adjectives; the (rare) plural is zeitgeists.  Hopefully, zeitgeistesque never becomes a thing.  

Spirit of the age

A concept from eighteenth & nineteenth century German philosophy, best translated as "spirit of the age", it refers to the invisible forces dominating or defining the characteristics of a given epoch in history.  Although now most associated with German philosopher Georg Wilhelm Friedrich Hegel (1770-1831), especially in contrast with the Hegelian concepts of volksgeist (national spirit) and weltgeist (world-spirit), the coinage predates Hegel and appears in the work of Johann Gottfried Herder (1744–1803), Johann Wolfgang von Goethe ("Goethe"; 1749–1832), Herbert Spencer (1820–1903) and Voltaire (François-Marie Arouet; 1694–1778).  Hegel, in Phenomenology of the Spirit (1807), did write of the idea but preferred the phrase Geist der Zeiten (spirit of the times) over the compound Zeitgeist.  Most sources acknowledge the first documented use being in the writings of Herder.

Portrait of GWF Hegel (circa 1839), steel engraving by Lazarus Gottlieb Sichling (1812–1863) after an aquarel (watercolor) lithograph (1828) by Julius Ludwig Sebbers (circa 1785-1893).

Until recently, zeitgeist tended to be used retrospectively, in the manner of geochronology where an epoch has a known end date before labels are applied.  Of late, with its (perhaps over-enthusiastic) adoption by those who write of pop-culture, it’s come to be attached to just about anything, however fleeting.  Some criticize this but that's probably intellectual snobbery; in a sense Hegel et al were also writing of a type of popular culture.  Appropriately then, zeitgeist should now be considered and English word, having been thoroughly assimilated and not capitalized unless used in a way which references its continental origins (of Hegel, Voltaire etc) in which case it remains German and as a noun picks up an initial capital.  Because it remains both a German and English word, it can be spoken as written or translated, depending on the effect desired and in this it's a variant of the conventions which guide the way written text is handled in oral speech.  Where a word or phrase, however familiar in English, remains foreign it should when spoken, be rendered in translation: the written text “Hillary Clinton is, inter alia, crooked”, is spoken as “Hillary Clinton is, among other things, crooked”.  Where a foreign word or phrase has been assimilated into English it is treated as native so the written text “Hillary Clinton’s statement was the usual mix of lies, half-truths, evasions etc” is spoken as “Hillary Clinton’s statement was the usual mix of lies, half-truths, evasions etcetera.”  Note the usual shortened form (etc) has traditionally always been followed by a full-stop but there is a welcome revisionist movement which argues it too has become an English word (as etcetera is an anglicized form of the Latin et cetera) and thus needs no longer to be treated as a truncation.

Of the zeitgeist, early in the third millennium: Paramount Pictures promotional poster for Mean Girls (2004).

Where a foreign word or phrase, however familiar in English, depends for technical or other reasons on the original form to convey its meaning, it should be spoken as written.  Words of this class are often legal Latin such as obiter dictum (from the Latin and literally "something said in passing and not critically to what's being discussed" and in law describing a judge's expression of opinion not essential to the verdict and thus not binding as a precedent) and habeas corpus (from the Latin habeas corpus ad subjiciendum (literally "You (shall) have the body to be subjected to (examination)" and now a mechanism to challenge the lawfulness of a detention (ie the detainee must be brought before a court).  Status quo (from the Latin status (state) (sometimes used in the ablative statū) + quō (in which), the ablative of quī (which)) is well-known and widely used as kind of verbal shorthand to avoid clumsy English constructions yet the Status Quo is an Ottoman era firman (from the Ottoman Turkish فرمان‎ (ferman), from the Persian فرمان‎ (farmân) (command, order, decree)) which defines certain unchanging understandings among religious communities with respect to nine shared religious sites in Jerusalem and Bethlehem and to translate this to anything else would rob it of the meaning which relies on its historic context.  So, words evolve to be defined as assimilated into English or not according to "rules" which are a bit vague but in use there's probably a consensus things like "obiter dictum" and "habeas corpus" remain ways of expressing something with a foreign phrase because they're still used in their original (legal) context whereas "status quo" has become an English phrase because use is so diverse and distant from its origins.

Friday, March 17, 2023

Anxiety

Anxiety (pronounced ang-zahy-i-tee)

(1) Fear, foreboding, worry, disquiet, distress, uneasiness or tension caused by apprehension of possible future misfortune, danger etc, often to a degree that normal physical and psychological functioning is disrupted (can occur without an identifiable cause in which case the patient may be diagnosed with an anxiety disorder).

(2) Earnest but tense desire; eagerness; an uneasy or distressing desire for someone or something.

(3) In psychiatry, a state of intense apprehension or worry often accompanied by physical symptoms such as shaking, intense feelings in the gut etc, common in mental illness or after a distressing experience; a generalised state of apprehension and psychic tension occurring in some forms of mental disorder.

1515–1525: From the Middle English anxumnesse (apprehension caused by danger, misfortune, or error, uneasiness of mind respecting some uncertainty, a restless dread of some evil), from the Old English angsumnes, from the Latin anxietatem (nominative anxietas) (anguish, anxiety, solicitude) a noun of quality from anxius (uneasy, anxious, solicitous, distressed, troubled in mind) from angō (to distress, trouble), akin to the Ancient Greek γχω (ánkhō) (to choke).  The construct of the Latin anxietās was anxi(us) (anxious) + -etās, a variant of -itās used if appearing before a vowel.  The -itas suffix was from the Proto-Italic -itāts & -otāts (-tās added to i-stems or o-stems, later used freely) and ultimately from the primitive Indo-European -tehats.    Synonyms include foreboding, uneasiness, perplexity, disquietude, disquiet, trouble, apprehension, restlessness & distress and it’s become a popular modifier (range anxiety, climate anxiety, separation anxiety, performance anxiety etc).  Anxiety is a noun; the noun plural is anxieties.

Xanax tablets.

Xanax is the brand name for the drug alprazolam which is a benzodiazepine.  It is a prescription medication primarily used to treat anxiety disorders, panic disorders and (more controversially) depression.  A fast & short-acting benzodiazepine, Xanax works by enhancing the activity of a neurotransmitter called gamma-aminobutyric acid (GABA), which helps to reduce anxiety and promote relaxation.  Xanax is regarded as effective for treating anxiety and related disorders when used as prescribed but can be habit-forming, leading to dependence and addiction.  Lindsay Lohan released (or "dropped" in the fashionable parlance) the track Xanax in 2019.  With a contribution from Finnish pop star Alma (Alma-Sofia Miettinen; b 1996), the accompanying music video was said to be “a compilation of vignettes of life”, Xanax reported as being inspired by Ms Lohan’s “personal life, including an ex-boyfriend and toxic friends”.  Structurally, Xanax was quoted as being based around "an interpolation of" Better Off Alone, by Dutch Eurodance-pop collective Alice Deejay, slowed to a Xanax-appropriate tempo.

Generalized anxiety disorder (GAD) and panic disorder (PD) were formalized when the third edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-III) was released in 1980 although among clinicians, GAD had for some years been a noted thread in the literature but what was done in DSM-III was to map GAD onto the usual pattern of diagnostic criteria.  In practice, because of the high degree of co-morbidity with other disorders, the utility of GAD as defined was soon a regular topic of discussion at conferences and the DSM’s editors responded, the parameters of GAD refined in subsequent releases between 1987-1994 when GAD’s diagnostic criteria emerged in its recognizably modern form:

By the time the terminology for mental disorders began in the nineteenth century to be codified, the word anxiety had for hundreds of years been used in English to describe feelings of disquiet or apprehension and in the seventeenth century there was even a school of thought it was a pathological condition.  It was thus unsurprising that “anxiety” was so often an element in the psychiatry’s early diagnostic descriptors such as “pantophobia” and “anxiety neurosis”, terms which designated paroxysmal manifestations (panic attacks) as well as “interparoxysmal phenomenology” (the apprehensive mental state).  The notion of “generalized anxiety”, although not then in itself a diagnosis, was also one of the symptoms of many conditions including the vaguely defined neurasthenia which was probably understood by many clinicians as something similar to what would later be formalized as GAD.  As a distinct diagnostic category however, it wasn’t until the DSM-III was released in 1980 that GAD appeared, anxiety neurosis split into (1) panic disorder and (2) GAD.  When the change was made, the editors noted it was a response to comments from clinicians, something emphasised when DSM-III was in 1987 revised (DSM-III-R), in effect to acknowledge there was a class of patient naturally anxious (who might once have been called neurotic or pantophobic) quite distinct from those for whom a source of anxiety could be deduced.  Thus, the cognitive aspect of anxiety became the critical criterion but within the profession, some scepticism about the validity of GAD as a distinct diagnostic category emerged, the most common concern being the difficulty in determining clear boundaries between GAD, other anxiety-spectrum disorders and certain manifestations of depression.

The modern label aside, GAD has a really long lineage and elements of the diagnosis found in case histories written by doctors over the centuries would have seemed familiar to those working in the early nineteenth century, tales of concern or apprehension about the vicissitudes of life a common thing.  As psychiatry in those years began to coalesce as a speciality and papers increasingly published, it was clear the behaviour of those suffering chronic anxiety could culminate in paroxysmal attacks, thus it was that GAD and panic attacks came to be so associated.  In English, the term panophobia (sometimes as pantaphobia, pantophobia or panphobia) dates from 1871, the word from the Late Latin pantŏphŏbŏs, from the Ancient Greek παντοφόβος (all-fearing (literally “anxiety about everything”)).  It appears in the surviving works of medieval physicians and it seems clear there were plenty of “pantophobic patients” who allegedly were afraid of everything and it was not a product of the Dark Ages, Aristotle (384-322 BC) in the seventh book of his Nicomachean Ethics (350 BC) writing there were men “…by nature apt to fear everything, even the squeak of a mouse”.

The first edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-I (1952) comprised what seems now a modest 130 pages.  The latest edition (DSM-5-TR (2022)) has 991 pages.  The growth is said to be the result of advances in science and a measure of the increasing comprehensiveness of the manual, not an indication that madness in the Western world is increasing.  The editors of the DSM would never use the word "madness" but for non-clinicians it's a handy term which can be applied to those beyond some point on the spectrum of instability.

Between Aristotle and the publication of the first edition of the DSM in 1952, physicians (and others) pondered, treated and discussed the nature of anxiety and theories of its origin and recommendations for treatment came and went.  The DSM (retrospectively labelled DSM-I) was by later standards a remarkably slim document but unsurprisingly, anxiety was included and discussed in the chapter called “Psychoneurotic Disorders”, the orthodoxy of the time that anxiety was a kind of trigger perceived by the conscious part of the personality and produced by a threat from within; how the patient reacted to this resulted in their reaction(s).  There was in the profession a structural determinism to this approach, the concept of defined “reaction patterns” at the time one of the benchmarks in US psychiatry.  When DSM-II was released in 1968, the category “anxiety reaction” was diagnosed when the anxiety was diffuse and neither restricted to specific situations or objects (ie the phobic reactions) nor controlled by any specific psychological defense mechanism as was the case in dissociative, conversion or obsessive-compulsive reactions. Anxiety reaction was characterized by anxious expectation and differentiated from normal apprehensiveness or fear.  Significantly, in DSM-II the reactions were re-named as “neuroses” and it was held anxiety was the chief characteristic of “neuroses”, something which could be felt or controlled unconsciously by various symptoms.  This had the effect that the diagnostic category “anxiety neurosis” encompassed what would later be expressed as panic attacks and GAD.

A: Excessive anxiety and worry (apprehensive expectation), occurring more days than not for at least 6 months, about a number of events or activities (such as work or matters relating to educational institutions).

B: The patient finds it difficult to control the worry.

C: The anxiety and worry are associated with three (or more) of the following six symptoms:

(1) Restlessness or feeling keyed up or on edge.

(2) Being easily fatigued.

(3) Difficulty concentrating or mind going blank.

(4) Irritability.

(5) Muscle tension.

(6) Sleep disturbance (difficulty falling or staying asleep, or restless, unsatisfying sleep).

The key change really was for the criteria for GAD requiring fewer symptoms. Whereas with the DSM-IV-TR (2000) individuals needed to exhibit at least three physical and three cognitive symptoms for a diagnosis of GAD, under DSM-5 (2013), only one of each was required so not only was the accuracy and consistency of diagnosis (by definition) improved, the obvious practical effect was better to differentiate GAD from other anxiety disorders and (importantly) the usual worries and concerns endemic to the human condition.  The final significant aspect of the evolution was that by the time of DSM-5, GAD had become effectively a exclusionary diagnosis in that it cannot be diagnosed if the anxiety is better explained by other anxiety disorders and nor can GAD be caused directly by stressors or trauma.

Leonard Bernstein's Symphony No 2 (The Age of Anxiety) was inspired by WH Auden's long poem of the same name.

WH Auden's (1907-1973) The Age of Anxiety: A Baroque Eclogue (1944) divided critics, said by some to be "his best work to date" and by others to be "dull and an obvious failure", some of whom rubbed in the critical salt by adding Leonard Bernstein's (1918-1990) Symphony No 2 (1948-1949), inspired by the poem, was the finer piece of art.  It was better received in the US where it was written, winning the Pulitzer prize but whether or not influenced by the reaction, Auden would never again complete an epic-length work.  Like HG Wells' (1866-1946) Mind at the End of its Tether (1945), it was very much a work of the unhappy time in which Auden found himself and in some ways picked up from his lament September 1st, 1939 (a poem he later renounced).  As a poem, The Age of Anxiety is a delight for structuralists, its six sections (prologue, life-story, dream-quest, dirge, masque & epilogue (and emulated by the six movements in Bernstein's symphony (each movement sub-divided))) able to be deconstructed even mathematically but, the most common complaint is that although his four protagonists (three men and a woman) are very different people and all from a world of vernacular American English, their thoughts on the human condition and their own are expressed as if each had once gone up to Oxford to take a degree in English, as Auden in his youth had done.  Such voices in poems are not unusual but the critics go further in claiming that anyone new to the work, were the characters' names to be concealed, could not possibly guess which of the four is talking.  While it becomes clear the abstractions he maps upon his four represent thought, intuition, sensation & feeling, while helpful as a device through which his word-view can be discussed, as flesh & blood characters they are vague indeed.  Still, literature should perhaps be enjoyed for what it is rather than what it's not; one doesn't need to find plausible what Philip Roth (1933-1918) thinks might be the thoughts of a woman to find pleasure in the text and it's the same with Auden's The Age of Anxiety.  Those interested in poetry as art will read such cleverness with relish, ticking the boxes on the path to technical ecstasy.  Those who want to feel something should stick to Sylvia Path (1932-1963).      

We would rather be ruined than changed
We would rather die in our dread
Than climb the cross of the moment
And let our illusions die.

WH Auden in The Age of Anxiety (1944).