Showing posts sorted by relevance for query bleak. Sort by date Show all posts
Showing posts sorted by relevance for query bleak. Sort by date Show all posts

Saturday, May 11, 2024

Bleak

Bleak (pronounced bleek)

(1) Bare, desolate, and often windswept.

(2) Cold and piercing; raw.

(3) Without hope or encouragement; depressing; dreary.

(4) A number of species of fish, the best known of which is probably the European freshwater fish, Alburnus, having scales with a silvery pigment used in the production of artificial pearls.

(5) Pale (obsolete).

1300-1350: From the Middle English bleke (also bleche, source of the Modern English bleach, and bleike (due to Old Norse), and the earlier Middle English blak & blac (pale, wan), from the Old English blǣc, blǣċ & blāc (bleak, pale, pallid, wan, livid; bright, shining, glittering, flashing) and the Old Norse bleikr (pale, whitish), from the Proto-Germanic blaikaz (pale, shining). It was cognate with the Old Norse bleikja & bleikr (white), the Old High German bleih, the Dutch bleek (pale, wan, pallid), the Low German blek (pale), the German bleich (pale, wan, sallow), the Danish bleg (pale), the Swedish blek (pale, pallid), the Norwegian bleik (pale), the Faroese bleikur (pale) and the Icelandic bleikur (pale, pink).  Akin to bleach, the primitive Indo-European root was bhel- (to shine, flash, burn (also "shining white").  Bleak is a noun & adjective, bleakness a noun, bleakish, bleaker & bleakest are adjectives and bleakly is an adverb; the noun plural (of fish) is bleaks or bleak (especially of a large number).

The original English sense (pale, wan etc) is long obsolete; the meaning modern meaning "bare, windswept" emerging in the 1530s, the figurative sense of "cheerless" first noted circa 1719.  The same Germanic root produced the Middle English blake (pale (bacc in the Old English)), but this fell from use, probably from confusion with blæc (black); the surviving surname “Blake” a demonstration of this, its roots traced variously to both "one of pale complexion" and "one of dark complexion".  Bleak has survived, not in the "pale" sense, but meaning only "bare, barren."  Common related words and synonyms include desolate, austere, dreary, chilly, cold, grim, lonely, harsh, somber, sad, dark, gloomy, dismal, bare, blank, burned, cleared, desert, deserted & exposed.

Jarndyce v Jarndyce,  Court of Chancery

Although now thought a conventional novel, Charles Dickens (1812-1870) published Bleak House in twenty serialized episodes between March 1852 and September 1853, a technique popular at the time.  It’s the recounting of Jarndyce v Jarndyce, a long-running (fictional) case in the Court of Chancery and is a critical satire of the English legal system.

In the novel, the matter of Jarndyce v Jarndyce ran in the Court of Chancery for generations, ending not in resolution but closing only when it was found that legal costs charged over the years had absorbed all the money in the estate.  The legal profession was critical of Bleak House, claiming it was much-exaggerated and hardly typical of the cases heard in Chancery but Dickens’ depiction was not wholly fictional, there being a number of cases which had dragged on for decades, ending, like Jarndyce v Jarndyce, only because legal costs had consumed all the funds which were the source of the original action, two decades-old Chancery cases mentioned in the author's preface as his inspiration.  Nor was Dickens alone in his criticism of this judicial lethargy, many, including some within the profession, had long advocated reform and the author's interest was also personal.  One, he'd worked as a legal clerk and had successfully brought before Chancery an action for breach of copyright but, despite winning the case, had been forced to pay costs because the other party declared bankruptcy.

A bleak visage.

The Court of Chancery (or court of equity) was one half of the English civil justice system, running in parallel with the common-law courts.  Chancery had evolved into a recognizable form in the fourteenth century as a court concerned more with justice and fairness than the rigid and precise rules under which the common-law courts operated, its most famous innovation being the laws of trusts which exist to this day.  However, Chancery’s increasing remit begat its own bureaucratic inertia and as early as the sixteen century the court was criticized for its leisurely pace, long backlogs and high costs; despite sporadic attempts at reform, especially during the early nineteenth century, the problems persisted.  The core of the problem was that every delay in the process meant another fee was charged and those fees were paid to the court's officials, thus providing an economic incentive for them to find reasons for delays.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

A number of enquiries concluded the difficulties (obvious to all except those benefiting from the system) were structural, the solution: dissolution.  In 1873 and 1875 the Supreme Court of Judicature Acts dissolved the Chancery and created a unified High Court of Justice, with Chancery becoming one of three divisions of the High Court, thus preserving equity as a parallel stream of law but resolving many of the administrative impediments to judicial efficiency.  The reform didn't infect the whole judicial system; even in the twentieth century, the House of Lords once took nearly eighteen years to hand down a decision.  The judicial indolence surprised few, Sir Patrick Dean (1909–1994; UK Ambassador to the US 1965-1969) of the foreign office once noting business in the Lords was often "conducted at a leisurely pace".

Charles Dickens' former "holiday cottage", overlooking Viking Bay in the Kent town of Broadstairs, came to be called Bleak House after the novel was published.  Built in 1801, it's said to be where the author wrote his his eighth novel, David Copperfield  (1849-1850), the manuscript for which reached the publisher with the informative title The Personal History, Adventures, Experience and Observation of David Copperfield the Younger of Blunderstone Rookery (Which He Never Meant to Publish on Any Account).

Sunday, May 21, 2023

Wiglomeration

Wiglomeration (pronounced wig-glom-uh-rey-shuhn)

Needlessly or pointlessly complicated, time-consuming legal wrangling (listed by most sources as “always derogatory” but it’s presumed within the profession it’s sometimes an expression of admiration).

1852: The construct was wig + (agg)lomeration.  Wiglomeration is a noun, the noun plural is wiglomerations.  Although some must have been tempted, there seems no evidence anyone has ever created derived forms such as wiglomerative, wiglomerating, wiglomerator etc.

Wig (a head of real or synthetic hair worn on the head (1) to disguise baldness, (2) for cultural or religious reasons, (3) for fashion, (4) by actors better to resemble the character they are portraying or (4) in some legal systems by advocates or judges during court proceedings) was a shortened form of periwig, from the Middle French perruque which was probably borrowed from the western Lombard perrucca & parrucca which are of uncertain origin, the Oxford English Dictionary (OED) suggesting there may be some relationship with the Latin pilus (hair) but, noting the phonetic variations, ponder that instead it could be related to parrocchetto (parakeet), the reference being to the bird’s feathers.  Linguistically, the process might have been similar to the phonetic changes of the intervocalic “L” into “R” of Italian parlare and Sicilian parrari.  Among fisherman, a wig was also “an old seal” although that use is now rare.  The meaning “to reprimand” is thought related to the slang term “bigwig” (that dating from the seventeenth century fashion in England of wearing big (and in the era increasingly bigger) wigs in England, a trend which peaked in early in 1700s) because of the association with aristocrats, nobles, lawyers and judges, the size and grandeur of one’s powdered wig a status symbol used to convey a perception of wealth and social standing.  Fashions however change and during the eighteenth century, the use declined and while among a few they lingered into the early 1800s, the French Revolution (1789) really was their death knell just about everywhere except courtrooms.

Interestingly, academic sources inside the construct was wig + (agg)lomeration rather than the more obvious wig + (g)lomeration, this based on an analysis of the unpublished notes of the author who coined the word.  Glomerate (to gather or wind into a spherical form or mass; to collect certain objects) was from the Latin glomeratus, past participle of glomerāre (to wind or add into a ball; to glomerate).  Agglomerate (the act or process of collecting in a mass; a heaping together; the state of being collected in a mass; a mass; cluster) was from the Latin agglomerātus, past participle of agglomerāre, the construct being ad- (to) + -glomerāre, from glomus (a ball; a mass), from globus (genitive glomeris), (a ball of yarn) of uncertain origin.

Wigs galore: Court of Chancery, Lincoln's Inn Hall (1808-1810), a book illustration created by Rudolph Ackermann, WH Pyne, William Combe, Augustus Pugin & Thomas Rowlandson, British Library collection.

Wiglomeration was coined by Charles Dickens (1812–1870) for a bit of a rant by Mr Jarndyce in the serialized novel Bleak House (1852-1853) which told the tale of the fictional probate case Jarndyce vs Jarndyce (spoken as “Jarndyse and Jarndyse” in the conventions of English legal language) which, over the decades it unfolded in the Court of Chancery Court, absorbed in legal fees all of the vast estate which the proceedings were initiated to distribute to the rightful beneficiaries.  The legal establishment at the time of publication criticized the depiction as “an exaggeration” but while it wasn’t typical, nor was it without basis because cases lasting over a decade were known and one famously ended (with the subject estate exhausted in legal costs) only in 1915 after running for 117 years.  Even well into the twentieth century, judicial sluggishness was not unknown: the House of Lords once took almost 19 years to hand down a decision.  In his youth as a court reporter Dickens had witnessed much wiglomeration.

Bleak House Chapter 8 (Covering a Multitude of Sins):

“He must have a profession; he must make some choice for himself. There will be a world more wiglomeration about it, I suppose, but it must be done.”

“More what, guardian?” said I.

“More wiglomeration,” said he. “It’s the only name I know for the thing. He is a ward in Chancery, my dear. Kenge and Carboy will have something to say about it; Master Somebody—a sort of ridiculous sexton, digging graves for the merits of causes in a back room at the end of Quality Court, Chancery Lane—will have something to say about it; counsel will have something to say about it; the Chancellor will have something to say about it; the satellites will have something to say about it; they will all have to be handsomely feed, all round, about it; the whole thing will be vastly ceremonious, wordy, unsatisfactory, and expensive, and I call it, in general, wiglomeration. How mankind ever came to be afflicted with wiglomeration, or for whose sins these young people ever fell into a pit of it, I don’t know; so it is.”

Lindsay Lohan in blonde bob wig, appearing on Late Night with Jimmy Fallon, New York, November 2012.

The word does not of necessity imply complex or intricate legal reasoning or argument although that can be part of things.  In the jargon, the trick to successful wiglomeration is to use the court’s processes to prolong proceedings (barristers are usually paid for each day’s appearance), either by causing delays or requiring the other side to respond to matters raised which may be so arcane as to be irrelevant, even if that’s not immediately obvious.  Obviously, the more time consuming (and thus more lucrative) these maneuvers prove the better and even if cases don’t literally become interminable, to some they must seem so.  There is also the possibility wiglomeration can fulfill a strategic purpose: if one party has access to effectively unlimited legal resources (ie money) while the other party is financially constrained, sufficient wiglomeration (which manifest as another day’s fees to be paid) can compel the poorer party either to end proceedings or settle on terms less favorable than might have been achieved had the case been brought to judgment.  The most egregious examples of the practice can be classified as an “abuse of process” but judges are sometimes reluctant to intervene because (1) the tactics being used are usually technically correct and (2) it might be seen as denying a party their rights.  The problem is the system but a wholly equitable solution is not immediately obvious.

Central criminal court Old Bailey 1840.

The tradition of barristers wearing wigs in English courts began in the seventeenth century when powdered wigs were a fashionable upper class accessory.  Culturally, lawyers tend to identify upwards so the adoption would not have been seen as “aping their betters” but just a natural alignment of style.  The courtroom style persisted even after wigs had elsewhere fallen from fashion and are still worn in many jurisdictions with traditions inherited from England.  The rationale offered is (1) the wig & gown have by virtue of long use become a symbol of formality and professionalism which lends dignity to proceedings and (2) the garb helps create a sense of anonymity and impartiality, presenting the officers of the court as representatives of the law rather than individuals with personal biases or prejudices, once a matter of some significance at a time when, for historic and structural reasons, there were perceptions of a lack of impartiality in the legal system.  They’re now not always a feature of proceedings but in most systems where they’ve been retained, barristers seem still to want to cling to the tradition although in recent years there’s been a tendency for judges to avoid them where possible and some more recently convened courts have reserved them only for ceremonial occasions and the odd photo opportunity.  Some courts (notably the UK’s recently established Supreme Court has made it possible for cases to be conducted without anybody be-wigged or gowned although, in a sign of the times, vegan wigs are now available as an alternative to the traditional horsehair.

The opinion the younger Dickens formed of the ways of lawyers has been shared by many.  Adolf Hitler’s (1889-1945; German head of government 1933-1945 & head of state 1934-1945) movement in its early days had much need of the services of lawyers and their efforts saved many Nazis from the consequences of their actions but Hitler showed little gratitude to the profession, declaring more than once “I will not give up until every German realizes that it is shameful to be a lawyer.”  Hitler’s own lawyer was Hans Frank (1900–1946) who in 1939 was appointed Governor General of occupied Poland where his rule was corrupt and brutal by even the Nazi's standards of awfulness and few have ever doubted he deserved the death sentence handed down by the International Military Tribunal (IMT) at Nuremberg (1945-1946).  Even in 1946 Frank was still describing Hitler as “…that great man” and regretted his one “…conspicuous failing…” was his mistrust of both the law and lawyers.  What Frank wanted was an authoritarian state but one under the rule of law; he was appalled not by the mass murder which would come to be called genocide but by it not being authorized by a duly appointed judge.  In Nuremberg he claimed to have undergone a number of religious experiences and was received into the Roman Catholic Church, apparently anxious either to atone for his sins or avoid an eternity of torture in Hell.  Of his death sentence he remarked “I deserved it and I expected it.” and of Hitler’s “thousand year Reich” he observed “…a thousand years will pass and still this guilt of Germany will not have been erased.”

There’s a popular view William Shakespeare (1564–1616) shared the general disapprobation of the profession because one of his most quoted phrases is “The first thing we do is, let’s kill all the lawyers.”  However, the context is rarely discussed and quite what the bard was intending to convey is open to interpretation.  The words were given to a character Dick the Butcher and spoken in Act IV, Scene II of Henry VI, Part II (1596-1599).

JACK CADE: I am able to endure much.

DICK [aside]: No question of that; for I have seen him whipp’d three market-days together.

JACK CADE: I fear neither sword nor fire.

SMITH [aside]: He need not fear the sword; for his coat is of proof.

DICK [aside]: But methinks he should stand in fear of fire, being burnt i’ th’ hand for stealing of sheep.

JACK CADE: Be brave, then; for your captain is brave, and vows reformation. There shall be in England seven half-penny loaves sold for a penny: the three-hoop’d pot shall have ten hoops; and I will make it felony to drink small beer: all the realm shall be in common; and in Cheapside shall my palfrey go to grass: and when I am king,– as king I will be,–

ALL. God save your majesty!

JACK CADE: I thank you, good people:– there shall be no money; all shall eat and drink on my score; and I will apparel them all in one livery, that they may agree like brothers, and worship me their lord.

DICK: The first thing we do, let’s kill all the lawyers.

Dick is a villain and the henchman of Jack Cade, who is leading a rebellion against King Henry and their view is that if they kill all who can read and write and burn all books then they’ll find a population easier to rule.  Knowing that, the more generous interpretation is that civilization depends for its fairness and tranquillity on the protection afforded by law and administered by lawyers, Shakespeare representing the rule of law as society’s most fundamental defense against those hungry for power at any price.  Lawyers of course support this version of Shakespeare’s intent, Justice John Paul Stevens (1920–2019; associate justice of the US Supreme Court 1975-2010) even discussing it in a dissenting opinion (Professional Real Estate Investors Inc vs Columbia Pictures Industries Inc (1993)) when he noted “As a careful reading of that text will reveal, Shakespeare insightfully realized that disposing of lawyers is a step in the direction of a totalitarian form of government.”  However, as many a neo-Marxist would point out “He would say that, wouldn’t he.”  If one’s world view is a construct in which the law and lawyers are agents acting in the interests only of the ruling class (the 1% in the popular imagination), then Dick the Butcher and Cade the labourer in seeking to overthrow an unfair, oppressive system are victims whose only hope of escaping their roles as slaves of the nobility is to revolt, a part of which will be the killing of the lawyers because, as the profession offers their skills only to those who can pay, those with no money have no choice.

Saturday, January 15, 2022

Mizzle & Drizzle

Mizzle (pronounced miz-uhl)

(1) To rain in fine drops; a form of precipitation between mist and drizzle.

(2) In (almost exclusively British) slang, to decamp; to disappear or suddenly leave (now rare).

(3) In (almost exclusively British) slang, to induce a muddled or confused state of mind.

1475–1485: From the late Middle English missellen & missill (to drizzle), cognate with the Dutch dialectal form mizzelen (to drizzle), the Low German miseln & mussel (to mizzle), the Dutch miezelen (to drizzle, rain gently) and akin to the Middle Dutch misel (mist, dew).  The slang use in both senses dates from the mid-eighteenth century.  It’s of obscure origin, possibly a frequentative related to the base of mist or related to the Middle Low German mes (urine), the Middle Dutch mes & mis (urine), both from the Old Saxon mehs (urine), from the Proto-Germanic mihstuz, mihstaz & mihsk- (urine), from mīganą (to urinate), from the primitive Indo-European meigh & omeigh (to urinate).  There’s also some relationship with the English micturate (to urinate), the Old Frisian mese (urine), the Low German miegen (to urinate), the Dutch mijgen (to urinate) and the Danish mige (to urinate).  Mizzle and mizzler are nouns, the verbs (used with or without object) are mizzled & mizzling; mizzly the adjective.

Now often though a portmanteau word (the construct being mi(st) + (dr)izzle) mizzle & drizzle have wholly separate etymologies and, historically, mizzle was a synonym of dizzle.  As verbs the difference between drizzle and mizzle is that drizzle is (ambitransitive) “to rain lightly; to shed slowly in minute drops or particles” while mizzle is “to rain in very fine drops”.  As nouns the difference is that drizzle is light rain while mizzle is misty rain or drizzle, thus the sense in the etymologically wrong portmanteau turns out to be English as it is used: mizzle is precipitation somewhere between mist and drizzle.  What mizzle and drizzle have in common is that unlike fog droplets, both fall to the ground.

The strange use in (mostly) British slang to mean “abscond, scram, flee” is an example of a dialectical form which spread although use has declined to the point where it’s now rare.  The other slang sense (to muddle or confuse) was probably an imperfect echoic, a misreading of past tense/participle of “misled”.  Charles Dickens (1812–1870) liked words which, given how profligate he was in their use, was good.  In Bleak House (1852-1852), a cautionary tale of the woes to be had were one's matters to end up in the list of the Court of Chancery, mentioned to the Lord High Chancellor are Messrs Chizzle, Mizzle, Drizzle and otherwise.

Drizzle (pronounced driz-uhl)

(1) To rain gently and steadily in fine drops; to sprinkle (In meteorology, defined as precipitation consisting of numerous minute droplets of water less than 0.02 inch (0.5 millimeter) in diameter).

(2) To let something fall in fine drops or particles; to sprinkle.

(3) To pour in a fine stream.

1535–1545: From the Old English drēosan (to fall), of obscure origin but may be a formation from dryseling or a dissimilated variant of the Middle English drysning (a falling of dew), from the Old English drysnan (to extinguish), akin to the Old English drēosan (to fall; to decline (cognate to the Modern English droze & drwose)) and cognate with the Old Saxon driosan, the Gothic driusan, the dialectal Swedish drösla and the Norwegian drjōsa.  Drizzle & drizzler are nouns, the verbs (used with or without object) is drizzled & drizzling, drizzly the adjective.  A honey dipper is a tool with a grooved head, used to collect viscous liquids such as honey or syrup so it may be drizzled over toast, cereal or other food.

Honey being drizzled on almond-butter toast.

Shakespeare in act 3, scene 5 of Romeo and Juliet (1597) used the word in the sense familiar in the sixteenth century

When the sun sets the air doth drizzle dew,

But for the sunset of my brother’s son

It rains downright.

How now? A conduit, girl? What, still in tears,

Rain stopped play during the last session on the first day of the pink-ball cricket match in Hobart on 14 January 2021.  The fifth and final test of the 2020-2021 series and the first Ashes test played in Hobart, the curious decision by the umpires deprived the crowd the chance to watch the last thirty-odd overs.  The stoppage was prompted by a brief, light drizzle which nobody except the umpires seem to think could be called rain and the sight of the solitary umbrella opened in the ground being that held by the umpire attracted a few derisive comments.  There was a sudden spike in traffic to the Bureau of Meteorology’s website as people looked at the rain radar seeking some indication of when play might resume but the radar showed almost no cloud and virtually no indication of rain in a 128 km (60 mile) radius.  The next day, the bureau reported the rain gauges at weather stations in the Hobart CBD and airport registered a total of 0.0 mm of rain on that evening.

The laws of cricket actually don’t prohibit the game being played when it’s raining, provided it is not dangerous or unreasonable, Law 3.8 including the clause:  If conditions during a rain stoppage improve and the rain is reduced to drizzle, the umpires must consider if they would have suspended play in the first place under similar conditions. If both on-field umpires agree that the current drizzle would not have caused a stoppage, then play shall resume immediately.

It was certainly unusual and many test matches have resumed in drizzle or mizzle heavier than what was seen that Friday night.  The consensus was the umpires might have been concerned about the effect of a wet outfield on the pink ball, a construction relatively new to cricket which attempts to emulate the behavior of the traditional red ball while remaining easily visible under the artificial lighting used for day-night matches.  It seems the pink ball is more affected by moisture than the traditional red or the white ball used in limited-overs competitions, tending to swell.

Mizzle & Drizzle protection: Lindsay Lohan in New York City, August 2013.

Friday, June 14, 2024

Lapidify

Lapidify (pronounced luh-pid-uh-fahy)

(1) To convert into stone or stony material; to petrify.

(2) To transform a material into something stony.

(3) Figuratively, to cause to become permanent; to solidify.

1620s: From the French lapidifier, from the Medieval Latin lapidificāre, the construct being the Latin lapis (stone) + -ify.  The origin of the Latin lapis is uncertain but there may be a link with the Ancient Greek λέπας (lépas) (bare rock, crag), which was from either the primitive Indo-European lep- (to peel) or a Mediterranean substrate language, most etymologists tending to favor the latter.  The -ify suffix was from the Middle English -ifien, from the Old French -ifier, from the Latin -ificare, from -ficus, from facio, (“make” or “do”).  It was used to produce verbs meaning “to make”; the alternative form was -fy.  The literal synonym in geology is petrify but also used (in various contexts) are set, harden, clarify, solidify, calcify, mineralize & fossilize.  Lapidify, lapidifies, lapidifying & lapidified are verbs, lapidification is a noun and lapidific & lapidifical are adjectives; the noun plural is lapidifications.

Medusa

In Greek mythology, Medusa (from the Ancient Greek Μέδουσα (Médousa), from μέδω (médō) (rule over)) was the youngest of the three Gorgon sisters and among them, the sole mortal.  In the popular imagination it seems to be believed than only the gaze of Medusa had the power to turn men to stone but her sisters Stheno & Euryale also possessed the gift.  The three were the daughters of Phorcys & Ceto who lived in the far west and the heads of the girls were entwined with writhing snakes and their necks protected with the scales of dragons while they had huge, boar-like tusks, hands of bronze and golden wings.  That alone would have made dating a challenge but anyone who had the misfortune to encounter them was turned instantly to stone.  Only Poseidon (god of the sea and one of the Olympians, the son of Cronus & Rhea) didn’t fear their glance because he had coupled with Medusa and fathered a child (in some tales the ghastly Cyclops Polyphemus which wasn’t encouraging but the other Cyclops were about as disagreeable.

Bust of Medusa in marble (1636) by Gianlorenzo Bernini (1598-1680), Museos Capitolinos. Palazzo dei Conservatori, Rome, Italy (left) and Lindsay Lohan in Medusa mode, Confessions of a Teenage Drama Queen (2004) (right).

Born in great secrecy, Perseus was the son of Zeus & Danae but one day, Danae’s father Acrisius heard the baby’s cry and, enraged that Zeus had seduced his daughter, had mother & child sealed in a wooden chest and cast into the sea; it washed up on the shores of the island of Seriphos, the pair rescued by the fisherman Dictys, brother of the ruling tyrant Polydectes.  When Perseus grew, he was one day one of those at one of Polydectes' banquets and when the guests were asked what gift they would offer their host, all except Perseus suggested horses.  He instead offered to bring to the table the severed head of Medusa.  It’s not clear if this was intended as a serious suggestion (wine may have been involved) but the tyrant insisted, saying that otherwise he would take Danae by force.  Embarking on this unpromising quest Perseus was helped by Hermes & Athena who took him to the Graeae; they showed him the way to the nymphs who lent him winged sandals, a kibisis (the backpack of the gods) and the helmet of Hades which rendered the wearer invisible.  Hermes armed him with the harpe, a sickle made of adamant.

Thus equipped, Perseus and Athena began the hunt for the Gorgons.  Of the three sisters, only Medusa was mortal so the project of decapitation had at least some theoretical prospect of success.  The far west was a bleak and uninviting place to which few travelled and they had little trouble in finding their lair, outside which they lay in wait until the family slept.  After midnight, when Medusa had fallen into a deep slumber, Perseus rose into the air on the nymphs’ winged sandals, and, while Athena held a shield of polished bronze over Medusa so it acted as a mirror, protecting them from her gaze, Perseus wielded his harpe, in one stroke striking head from shoulders.  Instantly, from the bloodied neck sprang Pegasus the winged horse and Chrysaor the giant.  Perseus stashed the severed head in the kibisis and quickly alit for home, pursued by a vengeful Stheno & Euryale but, concealed by the helmet’s cloak of invisibility, he evaded them.  Arriving in Seriphos, he became enraged after discovering Polydectes had attempted to rape Danae who had been compelled to seek refuge at the altars of the gods.  Perseus took Medusa’s head from the backpack and held the visage before Polydectes, lapidifying him in an instant, declaring his rescuer Dictys was now the island’s ruler.  The invaluable accessories he returned to the Nymphs while Athena set the head of Medusa in the middle of her shield, meaning she now possessed the power of lapidification.

Tuesday, January 11, 2022

Modernism

Modernism (pronounced mod-er-niz-uhm)

(1) A modern usage or characteristic; modern character, tendencies, or values; adherence to or sympathy with what is modern.

(2) In theology, a movement in Roman Catholic thought that sought to interpret the teachings of the Church in the light of philosophic and scientific conceptions prevalent in the late nineteenth and early twentieth centuries.  The movement was condemned in 1907 by Pope Pius X (always with initial capital).

(3) In theology, the liberal theological tendency in Protestantism which emerged in the twentieth century and became the intellectual core of the moderate wings in many denominations, most obviously displayed in the interplay between the liberal and evangelical factions at recent Lambeth Conferences.

(4) A deliberate philosophical and practical estrangement or divergence from the techniques and practices of the past in the arts and literature, occurring from late in the nineteenth century and (a little arbitrarily), thought to have concluded in the post-war years.  Taking the form in any of various innovative movements and styles, it was an aesthetic descriptor used in mostly in architecture, literature and art to label work thought modern in character, quality of thought, expression or technique (sometimes with initial capital).

1730-1740: The construct is modern + ism.  Modern was from the Middle French moderne, from the Late Latin modernus (modern), from the Classical Latin modo (just now; in a (certain) manner)), from the adverb modo (to the measure) originally ablative of modus (manner; measure) and hence, by measure, "just now".  Root was the primitive Indo-European med- (take appropriate measures).  The adjective modern entered English circa 1500 in the sense of "now existing", the meaning "of or pertaining to present or recent times" attested from the 1580s.  The verb modernize (give a modern character or appearance to, cause to conform to modern ideas, adapt to modern persons) has existed since the 1680s, thought probably a borrowing from the French modernizer rather than a native coining.

In the narrow, technical sense, modern began in English enjoying a broad meaning, simply a description of that which was not ancient and medieval, but it quickly gained niches, Shakespeare using it to indicate something "every-day, ordinary, commonplace" and the meaning "not antiquated or obsolete, in harmony with present ways" was formalized by the early nineteenth century.  Formerly, the scientific linguistic division of historical languages into old, middle, and modern began only in the nineteenth century, the first university department of modern (ie those still living (unlike Latin and Ancient Greek) and of some literary or historical importance) was created in 1821 although the use of Modern English can be traced from at least circa 1607 when jurist John Cowell (1554–1611) published The Interpreter, a dictionary of law which, inter alia, examined and explained the words of Old and Middle English.  The extended form modern-day was noted from 1872; Modern dance is attested by 1912; modern conveniences (increasingly electrically-powered labour-saving devices) dates from 1926; modern jazz was used in 1954 and the slang abbreviation mod (tidy, sophisticated teen-ager (and one contrasted usually with a disreputable rocker or some other delinquent)) dates from a surprisingly early 1960.

There have been references to modern art since 1807 but it then was simply a chronological acknowledgement in the sense of something being recent rather than a recognizable style or school, that sense, “work representing a departure from or repudiation of traditional or accepted styles”, not attested until 1895, the idea by 1897 extended to the individual, a modern person one thought “thoroughly up to date”.  Although it would take decades to assume its (now) modern meaning, as an adjective, post–modern appears first to have been use in 1919 and by the late 1940s had become common in the language of architects (presumably the first among the modernists to feel no longer modern) and use extended to the arts in the 1960s, infecting politics, literature and just about every form of criticism.

The –ism suffix is from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & -isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).

Tarta T77, Czechlosavakia 1934.

Confusion is sometimes attached to the words “modernity” and “modernism”.  The noun modernity, dating from the 1620s, meant simply the "quality or state of being modern", an elaborated way of saying “recent”, the sense "something that is modern" noted since 1733.  As a word to describe specifically the epoch which began in the early twentieth century, it was in use before the First World War.  The sense of that modernity was the particular combination of circumstances and forces which had coalesced to produce in Europe and North America a society which had in technical and other ways progressed with greater rapidity than any in history: (1) A long period of general peace, (2) a level of prosperity, which although uneven and often erratic, was unprecedented, (3) an array of technical innovations (electricity, radio, flight, telegraphs, internal combustion engines etc), (4) increasing literacy and a proliferation of mass-media, (4) an on-rush of new and transformative theories in psychology, philosophy & science (Freud, Darwin, Einstein et al), and (5) a relaxation of historically censorious attitudes which allowed innovations in art and literature to flourish.

Woman with a Hat (1905) by Henri Matisse (1869-1954).

Modernism, a word Dr Johnson (1709-1784) claimed was invented by the novelist Jonathan Swift (1667–1745) to suggest any "deviation from the ancient and classical manner" had by 1830 come to refer to generally to "modern ways and styles" but since the mid-1920s, it’s been used exclusively to describe the movements in art, music, architecture and literature which depart in some radical way from classical and other traditional antecedents.  By 1925, the noun modernist was applied to a producer or follower of the work of the movement.  In the 1580s it had meant "one who admires or prefers the modern"; something similar but without the political and other connotations modernity imposed.

Monument House of the Bulgarian Communist Party (Buzludzha Monument), Buzludzha Peak, Bulgaria (1974-1981).  Now a ruin.

Modernism, even if considered separately within the genres where the label was adopted had no coherence of style, intent or result, the significance being what it was not in that it deliberately aimed to depart from classical and traditional forms.  Although it’s contested still, much of what began as modernist art is attributed to the modernist’s growing alienation from the strictures and conventions which had characterized their formative years during the Victorian age.  It defined an epoch and, somewhere in the post-war period, became part of the history of art or architecture and for those for whom modernism is now just another product, such as art dealers, there’s really no distinction between modernist and postmodernist phases, both grouped under the rubric of Modern Art. 

Portrait of Wilhelm Uhde (1910) by Pablo Picasso (1881-1973), analytical cubism from his cubist period.

Modernist literature was influenced by the bleak surroundings created by urbanization and industrialization but it was the blast of the Great War which left writers grasping to find a response to what was a much-changed world.  The roots of modernist literature can be found in pre-war works but the war had so undermined faith in the foundations of Western society and culture which early in the century had been viewed with an optimism it’s hard now to convey.  An inescapable element in modernism was a rejection of beauty as if humanity had proved itself somehow unworthy of loveliness and modernist literature reflected disillusionment and betrayal, TS Eliot’s (1888-1965) epic-length poem The Waste Land (1922) is emblematic, reflecting the search for redemption in a spiritually empty landscape and technically, its discordant images and deliberate obscurity typify Modernism and preview the techniques developed in post-modernism which demand the reader be involved in interpreting and deconstructing the text.

Marilyn Monroe (1926-1962) reading Ulysses, 1955.

Modernist fiction however, in content or technique, wasn’t monolithic and even to label it a genre does require that word to adopt a nuance of meaning, the works could explore the life of one, of a group or all of humanity and ranged from the nihilistic to the utopian.  The great landmark was James Joyce’s (1882-1941) Ulysses (1922), a work of such sometimes impenetrable density which adopted the even then known device of a stream of consciousness to document the events of one day in the lives of three Dubliners, the idea being the abandonment of any structural order, conventional sentence construction sacrificed in an attempt to capture the nature of human fragmentary thought.  Not all readers were seduced by the approach but Ulysses picked up a cult following which endures to this day, some of the adherents compelled even to praise Joyce’s Finnegans Wake (1939), a work which in which a few find rare genius while others conclude it's merely modernism pursued to its inevitability unintelligible conclusion.  Reading Joyce can be hard work although Anthony Burgess (1917–1993), a fair critic of modernism, literary and otherwise, claimed to find a laugh of every page of Finnegans Wake.  For most, it's no fun at all.

Paysage coloré aux oiseaux aquatiques (Colored Landscape with Aquatic Birds, circa 1907) by Jean Metzinger (1883-1956).

In painting, the roots really lie in mannerism although elements can be traced through thousands of years of what is, strictly speaking, non-representational art, exemplified in many aspects by the sometimes (and not deliberately) almost abstract Italian primitives created between the late eleventh and mid-thirteenth centuries.  However, historians regard the first of the modernists as Édouard Manet (1832-1883) who, from his early thirties began to distort perspective and devalue representation, using techniques of brushstroke to emphasize the very nature of the flat canvas on which he worked, something artists had for centuries crafted to disguise.  In his wake came the rush of -isms that would characterize modernism: Impressionism, Cubism, Constructivism, Post-Impressionism, Cubism, Futurism & Expressionism, an array of visions which would see the techniques and physical construction of the works vie for attention (at least among fellow artists and critics) with what was once thought the art itself.

TWA Flight Center (1959-1962) by Eero Saarinen and Associates, John F Kennedy International Airport, New York.

In architecture, while not literally technologically deterministic, the advances in the technologies available made possible the abandonment of earlier necessities.  However, while  the existence of steel frames and increasingly complex shapes rendered in reinforced concrete may have allowed what came to be known as the “new international style”, neither precluded the ongoing use of decoration and embellishment, the discarding of which was a hallmark of modernist architecture.  It was the choice of the architects to draw simple geometric shapes with unadorned facades, a starkness associated especially with the mid-to-late twentieth century steel and glass skyscrapers.  Architects are less keen to be associated with some other high-rise buildings of the era, especially the grim housing projects which blighted so much of the urban redevelopment of the post-war years; there was none of the awful grandeur of the best neo-brutalism to many of these.

Tuesday, January 18, 2022

Phlogiston

Phlogiston (pronounced floh-jis-ton or floh-jis-tuhn)

In chemistry, a hypothetical colorless, odorless, weightless substance once believed to be the combustible part of all flammable substances and given off as flame during burning; sometimes styled poetically as the “fiery principle”.

1610-1620: From the New Latin phlogiston, from the Ancient Greek φλογιστόν (phlogistón), neuter of φλογιστός (phlogistós), (burnt up, inflammable), from φλογίζω (phlogízō), (to set fire to), from φλόξ (phlóx) (flame).  The most familiar Greek forms were phlogizein (to set alight) and phlegein (to burn).  Root was the primitive Indo-European bhel (to shine, flash, burn (also “shining white)).  Bhel proved most productive, used especially when forming words for bright colors and was part of beluga; Beltane; black; blancmange; blanch; blank; blanket; blaze (as in "bright flame, fire)" bleach; bleak; blemish; blench; blende; blend; blind; blindfold; blitzkrieg; blond; blue; blush; conflagration; deflagration; effulgence; effulgent; flagrant; flambe; flambeau; flamboyant; flame; flamingo; flammable; Flavian; Flavius; fulgent; fulminate; inflame; inflammable; phlegm; phlegmatic; phlogiston; phlox; purblind; refulgent & riboflavin.  As well as the Ancient Greek phlegein (to burn), the word was apparently related to the Sanskrit bhrajate (shines), the Latin flamma (flame), fulmen (lightning), fulgere (to shine, flash) & flagrare (to burn, blaze, glow), the Old Church Slavonic belu (white) and the Lithuanian balnas (pale).  The related forms were phlogistic, phlogisticating, phlogistication & phlogisticated and the scientific necessity of the age also demanded the creation of the verb dephlogisticate (deprive of phlogiston), thus also dephlogisticated, dephlogisticating & dephlogistication.

Alchemy & Chemistry

As the surgeons emerged from the barber’s shop the chemists were once alchemists.  Chemistry began as alchemy, once a respectable branch of learning concerned, inter alia, with the study and purification of materials, the dubious reputation it now suffers because of the fixation in popular culture on its work in developing the chemical process chrysopoeia, the transmutation of “base metals” such as lead into "noble metals", especially gold.  That particular notion of molecular re-arrangement proved a cul-de-sac but some of the laboratory techniques and experimental models developed in medieval alchemy remain in use today.

One pioneer of modern chemistry was German chemist & physician Georg Stahl (1660–1734) who devoted much attention to the fundamental nature of combustion: What happens when stuff burns?  Developing an idea first proposed in 1667 by German physician & alchemist Joachim Becher (1635–1682), in 1702, Stahl proposed that all inflammable objects contained a material substance he called “phlogiston”, from the Greek word meaning “to set on fire”.  When something burned, it liberated its content of phlogiston into the air and Stahl believed it to be chemically inert.  Stahl’s phlogiston theory would dominate scientific thinking for a century.

Phlogiston theory for a while survived even the odd inconvenient truth.  When experiments revealed that burning (oxidizing) a piece of metal resulted in it weighing more rather than less (contrary to phlogiston theory which suggested it would be lighter by the weight of the evacuated phlogiston), the inconsistency was resolved by postulating that phlogiston was either (1) an immaterial principle rather than a material substance (2), phlogiston had a negative weight or (3), phlogiston was lighter than air.  So much did the theory become scientific orthodoxy that when chemists isolated hydrogen, it was celebrated as pure phlogiston.

The execution of Lavoisier, woodcut by unknown artist.  Antoine-Laurent de Lavoisier (1743-1794) was a nobleman and a tax collector, neither quality likely much to appeal to the mob which prevailed after the French Revolution.  In 1794 he and twenty-seven other tax-farmers were executed by guillotine in Paris at the Place de la Révolution (now the Place de la Concorde).  A fellow scientist at the time lamented: "It took them only an instant to cut off that head, and one hundred years might not be sufficient to produce another like it.”

It would be decades before those with doubts, and there were a few, systemized their objections into an alternative theory.  In 1775, French chemist Antoine-Laurent de Lavoisier delivered a paper he called Memoir on the nature of the principle which combines with metals during their calcination [oxidation] and which increases their weight to a meeting of the French Royal Academy of Sciences.  Subsequently it was published in 1778.  Lavoisier named the combustible part of air principe oxigine (acidifying principle) from the Ancient Greek, the construct being ξύς (oxús) (sharp) + γένος (génos) (birth), referring to his erroneous belief that oxygen was a vital component of all acids, this his choice of “acid producing”.  The French adopted the variant principe oxygène and in English it became oxygen.  The fraction of air that does not support combustion he called azote, (no life) from the Ancient Greek, the construct being - (a-) (without) + ζωή () (life), the idea being the substance was incapable of sustaining life.  Azote is now called “nitrogen”, from the French nitrogène, the construct being the nitro- (from the Ancient Greek νίτρον (nítron) (sodium carbonate) + the French gène (producing).  From this paper, which eventually laid to rest phlogiston theory, emerged the foundations for the understanding of chemical reactions as combinations of elements which form new materials; the birth of modern chemistry.  Lavoisier’s model was convincingly elegant but there were those in the scientific establishment with reputations vested in phlogiston theory and some would prove recalcitrant.  Even when the existence of oxygen and nitrogen had become widely accepted, some remained so inculcated they felt compelled to integrate the old with the new, oxygen and nitrogen a filter with which to view phlogiston; a construction of reality which in the post-Trumpian world would be called “alternative facts”. 

Most famous was the eminent English chemist Joseph Priestley (1733-1804) who, even after personally identifying oxygen and well after Lavoisier's paper had persuaded nearly all others, insisted oxygen was but “dephlogisticated air” and in his 1796 paper Considerations on the doctrine of phlogiston and the decomposition of water, he labeled Lavoisier's devotees as “Antiphlogistians”, objecting to the idea of some “theory so new” and based on “so very narrow and precarious a foundation” suddenly overturning “the best established chemistry”.  Two centuries later, a similarly doomed rearguard action would be fought by the “steady-staters” against the big-bang theory explaining the origins of the universe.  Until his dying day, Priestley never accepted the invalidation of phlogiston theory but the increasingly complicated modifications he, and a dwindling few others, bolted-on to make it conform with the undeniable implications of Lavoisier’s model were unconvincing and by the turn of the nineteenth century, phlogiston’s days were over.