Wednesday, February 21, 2024

Waterfall

Waterfall (pronounced waw-ter-fawl or wot-er-fawl)

(1) A steep fall or flow of water in a watercourse from a height, as over a precipice; a cascade of falling water where there is a vertical or almost vertical step in a river.

(2) A hair-style using long, loose “waves”.

(3) As “waterfall development”, “waterfall management” and “the waterfall model”, descriptions of product research & development (R&D) (especially in tech) including sequential stages, from conception and design through testing and implementation, hopefully to result in a final delivered product.

(4) Figuratively, any waterfall-like outpouring of liquid, smoke etc.

(5) In slang (originally US but now widespread), the action of drinking from a vessel without touching it with the lips (a sanitary precaution with shared vessels).

(5) In the smoking of weed, a particular design of bong.

Pre 1000: From the Middle English waterfal & waterfalle, from the Old English wæterġefeall (waterfall) and cognate with the Old Norse vatnfall, the West Frisian wetterfal (waterfall), the Dutch waterval (waterfall), the German Wasserfall (waterfall) and the Swedish vattenfall (waterfall).  The colloquial use to describe (1) a necktie, (2) a cravat, (3) a chignon (in hair-styling, a low bun or knot positioned at or close to the nape of the neck) or (4) a beard are now effectively extinct.  Waterfall’s synonyms in general use (though hydrologists are more precise) include cascade, cataract, sault (old Canadian slang more often used of river rapids) and the clipping falls.  Waterfall is a noun verb & adjective and waterfalling & waterfalled are verbs; the noun plural is waterfalls.

The construct was water + fall and the Modern English spelling appears to have been a re-formation from around the turn of the sixteenth century.  The noun “water” was from the Old English wæter (water), from the Proto-West Germanic watar, from the Proto-Germanic watōr (water), from the primitive Indo-European wódr̥ (water).  The verb “water” was from the Middle English wateren, from the Old English wæterian, from the Proto-Germanic watrōną & watrijaną, from the Proto-Germanic watōr (water), from the primitive Indo-European wódr̥ (water).  The noun “fall” was from the Middle English fal, fall & falle, from the Old English feall & ġefeall (a falling, fall) and the Old English fealle (trap, snare), from the Proto-Germanic fallą & fallaz (a fall, trap).  It was cognate with the Dutch val, the German Fall (fall) & Falle (trap, snare), the Danish fald, the Swedish fall and the Icelandic fall.  The verb “fall” was from the Middle English fallen, from the Old English feallan (to fall, fail, decay, die, attack), from the Proto-West Germanic fallan (to fall), from the Proto-Germanic fallaną (to fall).  It was cognate with the West Frisian falle (to fall), the Low German fallen (to fall), the Dutch vallen (to fall), the German fallen (to fall), the Danish falde (to fall), the Norwegian Bokmål falle (to fall), the Norwegian Nynorsk falla (to fall), the Icelandic falla (to fall), the Albanian fal (forgive, pray, salute, greet) and the Lithuanian pùlti (to attack, rush).

Two views of Niagara Falls:  Between June-November 1969 (left), a temporary dam was built to stem the usual flow so geological studies could be conducted to ascertain the condition of the rocks and assess the extent of erosion.  After rectification work was carried out, the temporary structure was dynamited, an event promoted as a tourist attraction.  In 1885 (right), the falls underwent one of its occasional freezes.  Usually, these are what hydrologists call "partial freezes" (of late there have been a few: 2014, 2017 & 2019), the only (almost) "total freeze" recorded in 1848 although that was induced by the accumulation of ice on Lake Erie which caused a "natural dam" to form, stopping the flow of water to the Niagara River.  It was this rather than a "total freeze" of the falls which caused the phenomenon.

Lindsay Lohan with waterfall, Guanacaste Gold Coast, Costa Rica, January 2016.

For most of us, we know a waterfall when we see one: it’s a point in a waterway (usually a river) where the water falls over a steep drop that is close to literally vertical.  However, among hydrologists, there’s no agreed definition about the margins such as when something ceases to rapids and becomes a waterfall, some insisting that what lay-people casually call “waterfalls” are really “cataracts” or “cascades”.  To most of us there to admire the view, it’s a tiresome technical squabble among specialists but among themselves they seem happy for the debate to continue and some have even suggested precise metrics which can be mapped onto any formation.

Wasserfall (Waterfall), the embryonic SAM

Wasserfall (project Waterfall) was an early SAM (surface to air missile) developed by the Nazi armaments industry.  Although never used, it was highly influential in the post-war years.  In his memoirs (Inside the Third Reich (1969)), Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) discussed both the weapons systems with which he as minister was usually in some way connected and the political in-fighting and inter-service rivalries which hampered their development.  Although his writings are not wholly reliable (there was much he choose not to say on his contribution to anti-Jewish measures and his knowledge of the holocaust), on industrial and technical matters historians regard his account as substantially accurate (if incomplete).  Interestingly, after reading in Spandau prison a smuggled copy of the memoir (Ten Years and Twenty Days (1958)) of Karl Dönitz (1891–1980; as Grand Admiral head of the German Navy 1943-1945, German head of state 1945) who had been a fellow prisoner for the first decade of Speer’s twenty-year sentence, without any sense of irony, he remarked in his (extensively edited) prison journal (Spandau: The Secret Diaries (1975)):

Where he discusses military operations and the questions of armaments, the book is interesting and probably also reliable.  His political attitude, on the other hand, his relationship to Hitler, his childish faith in National Socialism – all that he either wraps in silence or spins a veil of sailor’s yarns.  This is the book of a man without insight.

Speer re-invented himself by wrapping in veils of silence anything too unpleasant to admit and spun plenty of veils so appealing that for decades there were many who, for various reasons, draped them over his past.  He wasn’t a man without insight but compared with Dönitz, he had much more guilt to conceal and thus more need of selective silence & spin.

Speer regarded the regime’s failure to devote the necessary resources to the Wasserfall project as one of Adolf Hitler's (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945)  many strategic blunders which, by 1943, had made defeat inevitable.  Having delayed development of the revolutionary Messerschmitt Me 262 jet fighter (deployed at scale mass it would have been a devastating weapon against the Allied bomber fleets then laying waste to German cities and industry), Hitler took the decision to afford the highest priority to the A4 (better known as the V2) rocket to retaliate against English cities; psychologically, Hitler always wanted to be on the offensive and would later appal the experts by demanding the Me 262 be re-designed as a fast, light bomber.  As a delivery system the V2 was a decade ahead of its time and there was then no defense against the thing but it was a hugely expensive and resource-intensive way to deliver an explosive load under a tonne.  As Speer noted, even if it became possible to produce and fire the projected 900 a month, that would mean a daily bomb-load of some 24 tonnes falling on England and that at a time when the Allied bomber groups were on average dropping some 3000 tonnes a day on German targets.  Hitler wasn’t wrong in predicting the use of the V2 against civilian targets would have an effect well beyond the measure of the tonnage delivered and the historians who claimed the disruption to the allied war effort caused by the V1 (an early cruise missile) & V2 was “negligible” were simply wrong but to have been an effective strategic weapon, at least hundreds of V2s a day would need to have found their targets.

Captured blueprints and photographs from the Wasserfall project's development. 

Speer admitted he “not only went along with this decision on Hitler's part but also supported it. That was probably one of my most serious mistakes.  We would have done much better to focus our efforts on manufacturing a ground-to-air defensive rocket.  It had already been developed in 1942, under the code name Wasserfall (Waterfall), to such a point that mass production would soon have been possible, had we utilized the talents of those technicians and scientists busy with [the V2] under Wernher von Braun (1912–1977).

He added that von Braun’s team was employed to develop weapons “for the army, whereas air defense was a matter for the air force.  Given the conflict of interests and the fierce ambitions of the army and the air force, the army would never have allowed its rival to take over the installations it had built up…  The difference in resource allocation was stark, more than ten times the number of technical staff working on the V2 compared to Waterfall and other anti-aircraft rocket projects (such as the small Taifun (Typhoon)).  The attraction of the anti-aircraft rockets was obvious as Speer noted: “Waterfall was capable of carrying approximately six hundred and sixty pounds of explosives along a directional beam up to an altitude of fifty thousand feet and hit enemy bombers with great accuracy.  It was not affected by day or night, by clouds, cold, or fog. Since we were later able to turn out nine hundred of the offensive big rockets monthly, we could surely have produced several thousand of these smaller and less expensive rockets per month. To this day I think that this rocket, in conjunction with the jet fighters, would have beaten back the Western Allies' air offensive against our industry from the spring of 1944 on.  Instead, gigantic effort and expense went into developing and manufacturing long-range rockets which proved to be, when they were at last ready for use in the autumn of 1944, an almost total failure [a comment which, combined with Allied propaganda and disinformation, influenced for decades many post-war historians].  Our most expensive project was also our most foolish one. Those rockets, which were our pride and for a time my favorite armaments project, proved to be nothing but a mistaken investment. On top of that, they were one of the reasons we lost the defensive war in the air.

Whether a mass-produced Waterfall would have been an effective weapon against the mass-bomber formations has divided analysts.  While the technology to produce a reliable directional mechanism had been mastered, what Germany never possessed was a proximity fuse which would have enabled the explosive charge to be triggered when a bomber was within range; instead the devices relied on impact or pre-set detonators.  Presumably, had other projects been suspended and the resources re-directed to Waterfall, mass production may have been possible and even if only partially successful, to disrupt a bombing offensive it was necessary only to inflict an ongoing 5-10% loss rate to make the campaign unsustainable.  Given the inevitable counter-measures, even that would likely have proved challenging but economic reality meant Waterfall probably did offer a more attractive path than the spectacular V2 and given the success in related fields, it was not impossible that had priority been granted, proximity fuses and other technical improvements may rapidly have appeared.  As it was, Waterfall (like Typhoon, Me 262, V2 and an extraordinary range of other intriguing projects) was the subject of a post-war race between the Russians, the Americans and the British, all anxious to gather up the plans, prototypes, and personnel of what were clearly the next generation of weapons.  As a proof of concept exercise Waterfall was convincing and within years SAMs were a vital component of defensive systems in most militaries.

The waterfall motif: Grill on the 1975 Imperial LeBaron Crown Coupe (left) and the Liebian International Building in China (right).

In design, "waterfall" can be a motif such as used for the grill on the 1975 Imperial LeBaron Crown Coupe.  It can also be literal and architects have many times integrated water-flows as an external design element but at 108 metres (354 feet) high, the one on the façade of the Liebian International Building in south-west China is easily the world’s tallest.  An eye-catching sight, the waterfall isn't run all that often (which must disappoint influencers who turn up with cameras ready) because it’s said to cost some 900 yuan (US$125) per hour just to pump the water to the top and, with the downturn in the property market, the building's revenues have fallen short of expectation.  When completed and displayed in 2016, the waterfall attracted some criticism on environmental grounds, water shortages far from unknown in China although the builders (Ludi Industry Group) pointed out the signature feature uses storm-water runoff, rainwater and groundwater, all stored in vast underground tanks.  It may for a while be the last example of exuberance to show up among China's skyscrapers, Xi Jinping (b 1953; general secretary of the Chinese Communist Party (CCP) and paramount leader of the People's Republic of China (PRC) since 2013) in 2014 calling for an end to what he called "weird architecture".  Mr Xi thinks buildings should be "suitable, economic, green and pleasing to the eye" rather than "oversized, xenocentric & weird".  Those skilled at reading between the CCP's lines decided the president had called the architects "formalists".  They would have taken note.

On TikTok, a small but active community of those who find waterfalls mesmerizing post video clips.

Tuesday, February 20, 2024

Nefandous

Nefandous (pronounced nef-and-us or nef-fandus)

(1) Not to be spoken of (archaic).

(2) Unspeakable, appalling; abominable, shocking to reasonable senses (rare).

1630s: From the Latin nefandus (unmentionable, impious, heinous), the construct being ne- (the negative particle: “not”) + fandus (to be spoken), gerundive of fārī (to speak), from the primitive Indo-European root bha (to speak, tell, say).  Nefandous is an adjective.  Although not obviously a word needing an intensifier, the comparative is “more nefandous” and the superlative “most nefandous”.

Google's ngrams trace the use of words but because of the way the data is harvested, the numbers represented by the ngrams are not of necessity accurate but, over decades, probably are broadly indicative.  While the numbers do bounce around a bit, it would seen that in British English (lower chart), use of "nefandous" was not infrequent in the nineteenth century while the most recent spike was during the 1930s; while politically and financially a troubled decade, any suggestion of a causal link with use would be speculative.  In US English (upper chart) use appears also to have declined after the nineteenth century, the most recent spike in the use of "nefandous" coinciding with the 2016 presidential campaign; again, to suggest any link with Donald Trump (b 1946; US president 2017-2021) or crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) would be speculative.  With the 2024 election threatening to be a re-run of 2020 (something quite a few seem to think variously unspeakable, unthinkable or unmentionable), there may be another revival of the word.   

The extinct nineteenth century formations were the noun nefandousness and the adverb nefandously; as an expression of character, nefandousness briefly found uses but the adverb was just silly.  Both seem to have followed the example of nefariousness & nefariously which is etymologically distant although in meaning there’s some overlap, those labelled nefandous often associated with things nefarious (sinful, villainous, criminal, or wicked).  Dating from the late sixteenth century, nefarious was from the Latin nefārius (execrable, abominable), from nefās (that which is contrary to divine law, an impious deed, a sin, crime), the construct being ne- (the negative particle: “not”) + fās (the dictates of religion, divine law), related to the Latin forms Latin forms meaning “I speak, I say” (thus the link with nefandous) and cognate with the Ancient Greek φημί (phēmí) (I say).

Unspeakable, unthinkable, unmentionable

Although the word "nefarious" is now rare, the idea is often expressed in the term "unspeakable", used to describe anything from crimes against fashion to mass murderers.  There was also the use use of "unmentionable" as a euphemism for a lady's underwear (usually in the plural as "her (or my) unmentionables") and although sometimes cited as an example of prudery in Victorian England, the evidence of use at the time suggests it was often something jocular or ironic.  However, there was also the notion of "unspeakable" a piece of literal positive law.  In Asia Minor (near present-day Selcuk, Türkiye), in a sacred grove not far from the city of Ephesus, stood the Great Temple of Artemis (also known as the Temple of Diana), one of the Seven Wonders of the Ancient World. During the evening of 21 July, 356 BC, Herostratus (also called Erostratus) of Ephesus saturated the timber and fabric furnishings of the temple with gallons of oil and when all was thoroughly soaked, he set fires in many places, inside and out.  Within minutes, as he had planned, the fire was uncontrollable and the temple doomed.  Coincidently, on the day the temple was razed, Alexander the Great (356-323 DC) was born.

St. Paul Preaching in Ephesus Before the Temple of Artemis (1885), by Adolf Pirsch (1858-1929).

Herostratus was apparently a wholly undistinguished and previously obscure citizen, different from others only in his desire to be famous and the lengths to which he was prepared to go to achieve that fame.  As shocked Ephesians rushed to the fire, Herostratus met them and proudly proclaimed his deed, telling them his name would for all eternity be remembered as the man who burned down the Great Temple of Artemis and razed one of the wonders of the world.  Herostratus was, as he expected, executed for his arson.  In an attempt to deny him the fame he craved, the Ephesians passed the damnatio memoriae law, making it a capital crime ever to speak of him or his deed.  However, it proved impossible to suppress the truth about such an event; the historian Theopompus (circa 380–circa 315 BC) relates the story in his Philippica and it later appears in the works of the historian Strabo (circa 64 BC–circa 24 AD).  His name thus became a metonym for someone who commits a criminal act in order to become noted.  Subsequent attempts to erase names from history by declaring them unspeakable (tried on a grand scale by comrade Stalin (1878-1953; Soviet leader 1924-1953) and the Kim dynasty in the DPRK (North Korea)) seem always to fail.

It's unfortunate history didn't unfold so Android and iOS were available in 356 BC so  Herostratus could have played Lindsay Lohan's The Price of Fame instead of turning to arson.  The game was said to be "a parody on celebrity culture and paparazzi" and enabled players to become world famous celebrities by creating an avatar which could "purchase outfits, accessories, toys and even pets".  Played well, he could have entered a virtual herostratisphere and the temple might stand today.  As Ms Lohan would understand, the tale of Herostratus reminds all that for everything one does, there's a price to be paid. 

Like many of the tales from antiquity, the story of destruction by arson is doubted.  Various conjectures have been offered, some of which doubt the technical possibility of what Herostratus is said to have done, some claiming it was a kind of inside job by the temple’s priests who had their own reasons for wanting a new building and even a reference to the writings of Aristotle which offers a lightning strike as the catalyst for the conflagration.  However, whatever did or didn’t happen in 356 BC, the word herostatic, to describe one who seeks fame at any cost, has endured, the attempt to make his name unspeakable as doomed as the temple.

Monday, February 19, 2024

Asymptote

Asymptote (pronounced as-im-toht)

(1) In mathematics, a straight line which a curve approaches arbitrarily closely as it extends to infinity; the limit of the curve; its tangent “at an imaginary representation of infinity”.

(2) By extension, figuratively, that which comes near to but never meets something else (used in philosophy, politics, conflict resolution etc).

1650–1660: From the Greek asýmptōtos (not falling together).  The Ancient Greek σύμπτωτη (asúmptōtē) was the feminine of Apollonius Pergaeus' πολλώνιος Περγαος Apollnios ho Pergaîos (Apollonius of Perga (Apollonius Pergaeus (circa 240-190 BC)), an astronomer whose most noted contribution to mathematics were his equations exploring quadratic curves.  The construct of the Ancient Greek adjective σύμπτωτος (asúmptōtos) (not falling together) was a- (not) + sýmptōtos (falling together (the construct being + συν (-sym-) (together) + πτωτός (ptōtós) (falling; fallen inclined to fall), the construct being ptō- (a variant stem of píptein (to fall) (from the primitive Indo-European root pet (to rush; to fly)) + -tos (the verbid suffix).  The adjective asymptotic (having the characteristics of an asymptote) dates only from the 1970s.  Asymptote is a noun & verb, asymptotia & asymptoter are nouns, asymptotic & asymptotical are adjectives, asymptoted & asymptoting are verbs and asymptotically is an adverb; the noun plural is asymptotes.

Lines, curves & infinity

The noun asymptote describes a straight line continually approaching but never meeting a curve, even if extending to infinity.  This means that although the distance between line and curve may tend towards zero, it can never reach that point, which is hard to visualize but explained by the notion of the line only ever able to move half the distance required to achieve intersection.  At some point such a thing becomes impossible usefully to represent graphically and even exactly to define the asymptotic using integer mathematics would be unmanageable, thus the use of the infinity symbol (∞).

Horizontal (left), vertical (centre) and oblique asymptotes (right).

There are (1) horizontal asymptotes (as x goes to infinity (in either direction (ie also negative (-) infinity)), the curve approaches b which has a constant value), (2) vertical asymptotes (as x (from any direction) approaches c (which has a constant value), the curve proceeds towards infinity (or -infinity) and (3) oblique asymptotes (as x proceeds towards infinity (or -infinity), the curve goes towards a line y=mx+b (m is not 0 as that is a horizontal asymptote).

The logarithmic spiral and the asymptote.

Although usually depicted on a flat plane, a curve may intersect the asymptote an infinite amount of times.  A spiral with a radius is a logarithmic spiral, distinguished by the property of the angle between the tangent and the radius vector being constant (hence the more popular names “equiangular spiral” or “growth spiral”, the latter favored by laissez faire economists.  The shape appears often in the natural environment in objects and phenomenon as otherwise dissimilar as sea-shells, hurricanes and galaxies near (in cosmic terms) and far.  This diagram was posted on X (formerly known as Twitter) by Dr Cliff Pickover (@pickover) who writes the most elegant explanations which help draw the eye to the often otherwise hidden beauty of mathematics.

Zeno of Elea (Ζήνων λέτης (circa 490–430 BC)) was a Greek philosopher of the Eleatic school, an ever-shifting aggregation of pre-Socratic thinkers based in the lands around the old colony of λέα (Elea, in the present day southern Italian region of Campania, then called Magna Graecia).  Among his surviving thoughts were nine musings (now called Zeno's paradoxes) on the nature of reality, the details of which survived only in the writings of others which has led to some speculation perhaps not all came originally from the quill of Zeno.  Although most of the paradoxes revolve around the notion movement is illusory (and thus effortlessly & instantly resolved by every student in their first Philosophy 101 lecture), they are all less about physics than language and mathematics, the most intriguing of them one of the underlying structures of the argument about whether “now” does or can exist, the “ultras” of one faction asserting “now cannot exist” the other that “only now can exist”.  In that spirit, there’s much to suggest Zeno was aware of the absurdity of many of “his” paradoxes and created them as (1) tools of intellectual training for his students and (2) devices to illustrate how ridiculous can be the result if abstraction is pursued far beyond the possibilities of reality (ie not all arguments pursued to their “logical conclusion” produce a “logical” result).  One of Zeno’s paradoxes contains an explanation of why a curve might never reach a straight line, even if that line stretches to infinity: If the curve can at any time move closer to the line only by half the distance required to intersect, then the curve can only ever tend towards the line.  The two will never touch.

Christian von Wolff (circa 1740), mezzotint by Johann Jacob Haid (1704-1767).

The German philosopher Baron Christian von Wolff (1679-1754) was an author whose writings cover an extraordinary range in formal philosophy, metaphysics, ethics and mathematics and were it not for the way in which Immanuel Kant’s (1724-1804) work has tended to be an intellectual steamroller flattening the history of German Enlightenment rationality, he probably now be better remembered beyond the profession.

What most historians agree is the paradoxes were written to provide some framework supporting Parmenides' (Parmenides of Elea (Παρμενίδης λεάτης (circa 515-570 BC)) was a teacher of the younger Zeno) doctrine of monism (that all that exists is one and cannot be changed, separable only descriptively for purposes of explanation).  The word “monism” was coined by Christian von Wolff and first used in English in 1862; it was from the New Latin monismus, from the Ancient Greek μόνος (mónos) (alone).  Spending years contemplating things like monism may be one of the reasons why so many German philosophers went mad.  So the doctrine of monism is one of the oneness and unity of reality, despite the appearance of what seems a most diverse universe.  That “one-thingism” (that one of philosophy’s great contributions to language) attracted political thinkers along the spectrum but most appealed to those who hold there must be a single source of political authority, expressed frequently as the need for the church to be subordinate to the state or vice versa although the differences may be less apparent than defined: the systems imposed by the ayatollahs in the Islamic Republic of Iran and the Chinese Communist Party (CCP) in the People’s Republic of China structurally more similar than divergent.  Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) once observed that while to political scientists fascism & communism seemed polar opposites, to many living under either the difference may have been something like comparing the North & South Poles, one frozen wilderness much the same as any other.  Arctic geographers would quibble over the details of that but his point was well-understood.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

Because of the self-contained, internal beauty, Monism has attracted long attracted political philosophers with axes to grind.  According to Sir Isaiah Berlin (1909-1997), “value monism” holds there are discoverable, axiomatic ethical principles from which all ethical knowledge may be derived, that ethical reasoning is algorithmic and mechanical, and that it seeks permanent, “final solutions” (no historical baggage in the phrase) to all ethical conflicts.  Berlin had his agenda and that was to warn monism tends to support political despotism, rejecting Immanuel Kant’s (1724–1804) argument “asymptotic monism” is not merely compatible with liberty and liberal toleration but actually a prerequisite for these values.  Although the phrase “Kant’s asymptotic monism” appears often, the phrase was never in his writings and is an encapsulation used by later philosophers to describe positions identifiably Kantesque.  His own philosophy has often been called “a form of transcendental idealism” which holds that the mind plays an active role in shaping our experience of the world, one’s individual’s experience of things not a direct reflection of what is but a construct shaped by the categories and concepts one’s minds impose on one’s experience.  Implicit in Kant is there is certainly one, ultimate, objective reality but experience of reality is limited and shaped by one’s cognitive capacities: because one’s experience of reality is always incomplete and imperfect, it can only ever approach a complete understanding of reality.  One’s cognitive capacities might improve but can only ever tend toward and never attain perfection.  Reality is the asymptote, one’s cognitive capacity the curve.

Sunday, February 18, 2024

Plangent

Plangent (pronounced plan-juhnt)

(1) Resounding loudly with an expressively plaintive sound (associated especially with the chiming of bells).

(2) Any loud, reverberating sound (now rare and probably obsolete).

(3) Mournful music (regardless of volume).

(4) By extension, in literature and poetry, text which is plaintive, mournful, a lament etc (now used loosely).

(5) By extension, in casual use, a state of mind somewhat short of melancholy.

(6) Beating, dashing, as in the action of breaking waves (obsolete except (rarely) as a literary or poetic device).

1822: From the Latin verb plangent- (stem of plangēns), the present participle of plangere (to beat (in sorrow more than anger)) and third-person plural future active indicative of plangō (I beat (my breast); I lament), from the primitive Indo-European root plak- (to strike).  The origin of the idea was in the “breast-beating” a demonstrable form of grief noted by anthropologists in cultures far removed from European contact so apparently something which evolved independently and possibly inherited from our more distant ancestor species.  Plangent is an adjective, plangency is a noun and plangently is an adverb; the noun plural is plagencies.

Plangent was adopted in English to mean “a loud sound which echoes and is suggestive of a quality of mournfulness”.  It was originally most associated with the bells sounded during funerals or memorial ceremonies.  By the mid-late nineteenth century additional layers of meaning had been absorbed, notably (1) sorrowful or somber music and, (2) prose or poetic verse evocative of such feelings.  So it was linguistic mission creep rather than a meaning shift that saw “plangent” a word to use of sad songs and maudlin poetry.  In the technical sense, the original meaning still resonates; the “haunting peal of a church bell can be called plangent and a poem which as text on the page may seem emotionless can be rendered startlingly plangent, if spoken in a certain tone and with a feeling for the pause.  In the jargon of some military bands, “the plangent” remains the instruction for the use of percussion to produce the slow, continuous and atonal beat used for funeral marches or somber commemorative ceremonies and this recalls the original use in English: “beating with a loud sound”, from the Latin plangere, (to strike or beat), the idea in antiquity an allusion to the “beating of the breast” associated with grief.  From this developed the general sense of “lament” which has survived and flourished.  The adjectival sense of anything “loud and resounding” is probably obsolete.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

Suffering ranging from mild displeasure to dark despair being clearly an inescapable part of the human condition, the synonyms of plangent are legion, the choice dictated by the precise nuance one wishes to capture, the forms including: aching, agonized, anguished, bemoaning, bewailing, bitter, deploring, doleful, dolorous, funereal, grieving, heartbroken, lamentable, longing, lugubrious, mournful, plaintive, regretful, rueful, sorrowful, sorry, wailing, weeping & woeful.  Take your pick.

Long Distance II by Tony Harrison (b 1937)

 Though my mother was already two years dead
Dad kept her slippers warming by the gas,
put hot water bottles her side of the bed
and still went to renew her transport pass.
 
You couldn't just drop in.  You had to phone.
He'd put you off an hour to give him time
to clear away her things and look alone
as though his still raw love were such a crime.
 
He couldn't risk my blight of disbelief
though sure that very soon he'd hear her key
scrape in the rusted lock and end his grief.
He knew she'd just popped out to get the tea.
 
I believe life ends with death, and that is all.
You haven't both gone shopping; just the same,
in my new black leather phone book there's your name
and the disconnected number I still call.

Shortly before he died, the poet Stephen Spender (1909–1995) wrote that Tony Harrison’s series of elegies for his parents “...was the sort of poetry for which I've been waiting my whole life.

Saturday, February 17, 2024

Algorithm

Algorithm (pronounced al-guh-rith-um)

(1) A set of rules for solving a problem in a finite number of steps.

(2) In computing, a finite set of unambiguous instructions performed in a prescribed sequence to achieve a goal, especially a mathematical rule or procedure used to compute a desired result.

(3) In mathematics and formal logic, a recursive procedure whereby an infinite sequence of terms can be generated.

1690s: From the Middle English algorisme & augrym, from the Anglo-Norman algorisme & augrimfrom, from the French algorithme, re-fashioned (under mistaken connection with Greek αριθμός (arithmos) (number)) from the Old French algorisme (the Arabic numeral system) from the Medieval Latin algorismus, a (not untypical) mangled transliteration of the Arabic الخَوَارِزْمِيّ (al-awārizmiyy), the nisba (the part of an Arabic name consisting a derivational adjective) of the ninth century Persian mathematician Muammad ibn Mūsā al-Khwārizmī and a toponymic name meaning “person from Chorasmia” (native of Khwarazm (modern Khiva in Uzbekistan)).  It was Muammad ibn Mūsā al-Khwārizmī works which introduced to the West some sophisticated mathematics (including algebra). The earlier form in Middle English was the thirteenth century algorism from the Old French and in English, it was first used in about 1230 and then by the English poet Geoffrey Chaucer (circa 1344-1400) in 1391.  English adopted the French term, but it wasn't until the late nineteenth century that algorithm began to assume its modern sense.  Before that, by 1799, the adjective algorithmic (the construct being algorithm + -ic) was in use and the first use in reference to symbolic rules or language dates from 1881.  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (HSO) has more oxygen atoms per molecule than sulphurous acid (HSO).  The noun algorism, from the Old French algorisme was an early alternative form of algorithm; algorismic was a related form.  The meaning broadened to any method of computation and from the mid twentieth century became especially associated with computer programming to the point where, in general use, this link is often thought exclusive.  The spelling algorism has been obsolete since the 1920s.  Algorithm, algorithmist, algorithmizability, algorithmocracy, algorithmization & algorithmics are nouns, algorithmize is a verb, algorithmic & algorithmizable are adjectives and algorithmically is an adverb; the noun plural is algorithms.

Babylonian and later algorithms

An early Babylonian algorithm in clay.

Although there is evidence multiplication algorithms existed in Egypt (circa 1700-2000 BC), a handful of Babylonian clay tablets dating from circa 1800-1600 BC are the oldest yet found and thus the world's first known algorithm.  The calculations described on the tablets are not solutions to specific individual problems but a collection of general procedures for solving whole classes of problems.  Translators consider them best understood as an early form of instruction manual.  When translated, one tablet was found to include the still familiar “This is the procedure”, a phrase the essence of every algorithm.  There must have been many such tablets but there's a low survival rate of stuff from 40 centuries ago not regarded as valuable.

So associated with computer code has the word "algorithm" become that it's likely a goodly number of those hearing it assume this was its origin and any instance of use happens in software.  The use in this context, while frequent, is not exclusive but the general perception might be it's just that.  It remains technically correct that almost any set of procedural instructions can be dubbed an algorithm but given the pattern of use from the mid-twentieth century, to do so would likely mislead or confuse confuse many who might assume they were being asked to write the source code for software.  Of course, the sudden arrival of mass-market generative AI (artificial intelligence) has meant anyone can, in conversational (though hopefully unambiguous) text, ask their tame AI bot to produce an algorithm in the syntax of the desired coding language.  That is passing an algorithm (using the structures of one language) to a machine which interprets the text and converts it to language in another structure, something programmers have for decades been doing for their clients.

A much-distributed general purpose algorithm (really more of a flow-chart) which seems so universal it can be used by mechanics, programmers, lawyers, physicians, plumbers, carpet layers, concreting contractors and just about anyone whose profession is object or task-oriented.   

The AI bots have proved especially adept at such tasks.  While a question such as: "What were the immediate implications for Spain of the formation of the Holy Alliance?" produces varied results from generative AI which seem to range from the workmanlike to the inventive, when asked to produce computer code the results seem usually to be in accord with a literal interpretation of the request.  That shouldn't be unexpected; a discussion of early nineteenth century politics in the Iberian Peninsular is by its nature going to to be discursive while the response to a request for code to locate instances of split infinitives in a text file is likely to vary little between AI models.  Computer languages of course impose a structure where syntax needs exactly to conform to defined parameters (even the most basic of the breed such as that PC/MS-DOS used for batch files was intolerant of a single missing or mis-placed character) whereas something like the instructions to make a cup of tea (which is an algorithm even if not commonly thought of as one) greatly can vary in form even though the steps and end results can be the same.

An example of a "how to make a cup of tea" algorithm.  This is written for a human and thus contains many assumptions of knowledge; one written for a humanoid robot would be much longer and include steps such as "turn cold tap clockwise" and "open refrigerator door".

The so-called “rise of the algorithm” is something that has attracted much comment since social media gained critical mass; prior to that algorithms had been used increasingly in all sorts of places but it was the particular intimacy social media engenders which meant awareness increased and perceptions changed.  The new popularity of the word encouraged the coining of derived forms, some of which were originally (at least to some degree) humorous but beneath the jocularity, many discovered the odd truth.  An algorithmocracy describes a “rule by algorithms”, a critique in political science which discusses the implications of political decisions are being made by algorithms, something which in theory would make representative and responsible government not so much obsolete as unnecessary.  Elements of this have been identified in the machinery of government such as the “Robodebt” scandal in Australia in which one or more algorithms were used to raise and pursue what were alleged to be debts incurred by recipients of government transfer payments.  Despite those in charge of the scheme and relevant cabinet ministers being informed the algorithm was flawed and there had been suicides among those wrongly accused, the politicians did nothing to intervene until forced by various legal actions.  While defending Robodebt, the politicians found it very handy essentially to disavow connection with the processes which were attributed to the algorithm.

The feeds generated by Instagram, Facebook, X (formerly known as Twitter) and such are also sometimes described as algorithmocracies in that it’s the algorithm which determines what content is directed to which user.  Activists have raised concerns about the way the social media algorithms operate, creating “feedback loops” whereby feeds become increasingly narrow and one-sided in focus, acting only to reinforce opinions rather than inform.  In fairness, that wasn’t the purpose of the design which was simply to keep the user engaged, thereby allowing the platform to harvest more the product (the user’s attention) they sell to consumers (the advertisers).  Everything else is an unintended consequence and an industry joke was the word “algorithm” was used by tech company CEOs when they didn’t wish to admit the truth.  A general awareness of that now exists but filter bubbles won’t be going away but what it did produce were the words algorithmophobe (someone unhappy or resentful about the impact of algorithms in their life) and algorithmophile (which technically should mean “a devotee or admirer of algorithms” but is usually applied in the sense of “someone indifferent to or uninterested in the operations of algorithms”, the latter represented by the great mass of consumers digitally bludgeoned into a state of acquiescent insensibility.

Some of the products are fighting back: The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now (2024) by  by Hilke Schellmann, pp 336, Hachette Books (ISBN-13: 978-1805260981).

Among nerds, there are also fine distinctions.  There are subalgorithms (sub-algorithm seems not a thing) which is a (potentially stand-alone) algorithm within a larger one, a concept familiar in many programming languages as a “sub-routine” although distinct from a remote procedure call (RPC) which is a subroutine being executed in a different address space.  The polyalgorithm (again hyphens just not cool) is a set of two or more algorithms (or subalgorithms) with instructions for choosing which in some way integrated.  A very nerdy dispute does exist within mathematics and computer science around whether an algorithm, at the definitional level, really does need to be restricted to a finite number of steps.  The argument can eventually extend to the very possibility of infinity (or types of infinity according to some) so it really is the preserve of nerds.  In real-world application, a program is an algorithm only if (even eventually), it stops; it need not have a middle but must have a beginning and an end.

There is also the mysterious pseudoalgorithm, something les suspicious than it may first appear.  Pseudoalgorithms exist usually for didactic purposes and will usually interpolate (sometime large) fragments of a real algorithm bit it may be in a syntax which is not specific to a particular (or any) programming language, the purpose being illustrative and explanatory.  Intended to be read by humans rather than a machine, all a pseudoalgorithm has to achieve is clarity in imparting information, the algorithmic component there only to illustrate something conceptual rather than be literally executable.  The pseudoalgorithm model is common in universities and textbooks and can be simplified because millions of years of evolution mean humans can do their own error correction on the fly.

Of the algorithmic

The Netflix algorithm in action: Lindsay Lohan (with body-double) during filming of Irish Wish (2024).  The car is a Triumph TR4 (1961-1967), one of the early versions with a live rear axle, a detail probably of no significance in the plot-line.

The adjective algorithmic has also emerged as an encapsulated criticism, applied to everything from restaurant menus, coffee shop décor, choices of typefaces and background music.  An entire ecosystem (Instagram et al) has been suggested as the reason for this multi-culture standardization in which a certain “look, sound or feel” becomes “commoditised by acclamation” as the “standard model” of whatever is being discussed.  That critique has by some been dismissed as something reflective of the exclusivity of the pattern of consumption by those who form theories about what seem not very important matters; it’s just they only go to the best coffee shops in the nicest parts of town.  In popular culture though the effect of the algorithmic is widespread, entrenched and well-understood and already the AI bots are using algorithms to write music will be popular, needing (for now) only human performers.  Some algorithms have become well-known such as the “Netflix algorithm” which presumably doesn’t exist as a conventional algorithm might but is understood as the sets of conventions, plotlines, casts and themes which producers know will have the greatest appeal to the platform.  The idea is nothing new; for decades hopeful authors who sent manuscripts to Mills & Boon would receive one of the more gentle rejection slips, telling them their work was very good but “not a Mills & Boon book”.  To help, the letter would include a brochure which was essentially a “how to write a Mills & Boon book” guide and it included a summary of the acceptable plot lines of which there were at one point reputedly some two dozen.  The “Netflix algorithm” was referenced when Falling for Christmas, the first fruits of Lindsay Lohan’s three film deal with the platform was released in 2022.  It was an example of followed a blending of several genres (redemption, Christmas movie, happy ending etc) and the upcoming second film (Irish Wish)  is of the “…always a bridesmaid, never a bride — unless, of course, your best friend gets engaged to the love of your life, you make a spontaneous wish for true love, and then magically wake up as the bride-to-be.” school; plenty of familiar elements there so it’ll be interesting to see if the algorithm was well-tuned.

Math of the elliptic curve: the Cox–Zucker machine can help.

Some algorithms have become famous and others can be said even to have attained a degree of infamy, notably those used by the search engines, social media platforms and such, the Google and TikTok algorithms much debated by those concerned by their consequences.  There is though an algorithm remembered as a footnote in the history of linguistic oddities and that is the Cox–Zucker machine, published in 1979 by Dr David Cox (b 1948) and Dr Steven Zucker (1949–2019).  The Cox–Zucker machine (which may be called the CZM in polite company) is used in arithmetic geometry and provides a solution to one of the many arcane questions which only those in the field understand but the title of the paper in which it first appeared (Intersection numbers of sections of elliptic surfaces) gives something of a hint.  Apparently it wasn’t formerly dubbed the Cox–Zucker machine until 1984 but, impressed by the phonetic possibilities, the pair had been planning joint publication of something as long ago as 1970 and undergraduate humor can’t be blamed because they met as graduate students at Princeton University.  The convention in academic publishing is for authors’ surnames to appear in alphabetical order and the temptation proved irresistible.