Sunday, May 5, 2024

Facade

Facade (pronounced fuh-sahd or fa-sahd)

(1) In architecture, the front of a building, especially an imposing or decorative one; any side of a building facing a public way or space and finished accordingly.

(2) By extension, the face or front (most visible side) of any other thing.

(3) Figuratively, a superficial appearance or illusion of something; a deceptive or insincere outward appearance; a front.

(4) In computing (object-oriented programming), a structural design pattern that provides an object that is a simplified interface to a larger body of code, such as a class library.

1650–1660: From the sixteenth century French façadefrom the Italian facciata (the front of a building), from the Upper Italian faciada & facciata, derivations of faccia (front; face), from the Vulgar Latin facia from the Classical Latin facies (face).  The French façade, appears still to be an accepted alternative spelling and one popular with architects.  The figurative use dates from 1845.  Facade, facadectomy and facadism are nouns and facadal is an adjective; the noun plural is facades.

Neoclassicism in eighteenth century architecture

Neoclassicism (new classicism (from the Latin classicus (highest; best)), refers to an eighteenth and early-nineteenth century movement which took inspiration from Greece between the eighth and fourth centuries BC and Rome between the fifth and first.  The revival was expressed in art, literature, philosophy, history, music, and, most memorably, architecture.  Neoclassicism could not exactly replicate the works of antiquity.  Because only fragments remained, either as ruins or (perhaps not entirely reliable) depictions of what was built by the earlier civilizations, the world imagined was a construct, an idea of what once was.  This did mean neoclassicism tended often to adopt the few, most dramatic motifs which survived either in the representational record or as crumbling remains, a movement, at least initially, driven more by idealism than reality.  It was another example of the West’s enduring reverence for the ancient world and its supposed sublimity of form, neoclassicism serving what was perhaps the myth of classical perfection.

Facade of Chiswick House (1729), London, a villa built and designed by Richard Boyle (third Earl of Burlington (1694-1753)) in the Neo-Palladian style (a European school which followed the work of the Venetian architect Andrea Palladio (1508–1580).

Classical revivalism actually began during the Renaissance, some four centuries earlier, renaissance (rebirth) translating historically as the rebirth in Europe of the spirit of Ancient Greece and Rome.  However, eighteenth century Europe saw the Age of Enlightenment, transforming philosophy, politics, society & culture and the new science of archaeological excavation which led to astonishing discoveries such as the largely intact Roman towns of Pompeii and Herculaneum, buried after a volcanic eruption in 79 AD.  From these ruins, it was found the societies of antiquity were technologically more sophisticated and their art more extraordinary than once thought and the discoveries sparked renewed interest and as part of the research effort, a less idealized view of the past evolved.

A McMansion which features at least one element a architect would describe as a facade and (possibly) a sub-facade.  However, being a McMansion, it may appear on the plan as a "Demi-Façade".

With the rise of Romanticism in the nineteenth century, neoclassicism fell from fashion but it didn’t die and elements continue to be interpolated into modern architecture although, the borrowing of aspects usually rendered on large scale and applying them to relatively small-scale suburban houses is unfortunate though many critics think it no worse that the mash-up of motifs, neoclassical and otherwise, which appear on McMansions.  The noun facadism is a technical term from architecture and structural engineering which describes (1) the practice of retaining only the facade of an old building when redeveloping a site (something often required by heritage legislation) and (2) in building construction, a technique where the facade is designed to be constructed independently of the rest of the building (a modular approach which not only can mean a cheaper build but also one which permits a much easier “updating” or “refreshing” of the appearance, conceptually (though not structurally) analogous with a surgical facelift.  The noun facadectomy probably began as humorous in-house slang among architects and builders but is now accepted as a standard word, something doubtlessly assisted by the frequency with which facadectomies are latterly performed.  On the model of appendectomy (The surgical procedure for the removal of the vermiform appendix) etc, it describes either (1) the removal (and sometimes replacement) of a building's facade or (2) the retention of a building's facade behind which a new structure is built after the original building is demolished.

Lindsay Lohan in front of the Louvre, Paris, March, 2015.

Architects point out that regardless of the way dictionaries tend to describe the things, a facade is not simply "the front or outward facing surface of a structure" and that to be defined as such one must in some way be separate from the structure in that were it to be removed, the building would still stand.  That might deprive the building of it aesthetic appeal and even let in the wind and rain but it would remain standing.  In that, the architects' precision is closely aligned also with the way the word is used figuratively of people or institutions.  Depending on how one deconstructs, the Louvre has a facade, a multi-piece facade or a number of facades whereas the Louvre Pyramid has none at all; it is a unitary structure.  Like the Eiffel Tower which in the years immediately after its erection was much derided (especially by Parisians), the pyramid was much criticized but opinion seems to be softening; either the disapproving are dying off or they're coming at least to tolerate it.

Saturday, May 4, 2024

ACL

ACL (pronounced eh-see-elle)

The abbreviation for anterior cruciate ligament, one of a pair of cruciate ligaments in the human knee.

1887 (in Italian); early 20th century (in English): The construct was anterior + cruciate + ligament. Anterior was from the Latin anterior (that is before, foremost).  Cruciate was from the Latin cruciatus, the perfect passive participle of cruciō, from crux (cross).  Ligament was from the Middle English ligament, from the Latin ligāmentum, from ligō (tie, bind).  The vital but unexciting body part sounds much better if spoken in other European languages including Portuguese (ligamento cruzado anterior), Spanish (ligamento cruzado anterior), Catalan (lligament encreuat anterior), French (ligament croisé antérieur) and especially Italian (legamento crociato anteriore).  Anterior cruciate ligament is a noun; the noun plural is anterior cruciate ligaments.

In the world of acronyms and abbreviations, there are literally dozens of other ACLs including the American Classical League which promotes the study of Antiquity and the classics, the Association for Computational Linguistics, a professional organization for those working on natural language processing, the Australian Christian Lobby, a right wing Christian pressure group which disapproves of the last three centuries-odd, the Access Control List, an element in computer security, ACL2, a modular software noted for its theorem prover, as code ACL, Akar-Bale language, an extinct Great Andamanese language (ISO (International Standard) 639-3), allowable combat load, in military aviation, the inventory of weapons system for which an airframe is rated and the wonderful anthroponotic cutaneous leishmaniasis, a form of Old World cutaneous leishmaniasis, usually with a prolonged incubation period and confined to urban areas.

The long and painful history of the anterior cruciate ligament

Ligaments of the right knee.

Descriptions of the anterior cruciate ligament (ACL) appear in some surviving medical texts from Antiquity, the earliest known reference thought to be in the drawings of the physician Galen (Claudius Galenus or Aelius Galenus; 129-216) although he made no mention of injuries associated with this body part, the aspect for which it’s now best known although there is evidence of corrective surgery being undertaken in Ancient Egypt.  Presumably, during the many centuries when falling from horses was far from uncommon, such injuries were frequent but because neither surgical correction nor sophisticated rehabilitation regimes had evolved, victims had to suffer or perhaps retire from more rigorous pursuits.  The Irish surgeon Robert Adams (1791-1875) in 1837 noted a clinical case of an ACL tear but in an age when treatments rightly were conservative because the risk death from any surgical intervention was high, Dr Adams’ report was purely observational.  The literature was augmented in 1850 by the Scottish GP (family doctor) James Stark (1811-1890) who published two cases of cruciate tears, describing the different manifestations of knee instability in patients with damaged ACLs but the first record of ACL repair was an operation performed in 1895 by the English surgeon Sir Arthur Mayo-Robson (1853-1933).  The early approach was the use of primary open sutures but while this produced good initial results, decoration was rapid.  No substantive improvements in method were reported so the suturing approach was abandoned and the profession turned to reconstruction.

Lindsay Lohan's knees.

The Russian-born surgeon Ivan Grekov (1867-1934) is credited with having in 1914 been the first to adopt the use of autologous tissue (of cells or tissues obtained from the same individual) for ACL rupture reconstruction in 1914, the technique also documented by the English professor of orthopaedic surgery, Ernest Hey Groves (1872-1944) who performed a number of procedures between 1917-1920.  The Hey Groves approach is strikingly modern and essentially the technique used today but the efficacy clearly wasn’t understood because in the following decades what the historians describe as “…a period of startling ingenuity which created an amazing variety of different surgical procedures often based more on surgical fashion and the absence of a satisfactory alternative than any indication that continued refinements were leading to improved results.  It is hence not surprising that real inventors were forgotten, good ideas discarded and untried surgical methods adopted with uncritical enthusiasm only to be set aside without further explanation.”  That to some extent may explain why ACL reconstructions became rare and it wasn’t until the 1970s when, as the implications of broadcasting allowed professional sport to become a multi-billion dollar industry that with sports medicine becoming a mainstream medical discipline that the operation became common; it was certainly a common injury.  Still, innovation continued and just as there was experimentation with xenografts (tissue graft taken from a species different from that of the recipient.) & allografts (a tissue graft between genetically different individuals of the same species) before the autologous prevailed.  Even synthetic graft materials enjoyed some popularity in the 1980 and 1990s, apparently because in laboratory testing artificial ligaments appeared to be more durable and better able to withstand stresses and strains; real-world experience proved otherwise.

Torn ACL: Exactly what it says.

The increasing participation of female participation in elite-level (often professional) sports such as the various football codes and basketball has in recent years seen a striking rise in ACL injuries.  While to reported volume of incidents is still less than those suffered in gymnastics, long the most common source, it’s in these team sports where the rate of increase has been greatest.  Although the male & female knee look much the same, the physiological differences exist and, given there are differences between almost every human cell which is in some way specifically male or female, that shouldn’t be surprising.  Anatomists note certain structural divergences such as those in the alignment of the leg & pelvis and the muscular protection of the knee joint, added to which the hormone estrogen is known to influence all ligaments but probably of greater consequence are the variations in neuromuscular behavior which human movement studies have documented.  Essentially, these focus on the different positions of the knee and the upper body (compared to the typical male) and a striking predilection when landing to apportion most weight to one rather than both feet.  Theories have been offered to account for this but the most obvious consequence is that the forces generated by landing are less absorbed by the foot and lower leg muscles (analogous with the “crumple-zones” in modern automobiles), meaning a higher proportion of the stress impacts upon the ACL of the “landing knee”.  Added to this, because the typical female tends to land with the upper body tilted to the side of the “landing knee”, this imposes a greater rotational force on the ACL at the same time a vertical impact is being absorbed.  This crucial aspect of behavior is known as the “ankle-dominant strategy”

Some novel research however emerged in 2024 and it may be a candidate for one of the ten Ig Nobel Prizes, awarded annual by the Annals of Improbable Research (AIR) to acknowledge “unusual or trivial achievements in scientific research”.  What the study hypothesized was there might be a link between the sports bras and ACL injuries.  Impressionistically, the connection is not immediately obvious and what the researchers found was not, as might be imagined, simply a product of weight distribution and the effective “multiplier effect” of mass in movement, the further it is from the pivot point, illustrated by the recommendations provided for placing weight in trailers when being towed.  The physics of both are presumably vaguely similar but the interplay of factors relating to women's ACL injuries seems to be more complex. 

Lindsay Lohan in "low-impact" sports bra.

It transpires the multiplier effect of the upper-body mass wasn’t the issue.  What the international team of experts in biomechanics and sports medicine did was study 35 female recreational athletes, finding that the more supportive were the sports bras (the so-called “high-impact” designs), the greater the decrease in the common risk factors associated with ACL injuries, the “knee flexion angles” reducing, meaning the knee didn't have to bend as much on landing.  Additionally, there was a reduction in “dynamic knee valgus”, the knee moving inwards from the foot, something of mechanical significance because females tend to have more inward collapsing knees (increased dynamic knee valgus) during landing activities.  Dynamically, what the study revealed was that when there was no or only minimal breast support, the greater was the tendency to adopt the “ankle-dominant strategy” which had the effect of transferring the stress to the knee and thus the ACL.

By contrast, when wearing a high-impact sports bra, females Used a more “hip-dominant” strategy which puts less strain on the ACL.  The mechanics of the “hip-dominant” approach is that the trunk moves less, making pelvic control easier the “…movement patterns at the trunk, pelvis and lower extremities… all connected.”  The study was published in the Journal of Applied Biometrics and the study cohort of 35 included women with bra cup sizes between B & D, the findings suggesting the larger the cup size, the higher the risk of traumatic knee injury although, perhaps counter-intuitively, the researchers weren’t prepared to say that “…definitively say breast size drives injury risk…” because (1) it was a small number of participants in the study and (2) there are “…so many differences in movement patterns from person to person.”  In the spirit of good research, one reviewer noted the study “…scratches the surface…" of an area that needs …further investigation. 

Human movement studies have a long history but the bulk of the research has been on men and the presence of breasts is the most obvious difference in the bio-mechanics of movement and something which might yet have implications not yet understood.  The physics of it is breasts move up and down and side-to-side during exercise and in a sixty minute session of running, they can bounce in a figure-eight pattern some 10,000 times.  Traditionally, for the sports bra manufacturers the focus in advertizing has been on comfort but there are also performance effects which at the elite level can be vital because the difference between success and failure can be measured in thousands of a second and fractions of an inch.  The appropriate bra can actually reduce oxygen consumption when running which translates into running economy (the distance traveled per volume of oxygen consumed) and the oxygen can be better used by the brain and muscles; also, if the breast movement minimized, the strides become longer, another aspect of economy.  Those matters were known but the apparent explanation of the wrong choice of sports bra being a factor in the higher incidence of ACL injuries in women is something new.

Friday, May 3, 2024

Sommelier

Sommelier (pronounced suhm-uhl-yey or saw-muh-lyey (French))

A waiter, in a club or restaurant or something similar, who is in charge of wines; sometimes known as the wine steward.

1889: Details of the etymology are contested at the margins.  All agree it’s a dissimilated form of the Middle French sommerier (a butler), from the thirteenth century sommier (a military officer who had charge of provisions, a position which evolved into as aspect of the modern role of quartermaster (the the expression used to describe staff in these roles as "on the 'Q' side")).   One version traces this from the twelfth century somme (pack) from the Vulgar Latin salma, a corruption of the Late Latin sagma (a pack-saddle (and later "the pack on the saddle")).  The alternative suggestion was it was from the Old French Provençal saumalier (pack-animal driver) again from Late Latin sagma, the origin of which was the Ancient Greek ságma (covering, pack saddle).  Polish, Portuguese, Spanish & Swedish all use an unadapted borrowing of the French sommelier.  Sommelier is a noun & verb and sommeliering & sommeliered are verbs; the noun plural is sommeliers. 

Fifty-odd years of the Court of Master Sommeliers

Although they call themselves cork-dorks, at the most elite level, a sommelier can belong to a most exclusive club.  The Court of Master Sommeliers was established in 1977, formalizing the layers of qualification that began in 1969 in London with the first Master Sommelier examination, conducted now by the various chapters of the court and globally, they’re a rare few.  While over 600 people have been to space and there are rumored to be some 4000 members of the Secret Society of the Les Clefs d'Or, there are currently only 262 Master Sommeliers in the world. 

In training; practice makes perfect.

The Certified Sommelier Examination (CME) exists as three part concept: (1) a focus on a candidate’s ability to demonstrate proficiency in deductive tasting, (2) the technical aspects of wine production & distribution and (3) the practical skills and techniques of salesmanship required for those working as sommeliers in restaurants and other establishments.  It’s thus a vocational qualification for those who wish to pursue a career in hospitality, either in the specialized field of beverage services or as a prelude to moving into management.  Like the Secret Society of the Les Clefs d’Or, upon graduation as a Certified Sommelier, a candidate becomes entitled to a certificate, title and a lapel pin (in the singular and worn to the left rather than pair of cross keys issued by the Les Clefs d’Or).

It’s a structured process.  As a prerequisite, candidates must have completed the classes and passed the Introductory Sommelier Examination (ISE) although it’s no longer required of students that the CME be completed within three years of the ISE, candidates now encouraged to proceed to the next level when “best prepared”.  The court suggests a minimum of three years industry experience is desirable because both the content of both the ISE & SME are predicated on the assumption those sitting will have this background and it’s further advise it’s best to work in the field for at least twelve months between the two.  The CSE is a one-day examination in three parts and the minimum passing grade is 60% in each (all within the one sitting):

(1) A tasting examination using the court’s Deductive Tasting Method (DTM), candidates during which candidates must with a high degree of accuracy & clarity describe and identify four wines (two white and two red).  The format of this is a written four-section essay which must be completed within 45 minutes and the DTM exists in a structured format which candidates must learn prior to the exam.

The Court of Master Sommelier's Deductive Tasting Method.

(2) A theory examination which is designed to test candidates' knowledge and understanding of wine, beverage, and the sommelier trade.  The test consists of multiple choice, short answer, some non-abstract math (ie the sort of arithmetic relevant to the profession) and matching questions. Candidates must complete the 45-question examination within 38 minutes.

Lindsay Lohan demonstrating early sommelier skills in The Parent Trap (1998).  She decided to focus on acting, pursuing wine-tasting only as a hobby.

(3) A Service Examination: The service examination is a practical-level experience, conducted in a setting which emulates a “real: restaurant environment.  It’s designed to allow candidates to demonstrate salesmanship, knowledge and appropriate conversational skills, all while performing the tableside tasks associated with the job.  The test typically includes opening still or sparkling wines in the correct manner and is not limited purely to what’s in the bottle, students expected to be able to recommend cocktails, spirits or other drinks and discuss the interplay of food and wine; what goes best with what.  As befits a practical exam, candidates must dress and deport themselves exactly as they would if employed as a restaurant sommelier.

Wednesday, May 1, 2024

Privity

Privity (pronounced priv-i-tee)

(1) Private or secret knowledge.

(2) Participation in the knowledge of something private or secret, especially as implying concurrence or consent.

(3) Privacy or secrecy (obsolete).

(4) In medieval theology, a divine mystery; something known only to God, or revealed only in the Holy Scriptures (obsolete).

(5) The genitals (archaic, and only in the plural).

(6) In law, a relationship between parties seen as being a result of their mutual interest or participation in a given transaction, usually in contract.

(7) The fact of being privy to something; knowledge, compliance (now rare).

1175–1225: From the Anglo-Norman priveté & privitee and the Middle English privete & private, from the Old French priveté, privité & priveté (privacy; a secret, private matter), the construct being privé (from the Late Latin privus (set apart, belonging to oneself)) + -té (from the Middle French -té, from the Old French -té, from the Latin -itātem or -tātem, accusative singular of -tās, ultimately from the primitive Indo-European -tehts; the suffix was used to form nouns, often denoting a quality or a property).  The ultimate source was the Classical Latin privātus (perfect passive participle of prīvō (I bereave, deprive; I free, release).  Privity is a noun; the noun plural is privities.

Between the twelfth & sixteenth centuries a privity was “a divine mystery; something known only to God, or revealed only in the Holy Scriptures and by the late 1200s this meaning had leaked into a general sense of “privacy; secrecy”, used between the fourteenth & seventeenth centuries to refer to “a private matter, a secret”.  The use to describe the genitals (presumably influenced in some way by “private parts” or “the private”) as “the privities” is attested from the late fourteen century and didn’t wholly fade from use until the early nineteenth although use had by then long declined to a northern English, Irish & Scottish regionalism.  The word was used from the 1520s as a technical term in the laws regulating feudal land tenure and other fields of law picked it up in the general sense of “a relationship between parties seen as being a result of their mutual interest or participation in a given transaction”; it was in contract law this would assume it’s important meaning as “privity of contract” (describing the special status of the parties to a contract (as legally defined), something which would for centuries be of critical importance and still in use today.  Less precise was the sixteenth century sense of “the fact of being privy to something; knowledge, compliance” and while there are better ways of saying it, such use is not yet extinct.

Privity of contract, Donoghue v Stevenson and the snail.

The classic case (drummed for almost a century into law students) in the demolition of the sense of the absolute in privity of contract was Donoghue v Stevenson ([1932] A.C. 562, [1932] UKHL 100, 1932 S.C. (H.L.) 31, 1932 S.L.T. 317, [1932] W.N. 139), finally decided before the House of Lords.  It was the case which more than any other established the foundation of the doctrine of product liability, refined the concept of negligence (transforming tort law) and remains a core part of the framework for the principles of “duty of care” which substantially it expanded.

The extraordinary case began with events which transpired in the modest settings of the Wellmeadow Café in Paisle, Scotland, Mrs Donoghue’s friend on 26 August 1928 buying her a ginger-beer, served in a bottle made from a dark, opaque glass.  After she’d consumed about half, the remainder was poured into a tumbler at which point the partially decomposed remains of a snail floated out, inducing an alleged shock and severe gastro-enteritis.  Because Mrs Stevenson was not a party to the contractual purchase of the ginger beer, she was unable to claim through breach of warranty of a contract: she was not party to any contract because, at law, she received the drink as a gift.  Accordingly, she issued proceedings against Stevenson (the manufacturer) and, after some four years in the lower courts, the matter ended up before the House of Lords, then the UK’s highest appellate court.

All were aware it was an important case.  The lower courts, bound by precedent, had been compelled to find the absence of privity of contract doomed the suit but the issue of product liability in the modern era of consumers interacting usually not directly with the producer of goods but their agents or retailers had for some time been discussed as an area of law in which reform was required.  What the Law Lords had to decide was whether the manufacturer owed Mrs Donoghue a duty of care in the absence of contractual relations contrary to established case law.  The important point was not if she was owed compensation for damages suffered but if a cause of action existed.

Previously, as a general principle, manufacturers owed no duty of care to consumers except if (1) the product was inherently dangerous and no warning of this sate was provided and (2) the manufacturer was aware that the product was dangerous because of a defect and this had been concealed from the consumer.  The Lords found for Mrs Donoghue although in a cautious judgement which could be read as offering little scope for others except the specific matter of ginger beer in opaque bottles containing the decomposed remains of a dead snail when sold to a Scottish widow.  However, the mood for reform was in the legal air and the judgment established (1) negligence is distinct and separate in tort, (2) there need not be privity of contract for a duty of care to be established and (3) manufacturers owe a duty to the consumers who they intend to use their products.

In the leading judgment, Lord Atkin (James Richard Atkin, 1867–1944; lord of appeal in ordinary 1928-1944) wrote, inter alia, what was at that time the widest definition of the “neighbour principle”: “The rule that you are to love your neighbour becomes in law, you must not injure your neighbour; and the lawyer’s question, Who is my neighbour? receives a restricted reply.  You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour.  Who, then, in law is my neighbour? The answer seems to be – persons who are so closely and directly affected by my act that I ought reasonably to have them in contemplation as being so affected when I am directing my mind to the acts or omissions which are called in question.  On this basis, if no other, the Lords held Mrs Donoghue’s action had succeeded and she had a cause of action in law, the culmination of a growing appreciation by the courts that the law needed to evolve to reflect the patterns of modern commerce.  Some years before Donoghue v Stevenson had been decided, another judge had observed “it would appear to be reasonable and equitable to hold that, in the circumstances and apart altogether from contract, there exists a relationship of duty as between the maker and the consumer

Once, if someone bought two bottles of ginger beer and gave one to a friend, were both to be injured by decomposing snails within, only the consumer who handed over the cash could have recovered damages because they alone enjoyed a privity of contract.  Since Donoghue v Stevenson, both can in court seek remedy in tort on the basis of “product liability”, a manufacturer’s duty of care held to extend to all consumers of their products.

Being the common law, what was effectively a new doctrine (and one, as the term “neighbour principle” suggests, rooted in Christian morality) it was also a general principle and thus a foundation on which the building blocks of subsequent judgments would sit; it could not be treated, in the words of Lord Reid (James Scott Cumberland Reid, 1890–1975, lord of appeal in ordinary 1948-1975): “as if it were a statutory definition. It will require qualification in new circumstances.  The courts in the years after 1932 had ample opportunity to refine things and this included the development of the modern tests in tort for the “foreseeability of damage” and “proximity” to which was later appended the surprisingly recent “fairness”, something which came to be regarded as within the rubric of public policy, all able to work in conjunction and as one judge noted, the distinctions between them were “somewhat porous but they are probably none the worse for that.  From Donoghue v Stevenson has evolved the modern notion of product liability and it would now to many seem strange there was in living memory a time when a manufacturer could escape liability for selling defective goods simply on the basis the injured party wasn’t the purchaser.  One curious quirk of Donoghue v Stevenson remains that the facts were not tested so it will never be known if the most important character in the case (the decomposing snail) ever existed.

Tuesday, April 30, 2024

Cybernetic

Cybernetic (pronounced sahy-ber-net-ik)

(1) Of or relating to cybernetics (the theoretical study of communication and control processes in biological, mechanical, and electronic systems, especially the comparison of these processes in biological and artificial systems).

(2) Of or relating to computers and the Internet (largely archaic (ie "so 1990s").

1948 (in English): From the Ancient Greek κυβερνητικός (kubernētikós) (good at steering, a good pilot (of a vessel)), from κυβερνητική τέχνη (kubernētikḗ tékhnē) (the pilot’s art), from κυβερνισμός (kubernismós) or κυβέρνησις (kubérnēsis) (steering, pilotage, guiding), from κυβερνάω (kubernáō) (to steer, to drive, to guide, to act as a pilot (and the ultimate source of the Modern English "govern").  Cybernetic & cybernetical are adjectives, cybernetics, cyberneticist & cybernetician are nouns and cybernetically is an adverb; the noun cybernetics is sometimes used as a plural but functions usually as a as singular (used with a singular verb)  

Although it's undocumented, etymologists suspect the first known instance of use in English in 1948 may have been based on the 1830s French cybernétique (the art of governing); that was in a paper by by US mathematician and philosopher Norbert Wiener (1894-1964) who was influenced by the cognate term "governor" (the name of an early control device proposed by Scottish physicist James Clerk Maxwell (1831–1879)), familiar in mechanical devices as a means of limiting (ie "governing") a machine's speed (either to a preferred rate or a determined maximum).  That was obviously somewhat different from the source in the original Greek kubernētēs (steersman) from kubernan (to steer, control) but the idea in both was linked by the notion of "control".  The French word cybernétique had been suggested by French physicist and mathematician André-Marie Ampère (1775-1836), (one of the founders of the science of electromagnetism and after whom is named the SI (International System of Units) unit of measurement of electric current, the ampere (amp)) to, describe the then non-existent study of the control of governments; it never caught on.  From cybernetics came the now ubiquitous back-formation cyber which has, and continues, to coin words, sometimes with some intellectual connection to the original, sometimes not: cybercafé, cybercurrency, cybergirlfriend, cybermania, cybertopia, cyberculture, cyberhack, cybermob, cybernate, cybernation, cyberpet, cyberphobia, cyberpunk, cybersecurity, cybersex, cyberspace, cyberfashion, cybergoth, cyberemo, cyberdelic etc.

Feedback

MIT Professor Norbert Wiener was an American mathematician and philosopher and one of the early thinkers developing the theory that the behaviour of all intelligent species was the result of feedback mechanisms that perhaps could be simulated by machines.  Now best remembered for the word cybernetics, his work remains among the foundations of artificial intelligence (AI).

The feedback loop at its most simple.

Cybernetics was an outgrowth of control theory, at the time something of a backwater in applied mathematics relevant to the control of physical processes and systems.  Although control theory had connections with classical studies in mathematics such as the calculus of variations and differential equations, it became a recognised field only in the late 1950s when the newly available power of big machine computers and databases were applied to problems in economics and engineering.  The results indicated the matters being studied manifested as variants of problems in differential equations and in the calculus of variations.  As the computer models improved, it was recognised the theoretical and industrial problems all had the same mathematical structure and control theory emerged.  The technological determinism induced by computing wasn’t new; the embryonic field had greatly been advanced by the machines of the eighteenth and nineteenth centuries.

Cybernetics can be represented as a simple model which is of most use when applied to complex systems.  Essentially, it’s a model in which a monitor compares what is happening with what should be happening, this feedback passed to a controller which accordingly adjusts the system’s behavior.  Wiener defined cybernetics as “the science of control and communications in the animal and machine”, something quite audacious at the time, aligning as it did the working of machines with animal and human physiology, particularly the intricacies of the nervous system and the implication the controller was the human brain and the monitor, vision from the eyes.  While the inherently mechanistic nature of the theory attracted critics, the utility was demonstrated by some success in the work of constructing artificial limbs that could be connected to signals from the brain.  The early theories underpinned much of the early work in artificial intelligence (AI).

Of cyberpunks and cybergoths

A cyberpunk Lindsay Lohan sipping martinis with Johnny Depp and a silver alien by AiJunkie.

The youth subcultures “cyberpunk” and “cybergoth” had common threads in the visual imagery of science fiction (SF) but differ in matters of fashion and political linkages.  Academic studies have suggested elements of cyberpunk can be traced to the dystopian Central & Eastern European fiction of the 1920s which arose in reaction to the industrial and mechanized nature of World War I (1914-1918) but in its recognizably modern form it emerged as a literary genre in the 1980s, characterized by darkness, the effect heightened by the use of stark colors in futuristic, dystopian settings, the cultural theme being the mix of low-life with high-tech.  Although often there was much representation of violence and flashy weaponry, the consistent motifs were advanced technology, artificial intelligence and hacking, the message the evil of corporations and corrupt politicians exploiting technology to control society for their own purposes of profit and power.  Aesthetically, cyberpunk emphasized dark, gritty, urban environments where the dominant visual elements tended to be beyond the human scale, neon colors, strobe lighting and skyscrapers all tending to overwhelm people who often existed in an atmosphere of atonal, repetitive sound.

Cybergoth girls: The lasting legacy of the cybergoth's contribution to the goth aesthetic was designer colors, quite a change to the black & purple uniform.  Debate continues about whether they can be blamed for fluffy leg-warmers.

The cybergoth thing, dating apparently from 1988, thing was less political, focusing mostly on the look although a lifestyle (real and imagined) somewhat removed from mainstream society was implied.  It emerged in the late 1990s as a subculture within the goth scene, and was much influenced by the fashions popularized by cyberpunk and the video content associated with industrial music although unlike cyberpunk, there was never the overt connection with cybernetic themes.  Very much in a symbiotic relationship with Japanese youth culture, the cybergoth aesthetic built on the black & purple base of the classic goths with bright neon colors, industrial materials, and a mix of the futuristic and the industrial is the array of accessories which included props such as LED lights, goggles, gas masks, and synthetic hair extensions.  Unlike the cyberpunks who insisted usually on leather, the cybergoths embraced latex and plastics such as PVC (polyvinyl chloride), not to imitate the natural product but as an item while the hairstyles and makeup could be extravagantly elaborate.  Platform boots and clothing often adorned with spikes, studs and chains were common but tattoos, piercings and other body modifications were not an integral component although many who adopted those things also opted to include cybergoth elements. 

Although there was much visual overlap between the two, cyberpunk should be thought of as a dystopian literary and cinematic genre with an emphasis on high-tech while cybergoth was a goth subculture tied to certain variations in look and consumption of pop culture, notably the idea of the “industrial dance” which was an out-growth of the “gravers” (Gothic Ravers), movement, named as goths became a critical mass in the clubs built on industrial music.  While interest in cyberpunk remains strong, strengthened by the adaptability of generative AI to the creation of work in the area, the historic moment of cyberpunk as a force in pop culture has passed, the fate of many subcultures which have suffered the curse of popularity although history does suggest periodic revivals will happen and elements of the look will anyway endure.

Monday, April 29, 2024

Palliate

Palliate (pronounced pal-ee-yet)

(1) To relieve or lessen (pain, disease etc) without curing or removing; to mitigate; to alleviate.

(2) To attempt to mitigate or conceal the gravity of (conduct (especially as of offenses)) by excuses, reasons, apologies etc; to extenuate.

(3) To cause an offence to seem less serious; some act of concealment.

1490s: From the Late Latin palliāre (to cover up), from palliātus (cloaked, covered), (in Late Latin the past participle of palliare (to cover with a cloak)), from palliāre (to cover up) or pallium (cloak).  Palliate is a verb & adjective, palliation, palliator & pallium are nouns, palliative is a noun & adjective, unpalliated is an adjective, palliated & palliating are verbs and palliatively is an adverb; the common noun plural is palliatives.

Palliate is one of those words in English which has become mostly overwhelmed by the associative meaning of a derived form. Palliative medicine (or palliative care) is a branch of medicine which focuses on those terminally ill (usually with months, at the most, to live) by providing pain relief and attempting to allowing the dying to enjoy the best possible quality of life.  The alternative industry is that of voluntary euthanasia (the so-called right-to-die movement) which is now permitted and regulated by legislation in many jurisdictions.  Palliative medicine gained the name from the idea of the use of “palliatives”, drugs which provide pain relief for those for whom there is no possibility of a cure.  In that sense, the treatment regime “cloaks rather than cures” and expectations are limited to concealment of the consequences of the condition.  Although such practices (along with euthanasia, voluntary and not) had been part of medical practice for centuries, it was in the 1960s it came to be recognized as a discipline and a structural part of (or adjunct to depending on the jurisdiction) the hospital industry, and there are both academic courses in the subject and peer-reviewed journals such as the European Association for Palliative Care’s (EAPC) Palliative Medicine, published since 1987.  Although On Death and Dying (1969) by Swiss-American psychiatrist Elisabeth Kübler-Ross (1926–2004) is sometimes cited as the intellectual impetus for emergence, it happened really because of the mid-century advances in hygiene, nutrition, pharmaceuticals & surgical techniques and the extension of medical services in the welfare states which extended life-spans but not necessarily wellness, thus the increasing population of those terminally ill and in need of care.  The ability to prolong life (sometimes for decades) of someone in a debilitated condition, combined with the social changes which had seen the decline in numbers of extended family living arrangements, meant a substantially public-funded industry needed to evolve.

Cloaked for the occasion: Lindsay Lohan in appropriate Grim Reaper mode, fulfilling a court-mandated community service order at LA County Morgue, October 2011.

That has meant the word has faded from some of its historic uses.  In law, it used to be part of the language of courtrooms, defense counsel attempting to palliate the conduct of their client in the hop the just or jury would view the alleged act less harshly and deliver a verdict less severe.  That sense came into use in seventeenth century England and in courtrooms it described attempts to cover or disguise the seriousness of an offence by reasons (fanciful & not), excuses (plausible & not) or apologies (sincere & not).  In legal use, palliate has been replace by mitigation (a plea assembling reasons why conduct should be regarded more favourably than it may appear and be thus awarded with a lesser sentence), from the Middle French mitigation, from the Latin mitigation from mītigātus (softened, pacified).  The companion term is exculpation which etymologically and legally is unrelated both to palliate & mitigate.  Exculpate was from the Medieval Latin exculpātus, the perfect passive participle of exculpō, from the Latin ex culpa, the construct being ex- (out, from) + culpa (fault; blame (and familiar in Modern English as “culpability”)).  Whereas a plea of palliation or in mitigation was entered in the context of asking certain matters be considered so a guilty party may receive a lesser punishment, an successful exculpation exonerates the accused.  The lawyers in the 1630s picked-up and adapted palliate’s earlier meaning.  In the fifteenth century, true to the Latin origin derived from “a cloak”, it was used to mean “to relieve the symptoms of; to ameliorate” the sense (concealing the symptoms) to which palliative medicine would in the 1960s return.  This use was extended by the mid-1500s to become a general way to “conceal, hide or disguise” and was used widely in fields such as tailoring, architecture, landscaping, interior decorating and anywhere else where techniques of illusion were valued.

Many of the artistic depictions of scenes from Antiquity are probably at least misleading (no epoch has ever been so idealized) but one aspect of the fashions seems usually faithfully to have reflected what really was: the garb of the physicians, philosophers and teachers which was a woollen cloak, draped over the left shoulder and wrapped around the body; the Romans called it a pallium and it was the stage garment also of the hetaerae (plural of hetaera (in Ancient Greece, a high-price escort of some beauty & culture who entertained upper-class men with company, conversation and other services; they're sometimes referred to as courtesans but this can be misleading and a more accurate modern comparison is probably with the business model of the “sugar-babe”)).

Appreciative audience: Phryne revealed before the Areopagus (1861), oil on canvas by Jean-Léon Gérôme (1824-1904), Hamburger Kunsthalle, Hamburg.

The painting depicts Phryne (circa 371-320 BC), a legendarily beautiful hetaera of Ancient Greece, on trial before the Areopagus (from the Ancient Greek Ἄρειος Πάγος (Áreios Págos (literally “Rock of Ares”)) which during some periods in classical times functioned as the final appellate court (both civil & criminal matters) in Athens.  As a deliberative body, the Areopagus (it picked up the name from the location where the sittings were conducted) may also at times have been a legislative (or at least an advisory) assembly something like a senate.  The comparison with the UK's House of Lords in its historic role as both the (upper) house of review is sometimes made because of the dual function as both a legislative body and a final court of appeal but the history of the role of the Aeropagus in law-making is sketchy and as a judicial organ it seems also to have sat as a whole, never restricting (as the Lords eventually did) the judicial hearings to committees of those with appropriate legal experience.

Defended (and by dubious legend not very well) by the speech-writer Hypereides (circa 390–322 BC), she was arraigned before the Areopagus on a charge of Asebeia (a criminal indictment alleging impiety, something like blasphemy towards the divine objects and perhaps an occupation risk in her profession and the charge appears to have been brought by a jilted and vengeful ex) and the most told tale of the trial is that acquittal was secured when she bared her breasts to those assembled to judge.  Depending on which imaginative medieval scribe was writing, either her counsel pulled the pallium from her body or she disrobed herself although all agree the unusual legal tactic was resorted to because the defence was going not well.  The famous legal critique of the Roman writer Marcus Fabius Quintilianus (circa 35-circa 100), the verdict was secured “non Hyperidis actione... sed conspectus corporis” (not by Hypereides' pleading, but by the sight of her body") and as a gesture it wasn’t unknown in Athenian culture.  Although the trial and acquittal (by a majority vote) are uncontested history, whether the “boobs offered in mitigation” ever happened is at least suspect but if true, it’s not surprising the venerable gentlemen judging her were impressed because she also modelled her nude form for the sculptor Praxiteles who based his Aphrodite of Knidos on those sessions.  In the late eighteen century, something of a Phryne cult formed among European artists although what is history and what myth in the stories of her illustrious career is mostly uncertain although there’s no doubt she’d often have worn a pallium.

Containing bilberry, witch hazel, mangosteen, sage, rosemary, calendula, rose flower, sea buckthorn, lemon grass, grapefruit, nettle & Iceland moss, Life Roots' Palliate Cream is advertized as an agent to (1) moisturize, (2) reduce inflammation & (3) protect against dryness.  This would suggest the product is thought something which genuinely improves the state of the skin, rather than just “papering over the cracks” (as some skin-care products unashamedly are).  The phrase “to paper over the cracks” is a particular sense of palliation meaning “to use a temporary expedient; to create the semblance of order or agreement; temporarily to conceal problems”.  The phrase (in English translation) is attributed to the Prussian statesman Otto von Bismarck (1815-1989; Chancellor of the German Empire 1871-1890) who used the equivalent German expression in a letter dated 14 August 1865 during the negotiations of the Convention of Gastein (1865), a treaty between Austria and Prussia which temporarily would postpone the onset of the Austro-Prussian War (1866) and can thus be thought a prelude to the wars and the subsequent system of intricately interlocked treaties which would be the framework of the Bismarckian form of Reichism: “We are working eagerly to preserve the peace and to cover the cracks in the building.”  Under Bismarck, the stresses inherent in the structure were contained but in the hands of hiss less able successors, the forces became unleashed and consumed the continent ending the rule of four dynastic empires.  Still, “papering over the cracks” remains often the way politics is done, usually the way coalitions are formed and of late, a new flavor of the technique has emerged: Benjamin Netanyahu (b 1949; Israeli prime minister 1996-1999, 2009-2021 and since 2022) doesn’t care if people see the cracks through the paper.