Showing posts sorted by date for query Quarantine. Sort by relevance Show all posts
Showing posts sorted by date for query Quarantine. Sort by relevance Show all posts

Tuesday, January 9, 2024

Compunction

Compunction (pronounced kuhm-puhngk-shuhn)

(1) A feeling of uneasiness or anxiety of the conscience caused by regret for doing wrong or causing pain; contrition; remorse; sorrow.

(2) Any uneasiness or hesitation about the rightness of an action.

1350–1400: From the Middle English compunccion, from the Old French compunction (from which in the twelfth century Modern French gained compunction), from the Late Latin compunctionem (a pricking) & compūnctiōn- (stem of the Ecclesiastical Latin compunctiō) (remorse; a stinging or pricking (of one’s guilty conscience)), the construct being the Classical Latin compūnct(us) (past participle of compungere (to sting; severely to prick), the construct of which was (com- (used as an intensive prefix) + pungere (to prick; to puncture) (from a suffixed form of the primitive Indo-European root peuk- (to prick)) + -iōn- (stem of –iō and a suffix forming nouns, used especially on past participle stems).  The origin of the meaning in Latin (transferred from the element pungere (to prick; to puncture)) was the idea of “a pricking of one’s guilty conscience” which could induce some feeling of regret although, like many injuries cause by pin-pricks, recovery was often rapid.  The adjective compunctious (causing compunction, pricking the conscience) dates from the late sixteenth century.  Compunction & compunctiousness are nouns, compunctious & compunctionless are adjectives and compunctiously is an adverb; the noun plural is compunctions.

The Ecclesiastical Latin compunctiō (and compunction in other forms) appears frequently in the texts of the early Church, used in a figurative sense originally to convey a more intense sense of “contrition” or “remorse” than that familiar in modern use.  Contrition and remorse were of course a thing vital for the Church to foster, indeed to demand of the congregation.  The very structure of Christianity was built upon the idea that all were born in a state of guilt because the very act of conception depending upon an original sin and this was what made Jesus unique: the virgin birth meant Christ was born without sin although centuries of theological squabbles would ensue as the debate swirled about his nature as (1) man, (2) the son of God and (3) God.  That was too abstract for most which was fine with the priests who preferred to focus on the guilt of their flock and their own importance as the intermediaries between God and sinner, there to arrange forgiveness, something which turned out to be a commodity and commodities are there to be sold.  Forgiveness was really the first futures market and compunction was one of the currencies although gold and other mediums of exchange would also figure.

Sorry (Regretful or apologetic for one's actions) was from the From Middle English sory, from the Old English sāriġ (feeling or expressing grief, sorry, grieved, sorrowful, sad, mournful, bitter), from the Proto-West Germanic sairag, from the Proto-Germanic sairagaz (sad), from the primitive Indo-European seh₂yro (hard, rough, painful).  It was cognate with the Scots sairie (sad, grieved), the Saterland Frisian seerich (sore, inflamed), the West Frisian searich (sad, sorry), the Low German serig (sick, scabby), the German dialectal sehrig (sore, sad, painful) and the Swedish sårig.  Remarkably, despite the similarities in spelling and meaning, “sorry” is etymologically unrelated to “sorrow”.  Sorrow (a state of woe; unhappiness) was from the Middle English sorow, sorwe, sorghe & sorȝe, from the Old English sorg & sorh (care, anxiety, sorrow, grief), from the Proto-West Germanic sorgu, from the Proto-Germanic surgō (which may be compared with the West Frisian soarch, the Dutch zorg, the German Sorge, and the Danish, Swedish and Norwegian sorg), from the primitive Indo-European swergh (watch over, worry; be ill, suffer) (which may be compared with the Old Irish serg (sickness), the Tocharian B sark (sickness), the Lithuanian sirgti (be sick) and the Sanskrit सूर्क्षति (sū́rkati) (worry).

Johnny Depp & Amber Heard saying sorry in Australia and Johnny Depp deconstructing sorry in London.

Sorry indicates (1) one is regretful or apologetic for one’s thoughts or actions but it can also mean (2) one is grieved or saddened (especially by the loss of something or someone), (3) someone or something is in a sad or regrettable state or (4) someone or something is hopelessly inadequate for their intended role or purpose.  Such is human nature that expressions of sorry in the sense of an apology are among the more common exchanges and one suspects something like the 80/20 rule applies: 80% of apologies are offered by (or extracted from) 20% of the population.  So frequent are they that an art has evolved to produce phrases by which an apology can be delivered in which sorry is somehow said without actually saying sorry.  This is the compunction one fells when one is not feeling compunctious and a classic example was provided when the once (perhaps then happily) married actors Johnny Depp (b 1963) & Amber Heard (b 1986) were in 2015 caught bringing two pet dogs into Australia in violation of the country’s strict biosecurity laws.  Ms Heard pleaded guilty to falsifying quarantine documents, stating in mitigation her mistake was induced by “sleep deprivation”.  No conviction was recorded (the maximum sentence available being ten years in jail) and she was placed on a Aus$1,000 one-month good behavior bond, the couple ordered to make a “public apology” and that they did, a short video provided, the script unexceptional but the performances something like a Monty Python sketch.  However, whatever the brief performance lacked in sincerity, as free advertising for the biosecurity regime, it was invaluable.  Mr Depp later returned to the subject when promoting a film in London.

The synonyms for “sorry” (as in an apology) include regret, apologize, compunctious, contrite, penitent, regretful, remorseful & repentant (which is more a subsequent act).  Practiced in the art of the “non-apologetic” apology are politicians (some of whom have honed it to the point where it’s more a science) who have a number of ways of nuancing things.  Sometimes the excuse is that simply to say “sorry” might subsequent legal proceedings be construed as an admission of liability, thus exposing the exchequer and there was some basis for that concept which has prompted some jurisdictions explicitly to write into legislation that in traffic accidents and such, simply to say “sorry” cannot be construed as such an admission.  That of course has had no apparent effect on the behaviour of politicians.  Even when there is no possibly of exposing the state to some sort of claim, politicians are still averse to anything like the word “sorry” because it’s seen as a “loss of face” and a victory for one’s opponents.

There are exceptions.  Some politicians, especially during periods of high popularity, worked out that such was the novelty, saying sorry could work quite well, especially if delivered in a manner which seemed sincere (and the right subject, in the right hands, can learn such tricks) although some who found it worked did overdo it, the repetition making it clear it was just another cynical tactic.  An example was Peter Beattie (b 1952; Premier of Queensland 1998-2007) who found the electorate responded well to a leader saying sorry but such was the low quality of the government he headed that there was often something for which to apologize and having set the precedent, he felt compelled to carry on until the sheer repetitive volume of the compunctiousness began merely to draw attention to all the incompetence.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The other exception is the set-piece event.  This is where a politician apologizes on behalf of someone else (a previous government, hopefully the opposition or something a vague as the nation in some dim, distant past) while making it clear that personally it’s nothing to do with them personally.  There has been a spate of these in recent decades, many apologizing for egregiously appalling acts by white men against ethnic minorities, indigenous populations, the disabled or other powerless groups.  Again, some of the apologies have been in the form of “personally sorry it happened”, thereby ticking the box without costing anything; people like and indigenous population apparently deserving words but not compensation.  For the rest of us, ranging from the genuinely sincere to the cynically opportunistic nihilistic psychopaths, the most obvious tool is the adverb: to say “I am so sorry” can be more effective than “I’m sorry” provided the tone of voice, inflections and the non-verbal clues are all in accord.  Sorry is recommend by many because it so easily can be made to sound sincere with a ease that’s challenging with compunctious, contrite, penitent, regretful, and remorseful, the longer words ideal for one politician “apologizing” to another in a form which is linguistically correct while being quite contemptuous.

Saturday, October 14, 2023

Bubble

Bubble (pronounced buhb-uhl)

(1) A spherical globule of gas (or vacuum) contained in a liquid or solid.

(2) Anything that lacks firmness, substance, or permanence; an illusion or delusion.

(3) An inflated speculation, especially if fraudulent.

(4) The act or sound of bubbling.

(5) A spherical or nearly spherical canopy or shelter; dome.

(6) To form, produce, or release bubbles; effervesce.

(7) To flow or spout with a gurgling noise; gurgle.

(8) To speak, move, issue forth, or exist in a lively, sparkling manner; exude cheer.

(9) To seethe or stir, as with excitement; to boil.

(10) To cheat; deceive; swindle (archaic).

(11) To cry (archaic Scots).

(12) A type of skirt.

(13) In infection control management, a system of physical isolation in which un-infected sub-sets population are protected by restricting their exposure to others.

1350-1400: From the Middle English noun bobel which may have been from the Middle Dutch bubbel & bobbel and/or the Low German bubbel (bubble) and Middle Low German verb bubbele, all thought to be of echoic origin.  The related forms include the Swedish bubbla (bubble), the Danish boble (bubble) and the Dutch bobble.  The use to describe markets, inflated in value by speculation widely beyond any relationship to their intrinsic value, dates from the South Sea Bubble which began circa 1711 and collapsed in 1720.  In response to the collapse, parliament passed The Bubble Act (1720), which required anyone seeking to float a joint-stock company to first secure a royal charter.  Interestingly, the act was supported by the South Sea Company before its failure.  Ever since cryptocurrencies emerged, many have been describing them as a bubble which will burst and while that has happened with particular coins (the exchange collapses are something different), the industry thus far has continued with only the occasional period of deflation.  Bubble & bubbling are nouns & verbs, bubbler is a noun, bubbled is a verb, bubbly is a noun & adjective, bubbleless & bubblelike are adjectives and bubblingly is an adverb; the noun plural is bubbles.

An artificial tulip in elisa mauve.

However although the South Sea affair was the first use of “bubble” to describe such a market condition, it wasn’t the first instance of a bubble which is usually regarded as the Dutch tulpenmanie (tulip mania) which bounced during the 1630s, contract prices for some bulbs of the recently introduced and wildly fashionable tulip reaching extraordinarily high levels, the values accelerating from 1634 until a sudden collapse in 1637.  Apparently just a thing explained by a classic supply and demand curve, the tulip bubble burst with the first big harvest which demonstrated the bulbs and flowers were really quite common.  In history, there would have been many pervious bubbles but it wasn’t until the economies and financial systems of early-modern Europe were operating that the technical conditions existed for them to manifest in the form and to the extent we now understand.  Interestingly, for something often regarded as the proto-speculative asset bubble and a landmark in economic history, twentieth-century revisionist historians have suggested it was more a behavioral phenomenon than anything with any great influence on the operation of financial markets or the real economy, the “economic golden age” of the Dutch Republic apparently continuing unaffected for almost a century after the bottom fell out of the tulip market.  The figurative uses have been created or emerged as required, the first reference to anything wanting firmness, substance, or permanence is from 1590s.  The soap-bubble dates from 1800, bubble-shell is from 1847, bubble-gum was introduced in 1935 and bubble-bath appears first to have be sold in 1937.  The slang noun variation “bubbly” was first noted in 1920, an invention of US English.  

The word "bubble" spiked shortly after the start of the Covid-19 pandemic.  Over time, use has expanded to encompass large-scale operations like touring sporting teams and even the geographical spaces used for the 2022 Beijing Winter Olympics but the original meaning was more modest: small groups based on close friends, an extended family or co-workers.  These small bubbles weren't supposed to be too elastic and operated in conjunction with other limits imposed in various jurisdictions; a bubble might consist of a dozen people but a local authority might limit gatherings to ten in the one physical space so two could miss out, depending on the details in the local rules.  Bubble thus began as an an unofficial term used to describe the cluster of people outside the household with whom one felt comfortable in an age of pandemic.

Tulips

Bubbles were however a means of risk-reduction, not a form of quarantine.  The risks in a bubble still exist, most obviously because some may belong to more than one bubble, contact thus having a multiplier effect, the greater the number of interactions, the greater the odds of infection.  Staying home and limiting physical contact with others remained preferable, the next best thing to an actual quarantine.  The more rigorously administered bubbles used for events like the Olympics are essentially exercises in perimeter control, a defined "clean" area, entry into which is restricted to those tested and found uninfected.  At the scale of something like an Olympic games, it's a massive undertaking to secure the edges but, given sufficient resource allocation can be done although it's probably misleading to speak of such an operation as as a "bubble".  Done with the static-spaces of Olympic venues, they're really quarantine-zones.  Bubble more correctly describes touring sporting teams which move as isolated bubbles often through unregulated space.

The Bubble Skirt

A type of short skirt with a balloon style silhouette, the bubble dress (more accurately described as a bubble skirt because that’s the bit to which the description applies) is characterized by a voluminous skirt with the hem folded back on itself to create a “bubble” effect at the hemline.  Within the industry, it was initially called a tulip skirt, apparently because of a vague resemblance to the flower but the public preferred bubble.  It shouldn’t be confused with the modern tulip skirt and the tulip-bubble thing is just a linguistic coincidence, there’s no link with the Dutch tulipmania of the 1630s.  Stylistically, the bubble design is a borrowing from the nineteenth century bouffant gown which featured a silhouette made of a wide, full skirt resembling a hoop skirt, sometimes with a hoop or petticoat support underneath the skirt.   While bouffant gowns could be tea (mid-calf) or floor length, bubble skirts truncate the look hemlines tend to be well above the knee.  Perhaps with a little more geometric accurately, the design is known also as the “puffball” and, in an allusion to oriental imagery, the “harem” skirt.  Fashion designer Christian La Croix became fond of the look and a variation included in his debut collection was dubbed “le pouf” but, in English, the idea of the “poof skirt” never caught on.

Lindsay Lohan in Catherine Malandrino silk pintuck dress with bubble skirt, LG Scarlet HDTV Launch Party, Pacific Design Center, Los Angeles, April 2008.

It must have been a memorable silhouette in the still austere post-war world, a sheath dress made voluminous with layers of organza or tulle, the result a cocoon-like dress with which Pierre Cardin and Hubert de Givenchy experimented in 1954 and 1958, respectively. A year later, Yves Saint Laurent for Dior added the combination of a dropped waist dress and bubble skirt; post-modernism had arrived.  For dressmakers, bubble fashion presented a structural challenge and mass-production became economically feasible only because of advances in material engineering, newly available plastics able to be molded in a way that made possible the unique inner construction and iconic drape of the fabric.  For that effect to work, bubble skirts must be made with a soft, pliable fabric and the catwalk originals were constructed from silk, as are many of the high end articles available today but mass-market copies are usually rendered from cotton, polyester knits, satin or taffeta.

The bubble in the 1950s by Pierre Cardin (left), Givenchy (centre) & Dior (right).

The bubble skirt was never a staple of the industry in the sense that it would be missing from annual or seasonal ranges, sometimes for a decade or more and sales were never high, hardly surprising given it was not often a flattering look for women above a certain age, probably about seven or eight.  Deconstructing the style hints at why: a hemline which loops around and comes back up, created sometimes by including a tighter bottom half with the bulk of additional material above, it formed a shape not dissimilar to a pillow midway through losing its stuffing.  For that reason, models caution the look is best when combined with a sleek, fitted top to emphasize the slimness of the waistline, cinched if necessary with a belt some sort of delineating tie.  The bubble needs to be the feature piece too, avoiding details or accessories which might otherwise distract; if one is wearing a partially un-stuffed pillow, the point needs to be made it’s being done on purpose.

The bubble is adaptable although just because something can be done doesn’t mean it should be done.  The bubble skirt has however received the Paris Hilton imprimatur so there’s that.

On the catwalks however, again seemingly every decade or so, the bubble returns, the industry relying on the short attention span of consumers of pop culture inducing a collective amnesia which allows many resuscitations in tailoring to seem vaguely original.  Still, if ever a good case could be made for a take on a whimsical 1950s creation to re-appear, it was the staging of the first shows of the 2020-2021 post-pandemic world and the houses responded, Louis Vuitton, Erdem, Simone Rocha and JW Anderson all with billowy offerings, even seen was an improbably exuberant flourish of volume from Burberry.  What appeared on the post-Covid catwalk seemed less disciplined than the post-war originals, the precise constraints of intricately stitched tulle forsaken to permit a little more swish and flow, a romantic rather than decadent look.  The reception was generally polite but for those who hoped for a different interpretation, history suggests the bubble will be back in a dozen-odd years.

Friday, September 15, 2023

Plague

Plague (pronounced pleyg)

(1) An infectious, epidemic disease caused by a bacterium, Yersinia pestis (trans transmitted to man by the bite of the rat flea (Xenopsylla cheopis)) characterized by fever, chills, and prostration.

(2) In casual use, any epidemic disease that causes high mortality; pestilence.

(3) Any widespread affliction, calamity, or evil, especially one regarded as divine retribution.

(4) Any cause of trouble, annoyance, or vexation; torment; to pester.

(5) As in “… a plague upon…”, to curse another, wishing any evil upon them.  The variation “a plague upon both your houses” suggests an unwillingness to take sides, an implication one thinks both parties are in the wrong. 

1350-1400: From the Middle English plage, a borrowing from the Old French plage, from the Latin plāga (blow, wound, (and pestilence in Late Latin), from plangō or plangere (to strike), the ultimate root being the Ancient Greek plēgē (a stroke).  It was cognate with the Middle Dutch plāghe (from the Dutch plaag) & plāghen (from the Dutch plagen), the Middle Low German plāge, the Middle High German plāge & pflāge (from the German plage) & plāgen (from the German plagen), the Swedish plåga, the French plaie and the Occitan plaga.  Plague exists as verb and noun, plaguer being the other noun, plaguing & plagued the verbs.  Other derived forms exist but are rarely seen except in historic or technical writing: plagioclase, plagioclimax, plagiohedral, plagiotropic and plagiotropism, plaguesome & plaguy.  For the actual disease there’s no actual synonym but many words tend to be used interchangeably in any context: invasion, scourge, contagion, pandemic, epidemic, curse, infection, outbreak, influenza, infestation, blight, calamity, pest, cancer, bedevil, afflict, beleaguer, bother, haunt, torment.

The famous phrase "A plague of both your houses" is from William Shakespeare's  (1564–1616) Romeo and Juliet (1597) .  When Mercutio says a "plague o' both your houses", he is damning both the Montagues and Capulets, asking fate to visit upon the families some awful fate because he blames both for his imminent death.  In modern use, it's used to suggest an unwillingness to take sides, the implication being one thinks both parties are in the wrong:

Mercutio. Help me into some house, Benvolio,
Or I shall faint. A plague o' both your houses!
They have made worms' meat of me: I have it,
And soundly too: your houses!

Romeo and Juliet, Act III, Scene 1

Plagues and the Plague

Masked-up: Lindsay Lohan avoiding plague.

Plague is an infectious disease caused by the bacterium Yersinia pestis and exists in three strains: Bubonic plague, Septicemic plague & Pneumonic plague, the former two usually contracted by the handling of an infected animal or the bite of a flea, the last by contact between people via infectious droplets in the air.  Typically, several hundred cases are reported annually, mostly in India, the Congo, Madagascar & Peru and cases have been reported in the US but historically, outbreaks were large-scale events lasting months or years, the best known of which include the fourteenth century Black Death, estimated to have killed some fifty-million and the Great Plague of London which, in 1665-1666, caused the death of one in five of the city's population.  COVID-19 was thus a plague but not the plague.  A common noun, plague is written with an initial capital only at the beginning of a sentence, or (as in the Great Plague of London) when it has become a thing.  Notable epidemics have included:

The Black Death (1346-1353)

Death Toll: 75 – 200 million; Cause: Bubonic Plague

The Plague ravaged Europe, Africa, and Asia, with a death toll of 75-200 million, killing up to half the population of some European countries.  Thought to have originated in Asia, Plague was most likely spread by fleas living on the rats of merchant ships and in some countries, populations didn’t recover until the nineteenth century.  Now unknown in most parts of the world, outbreaks still happen in various places.

Plague of Justianian (541-542)

Death Toll: 25 million; Cause: Bubonic Plague

Thought to have killed perhaps half the population of Europe, the Plague of Justinian afflicted the Byzantine Empire and Mediterranean port cities.  The first verified and well-documented incident of the Bubonic Plague, it reduced the population of the Eastern Mediterranean by a quarter and devastated Constantinople, where, at the height of the pandemic, 5,000 a day were dying.

Antonine Plague (165 AD)

Death Toll: 5 million; Cause: Unknown

Also known as the Plague of Galen, the Antonine Plague affected Asia Minor (the modern Republic of Türkiye), Egypt, Greece, and Italy and is thought to have been either Smallpox or Measles, though the true cause is unknown. The disease was brought to Rome by soldiers returning from Mesopotamia.  The pandemic significantly weakened the Roman army.

London and the plagues of Plague

A London Bill of Mortality, 1665.

During the sixteenth & seventeenth centuries when "bubonic plague was abroad", the authorities compiled "Bills of Mortality" listing the causes of death recorded that week.  It's now believed the statistics are not wholly reliable (Plague numbers, like the global toll from Covid-19, believed greatly to have been understated) but the startling ratio of deaths attributed to Plague compared with other causes is indicative of the deadly nature of the epidemic.  In one week 3880 residents of London were reported as having succumbed to Plague, dwarfing the number recorded as dying by other causes including Old Age (54), Consumption (Tuberculous) (174), Small Pox (10), Fright (1), Grief (1), Spotted Fever and the Purples (190), Griping in the Guts (74), Lethargy (1), Rifing of the Lights (19) and Wind (1).  Like the Covid-19 statistics, there was likely some overlap in the numbers but the disparity remains striking.

After the Black Death, London's major plague epidemics occurred in 1563, 1593, 1625 and 1665 and although the last is best-known (associated as it was with the Great Fire of 1666), it's believed it was during the 1563 event the city suffered the greatest proportional mortality with between a quarter and a third of the populating dying; losses have been estimated to be as high as 18,000 and in some weeks the toll exceeded 1000.  From there, the disease spread around the nation the following year, the fleas which were the primary vector of transmission having hibernated through what was a comparatively mild winter.  Echoing the political and military effects of epidemics noted since Antiquity, it was at this time England was compelled to give up their last French possession, Le Havre, which was being held as a hostage for Calais.  Plague broke out in the occupying garrison and few troops escaped infection so the town had to be surrendered.

There were small, manageable outbreaks in 1603 & 1610-1611 but the epidemic of 1625 was severe and associated with a notable internal migration as those with the means to leave London did not, the reduction in the number of magistrates & doctors noted as inducing the predicable social consequences although as time passed, it was clear the disease was becoming less virulent and the mortality rate had fallen, something now attributed at least partially to the so-called "harvesting effect".  After 1666, the Plague didn't vanish and there were periodic outbreaks but the lessons had been well-learned and the efficiency of communications and the still embryonic public-health infrastructure operated well, even if little progress had been made in actual medical techniques.  The Hull (an East Yorkshire port city) Plague of 1699 was contained with little spread and when an outbreak of fever was reported in Marseilles in 1720, stricter quarantine measures  were imposed in English ports which successfully prevented any great spread.  Throughout the eighteenth & nineteenth centuries (as late as 1896-1897) there were occasional isolated cases and small outbreaks of plague in various parts of England but none ever remotely approached the scale of the 1665-1666 epidemic.

Werner Herzog's Nosferatu (1979) 

Werner Herzog's (b 1942) 1979 remake of Friedrich Wilhelm Murnau's (1888–1931) masterpiece of Weimar expressionism (Nosferatu (1922)) takes place mostly in a small German city afflicted suddenly by Plague, Herzog rendering something chilling and darkly austere, despite the stylistic flourishes.  The 1979 film delivered the definitive screen Dracula and was a piece to enjoy when living in the social isolation of the Covid era.

Scene from Werner Herzog's Nosferatu (1979)

Monday, July 17, 2023

Afforce

Afforce (pronounced af-fors)

(1) To strengthen or reinforce by the addition of other or of specially skilled members, deliberative bodies such as juries or tribunals.

(2) To force; compel; violate (obsolete).

(3) Reflexively, to exert one's self; endeavour; attempt (obsolete).

1400s: From the Middle English (in the sense “to force”), from the Old French aforcer, from the Latin exfortiāre, from fortis (strong), from the Proto-Italic forktis, from the primitive Indo-European baergh (to rise, high, hill).  The a- prefix as used here is rare and is in English no longer productive.  It was related to the Latin ad- (to; at) and was used to show or emphasize a state, condition, or manner and was common in Old & Middle English, some of the constructs still used poetically (apace, afire, aboil, a-bling) and some where the specific, technical meaning has endured (asunder, astern).  The Oxford English Dictionary (OED) noted the descent of many of these form to the archaic, suggesting it was part of the organic evolution of the language, these “…prefixes were at length confusedly lumped together in idea, and the resultant a- looked upon as vaguely intensive, rhetorical, euphonic or even archaic and wholly otiose.”  The double-ff is a written tribute to the spoken, afforce formed with an oral prefix; the noun counterpart of this was æf-.  Afforce, afforcing & afforced are verbs, afforcement is a noun; the noun plural is afforcements.

Afforce thus emerged just as a way of emphasizing the notion of force or indicating the act transpiring.  Geoffrey Chaucer (circa 1343-1400) in The Man of Law's Tale (1387), the fifth of the Canterbury Tales uses afforce in that sense:  Than whan thys wycked Thelous by harde manasses and hys grete strengh the had wyll to afforce her, than she restreynyd hys gret foly by thys reason, ffor cause that her Chylde Moryce the whyche was of the age of.

That strict arbiter of English use, Sir Ernest Gowers (1880-1966), noted approvingly in his second edition (1965) of Henry Fowler's (1858–1933) Modern English Usage (1926) that the OED as early as 1888 ruled afforce was for all purposes obsolete save "to reinforce or strengthen a deliberative body by the addition of new members, as a jury by skilled assessors or persons acquainted with the facts".  Sir Ernest seemed also pleased the OED had sought to drive a stake through afforce's linguistic heart by not including an entry in the concise (COD) edition of the OED, adding that he regarded any revival as but a flashy "pride of knowledge", a most "un-amiable characteristic", the display of which "sedulously should be avoided".  Sir Ernest had spoken, Henry Fowler would have concurred and in any sense afforce remains vanishingly rare.

Manchester Assize Courts 1934.  Damaged by Luftwaffe raids in 1940-1941, it was demolished in 1957.  Perhaps surprisingly, given some of the ghastly stuff built in post-war years, the replacement Crown Court building has some nice touches and not unpleasing lines.

It was the operation of jury trials in English law which saw the meaning beginning to shift although the legal use did encapsulate both senses.  At common law, the practice to “afforce the assize” was a method for a court to secure a verdict where the jury disagreed.  This was achieved by adding other jurors to the panel until twelve could be found who were unanimous in their opinion, thus the senses (1) afforcement being forcing a jury to verdict and (2) afforcement being the addition of members to the jury.  The word has endured (if rarely used) in this technical sense and not become merely a synonym of augment, somewhat unusual in English where words tend to be co-opted for just about use which seems to fit and it may be that when courts ceased to afforce, juries, the word became stranded in its special, historic sense, a process probably assisted by the practice of adding the a- prefix faded.

Vested with both civil and criminal jurisdiction, the Courts of Assize sat between 1293-1972 in the counties of England and Wales.  The afforcement of the assize was an ancient practice in trials by jury and involved adding other jurors to the panel in cases where the jurors differed among themselves and couldn’t agree in one (sententiam) finding.  In those instances, at the discretion of the judges, either the jury could be afforced or the existing body could be compelled to unanimity by directing the sheriff to lock them up without food or drink until they did agree.  The latter does sound an extreme measure; even when medieval conclaves of cardinals proved unable to organise the numbers to elect a new pope, when their eminences were locked-up, they were at least given bread and water.

However it was done, afforcement or starvation, the objective was to get to the point where there were twelve who could agree on a verdict.  However, as legal theorists at the time observed, this really created a second trial and eventually afforcement was abandoned, both justice and its administration thought better served by an insistence on unanimity (probably an inheritance from canon law and a common thing on the continent where the unanimity of a consultative or deliberative body was deemed indispensable).  Also refined was the practice of confining jurors without meat and drink; now they’re fed and watered and, if after long enough some prove still recalcitrant, the jury is discharged and a new trial may be ordered.  Some jurisdictions have found this too inefficient and have introduced majority verdicts so only ten or eleven of the twelve need to be convinced a defendant is guilty as sin which, as any prosecutor will tell you, they all are. 

Chief Justice Charles Evans Hughes (1862–1948; Chief Justice of the US 1930-1941) taking FDR's oath of office at the start of his second term, 20 January 1937.

There have too been attempts to afforce the bench.  Franklin Delano Roosevelt (1882–1945; US President 1933-1945), not best pleased at repeatedly having parts of his New Deal legislation declared unconstitutional by the US Supreme Court, in 1937 created the Judicial Procedures Reform Bill which sought to add sympathetic judges to the bench, his argument being the constitution not mandating than there must be nine judges on the bench, it was a matter for congress to determine the number.  He was apparently serious but may also have had in mind the threat in 1911 by the UK’s Liberal Party government to appoint to the House of Lords as many peers as would be necessary to ensure the upper house could no longer block their legislation.  That worked, the peers backing down and allowing the government’s reforms to pass into law, the feeling always that they were less appalled by creeping socialism than the thought of the House of Lords being flooded with “jumped-up grocers”.  It may also have worked in the US, the "court-packing plan" ultimately not required.  Some months after FDR’s landslide victory in the 1936 presidential election, Justice Owen Roberts (1875–1955; US Supreme Court judge 1930-1945) switched his vote, creating a pro-New Deal majority, an act remembered in judicial history as the "the switch in time that saved nine".

The US Supreme Court in session, 1932.  The photo is by Erich Salomon (1886-1944) and is one of two known images of the court in session.  Dr Salomon died in Auschwitz.

The idea of “packing the court” has been revived before but in 2021, congressional Democrats introduced a bill for an act which would expand the Supreme Court bench from nine to thirteen, essentially for the same reasons which attracted FDR in 1937.  Unlike then however, the Democrat control of both houses was marginal and there was no chance of success and even had there been an unexpectedly good result in the 2022 mid-term elections, nothing would have overcome the resistance of conservative Democrats in the senate.  With the Republican-appointed judges (reactionary medievalists or black-letter law judges depending on one’s view) likely to be in place for decades, the 2021 bill is more a shot across the judicial bow and the interplay between electoral outcomes and public opinion, of which the judges are well aware, will bubble and perhaps boil in the years ahead.

Lindsay Lohan on the panel of The Masked Singer (2019).

The Masked Singer Australia is a TV singing competition, the local franchise of a format which began in South Korea as the King of Mask Singer.  The premise is that elaborately costumed masked celebrities sing a song and a panel has to guess their identity.  In 2019, the producers afforced the judging panel with the appointment of Lindsay Lohan and the experiment seems to have been a success despite Ms Lohan having little or no idea who the local celebrities were, masked or otherwise.  That may have been part of the charm of her performance and it seemed to gel with viewers, the second series in 2020, in which Ms Lohan wasn’t able to participate because of COVID-19 quarantine restrictions, seeing a sharp decline in viewer numbers, the opening episode down 37% from 1.2 million to 733k.  Overall, the season average in the five mainland capital cities dropped to 816k from 928k, a year-on-year drop of 12%.  In October 2021, Warner Brothers TV announced a third series had been commissioned for broadcast in 2022 but Lindsay Lohan didn't again afforce the panel, depriving audiences of the chance to watch her try to guess the names of people she's never heard of.  #BringBackLindsay is expected to trend.

Sunday, January 1, 2023

Quarantine

Quarantine (pronounced kwawr-uhn-teen or kwor-uhn-teen)

(1) In historic English common law, the period of 40 days during which a widow was entitled to remain in her deceased husband's home while any dower is collected and returned.

(2) A strict isolation imposed to prevent the spread of disease and (by extension), any rigorous measure of isolation, regardless of the reason.

(3) A period, originally 40 days (the historic understanding of the maximum known incubation period of disease) of detention or isolation imposed upon ships, persons, animals, or plants on arrival at a port or place, when suspected of carrying some infectious or contagious disease; a record system kept by port health authorities in order to monitor and prevent the spread of contagious diseases.  The origin was in measures taken in 1448 in Venice's lazaret to avoid renewed outbreaks of the bubonic plague.

(4) In historic French law, a 40-day period imposed by the king upon warring nobles during which they were forbidden from exacting revenge or to continue warfare.

(5) A place where such isolation is enforced (a lazaret).

(6) In international relations, a blockade of trade, suspension of diplomatic relations, or other action whereby one country seeks to isolate another.

(7) In computing, a place where files suspected of harboring a computer virus or other harmful code are stored in a way preventing infection of other files or machines; the process of such an isolation.

(8) To withhold a portion of a welfare payment from a person or group of people (Australia).

(9) To quarantine someone or something.

1600–1610: From the Middle English quarentine (period a ship suspected of carrying contagious disease is kept in isolation), from the Norman quarenteine, from the French quarenteine, from the Italian quarantina, a variant of quarantena, originally from the upper Italian (Venetian) dialect as quaranta giorni (space of forty days, group of forty), from quaranta (forty) from the Medieval Latin quarentīna (period of forty days; Lent), from the Classical Latin quadrāgintā (four tens, forty) and related to quattuor (four), from the primitive Indo-European root kwetwer (four).  The difference between quarantine and isolation is one of context; while people might for many reasons be isolated, quarantine is a public health measure to deal with those exposed to or at risk of having been infected by a communicable disease, the duration of the quarantine being sufficient to ensure any risk of spreading the infection has passed.  The name is from the Venetian policy (first enforced as the 30 day edict trentino in 1377) of keeping ships from plague-stricken countries waiting off its port for forty days to ensure no latent cases remained aboard.  The extended sense of "any period of forced isolation" dates from the 1670s.  A doublet of carene and quadragene.

In the context the L'Ancien Régime (pre-revolutionary France), it was a calque of the French quarantaine, following the edicts of Louis IX (and formalized by the quarantaine du Roi (1704) of Louis XIV which was a mechanism of quieting squabbling nobles).  Quarantine was introduced to international relations as a euphemism for "blockade" in 1937 because the Roosevelt administration was (1) conscious of public reaction to the effects on civilians of the Royal Navy’s blockade of Imperial Germany during World War I (1914-1918) and (2) legal advice that a “blockade” of a non-belligerent was, under international law, probably an act of war.  The use was revived by the Kennedy administration during the Cuban Missile Crisis (October 1962).  The verb meaning "put under quarantine" came quickly to be used in any sense including figuratively (to isolate, as by authority) dates from 1804.  Predating the use in public health, in early sixteenth century English common law, the quarentine was the period of 40 days during which a widow was entitled to remain in her dead husband's home while any dower is collected and returned.  The alternative spellings quarentine, quarantin, quaranteen, quarantain, quarantaine, quarrentine, quarantene, quarentene, quarentyne, querentyne are all obsolete except in historic references).  While not of necessity entirely synonymous, detention, sequester, separation, seclusion, segregation, sequestration, lazaretto, segregate, confine, separate, seclude, insulate, restrict, detach & cordon, are at least vaguely similar.  Quarantine is a noun & verb, quarantiner is a noun, quarantinable is an adjective and quarantined & quarantining are verbs & adjectives.

In scripture, the number 40 often occurs although Biblical scholars, always anxious to dismiss musings from numerologists, new age practitioners and crystal-wearing basket weavers, reject the notion it has any special meaning beyond the idea of a “period of trial or struggle”, memorably expressed in the phrase “forty days and forty nights”.  In the Old Testament, when God destroyed the earth in the Great Flood, he delivered rain for 40 days and 40 nights (Genesis 7:12).  After killing the Egyptian, Moses fled to Midian where he spent 40 years in the desert tending flocks (Acts 7:30) and subsequently he stood on Mount Sinai for 40 days and 40 nights (Exodus 24:18) and then interceded on Israel’s behalf for 40 days and 40 nights (Deuteronomy 9:18, 25).  In Deuteronomy 25:3, the maximum number of lashes a man could receive as punishment for a crime was set at 40.  The Israelite spies took 40 days to spy out Canaan (Numbers 13:25), the Israelites wandered for 40 years (Deuteronomy 8:2-5) and before Samson’s deliverance, Israel served the Philistines for 40 years (Judges 13:1).  Goliath taunted Saul’s army for 40 days before David arrived to slay him (1 Samuel 17:16) and when Elijah fled from Jezebel, he traveled 40 days and 40 nights to Mt. Horeb (1 Kings 19:8).  The number 40 also appears in the prophecies of Ezekiel (4:6; 29:11-13) and Jonah (3:4).  In the New Testament, the quarentyne was the desert in which Christ fasted and was tempted for for 40 days and 40 nights (Matthew 4:2) and there were 40 days between Jesus’ resurrection and ascension (Acts 1:3).  Presumably, this influenced Western medicine because it was long (and still by some) recommended that women should for 40 days rest after childbirth.

Plague, the Venetians and Quarantino

The Plague of Justinian arrived in Byzantine capital of Constantinople in 541, brought from recently conquered Egypt across the Mediterranean by plague-ridden fleas in the fur of rats on ships bringing loot from the war.  From the imperial capital it spread across Europe, Asia, North Africa and Arabia, killing an estimated thirty to fifty million, perhaps a quarter the inhabitants of the eastern Mediterranean.  Plague never really went away, localized outbreaks happening periodically unit it returned as a pandemic some eight-hundred years later; the Black Death, which hit Europe in 1347, claimed some two-hundred million in just four years and demographically, Europe would not for centuries recover from the Black Death.

There was at the time little scientific understanding of contagion but it became clear it was related to proximity so officials in Venetian-controlled port city of Ragusa (now Dubrovnik in Croatia) resolved to keep newly arrived sailors in isolation until it was apparent they were healthy.  Initially, the sailors were confined to their ships for thirty days, formalized in a 1377 Venetian law as a trentino (thirty days), which radically reduced the transmission rate and by 1448, the Venetians had increased the forced isolation to forty days (quarantine), which, given bubonic plague’s thirty-seven day cycle from infection to death, was an example of a practical scientific experiment.  The word soon entered Middle English as quarantine (already in use in common law as a measure of certain rights accruing to a widow), the origin of the modern word and practice of quarantine.  The English had many opportunities to practice quarentine.  In the three-hundred odd years between 1348 and 1665, London suffered some forty outbreaks, about once a generation (or every twenty years), the significance of this pattern something which modern epidemiologists would later understand.  Quarentine laws were introduced in the early sixteenth century and proved effective, reducing the historic medieval death-rates to about twenty percent.

Eggs à la Lohan

In self-imposed quarantine in March 2020, Lindsay Lohan was apparently inspired by a widely shared motivational poem by Kitty O’Meara (on the internet dubbed the "poet laureate of the pandemic") which included the fragment:

And the people stayed home.  And read books, and listened, and rested, and exercised, and made art, and played games, and learned new ways of being, and were still.  And listened more deeply.  Some meditated, some prayed, some danced.  Some met their shadows.  And the people began to think differently.

One of Lindsay Lohan's recommendations for a time of quarantine was to take the time to cook, posting a photograph of Eggs à la Lohan, a tasty looking omelet.  The poem also contained the words:

And the people healed.  And, in the absence of people living in ignorant, dangerous, mindless, and heartless ways, the earth began to heal.  And when the danger passed, and the people joined together again, they grieved their losses, and made new choices, and dreamed new images, and created new ways to live and heal the earth fully, as they had been healed.

Unfortunately, viewed from early 2023, it would seem Ms O'Meara's hopes quarantine might have left us kinder, gentler and more thoughtful may not have be realized.  It may be Mr Putin didn’t read poem and just ate omelet. 

Monday, April 25, 2022

Isolation

Isolation (pronounced ahy-suh-ley-shuhn)

(1) An act or instance of isolating; the state of being isolated.

(2) In medicine, the complete separation from others of a person suffering from contagious or infectious disease; quarantine.

(3) In diplomacy, the separation, as a deliberate choice by government, of a nation from other nations by nonparticipation in or withdrawal from international relations and institutions.

(4) In psychoanalysis, a process whereby an idea or memory is divested of its emotional component.

(5) In social psychology, the failure of an individual to maintain contact with others or genuine communication where interaction with others persists.

(6) In linguistics and other fields, to consider matters without regard to context.

(7) In chemistry, obtaining an element from one of its compounds, or of a compound from a mixture

(8) In computing, a database property that determines when and how changes made in one transaction are visible to other concurrent transactions.

1830s: A compound word, isolate + -ion.  A modern English borrowing from the French isolé (placed on an island (thus away from other people)).  Isolé was from the Italian isolato, past participle of isolare, the root of which was the Latin insulātus & insulātes (made into an island), from insula (island).  From circa 1740, English at first used the French isolé (rendered as isole) which appeared also as isole'd in the 1750s, isolate the verb emerging in the 1830s; isolated the past participle.  Isolation is now the most familiar form, the suffix –ion is from the Latin - (genitive -iōnis), appended to a perfect passive participle to form a noun of action.  Words with similar meanings, often varying by context, includes solitude, desolation, confinement, segregation, remoteness, privacy, quarantine, sequestration, aloofness, detachment, withdrawal, exile, aloneness, concealment, retreat, hiding, reclusion, monkhood, and seclusion.

Isolation, Social Phobia and Social Anxiety Disorder

As long ago as 400 BC, Greek physician Hippocrates (circa 460–c370 BC) noted there were people who sought social isolation, describing them as those who "love darkness as life" adding, in a hint at later understandings of mental illness, they tended also to "think every man observes them."  Such folk doubtless pre-dated antiquity, being always part of organized societies but it wasn’t until the late nineteenth century when psychiatry emerged as a distinct field that the particular human condition came to be known as social phobia or social neurosis, then thought of as a descriptor of extremely shy patients who sought isolation by choice.

Desolate: an emo in isolation.

Despite the increasing medicalization of the spectrum of the human condition, it wasn’t until 1968, in the second edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-II), that social isolation was described as a specific phobia of social situations or excessive fear of being observed or scrutinized by others but at this point the definition of social phobia was very narrow.  With the release in 1980 of the DSM-III, social phobia was included as an official psychiatric diagnosis although it restricted the criteria, noting those who sought social isolation did so because of a fear of “performance situations” and did not include fears of less formal encounters such as casual conversations.  Those with such broad fears were instead to be diagnosed with “avoidant personality disorder” which, for technical reasons defined within the DSM-III, could not be co-diagnosed as social phobia, an attitude reflecting the editors’ view that phobias and neuroses needed specifically to be codified rather than acknowledging there existed in some a “general anxiety” disorder.  This neglect was addressed in the 1987 revision to the DSM-III (DSM-III-R) which changed the diagnostic criteria, making it possible to diagnose social phobia and avoidant personality disorder in the same patient.  In this revision, the term "generalized social phobia" was introduced.  DSM-IV was published in 1994 and the term “social anxiety disorder” (SAD) replaced social phobia, this reflecting how broad and generalized fears are in the condition although the diagnostic criteria differed only slightly from those in the DSM-III-R.  The DSM-IV position remains essentially current; the modifications in the DSM-5 (2013) not substantively changing the diagnosis, altering little more than the wording of the time frame although the emphasis on recognizing whether the experience of anxiety is unreasonable or excessive was shifted from patient to clinician.

For some, COVID-19 isolation was a business opportunity.

Generalized anxiety disorder (GAD) and panic disorder (PD) were formalized when DSM-III was released in 1980 although among clinicians, GAD had for some years been a noted thread in the literature but what was done in DSM-III was to map GAD onto the usual pattern of diagnostic criteria.  In practice, because of the high degree of co-morbidity with other disorders, the utility of GAD as defined was soon a regular topic of discussion at conferences and the DSM’s editors responded, the parameters of GAD refined in subsequent releases between 1987-1994 when GAD’s diagnostic criteria emerged in its recognizably modern form.  By the time the terminology for mental disorders began in the nineteenth century to be codified, the word anxiety had for hundreds of years been used in English to describe feelings of disquiet or apprehension and in the seventeenth century there was even a school of thought it was a pathological condition.  It was thus unsurprising that “anxiety” was so often an element in the psychiatry’s early diagnostic descriptors such as “pantophobia” and “anxiety neurosis”, terms which designated paroxysmal manifestations (panic attacks) as well as “interparoxysmal phenomenology” (the apprehensive mental state).  The notion of “generalized anxiety”, although not then in itself a diagnosis, was also one of the symptoms of many conditions including the vaguely defined neurasthenia which was probably understood by many clinicians as something similar to what would later be formalized as GAD.  As a distinct diagnostic category however, it wasn’t until the DSM-III was released in 1980 that GAD appeared, anxiety neurosis split into (1) panic disorder and (2) GAD.  When the change was made, the editors noted it was a response to comments from clinicians, something emphasised when DSM-III was in 1987 revised (DSM-III-R), in effect to acknowledge there was a class of patient naturally anxious (who might once have been called neurotic or pantophobic) quite distinct from those for whom a source of anxiety could be deduced.  Thus, the cognitive aspect of anxiety became the critical criterion but within the profession, some scepticism about the validity of GAD as a distinct diagnostic category emerged, the most common concern being the difficulty in determining clear boundaries between GAD, other anxiety-spectrum disorders and certain manifestations of depression.

The modern label aside, GAD has a really long lineage and elements of the diagnosis found in case histories written by doctors over the centuries would have seemed familiar to those working in the early nineteenth century, tales of concern or apprehension about the vicissitudes of life a common thing.  As psychiatry in those years began to coalesce as a speciality and papers increasingly published, it was clear the behaviour of those suffering chronic anxiety could culminate in paroxysmal attacks, thus it was that GAD and panic attacks came to be so associated.  In English, the term panophobia (sometimes as pantaphobia, pantophobia or panphobia) dates from 1871, the word from the Late Latin pantŏphŏbŏs, from the Ancient Greek παντοφόβος (all-fearing (literally “anxiety about everything”)).  It appears in the surviving works of medieval physicians and it seems clear there were plenty of “pantophobic patients” who allegedly were afraid of everything and it was not a product of the Dark Ages, Aristotle (384-322 BC) in the seventh book of his Nicomachean Ethics (350 BC) writing there were men “…by nature apt to fear everything, even the squeak of a mouse”.

The first edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-I (1952) comprised what seems now a modest 130 pages.  The latest edition (DSM-5-TR (2022)) has 991 pages.  The growth is said to be the result of advances in science and a measure of the increasing comprehensiveness of the manual, not an indication that madness in the Western world is increasing.  The editors of the DSM would never use the word "madness" but for non-clinicians it's a handy term which can be applied to those beyond some point on the spectrum of instability.

Between Aristotle and the publication of the first edition of the DSM in 1952, physicians (and others) pondered, treated and discussed the nature of anxiety and theories of its origin and recommendations for treatment came and went.  The DSM (retrospectively labelled DSM-I) was by later standards a remarkably slim document but unsurprisingly, anxiety was included and discussed in the chapter called “Psychoneurotic Disorders”, the orthodoxy of the time that anxiety was a kind of trigger perceived by the conscious part of the personality and produced by a threat from within; how the patient reacted to this resulted in their reaction(s).  There was in the profession a structural determinism to this approach, the concept of defined “reaction patterns” at the time one of the benchmarks in US psychiatry.  When DSM-II was released in 1968, the category “anxiety reaction” was diagnosed when the anxiety was diffuse and neither restricted to specific situations or objects (ie the phobic reactions) nor controlled by any specific psychological defense mechanism as was the case in dissociative, conversion or obsessive-compulsive reactions. Anxiety reaction was characterized by anxious expectation and differentiated from normal apprehensiveness or fear.  Significantly, in DSM-II the reactions were re-named as “neuroses” and it was held anxiety was the chief characteristic of “neuroses”, something which could be felt or controlled unconsciously by various symptoms.  This had the effect that the diagnostic category “anxiety neurosis” encompassed what would later be expressed as panic attacks and GAD.

A: Excessive anxiety and worry (apprehensive expectation), occurring more days than not for at least 6 months, about a number of events or activities (such as work or matters relating to educational institutions).

B: The patient finds it difficult to control the worry.

C: The anxiety and worry are associated with three (or more) of the following six symptoms:

(1) Restlessness or feeling keyed up or on edge.

(2) Being easily fatigued.

(3) Difficulty concentrating or mind going blank.

(4) Irritability.

(5) Muscle tension.

(6) Sleep disturbance (difficulty falling or staying asleep, or restless, unsatisfying sleep).

The key change really was for the criteria for GAD requiring fewer symptoms. Whereas with the DSM-IV-TR (2000) individuals needed to exhibit at least three physical and three cognitive symptoms for a diagnosis of GAD, under DSM-5 (2013), only one of each was required so not only was the accuracy and consistency of diagnosis (by definition) improved, the obvious practical effect was better to differentiate GAD from other anxiety disorders and (importantly) the usual worries and concerns endemic to the human condition.  The final significant aspect of the evolution was that by the time of DSM-5, GAD had become effectively a exclusionary diagnosis in that it cannot be diagnosed if the anxiety is better explained by other anxiety disorders and nor can GAD be caused directly by stressors or trauma.