Thursday, August 17, 2023

Stunt

Stunt (pronounced stuhnt)

(1) To stop, slow down, or hinder the growth or development of; dwarf; arrested development.

(2) In botanical pathology, a disease of plants, characterized by a dwarfing or stunting of the plant.

(3) A performance displaying a person's skill or dexterity, as in athletics; feat.

(4) Any remarkable feat performed chiefly to attract attention.

1575-1585: From the dialectal stunt (stubborn, dwarfed), from the Middle English stont & stunt (short, brief), from the Old English stunt (stupid, foolish, simple (as in stuntspræc "foolish talk")), from the Proto-Germanic stuntaz (short, compact, stupid, dull).  It was cognate with the Middle High German stunz (short, blunt, stumpy) from the Proto-Germanic stuntaz (short, truncated), and the Old Norse stuttr (short in stature, dwarfed).  It was related also to the Old English styntan (to make dull, stupefy, become dull, repress).

The origin of the noun use of stunt is obscure although all agree it’s of US origin circa 1878 and some sources suggest it was originally college sports slang though without evidence of youthful coinage.  Links have been suggested to the Middle Low German stunt (a shoulder grip with which you throw someone on their back), a variant of the colloquial stump (dare, challenge) (1871), the German stunde (literally "hour") and the Middle English stunt (foolish; stupid) but no documentary evidence exists.  The noun in this sense certainly caught on, applied particularly to aerobatic display by aircraft and gained a new life when Youtube and its imitators provided a platform.  Stunt historically was a verb, the familiar noun a later form, the earlier noun was stuntedness, the adverb is stuntingly and the adjectives stunty & stunted.

Lindsay Lohan with body double in Irish Wish (left) and in Falling for Christmas (centre) in which for the skiing scenes she used a stunt double (right).

In Film & television production, the terms "stunt double" & "body double" are sometimes used interchangeably but by convention they describe different roles.  The classic stunt double is engaged to perform those parts of the script which call upon an actor to do something especially physically demanding which typically requires special skills and may involve some risk; there there has been an injury toll among stunt doubles with deaths are not unknown.  The term body double is usually used of those engaged (1) to appear in scenes in which an actor wishes not to appear (such as those involving nudity) or (2) to permit something to be filmed which would otherwise defy the laws of nature (such as an actor having a conversation with themselves).  Advances in technology mean the laws of nature now are little obstacle to the impossible being depicted but many actors still have "no-nudity" clauses in contracts although the profession is now much concerned the combination of digital editing and artificial intelligence (AI) will soon render even all this obsolete.  Actually, at the technical level, flesh & blood actors might soon be (or already are) obsolete but their hope is audiences will continue to demand real people playing the parts.  Time will tell.  In her recent Netflix projects, Lindsay Lohan used a body double in Irish Wish (now slated for release in early 2024) but in Falling for Christmas (2022) needed a stunt double for the skiing scenes, the role taken by Rian Zetzer (b 1996), a Salt Lake City-based former competitive mogul skier and sponsored free-skier.

The Cunning Stunts (1977-1982)

Feminist theatre, although with identifiable roots in the Weimar Republic (Germany: 1918-1933), came to be recognized, theorized, and practiced during the 1970s in the wake of second-wave feminism.  Although it encompassed diverse theatrical work, it’s always been most associated with the overtly political, a movement motivated by the recognition of and resistance to women’s marginalization within social and cultural systems that reinforce male privilege and dominance.  In this it acted out a resistance to mainstream, male-dominated theatre culture and revived long-neglected works and performances by women from the dramatic texts of Hrotsvitha (circa 935–973), plays by Restoration playwrights such as Aphra Behn (1640–1689), Mary Pix (1666–1709) & Susanna Centlivre (circa 1669-1723) and dramas by the Edwardian activists most interested in suffrage, Elizabeth Baker (1876–1962), Cicely Mary Hamilton (1872–1952), Elizabeth Robins (1862–1952), & Katherine Githa Sowerby (1876–1970).

What emerged from the second wave came largely to be defined by three types of feminism: bourgeois/liberal, radical/cultural & socialist/materialist.  Critics treated the three in a hierarchical construct of respectability, bourgeois/liberal feminism treated as politically the weakest given it neither endorsed radical feminism’s desire to overthrow patriarchy in favor of women’s social, cultural and sexual empowerment, nor advocated the radical transformation of society’s economic, political and social structures as socialist/materialist feminism did.  Each dynamic had its aesthetic counterpart: bourgeois/liberal feminism remained attached to conventional realistic forms, but sought a greater role for women within the confines of traditional dramatic writing; radical/cultural feminism, heavily influenced by French theorists, explored a women’s language; socialist/materialist feminism found its aesthetic in the Brechtian legacies of presentational forms, techniques and performance registers.

In this milieu, the debut in London in 1977 of the feminist performance collective Cunning Stunts was unexpected.  Neither overtly nor even identifiably political, they were something of a reaction to feminist theatre itself, the members noting feminist “alternative theatre” had become elitist and they wanted a more accessible and spontaneous performer’s platform rather than a writer’s or director’s theatre, one which not only displayed the absurdity of male behavior but presented women being funny, flouting the prevailing glamorous image of women as entertainers.  The shows were musical, visual, highly energetic and existed mostly to offer fun rather than any political or cultural critique although later productions, such as Opera, said to use their “…versions of archetypal symbols and mythological characters drawn from astrology, matriarchal societies… to express the experiences of living as wimin (sic) in a male strangulated world” did suggest other agendas remained of interest.

Suffering the internal conflicts perhaps endemic to collectives, the Cunning Stunts dissolved in 1982, having seemingly worked their concept dry.  In the UK, much alternative theatre didn’t survive the 1980s, the Thatcher government dismantling many of the often left-wing local authorities which had provided a substantial proportion of the funding.

Wednesday, August 16, 2023

Squiffy

Squiffy (pronounced skwiff-e)

An informal term describing someone somewhere on the spectrum of drunkenness, now used mostly of mild yet obvious intoxication.

Late 1800s: Based on the surname Asquith and coined because of habit of Herbert Henry Asquith (1852–1928; UK prime-minister 1908-1916) in appearing in the House of Commons, visibly affected by alcohol.  From this he earned (and richly deserved) the sobriquet “Squiffy".  Squiffy, squiffier, squiffiest & squiffed are adjectives, squiffiness & Squiffite are nouns and squiffily is an adverb; historically, the the most common noun plural was Squiffites.  The concocted noun squiffinessness is wholly jocular and sometimes appropriate. 

HH Asquith was brought up in a provincial household in the puritan tradition where alcohol was rarely served but, after a second marriage in which he took a socialite wife and began to move in the circles of London’s glittering society, his fondness grew for fine wines and spirits.  These tastes he took with him when he entered parliament in 1885 and his assumption of the premiership two decades later did little to diminish his thirst.

Henry & Venita.

Nor did it seem to affect his vitality.  In his early sixties he became quite besotted with Venetia Stanley (1887-1948), the twenty-five year old best friend of his daughter and between 1912-1915 he would spend much time in cabinet writing her love letters.  One would have thought a British prime-minister might have much else on his mind in during these years, but “old squiffy” seemed to fit it all in, Andrew Bonar Law (1858–1923; UK prime-minister 1922-1923) admitting at the time that Asquith “when drunk could make a better speech than any of us sober”.  Sometimes though, even for him, it proved too much.  After one very long lunch left him more than usually squiffy, he fell asleep in the house, unable to be roused to speak in support of his bill dealing with the Church of Wales, leaving its carriage to the postmaster general Herbert Samuel (1870–1963) and attorney-general Sir Rufus Isaacs (1860–1935).  It prompted Arthur Balfour (1848–1930; UK prime-minister 1902-1905) to assure the good people of the Welsh Church that all would be well because the matters were in the hands of “…one drunken Christian and two sober Jews.”  The laws of of the nation have since sometimes been in less capable hands. 

Anti-squiffiness device: Lindsay Lohan wearing one of AMS's (Alcohol Monitoring Systems) SCRAMs (Secure Continuous Remote Alcohol Monitoring).

He led the government for eight-odd years, his first cabinet in 1908 probably the most lustrous of the century and his fall from office probably little to do with alcohol, his character simply not suited to lead a government during wartime.  In subsequent years, he retained a following that became a faction of the Liberal Party and which would be a notable factor in British politics; they were called the Squiffites, a formation easier on the tongue than Asquithite.  English has a rich vocabulary of synonyms for drunk including buzzed, inebriated, laced, lit, magoted, muddled, pissed, plastered, potted, sloshed, shit-faced, squiffy, stewed, tanked, tipsy, totaled, wasted, boozed, groggy, juiced, liquored, tight, under the influence & under-the-table; not all are used in every country and some overlap with descriptions of the effects of other drugs but it’s an impressively long list.  One interesting aspect of the use of squiffy is that it tends to be used with a modifier: the practice being to say “a bit squiffy” or “a little bit squiffy” and it seems now more applied to women.

There may on 4 August 2021 have been some sort of equipment malfunction somewhere in the apparatus used to record and broadcast parliamentary questions from the Australian House of Representatives because many viewers concluded the deputy prime-minister was a bit squiffy.  Question time is held at 2pm (just after lunch).  One constituent wrote to the speaker’s office to enquire and received an assurance from a staff member it’s not possible for a member to appear in the house while squiffy.  Her prompt response was helpful.

The Hon Barnaby Joyce MP (b 1967; thrice deputy prime-minister of Australia, 2016-date (the gaps due to "local difficulties")), House of Representatives, Canberra, Australia, 4 August 2021.  For observers of Mr Joyce who may be searching for the right word, when one is obviously affected by squiffiness, one may be said to be squiffed or squiffy; the comparative being squiffier and the superlative squiffiest.

Tuesday, August 15, 2023

Acnestis

Acnestis (pronounced ak-nees-tis)

(1) In zoology, the area of skin on the part of the back between the shoulder blades and the loins which an animal cannot reach to scratch.

(2) By extension, in humans, much the same thing.

1700s: From the Late Latin acnestis, the from Koine Greek κνηστις (áknēstis) (spine), from κνστις (knêstis) (spine, cheese-grater).  There are theories it may have been as construct of - (a-) + -κναίειν (-knaíein) (grate, scrape, scratch) (only attested in compounds) or from an incorrect segmentation of κατ κνστιν (katà knêstin) (on the spine) (based on translations of Homer’s Odyssey), ultimately from the primitive Indo-European kneh.  Acnestis is a noun; the noun plural is acnestises.

#Freckles: The acnestis area on Lindsay Lohan’s back.

Based on the earliest known texts in which the word appears, it was defined as: “That part of the spine of the back, which reaches from the metaphrenon, [then used to describe the areas between the shoulder blades] and the loins” and use was limited only to “those quadrupeds unable to reach it to scratch”.  The word has been used figuratively of political problems which are persistent & troubling yet we seem to lack the means to solve; they remain thus intractable: “In what has to be the longest post-election season in living memory, the last five months have felt like an acnestis upon our collective soul; like that little patch of skin on our backs that we just can't reach to scratch ourselves.  It's irritating.  It's annoying.  It's left us reaching and spinning around in circles.”  (A Wish List to Soothe Our Collective Itch, New Straits Times, Kuala Lumpur, Malaysia, 5 August 2008.

It’s a linguistic curiosity that few dictionaries bother to list acnestis yet “back scratchers” seem to have been part of the domestic inventory of humans for about as long as the reliable archaeological record extends.  The conscientious Oxford English Dictionary (OED) includes an entry while noting it is “rare in genuine use” and that’s presumably always been the case even among zoologists.  It’s another of those words which has gained a (sort-of) niche in the internet age as lists of strange, obscure or unusual words have proliferated.  However, if acnestis never became a fashionable word, the ongoing popularity of back-scratchers (whether designed for the purpose or improvised from whatever fell to hand) confirms the condition remains endemic and in one episode of the the television cartoon series The Simpsons, Mr Burns (evil owner of the nuclear power-plant) lamented that because of an act of embezzlement by Homer Simpson (who needed the money for a proprietary baldness cure), he couldn’t afford to buy “the ivory back-scratcher” he desired.

A back-scratcher of nielloed steel and silver with gold inlaid, dating from circa 1601-1625 from the Mughal dynasty who ruled the Mughal Empire (circa 1526-1857).

It’s thought to have been crafted in Bidar, India using a method called “bidri”, a metal-working technique unique to India in which objects were fabricated from an alloy (95% zinc; 5% copper), colored a rich matt black and inlaid with silver.  The name Bidri is from the Deccani city of Bidar where the process is thought to have originated.  The back-scratcher has jeweled mounts while the be-ringed hand and the Makara (from the Sanskrit मकर and Romanized as Makara, a legendary sea-creature in Hindu mythology which, in Hindu astrology, is equivalent to the Zodiac sign Capricorn) head unscrews to reveal sharp blades.  During the nineteenth century, it was exhibited at the Society of Antiquaries and at the Archaeological Society and the unusual nature of the design induced lively debates about its function.  There was speculation it may have been a pointer used with holy manuscripts but most have concluded it was a back-scratcher for some rich or eminent person.

It’s now on permanent exhibition at the British Museum and the institution provides curatorial notes: “The object was dis-assembled (each terminal unscrewed to reveal a short flat pointed tool (dragon head terminal) and a longer chamfered blade (hand terminal).  The steel tool and blade were cleaned with acetone.  The object overall was cleaned using cotton wool swabs dipped in Silvo.  Cotton wool swabs dampened in White Spirit and then acetone were used to remove any traces of Silvo and to complete cleaning and degreasing of the surface.  Some areas of firestain remain”.  In the periodic conservation cleaning, the method uses Silvo copper polish, acetone propan-1-one & dimethyl ketone, white spirit composition & petroleum distillate.

Bear solves acnestis issue.

Monday, August 14, 2023

Obsolete & Obsolescent

Obsolete (pronounced ob-suh-leet)

(1) No longer in general use; fallen into disuse; that is no longer practiced or used, out of date, gone out of use, of a discarded type; outmoded.

(2) Of a linguistic form, no longer in use, especially if out of use for at least the past century.

(3) Effaced by wearing down or away (rare).

(4) In biology, imperfectly developed or rudimentary in comparison with the corresponding character in other individuals, as of a different sex or of a related species; of parts or organs, vestigial; rudimentary.

(5) To make obsolete by replacing with something newer or better; to antiquate (rare).

1570–1580: From the Latin obsolētus (grown old; worn out), past participle of obsolēscere (to fall into disuse, be forgotten about, become tarnished), the construct assumed to be ob- (opposite to) (from the Latin ob- (facing), a combining prefix found in verbs of Latin origin) + sol(ēre) (to be used to; to be accustomed to) + -ēscere (–esce) (the inchoative suffix, a form of -ēscō (I become)).  It was used to form verbs from nouns, following the pattern of verbs derived from Latin verbs ending in –ēscō).  Obsoletely is an adverb, obsoleteness is a noun and the verbs (used with object), are obsoleted & obsoleting; Although it does exist, except when it’s essential to covey a technical distinction, the noun obsoleteness is hardly ever used, obsolescence standing as the noun form for both obsolete and obsolescent.  The verb obsolesce (fall into disuse, grow obsolete) dates from 1801 and is as rare now as it was then.

Although not always exactly synonymous, in general use, archaic and obsolete are often used interchangeably.  However, dictionaries maintain a distinction: words (and meanings) not in widespread use since English began to assume its recognizably modern form in the mid-1700s, are labeled “obsolete”.  Words and meanings which, while from Modern English, have long fallen from use are labeled “archaic” and those now seen only very infrequently (and then in often in specialized, technical applications), are labeled “rare”.

Obsolescent (promounced ob-suh-les-uhnt)

(1) Becoming obsolete; passing out of use (as a word or meaning).

(2) Becoming outdated or outmoded, as applied to machinery, weapons systems, electronics, legislation etc.

(3) In biology, gradually disappearing or imperfectly developed, as vestigial organs.

1745–1755: From the Latin obsolēscentum, from obsolēscēns, present participle of obsolēscere (to fall into disuse); the third-person plural future active indicative of obsolēscō (degrade, soil, sully, stain, defile).  Obsolescently is an adverb and obsolescence a noun.  Because things that are obsolescent are becoming obsolete, the sometimes heard phrase “becoming obsolescent” is redundant.  The sense "state or process of gradually falling into disuse; becoming obsolete" entered general use in 1809 and although most associated with critiques by certain economists in the 1950s, the phrase “planned obsolescence was coined” was coined in 1932, the 1950s use a revival.

Things that are obsolete are those no longer in general use because (1) they have been replaced, (2) the activity for which they were designed is no longer undertaken.  Thing that are considered obsolescent are things still to some extent in use but are for whatever combination of reasons, are tending towards becoming obsolete.  in fading from general use and soon to become obsolete. For example, the Windows XP operating system (released in 2001) is not obsolete because some still use it, but it is obsolescent because, presumably it will in the years ahead fall from use.

Ex-Royal Air Force (RAF) Hawker Hunter in Air Force of Zimbabwe (AFZ) livery; between 1963-2002 twenty-six Hunters were at different times operated by the AFZ.  Declared obsolete as an interceptor by the RAF in 1963, some Hunters were re-deployed to tactical reconnaissance, ground-attack and close air support roles before being retired from front-line service in 1970.  Some were retained as trainers while many were sold to foreign air forces including India, Pakistan and Rhodesia (Zimbabwe since 1980).

Despite the apparent simplicity of the definition, in use, obsolescent is highly nuanced and much influenced by context.  It’s long been a favorite word in senior military circles; although notorious hoarders, generals and admirals are usually anxious to label equipment as obsolescent if there’s a whiff of hope the money might to forthcoming to replace it with something new.  One often unexplored aspect of the international arms trade is that of used equipment, often declared obsolescent by the military in one state and purchased by that of another, a transaction often useful to both parties.  The threat profile against which a military prepares varies between nations and equipment which genuinely has been rendered obsolescent for one country may be a valuable addition to the matériel of others and go on enjoy an operational life of decades.  Well into the twentieth-first century, WWII & Cold War-era aircraft, warships, tanks and other weapon-systems declared obsolescent and on-sold (and in some cases given as foreign aid or specific military support) by big-budget militaries remain a prominent part of the inventories of many smaller nations.  That’s one context, another hinges on the specific-tasking of materiel; an aircraft declared obsolescent as a bomber could go on long to fulfil a valuable role as in transport or tug.

In software, obsolescence is so vague a concept the conventional definition really isn’t helpful.  Many software users suffer severe cases of versionitis (a syndrome in which they suffer a sometimes visceral reaction to using anything but the latest version of something) so obsolescence to them seems an almost constant curse.  The condition tends gradually to diminish in severity and in many cases the symptoms actually invert: after sufficient ghastly experiences with new versions, versionitis begins instead to manifest as a morbid fear of every upgrading anything.  Around the planet, obsolescent and obsolete software has for decades proliferated and there’s little doubt this will continue, the Y2K bug which prompted much rectification work on the ancient code riddling the world of the main-frames and other places unlikely to be the last great panic (one is said to be next due in 2029).  The manufacturers too have layers to their declaration of the obsolete.  In 2001, Microsoft advised all legacy versions of MS-DOS (the brutish and now forty year old file-loader) were obsolete but, with a change of release number, still offer what's functionally the same MS-DOS for anyone needing a small operating system with minimal demands on memory size & CPU specification, mostly those who use embedded controllers, a real attraction being the ability easily to address just about any compatible hardware, a convenience more modern OSs have long restricted.  DOS does still have attractions for many, the long-ago derided 640 kb actually a generous memory space for many of the internal processes of machines and it's an operating system with no known bugs.  

XTree’s original default color scheme; things were different in the 1980s.

Also, obsolescent, obsolete or not, sometimes the old ways are the best.  In 1985, Underware Sytems (later the now defunct Executive Systems (EIS)) released a product called XTree, the first commercially available software which provided users a visual depiction of the file system, arranged using a root-branch tree metaphor.  Within that display, it was possible to do most file-handling such as copying, moving, re-naming, deleting and so on.  Version 1.0 was issued as a single, 35 kb executable file, supplied usually on a 5.25" floppy diskette and although it didn’t do anything which couldn’t (eventually) be achieved using just DOS, XTree made it easy and fast; reviewers, never the most easily impressed bunch, were effusive in their praise.  Millions agreed and bought the product which went through a number of upgrades until by 1993, XTreeGold 3.0 had grown to a feature-packed three megabytes but, and it was a crucial part of the charm, the user interface didn’t change and anyone migrating from v1 to v3 could carry on as before, using or ignoring the new functions as they choose.

However, with the release in 1990 of Microsoft’s Windows 3.0, the universe shifted and while it was still an unstable environment, it was obvious things would improve and EIS, now called the XTree Company, devoted huge resources to producing a Windows version of their eponymous product, making the crucial decision that when adopting the Windows-style graphical user interface (GUI), the XTree keyboard shortcuts would be abandoned.  This mean the user interface was something that looked not greatly different to the Windows in-built file manager and bore no resemblance to the even then quirky but marvelously lucid one which had served so well.  XTree for Windows was a critical and financial disaster and in 1993 the company was sold to rival Central Point Software, themselves soon to have their own problems, swallowed a year later by Symantec which, in a series of strategic acquisitions, soon assumed an almost hegemonic control of the market for Windows utilities.  Elements of XTree were interpolated into other Symantec products but as a separate line, it was allowed to die.  In 1998, Symantec officially deleted the product but the announcement was barely noted by the millions of users who continued to use the text-based XTree which ran happily under newer versions of Windows although, being a real-time program and thus living in a small memory space, as disks grew and file counts rose, walls were sometimes hit, some work-arounds possible but kludgy.  The attraction of the unique XTree was however undiminished and an independent developer built ZTree, using the classic interface but coded to run on both IBM’s OS/2 and the later flavors of Windows.  Without the constraints of the old real-time memory architecture, ZTree could handle long file and directory names, megalomaniacs now able to log an unlimited number of disks and files, all while using the same, lightning-fast interface.  The idea spread to UNIX where ytree, XTC, linuXtree and (most notably), UnixTree were made available.

ZTree, for those who can remember how things used to be done.

ZTree remains a brute-force favorite for many techs.  Most don’t often need to do those tasks at which it excels but, when those big-scale needs arise, as a file handler, ZTree still can do what nothing else can.  It’ll also do what’s now small-scale stuff; anyone still running XTree 1.0 under MS-DOS 2.11 on their 8088 could walk to some multi-core 64-bit monster with 64 GB RAM running Windows 11 and happily use ZTree.  ZTree is one of the industry’s longest-running user interfaces.

The Centennial Light, Livermore-Pleasanton Fire Department, Livermore, California.  Illuminated almost continuously since 1901, it’s said to be the world's longest-lasting light bulb.  The light bulb business became associated with the idea of planned obsolescence after the revelation of the existence of a cartel of manufacturers which had conspired to more than halve the service life of bulbs in order to stimulate sales.

As early as 1924, executives in US industry had been discussing the idea of integrating planned obsolescence into their systems of production and distribution although it was then referred to with other phrases.  The idea essentially was that in the industrial age, modern mercantile capitalism was so efficient in its ability to produce goods that it would tend to over-produce, beyond the ability to stimulate demand.  The result would be a glut, a collapse in prices and a recession or depression which affected the whole society, a contributing factor to what even then was known as the boom & bust economy.  One approach was that of the planned economy whereby government would regulate production and maintain employment and wages at the levels required to maintain some degree of equilibrium between supply and demand but such socialistic notions were anathematic to industrialists.  Their preference was to reduce the lifespan of goods to the point which matched the productive capacity and product-cycles of industry, thereby ensuring a constant churn.  Then, as now, there were those for and against, the salesmen delighted, the engineers appalled.

The actual phrase seems first to have been used in the pamphlet Ending the Depression Through Planned Obsolescence, published in 1932 by US real estate broker (and confessed Freemason) Bernard London (b circa 1873) but it wasn’t popularized until the 1950s.  Then, it began as a casual description of the techniques used in advertising to stimulate demand and thus without the negative connotations which would attach when it became part of the critique of materialism, consumerism and the consequential environmental destruction.  There had been earlier ideas about the need for a hyper-consumptive culture to service a system designed inherently to increase production and thus create endless economic growth: one post-war industrialist noted the way to “avoid gluts was to create a nation of gluttons” and exporting this model underlies the early US enthusiasm for globalism.  As some of the implications of that became apparent, globalization clearly not the Americanization promised, enthusiasm became more restrained.

Betamax and VHS: from dominant to obsolescent to obsolete; the DVD may follow.

Although the trend began in the United States in the late 1950s, it was in the 1970s that the churn rate in consumer electronics began to accelerate, something accounted for partly by the reducing costs as mass-production in the Far East ramped up but also the increasing rapidity with which technologies came and went.  The classic example of the era was the so-called videotape format war which began in the mid 1970s after the Betamax (usually clipped to Beta) and Video Home System (VHS) formats were introduced with a year of each other.  Both systems were systems by which analog recordings of video and audio content cold be distributed on magnetic tapes which loaded into players with a cassette (the players, regardless of format soon known universally as video cassette recorders (VCR).  The nerds soon pronounced Betamax the superior format because of superior quality of playback and commercial operators agreed with it quickly adopted as the default standard in television studios.  Consumers however came to prefer VHS because, on most of the screens on which most played their tapes, the difference between the two was marginal and the VHS format permitted longer recording times (an important thing in the era) and the hardware was soon available at sometimes half the cost of Betamax units.

It was essentially the same story which unfolded a generation later in the bus and operating systems wars; the early advantages of OS/2 over Windows and Micro Channel Architecture (MCA) over ISA/EISA both real and understood but few were prepared to pay the steep additional cost for advantages which seemed so slight and at the same time brought problems of their own.  Quite when Betamax became obsolescent varied between markets but except for a handful of specialists, by the late 1980s it was obsolete and the flow of new content had almost evaporated.  VHS prevailed but its dominance was short-lived, the Digital Versatile Disc (DVD) released in 1997 which within half a decade was the preferred format throughout the Western world although in some other markets, the thriving secondary market suggests even today the use of VCRs is not uncommon.  DVD sales though peaked in 2006 and have since dropped by some 80%, their market-share cannibalized not by the newer Blu-Ray format (which never achieved critical mass) but by the various methods (downloads & streaming) which meant many users were able wholly to abandon removable media.  Despite that, the industry seems still to think the DVD has a niche and it may for some time resist obsolescence because demand still exists for content on a physical object at a level it remains profitable to service.  Opinions differ about the long-term.  History suggests that as the “DVD generation” dies off, the format will fade away as those used to entirely weightless content available any time, in any place won’t want the hassle but, as the unexpected revival of vinyl records as a lucrative niche proved, obsolete technology can have its own charm which is why a small industry now exists to retro-fit manual gearboxes into modern Ferraris, replacing technically superior automatic transmissions.

Sunday, August 13, 2023

Idiot

Idiot (pronounced id-ee-uht)

(1) In informal use (1) a foolish or senseless person (derogatory) or (2) an affectionate expression of disapprobation or disagreement.

(2) In medicine & psychology, a person of the lowest order in the former classification of intellectual disability (a mental age of less than three years old and an IQ (intelligence quotient) lower than 25; no longer in technical use; considered now generally offensive unless used affectionately.

1250–1300: From the Middle English idiote & ydiote, from the twelfth century Old French idiote (later idiot) (uneducated or ignorant person), from the Late Latin idiōta (an ignorant person), from the Ancient Greek διώτης (iditēs) (private person, layman, person lacking skill or expertise; an ignoramus (as opposed to a writer, soldier or skilled workman), the construct being idiō- (a lengthened variant of idio-, perhaps by analogy with stratiōtēs (professional soldier) derived from stratiá (army)) + -tēs (the agent noun suffix).  The Ancient Greek διος (ídios) meant " one's own, pertaining to oneself, private" and was a doublet of idiota.  Dialectical variations in English and Irish included eejit, idjit & idget.  The plural is idiots.  English offers a rich array of alternatives to idiot: fool, moron, nitwit, twit, blockhead, bonehead, cretin, dimwit, dork, dumbbell, dunce, ignoramus, imbecile, muttonhead, nincompoop, ninny, pinhead, simpleton, clodpoll, jerk, half-wit; dolt, dunce & numskull.

Use of the word "idiot" in headlines can hurt feelings.

The original meaning was “a person so mentally deficient as to be incapable of ordinary reasoning;" but this in Middle English later in the fourteenth century extended to "a simple man, uneducated person, layman".  A meaning shift had also happened in Latin, the classical form meaning “an ordinary person, layman; outsider" whereas in the Late Latin it conveyed the sense of "an uneducated or ignorant person".  This mirrored what happened with the Greek idiotes which meant literally "a private person" (ie a layman, someone uninvolved in public affairs) but came to be applied patronizingly to suggest someone "ignorant and uneducated".  In plural, the Greek word could mean "one's own countrymen."  In medieval English common law, the formalized distinction was between an idiot (one who has been without reasoning or understanding from birth) and a lunatic (who became that way later in life), and the difference could be important in determining the responsibility and punishment for crimes committed.  The idiot savant first appeared in medical literature in 1870; idiot box was first used to describe television in 1959 and, given that broadcasting had begun in the 1930s, it’s surprising it took that long to work that out; idiot light to describe the dashboard warning lights in cars is attested from 1961, a reference to drivers so lacking in mechanical sympathy not to notice indications of problems or bother to scan gauges.

The adjective idiotic was from 1713, following the Classical Latin idioticus and the Ancient Greek idiotikos; idiotical is from 1640s; the noun idiocy (state of being an idiot) is from the 1520s, from idiot on the model of prophecy etc and the early alternatives included idiotacy (1580s), idiotry (1590s).  Until well into the twentieth century, blithering was one of the more popular adjectives applied to idiot, the form dating from 1880, the present-participle adjective from the verb blither (to talk nonsense).  A handy adaptation of idiot was the in-joke among IT staff who sometimes classify problems reported by users as ID10T errors.

Comrade Lenin agitprop.

The term useful idiot is from political science and so associated with Lenin (Vladimir Ilyich Ulyanov (1870–1924; first leader of Soviet Russia 1917-1922 & USSR 1922-1924) that it's attributed to him but there's no evidence he ever spoke or wrote the words.  It became popular during the Cold War to describe pro-communist intellectuals and apologists in the West, the (probably retrospective) association with Lenin probably because had the useful idiots actually assisted achieving a communist revolution there, their usefulness outlived, he'd likely have had at least some of them shot as "trouble-makers".  Although it took many Western intellectuals decades to recant (some never quite managed) their support for the Soviet Union, the watershed was probably Comrade Khrushchev's (1894–1971; Soviet leader 1953-1964)  so called "Secret Speech" (On the Cult of Personality and Its Consequences) to the 20th Congress of the Communist Party of the Soviet Union on 25 February 1956 in which he provided a detailed critique of the rule of comrade Stalin (1878-1953; Soviet leader 1924-1953), especially the bloody purges of the late 1930s.  Some had however already refused to deny what had become obvious to all but avid denialists, The God that Failed a collection of six essays published in 1949 by poet Stephen Spender (1909-1995) et al in which the writers lay bare their sense of betrayal and disillusionment with communism because of the totalitarian state built by Stalin which was in so many ways just another form of fascism. 

Idiot, Imbecile & Moron

Idiot, imbecile, and moron were in the early twentieth century used in a psychological classification system, each one assigned to a specific range of abilities.

Idiots: Those so defective that the mental development never exceeds that or a normal child of about two years.

Imbeciles: Those whose development is higher than that of an idiot, but whose intelligence does not exceed that of a normal child of about seven years.

Morons: Those whose mental development is above that of an imbecile, but does not exceed that of a normal child of about twelve years.

Of these three words moron is the newest, created only in the early twentieth century, coined specifically for the purpose of medical diagnosis.  Moron is from the Ancient Greek mōros (foolish, stupid), the root shared with the rare morosoph (a learned fool).  Imbecile dates from the sixteenth century, an adjective meaning "weak, feeble", from the Classical Latin imbecillus (weak, weak-minded) and not until the early nineteenth century did it begin to be used as a noun.  Moran actually replaced “feeble-minded” and “simpleton” (introduced in 1846) but neither were ever standardised in the medical lexicon.  The clinical use is now obsolete but the generalized use of all three is well established as terms of opprobrium for someone who acts in some dopey way or says something stupid, but, the convention is now they can only be applied to someone not cognitively impaired, an inversion of their original purpose when part of the system of classification.

In the early 1900s, as the profession of psychiatry became more accepted within medicine, the system of classification became increasingly scientific: Idiots were those with IQs between 0–25, imbeciles between 26-50 and morons between 51–70.  The interest in the then fashionable field of eugenics saw further refinements with a teleological flavor: the concepts "moral insanity", "moral idiocy"," and "moral imbecility" used by the emerging field of eugenic criminology, which held crime could be reduced by preventing "feeble-minded" people from reproducing and the US Supreme Court used the terminology in the judgment of forced-sterilization case Buck v Bell (274 U.S. 200 (1927)). 

The later introduction of retard, retarded & retardation was a genuine attempt to de-stigmatize those once labeled idiots, imbeciles & morons.  The process was the same as the invented word moron replacing “simpleton” and “feeble-minded” (from the Latin flebilis (to be lamented).  Retarded was from the Latin retardare (to make slow, delay, keep back, or hinder) and was first used in relation to developmental delay in 1895 and was introduced as an alternative to idiot, moron, and imbecile because at the time it wasn’t derogatory, being a familiar technical term from engineering and mathematics but the associative connection meant that by the 1960s, it had become an insult.  As "retarded" and the related clinical terms from psychiatry appeared on the euphemism treadmill they gradually assumed their derogatory connotations.  It seems to be an organic process in language, an original term, neutral in meaning, enters public use and because of the thing with which it’s associated, becomes pejorative, the process noted also with words which become racial slurs.  It’s a very particular process: “Chinaman” thought pejorative while “Englishman” is not; “Aussie” a term of endearment whereas as “Paki” is a slur although that too is circumstantial, commercial television station Channel 9 (Australia) using “The Pakis” in their promotional material for the coverage of the 1983-1984 cricket season.  It wouldn’t now be used.

So, as sympathy emerged for various sensitivities, the search for connotatively neutral replacements settled on variations of “intellectual disability”, the new sub-categories being profound, severe, and moderate levels.  The World Health Organisation (WHO) in 1968 published (in an out-of-sequence amendment to the ICD-8 (International Statistical Classification of Diseases and Related Health Problems) a classification of intellectual disability (ID), based on what they called “relative degrees of cognitive functioning”:

Profound ID:          IQ below 20-25

Severe ID:             IQ 20-25 to 35-40

Moderate ID:         IQ 35-40 to 50-55

Mild ID:                 IQ 50-55 to 70

The alignment with the old system was idiot=profound, imbecile=moderate/severe and moron or feeble minded=mild but, by the time the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) and ICD-10 were published in 1994, the profession was moving away from the use of raw IQ scores to something more nuanced, the DSM noting the importance of assessing “dysfunction or impairment” in at least two areas including “communication, self-care, home living, social/interpersonal skills, use of community resources, self direction, functional academic skills, work, leisure & health and safety”.  The ICD noted “mental retardation is a condition of arrested or incomplete development of the mind, which is especially characterized by impairment of skills manifested during the developmental period, contributing to the overall level of intelligence- cognitive, language, motor and social abilities”.  However, the IQ baselines remained and the DSM-5 refined the model further, noting an intellectual disability should be defined by:

(1) Current intellectual deficits of two or more standard deviations below the population mean, which generally translates into performance in the lowest 3% of a person’s age and cultural group, or an IQ of 70 or below.

(2) Concurrent deficits in at least two domains of adaptive functioning of at least two or more standard deviations, which generally translates into performance in the lowest 3 % of a person’s age and cultural group, or standard scores of 70 or below.

Both assessments need to be measured with an individualized, standardized, culturally appropriate, psychometrically sound measure and needed to assess (1) conceptual skills (communication, language, time, money & academic), (2) social skills (interpersonal skills, social responsibility, recreation & friendships) and (3) practical skills (daily living skills, work & travel).  US legislation in 2010 required the terms "mental retardation" and" mentally retarded" be removed from federal records and replaced with "intellectual disability" and "individual with an intellectual disability", a change reflected in the DSM-5 (2013).

Saturday, August 12, 2023

Mandarin

Mandarin (pronounced man-duh-rin)

(1) In Imperial China, a member of any of the nine ranks of public officials, each distinguished by a particular kind of button worn on the cap.

(2) By extension, an influential or senior government official or bureaucrat.

(3) In informal (derogatory) use, a pedantic or elitist bureaucrat.

(4) By extension, a member of an elite or powerful group or class, as in intellectual or cultural milieus (usually but not necessarily paid officials of institutions and it tends to be derogatory).  The word is sometimes applied to any authority thought deliberately superior or complex; esoteric, highbrow or obscurantist.

(5) As “Standard Mandarin”, an official language of China and Taiwan, and one of four official languages in Singapore; Putonghua, Guoyu or Huayu (initial capital letter).

(6) A northern Chinese dialect, especially as spoken in and around Beijing (initial capital letter).

(7) A small, spiny citrus tree, Citrus reticulata, native to China, bearing lance-shaped leaves and flattish, orange-yellow to deep-orange loose-skinned fruit, some varieties of which are called tangerines; a small citrus tree (Citrus nobilis), cultivated for its edible fruit; the fruit of such tree, resembling small tangerines.

(8) In botany, any of several plants belonging to the genus Disporum or Streptopus, of the lily family, as S. roseus (rose mandarin) or D. lanuginosum (yellow mandarin), having drooping flowers and red berries.

(9) Of or relating to a mandarin or mandarins.

(10) In ornithology, an ellipsis of mandarin duck (Aix galericulata).

(11) Elegantly refined, as in dress, language or taste.

(12) A color in the orange spectrum.

(13) In ichthyology, as mandarin fish, the term applied to a number of brightly-colored species.

1580–1590: From the Portuguese mandarim & mandarij (or the older Dutch mandorijn), an alteration (by association with mandar (to order) of the Austronesian Malay menteri & manteri, from the Hindi mantrī and the Sanskrit मन्त्रिन् (mantrin) (minister, councillor), from मन्त्र (mantra), (counsel, maxim, mantra) + -इन् (-in) (an agent suffix).  In Chinese folk etymology, the word originates from Mandarin 滿大人/满大人 (Mǎndàrén (literally “Manchu (important man”)).  Mantra was ultimately from the primitive Indo-European root men- (to think) and the evolution of mandarin (in the sense of Chinese civil administration) was influenced in Portuguese by (mandar) (to command, order).  It was used generically of the several grades of Chinese officials who had entered the civil service (usually by way of the competitive exam); the Chinese equivalent was kwan (public servant) and by the early twentieth century it came to be used of “an important person” though often in a resentful manner rather than the sense of “a celebrity”.  The use to describe the small fruit was first noted in 1771 and was from the French mandarine, feminine of mandarin, based on the association with the color often used for the robes worn by mandarins in the Chinese civil service.  Mandarin, mandarinship, mandarinism & mandarinate are nouns, mandarinal is an adjective; the noun plural is manderins.

Lindsay Lohan in mandarin collar The Parent Trap (1998).  It wouldn't now be done because of fears of being cancelled for cultural appropriation.

In fashion, the mandarin collar (a short unfolded stand-up collar on a shirt or jacket) was a style adopted by Western fashion houses and said to be reminiscent of (though sometimes with exaggerated dimensions) the style depicted in the clothing of mandarins in Imperial China. The mandarin gown (technically a cheongsam which was actually from the Cantonese 長衫/长衫 (coeng saam) (long robe) was (1) a tight-fitting and usually brightly colored and elaborately patterned formal woman's dress, split at the thigh (known also as a qipao) & (2) a plain colored, tight-fitting dress with a short split at the thigh, worn as a school uniform by schoolgirls in Hong Kong.  Some dictionaries and food guides include “Mandarin cuisine” as a descriptor of the food associated with the area around Beijing but there’s little evidence of use and the main distinction in the West seems to be between Beijing cuisine and Cantonese cuisine from the south.  However, “Mandarin” is a most popular element in the names of Chinese restaurants in the West.

Lindsay Lohan mixing a Red Bull & mandarin juice while attending an event with former special friend Samantha Ronson, Mandarin Oriental Hotel, London, February 2012.

The use to describe the standard language of the nation was a calque of the Chinese 官話/官话 (Guānhuà) (spoken language of the mandarins), as an extension from mandarin (bureaucrat of the Chinese Empire) to the language used by the imperial court and sometimes by imperial officials elsewhere; from this, it was in the twentieth century adopted as a synonym for “Modern Standard Chinese” although academics and translators note the ambiguity which developed after the use was extended in the early seventeenth century to a number of northern dialects of Chinese to the extent they consider Manderin a branch of the Chinese languages and consisting of many dialects; Guanhua or Beifanghua.  Standard Mandarin (the language of the elites, media and education) and Mandarin Chinese (the group of Northern Chinese dialects together with Standard Mandarin) are not wholly interchangeable and within China are described differently.

Mandarin duck.

There are some forks of Mandarin Chinese which, but for a few words and phrases, are unintelligible to speakers of Standard Mandarin and the whole set of Mandarin languages are parts of the broader group of languages and dialects (or topolects) spoken.  The evolution of Mandarin to become both the imperial lingua franca and the official “court language” of the Ming and Qing dynasties was in part a pragmatic solution to the mutual unintelligibility of the varieties of spoken Chinese which had emerged over centuries.  It became prevalent during the late Ming (1368-1644) and early Qing (1636-1912) eras and, because of the centralization of Imperial administration, the particular court dialect spoken in Beijing became dominant by the mid-nineteenth century and substantially formed what was by the time of the formation of the People’s Republic of China (PRC) under the Chinese Communist Party (CCP) in 1949, it was “Standard Chinese”.