Monday, August 14, 2023

Puffery

Puffery (pronounced puhf-uh-ree)

(1) Undue or exaggerated praise; inflated laudation; publicity, claims in advertising, acclaim etc, that are exaggerated (also known as the “puff piece”).

(2) In common law jurisdictions (often as “mere puffery), certain claims or assertions made which, even if literally untrue or misleading, are not actionable.

(3) An act of puffing (rare except in humor).

1730–1735: The construct was puff (in the sense of “to praise with exaggeration”) + -ery.  The noun puff was from the early thirteenth century Middle English puf, puffe, puff & puf, from the Old English pyf (a short, quick blast of wind, act of puffing) which was imitative and cognate with the Middle Low German puf & pof.  It was derived from the verb which was from the Middle English puffen, from the Old English pyffan & puffian (to breathe out, blow with the mouth) and similar forms in other European languages included the Dutch puffen, the German Low German puffen, the German puffen, the Danish puffe and the Swedish puffa.  The sense of “to blow with quick, intermittent blasts” was common by the mid-fourteenth century while the meaning “pant, breathe hard and fast” emerged some decades later.  It was used of the “fluffy light pastry" from the late fourteenth century while the “small pad of a downy or flossy texture for applying powder to skin or hair” was first so described in the 1650s.

The meaning “to fill, inflate, or expand with breath or air” dates from the 1530s while the intransitive sense (in reference to small swellings & round protuberances) was noted by 1725.  The transitive figurative sense of “exalt” was known by the 1530s which shifted somewhat by the early eighteenth century into the meaning “praise with self-interest, give undue or servile praise to”, the idea by mid century focused on the figurative sense of “empty or vain boast”, this sense soon extended to mean “flattery & inflated praise”.  The derogatory use of poof for “an effeminate man; a male homosexual” was noted from the 1850s and is presumably from puff (possibly in the sense of “powder puff”, an allusion to the stereotype of their “excessive concern with maintaining a delicate appearance”)) and the extended form “poofter” was early twentieth century Australian slang, an unusual linguistic departure for a dialect which tended either to clip or add a trailing “e”, “y” or “o” sound to words.  The correct spelling for the furniture piece (A low cushioned seat with no back; a padded foot-stool) was pouf, from the French pouf & pouff (again of imitative origin) but, presumably because of confusion caused by the pronunciation, the spellings puff & poof sometimes are used.  The suffix -ery was from the Middle English -erie, from the Anglo-Norman and Old French -erie, a suffix forming abstract nouns.  The suffix first occurs in loan words from the Old French into the Middle English, but became productive in English by the sixteenth century, sometimes as a proper combination of -er with “y” (as in bakery or brewery) but also as a single suffix (such as slavery or machinery).  Puffery is a noun; the noun plural is pufferies.

Mere puffery

In law, the concept of “mere puffery” was created to provide a buffer between the “meaningless” sales pitch and the deceptive or misleading claims which amount to a misrepresentation.  A misrepresentation may be actionable; “mere puffery” is not.  Puffery is used to describe a claim that (1) a “reasonable person” would not take seriously or (2) is so vague or subjective that it can be neither proved nor disproved.  Those two definitions operate in conjunction because even if an assertion can be disproved, if it would be absurd for the “reasonable person” to claim they believed it, it will be held to be “mere puffery”.

Doubling down: Disappointed at losing the case based on their £100 offer, to restore public confidence, they offered £200. 

In contract law, the term “puffery” comes from one of the most celebrated cases in English jurisprudence: Carlill v Carbolic Smoke Ball Company (1892, EWCA Civ 1) before the Court of Appeal.  During the deadly influenza pandemic in the northern winter of 1889-1890, the Carbolic Smoke Ball Company it would pay £100 (equivalent to some £14,000 in 2023) to anyone who became ill with influenza after using their smoke ball in accordance with the instructions enclosed with the product.  Mrs Carlill was concerned enough by the flu to buy a ball which, following the instructions, she used thrice daily for some weeks but nevertheless, caught the flu.  Unable to persuade the company to pay her £100, Mrs Carlill brought an action, in court claiming a contract existed which the company denied.  At first instance, despite being represented by a future prime-minister, the Carbolic Smoke Ball Company lost, a verdict upheld unanimously by the Court of Appeal.  It was a landmark in the development of contract law, refining the long-established principles of (1) offer, (2) acceptance, (3) certainty of terms and (4) payment although it would be decades before the implications would begin comprehensively to be realized in legislation.  Not only did Mrs Carlill secure her £100 but she survived the pandemic, living to the age of ninety-six.  On 10 March 1942, she died after catching influenza.

So, Mrs Carlill, having used the smoke ball three times a day for almost two months before she developed influenza sued for breach of contract and the court held the offer made in the advertisement was not “mere puff” but constituted a valid offer of contract; the Smoke Ball Company’s offer was thus a misrepresentation because, in the particular circumstances detailed, a “reasonable person” would be likely to believe that they would receive £100 and thus, relying on the claim, be persuaded to purchase the product.  However, all the circumstances must be considered on a case-by-case basis and an individual’s simple reliance on a claim they sincerely believe to be true is not sufficient to for something to be held a misrepresentation.

In the famous Red Bull lawsuit in 2013, the court noted the company’s advertising slogan “Red Bull gives you wings” was “mere puffery” in that no reasonable person would believe ingesting even many cans of the stuff would mean they would “grow wings and fly” but the lawsuit claimed that implicit in the slogan was the allegedly deceptive and fraudulent suggestion that the drink was a “superior source of energy”, something not backed up by scientific evidence.   Heard in US District Court for the Southern District of New York, the class action was lodged by someone who had been drinking Red Bull for a decade-odd.  His claim was not that he expected feathers to sprout but that idea drinking Red Bull would increase performance and concentration (as advertised on the company's television, on-line and marketing campaigns) was “deceptive and fraudulent and is therefore actionable”.  The scientific basis for the action was research which found energy drinks gained their “boost” through caffeine alone, not guarana or any other ingredient, adding although there was no academic support for the claim Red Bull provides “any more benefit to a consumer than a cup of coffee, the Red Bull defendants persistently and pervasively market their product as a superior source of ‘energy’ worthy of a premium price over a cup of coffee or other sources of caffeine.”  Red Bull, while denying any wrongdoing or liability and maintaining its “marketing and labeling have always been truthful and accurate”, the company settled the lawsuit “to avoid the cost and distraction of litigation”.  As part of the settlement, anyone resident of the US who claimed to have purchased a can of Red Bull at some time after 1 January 2002 was eligible to receive either a $US10 reimbursement or two free Red Bull products with a retail value of approximately $US15, a webpage created to enable those affected to lodge their claim.  To avoid any similar claims, the company “voluntarily updated its marketing materials and product labeling".

Advertising is often a mix of puffery and specific claims which can be actionable, depending on the circumstances, either in damages or restitution.

So every case is decided on its merits.  A case before the Federal Court in Australia in 2017 held that a false assertion an app had “the most property listings in Sydney” was a misrepresentation because uncontested evidence proved otherwise although the court note were the app to claim it was “the best” app of its kind that would be mere puffery because, in that context, the phrase “the best” means nothing in particular because it’s not something which can be reduced to a metric or precisely defined.  More intriguing for those who like to speculate when grey turns black or white was the Pepsi Points Case which was in many ways similar to Carlill v Carbolic Smoke Ball Company.  PepsiCo’s advertising included a point system which customers could use to redeem prizes and one campaign had offered a military jet fighter (then invoiced by the manufacturers at US$23 million odd) in exchange for 7 million "Pepsi Points" (then worth US$700,000).  Mailing a $700,000 cheque to PepsiCo, a customer asked to collect his jet.  The court held the offer was “mere puffery” on the basis of (1) aspects of the campaign which clearing indicated “its jocular nature”, (2) that no reasonable person would believe a US$23 million jet could be obtained by exchanging US$700,000 and it was (3) anyway impossible for the company to deliver a military fighter jet in operable condition to a civilian customer.  It was an interesting case because it might have been decided differently if the object had been closer in value to the points mentioned and been something there was no legal impediment to supplying (such as a US$1 million car).  Were it a US$143 million car (there is one), the promotion would presumably still be judged puffery but at some point, it must be that the relative values would be close enough to for the “reasonable person” test to apply.  That however is something impossible to reduce to an equation and each case will be decided on its merits.  Just to be sure, PepsiCo bumped up by several orders of magnitude the points required to start one’s own air force up and added some text to make it clear the whole thing was just a joke.

In the matter of Tyrrell’s Crinkly Crisps.  Often packaging & advertising will contain a number of claims, some of which will be mere puffery (even if it’s easy to prove blatantly they’re untrue) while others need to be verifiable:

2 Pack: Not puffery; every pack must contain two packets.  There have been instances when customers have complained they’ve received more than was advertised and paid for but it’s rare.  Usually, such things are treated as “windfalls”.

Vegan: Not puffery; the contents must be vegan (as defined in the regulation of whatever jurisdiction in which they’re sold).

Triple Cooked: Probably puffery because it’s doubtful the term has any legal definition although were it possible to prove the production process is essentially the same as for any other crisp (chip), it might be actionable.  Because “triple” does have a defined value, were it proved the goods were cooked only twice as long as the practice of other manufacturers, that would presumably compel a change of text to “Double Cooked”.

More Crunch: Probably puffery because the measure of such things is so subjective and there is a point at which to increase crunchiness becomes self-defeating because other desired qualities will be lost.

Crinkly Crisps: Not puffery; the crisps must to some extent be crinkly although it might be fun to have a judge explore the margins and tell us how slight a corrugation can be while still being called “crinkly”.

No Artificial Nasties: Not puffery; these packets probably contain artificial ingredients because they’re almost impossible to avoid in the industrial production of food.  What constitutes a “nasty” is however a thing of quantity as well as quality; something millions every day harmlessly (even beneficially) can be a toxic “nasty” in large quantities so what’s included in the packet will be safe as supplied.  If potential “nasties” are found to exist in a quantity above a certain point, it’s actionable.

Gluten Free: Not puffery; unless there is an allowable quantity (ie trace amounts) permitted by regulation, there must be no gluten.

Sea Salt & Vinegar: Not puffery; sea salt is a particular type of salt so it must be used and there must be evidence of the use of vinegar.

165 g Net: Not puffery; each pack must contain 165 g of edible content +/- the small % of production line variation a court would deem acceptable.

Content guide (fat, energy et al): Not puffery; again, what’s claimed must be a reliable indication of the products within whatever small variation is acceptable.

Photograph with giant crisp: Puffery and an example of how the “reasonable person” test works in conjunction with an objective test of truth.  The packs do not contain crisps as large as is represented in the image (indeed, such would be too big even toi fit in the pack) and no reasonable person would believe this is what they’re buying.

Obsolete & Obsolescent

Obsolete (pronounced ob-suh-leet)

(1) No longer in general use; fallen into disuse; that is no longer practiced or used, out of date, gone out of use, of a discarded type; outmoded.

(2) Of a linguistic form, no longer in use, especially if out of use for at least the past century.

(3) Effaced by wearing down or away (rare).

(4) In biology, imperfectly developed or rudimentary in comparison with the corresponding character in other individuals, as of a different sex or of a related species; of parts or organs, vestigial; rudimentary.

(5) To make obsolete by replacing with something newer or better; to antiquate (rare).

1570–1580: From the Latin obsolētus (grown old; worn out), past participle of obsolēscere (to fall into disuse, be forgotten about, become tarnished), the construct assumed to be ob- (opposite to) (from the Latin ob- (facing), a combining prefix found in verbs of Latin origin) + sol(ēre) (to be used to; to be accustomed to) + -ēscere (–esce) (the inchoative suffix, a form of -ēscō (I become)).  It was used to form verbs from nouns, following the pattern of verbs derived from Latin verbs ending in –ēscō).  Obsoletely is an adverb, obsoleteness is a noun and the verbs (used with object), are obsoleted & obsoleting; Although it does exist, except when it’s essential to covey a technical distinction, the noun obsoleteness is hardly ever used, obsolescence standing as the noun form for both obsolete and obsolescent.  The verb obsolesce (fall into disuse, grow obsolete) dates from 1801 and is as rare now as it was then.

Although not always exactly synonymous, in general use, archaic and obsolete are often used interchangeably.  However, dictionaries maintain a distinction: words (and meanings) not in widespread use since English began to assume its recognizably modern form in the mid-1700s, are labeled “obsolete”.  Words and meanings which, while from Modern English, have long fallen from use are labeled “archaic” and those now seen only very infrequently (and then in often in specialized, technical applications), are labeled “rare”.

Obsolescent (promounced ob-suh-les-uhnt)

(1) Becoming obsolete; passing out of use (as a word or meaning).

(2) Becoming outdated or outmoded, as applied to machinery, weapons systems, electronics, legislation etc.

(3) In biology, gradually disappearing or imperfectly developed, as vestigial organs.

1745–1755: From the Latin obsolēscentum, from obsolēscēns, present participle of obsolēscere (to fall into disuse); the third-person plural future active indicative of obsolēscō (degrade, soil, sully, stain, defile).  Obsolescently is an adverb and obsolescence a noun.  Because things that are obsolescent are becoming obsolete, the sometimes heard phrase “becoming obsolescent” is redundant.  The sense "state or process of gradually falling into disuse; becoming obsolete" entered general use in 1809 and although most associated with critiques by certain economists in the 1950s, the phrase “planned obsolescence was coined” was coined in 1932, the 1950s use a revival.

Things that are obsolete are those no longer in general use because (1) they have been replaced, (2) the activity for which they were designed is no longer undertaken.  Thing that are considered obsolescent are things still to some extent in use but are for whatever combination of reasons, are tending towards becoming obsolete.  in fading from general use and soon to become obsolete. For example, the Windows XP operating system (released in 2001) is not obsolete because some still use it, but it is obsolescent because, presumably it will in the years ahead fall from use.

Ex-Royal Air Force (RAF) Hawker Hunter in Air Force of Zimbabwe (AFZ) livery; between 1963-2002 twenty-six Hunters were at different times operated by the AFZ.  Declared obsolete as an interceptor by the RAF in 1963, some Hunters were re-deployed to tactical reconnaissance, ground-attack and close air support roles before being retired from front-line service in 1970.  Some were retained as trainers while many were sold to foreign air forces including India, Pakistan and Rhodesia (Zimbabwe since 1980).

Despite the apparent simplicity of the definition, in use, obsolescent is highly nuanced and much influenced by context.  It’s long been a favorite word in senior military circles; although notorious hoarders, generals and admirals are usually anxious to label equipment as obsolescent if there’s a whiff of hope the money might to forthcoming to replace it with something new.  One often unexplored aspect of the international arms trade is that of used equipment, often declared obsolescent by the military in one state and purchased by that of another, a transaction often useful to both parties.  The threat profile against which a military prepares varies between nations and equipment which genuinely has been rendered obsolescent for one country may be a valuable addition to the matériel of others and go on enjoy an operational life of decades.  Well into the twentieth-first century, WWII & Cold War-era aircraft, warships, tanks and other weapon-systems declared obsolescent and on-sold (and in some cases given as foreign aid or specific military support) by big-budget militaries remain a prominent part of the inventories of many smaller nations.  That’s one context, another hinges on the specific-tasking of materiel; an aircraft declared obsolescent as a bomber could go on long to fulfil a valuable role as in transport or tug.

In software, obsolescence is so vague a concept the conventional definition really isn’t helpful.  Many software users suffer severe cases of versionitis (a syndrome in which they suffer a sometimes visceral reaction to using anything but the latest version of something) so obsolescence to them seems an almost constant curse.  The condition tends gradually to diminish in severity and in many cases the symptoms actually invert: after sufficient ghastly experiences with new versions, versionitis begins instead to manifest as a morbid fear of every upgrading anything.  Around the planet, obsolescent and obsolete software has for decades proliferated and there’s little doubt this will continue, the Y2K bug which prompted much rectification work on the ancient code riddling the world of the main-frames and other places unlikely to be the last great panic (one is said to be next due in 2029).  The manufacturers too have layers to their declaration of the obsolete.  In 2001, Microsoft advised all legacy versions of MS-DOS (the brutish and now forty year old file-loader) were obsolete but, with a change of release number, still offer what's functionally the same MS-DOS for anyone needing a small operating system with minimal demands on memory size & CPU specification, mostly those who use embedded controllers, a real attraction being the ability easily to address just about any compatible hardware, a convenience more modern OSs have long restricted.  DOS does still have attractions for many, the long-ago derided 640 kb actually a generous memory space for many of the internal processes of machines and it's an operating system with no known bugs.  

XTree’s original default color scheme; things were different in the 1980s.

Also, obsolescent, obsolete or not, sometimes the old ways are the best.  In 1985, Underware Sytems (later the now defunct Executive Systems (EIS)) released a product called XTree, the first commercially available software which provided users a visual depiction of the file system, arranged using a root-branch tree metaphor.  Within that display, it was possible to do most file-handling such as copying, moving, re-naming, deleting and so on.  Version 1.0 was issued as a single, 35 kb executable file, supplied usually on a 5.25" floppy diskette and although it didn’t do anything which couldn’t (eventually) be achieved using just DOS, XTree made it easy and fast; reviewers, never the most easily impressed bunch, were effusive in their praise.  Millions agreed and bought the product which went through a number of upgrades until by 1993, XTreeGold 3.0 had grown to a feature-packed three megabytes but, and it was a crucial part of the charm, the user interface didn’t change and anyone migrating from v1 to v3 could carry on as before, using or ignoring the new functions as they choose.

However, with the release in 1990 of Microsoft’s Windows 3.0, the universe shifted and while it was still an unstable environment, it was obvious things would improve and EIS, now called the XTree Company, devoted huge resources to producing a Windows version of their eponymous product, making the crucial decision that when adopting the Windows-style graphical user interface (GUI), the XTree keyboard shortcuts would be abandoned.  This mean the user interface was something that looked not greatly different to the Windows in-built file manager and bore no resemblance to the even then quirky but marvelously lucid one which had served so well.  XTree for Windows was a critical and financial disaster and in 1993 the company was sold to rival Central Point Software, themselves soon to have their own problems, swallowed a year later by Symantec which, in a series of strategic acquisitions, soon assumed an almost hegemonic control of the market for Windows utilities.  Elements of XTree were interpolated into other Symantec products but as a separate line, it was allowed to die.  In 1998, Symantec officially deleted the product but the announcement was barely noted by the millions of users who continued to use the text-based XTree which ran happily under newer versions of Windows although, being a real-time program and thus living in a small memory space, as disks grew and file counts rose, walls were sometimes hit, some work-arounds possible but kludgy.  The attraction of the unique XTree was however undiminished and an independent developer built ZTree, using the classic interface but coded to run on both IBM’s OS/2 and the later flavors of Windows.  Without the constraints of the old real-time memory architecture, ZTree could handle long file and directory names, megalomaniacs now able to log an unlimited number of disks and files, all while using the same, lightning-fast interface.  The idea spread to UNIX where ytree, XTC, linuXtree and (most notably), UnixTree were made available.

ZTree, for those who can remember how things used to be done.

ZTree remains a brute-force favorite for many techs.  Most don’t often need to do those tasks at which it excels but, when those big-scale needs arise, as a file handler, ZTree still can do what nothing else can.  It’ll also do what’s now small-scale stuff; anyone still running XTree 1.0 under MS-DOS 2.11 on their 8088 could walk to some multi-core 64-bit monster with 64 GB RAM running Windows 11 and happily use ZTree.  ZTree is one of the industry’s longest-running user interfaces.

The Centennial Light, Livermore-Pleasanton Fire Department, Livermore, California.  Illuminated almost continuously since 1901, it’s said to be the world's longest-lasting light bulb.  The light bulb business became associated with the idea of planned obsolescence after the revelation of the existence of a cartel of manufacturers which had conspired to more than halve the service life of bulbs in order to stimulate sales.

As early as 1924, executives in US industry had been discussing the idea of integrating planned obsolescence into their systems of production and distribution although it was then referred to with other phrases.  The idea essentially was that in the industrial age, modern mercantile capitalism was so efficient in its ability to produce goods that it would tend to over-produce, beyond the ability to stimulate demand.  The result would be a glut, a collapse in prices and a recession or depression which affected the whole society, a contributing factor to what even then was known as the boom & bust economy.  One approach was that of the planned economy whereby government would regulate production and maintain employment and wages at the levels required to maintain some degree of equilibrium between supply and demand but such socialistic notions were anathematic to industrialists.  Their preference was to reduce the lifespan of goods to the point which matched the productive capacity and product-cycles of industry, thereby ensuring a constant churn.  Then, as now, there were those for and against, the salesmen delighted, the engineers appalled.

The actual phrase seems first to have been used in the pamphlet Ending the Depression Through Planned Obsolescence, published in 1932 by US real estate broker (and confessed Freemason) Bernard London (b circa 1873) but it wasn’t popularized until the 1950s.  Then, it began as a casual description of the techniques used in advertising to stimulate demand and thus without the negative connotations which would attach when it became part of the critique of materialism, consumerism and the consequential environmental destruction.  There had been earlier ideas about the need for a hyper-consumptive culture to service a system designed inherently to increase production and thus create endless economic growth: one post-war industrialist noted the way to “avoid gluts was to create a nation of gluttons” and exporting this model underlies the early US enthusiasm for globalism.  As some of the implications of that became apparent, globalization clearly not the Americanization promised, enthusiasm became more restrained.

Betamax and VHS: from dominant to obsolescent to obsolete; the DVD may follow.

Although the trend began in the United States in the late 1950s, it was in the 1970s that the churn rate in consumer electronics began to accelerate, something accounted for partly by the reducing costs as mass-production in the Far East ramped up but also the increasing rapidity with which technologies came and went.  The classic example of the era was the so-called videotape format war which began in the mid 1970s after the Betamax (usually clipped to Beta) and Video Home System (VHS) formats were introduced with a year of each other.  Both systems were systems by which analog recordings of video and audio content cold be distributed on magnetic tapes which loaded into players with a cassette (the players, regardless of format soon known universally as video cassette recorders (VCR).  The nerds soon pronounced Betamax the superior format because of superior quality of playback and commercial operators agreed with it quickly adopted as the default standard in television studios.  Consumers however came to prefer VHS because, on most of the screens on which most played their tapes, the difference between the two was marginal and the VHS format permitted longer recording times (an important thing in the era) and the hardware was soon available at sometimes half the cost of Betamax units.

It was essentially the same story which unfolded a generation later in the bus and operating systems wars; the early advantages of OS/2 over Windows and Micro Channel Architecture (MCA) over ISA/EISA both real and understood but few were prepared to pay the steep additional cost for advantages which seemed so slight and at the same time brought problems of their own.  Quite when Betamax became obsolescent varied between markets but except for a handful of specialists, by the late 1980s it was obsolete and the flow of new content had almost evaporated.  VHS prevailed but its dominance was short-lived, the Digital Versatile Disc (DVD) released in 1997 which within half a decade was the preferred format throughout the Western world although in some other markets, the thriving secondary market suggests even today the use of VCRs is not uncommon.  DVD sales though peaked in 2006 and have since dropped by some 80%, their market-share cannibalized not by the newer Blu-Ray format (which never achieved critical mass) but by the various methods (downloads & streaming) which meant many users were able wholly to abandon removable media.  Despite that, the industry seems still to think the DVD has a niche and it may for some time resist obsolescence because demand still exists for content on a physical object at a level it remains profitable to service.  Opinions differ about the long-term.  History suggests that as the “DVD generation” dies off, the format will fade away as those used to entirely weightless content available any time, in any place won’t want the hassle but, as the unexpected revival of vinyl records as a lucrative niche proved, obsolete technology can have its own charm which is why a small industry now exists to retro-fit manual gearboxes into modern Ferraris, replacing technically superior automatic transmissions.

Sunday, August 13, 2023

Coupe

Coupe (pronounced koop or koo-pey (the latter used even if spelled without the “é”)).

(1) A closed, two-door car, sometimes on a shorter wheelbase than the four-door version on which they’re based.

(2) A four-door car with a lower or more elongated, sloping roofline than the model on which it’s based.

(3) An ice cream or sherbet mixed or topped with fruit, liqueur, whipped cream etc.

(4) A glass container for serving such a dessert, usually having a stem and a wide, deep bowl (similar in shape but usually larger than a champagne coupe).

(5) As champagne coupe, a shallow, broad-bowled saucer shaped stemmed glass also often used for cocktails because of their greater stability than many a cocktail glasses.

(6) A short, four-wheeled, horse-drawn, closed carriage, usually with a single seat for two passengers and an outside seat for the driver.

(7) The end compartment in a European diligence or railroad car with seats on one side only.

(8) In commercial logging, an area of a forest or plantation where harvesting of wood is planned or has taken place.

(9) In military use, as coupe gorge (a borrowing from French (literally “cut-throat”), any position affording such advantage to an attacking formation that the troops occupying it must either surrender or be “cut to pieces”.

(10) In various sports, a cup awarded as a prize.

(11) A hairstyle (always pronounced coop) which typically features shorter sides and back with longer hair on top.

1825–1835: From the French coupé (low, short, four-wheeled, close carriage without the front seat, carrying two inside, with an outside seat for the driver (also “front compartment of a stage coach”)), a shortened form of carrosse coupé (a cut-off or shortened version of the Berlin (from Berliner) coach, modified to remove the back seat), the past participle of couper (to cut off; to cut in half), the verbal derivative of coup (blow; stroke); a doublet of cup, hive and keeve, thus the link with goblets, cups & glasses.  It was first applied to two-door automobiles with enclosed coachwork by 1897 while the Coupe de ville (or Coup de ville) dates from 1931, describing originally a car with an open driver's position and an enclosed passenger compartment.

The earlier senses (wicker basket, tub, cask) date from 1375–1425, from the Middle English, from the Anglo-French coupe & cope, from the Old French coupe, from the Medieval Latin cōpa (cask), from the Latin cūpa (cask, tub, barrel), the ultimate source of the modern “cup” (both drink vessels and bras).  The Middle English cǒupe was from the Old Saxon kûpa & côpa, from the Old High German chôfa & chuofa, again from the from the Medieval Latin cōpa from the Latin cūpa.  It described variously a large wicker basket; a dosser, a pannier; a basket, pen or enclosure for birds (a coop); a cart or sled equipped with a wicker basket for carrying manure etc; a barrel or cask for holding liquids.  The obvious descendent is the modern coop (chickens etc).  Coupe is a noun; the noun plural is coupes.

Marie Antoinette and the unrelated champagne coupe.

The “coupe” hair-style (always pronounced coop) is one which typically features shorter sides and back with longer hair on top, the modern interpretations making a distinct contrast between the shorter and longer sections, the aim being the creation of sharp lines or acute angles.  The longer hair atop can be styled in various ways (slicked back, textured, or even the messy look of a JBF.  Historically, the coupe hairstyle was associated with men's cuts but of late it’s become popular with women, attracted by the versatility, low maintenance and the adaptability to suit different face shapes, hair types and variegated coloring.  Because outside the profession, there’s no obvious link between “coupe” and hair-styles, the term “undercut” is often used instead.  Unfortunately, despite the often-repeated story, there seems little to support the claim the wide-mouthed, shallow-bowled champagne coupe was modelled on one of Marie Antoinette's (1755–1793; Queen Consort of France 1774-1792) breasts.

Harold Wilson (1916–1995; UK prime minister 1964-1970 & 1974-1976) outside 10 Downing Street with his official car, a Rover 3.5 saloon.

In automobiles, by the 1960s, the English-speaking world had (more or less) agreed a coupe was a two door car with a fixed roof and, if based on a sedan, in some way (a shorter wheelbase or a rakish roof-line) designed put a premium on style over utility.  There were hold-outs among a few UK manufacturers who insisted there were fixed head coupes (FHC) and drop head coupes (DHC), the latter described by most others as convertibles or cabriolets but mostly the term had come to be well-understood.  It was thus a surprise when Rover in 1962 displayed a “four-door coupe”, essentially their 3 Litre sedan with a lower roof-line and a few “sporty” touches such as a tachometer and a full set of gauges.  Powered by a 3.0 litre (183 cubic inch) straight-six, it had been available as a four-door sedan since 1958 and had found a niche in that part of the upper middle-class market which valued smoothness and respectability over the speed and flashiness offered by the rakish Jaguars but, heavy and under-powered by comparison, even its admirers remarked on the lethargy of the thing while noting it was fast enough to over-tax the four-wheel drum brakes.  The engine did however set standards of smoothness which only the Rolls-Royce straight-sixes and the best of the various straight-eights could match but by 1959, both breeds were all but extinct so the Rover, with its by then archaic arrangement using overhead inlet and side-mounted exhaust valves had at least one unmatched virtue to offer.

Rover-BRM Gas-Turbine, Le Mans, 1965.

Although obviously influenced by the then stylish 1955 Chryslers, its conservative lines appealed to a market segment where such a thing was a virtue and reflected Rover’s image although it was a company with a history which included some genuine adventurism, their experimental turbine-engined cars in the early post-war years producing high performance, something made more startling by them being mounted in bodies using the same styling cues as the upright 3 Litre.  The company however discovered that whatever the many advantages, they suffered the same problems that would doom Chrysler’s turbine project, notably their thirst (although turbines do have a wide tolerance of fuel types) and the high costs of manufacturing because of the precision required, something hinted at by the Chrysler’s tachometer reading to 46,000 rpm while the temperature gauge was graduated to 1,700°F (930°C).  While such machinery was manageable on warships or passenger jets, to sell them to general consumers would have been too great a risk for any corporation and neither ever appeared in the showrooms although Chrysler’s research continued until 1979 and Rover co-developed a turbine race car which proved its speed and durability in several outings in the Le Mans 24 hour endurance classic.

Chop top: The Rover 3.5 Coupé (P5B).

For the public however, Rover upgraded the 3 Litre in a way which was less imaginative but highly successful, purchasing from General Motors (GM) the rights to the 3.5 litre (2.15 cubic inch), all-aluminum V8 which Buick, Oldsmobile & Pontiac had all used in their new generation of “compact” cars between 1960-1963.  For a variety of reasons, GM abandoned the project (to their later regret) and Rover embarked on their own development project, modifying the V8 to suit local conditions and the availability of components.  Remarkably, it would remain in production until 2006, used by several manufacturers as well as a legion of private ventures in capacity up to 5.0 litres (305 cubic inch) although megalomaniacs discovered that by using a mix-and-match of off-the-shelf parts, a displacement of 5.2 litres (318 cubic inch) was possible.  Lighter and more powerful than the long-serving straight-six, the V8 transformed the 3 Litre although Rover, with typical English understatement, limited themselves to changing the name to “3.5 Litre”, solving the potential of any confusion when the V8 was offered in the smaller 2000 by calling it the “3500”.

Although the factory never released one, privately some 3.5 Coupés have been converted to two-doors and there are even some cabriolets (ie drop head coupes).

Although the new engine couldn’t match the smoothness of the old, the effortless performance it imparted added to the refinement and fortunately, by the time the V8 was installed, disk brakes had been fitted and transformed by the additional power, it became an establishment favorite, used by prime-ministers and Queen Elizabeth II even long after it had been discontinued.  Even by the time the V8 version was released in 1967, it was in many ways a relic but it managed to offer such a combination of virtues that its appeal for years transcended its vintage aspects.  When the last was produced in 1973, that it was outdated and had for some time been obsolescent was denied by few but even many of them would admit it remained a satisfying drive.  One intriguing part of the tale was why, defying the conventions of the time, the low-roof variation of the four-door was called a coupé (and Rover did use the l'accent aigu (the acute accent: “é”) to ensure the “traditional pronunciation” was imposed although the Americans and others sensibly abandoned the practice).  The rakish lines, including more steeply sloped front and rear glass were much admired although the original vision had been more ambitious still, the original intention being a four-door hardtop with no central pillar.  Strangely, although the Americas and Germans managed this satisfactorily, a solution eluded Rover which had to content themselves with a thinner B-pillar.  

One way or another, windows have troubled the English: (1) the “window tax” imposed on houses during the eighteenth & nineteenth centuries a constant irritant to many, (2) the squircle (in algebraic geometry "a closed quartic curve having properties intermediate between those of a square and a circle") windows used in the early de Havilland Comets found to be a contributing factor in the catastrophic structural airframe failures which doomed the thing and the reason why oval windows are used to this day (mathematicians pointing out the Comet’s original apertures were not “quartic” as some claim on the basis of them being “a square with rounded corners”, the nerds noting “quartic” means “an algebraic equation or function of the fourth degree or a curve describing such an equation or function” and (3) even by the mid 1970s, Jaguar couldn’t quite get right the sealing on the frameless windows used on the lovely “two-door” versions (1975-1978)of the Jaguar & Daimler XJ saloons (which the factory insisted were NOT a coupé, presumably to differentiate them from the long-serving (1975-1995) but considerably less lovely XJ-S (later XJS).

Lindsay Lohan with Porsche Panamera 4S four-door coupe (the factory doesn't use the designation but most others seem to), Los Angeles, 2012.

The etymology of coupe is that it comes from couper (to cut off) but the original use in the context of horse-drawn coaches referred to the platform being shortened, not lowered but others have also been inventive, Cadillac for decades offering the Coupe De Ville (they used also Coupe DeVille) and usually it was built on exactly the same platform as the Sedan De Ville.  So Rover probably felt entitled to cut where they preferred; in their case it was the roof and in the early twentieth century, the four-door coupe became a thing, the debut in 2004 of the Mercedes-Benz CLS influencing other including BMW, Porsche, Volkswagen and Audi.  Whether the moment for the style has passed will be indicated by whether the current model, the last of which will be produced in August 2023, will be replaced.

Idiot

Idiot (pronounced id-ee-uht)

(1) In informal use (1) a foolish or senseless person (derogatory) or (2) an affectionate expression of disapprobation or disagreement.

(2) In medicine & psychology, a person of the lowest order in the former classification of intellectual disability (a mental age of less than three years old and an IQ (intelligence quotient) lower than 25; no longer in technical use; considered now generally offensive unless used affectionately.

1250–1300: From the Middle English idiote & ydiote, from the twelfth century Old French idiote (later idiot) (uneducated or ignorant person), from the Late Latin idiōta (an ignorant person), from the Ancient Greek διώτης (iditēs) (private person, layman, person lacking skill or expertise; an ignoramus (as opposed to a writer, soldier or skilled workman), the construct being idiō- (a lengthened variant of idio-, perhaps by analogy with stratiōtēs (professional soldier) derived from stratiá (army)) + -tēs (the agent noun suffix).  The Ancient Greek διος (ídios) meant " one's own, pertaining to oneself, private" and was a doublet of idiota.  Dialectical variations in English and Irish included eejit, idjit & idget.  The plural is idiots.  English offers a rich array of alternatives to idiot: fool, moron, nitwit, twit, blockhead, bonehead, cretin, dimwit, dork, dumbbell, dunce, ignoramus, imbecile, muttonhead, nincompoop, ninny, pinhead, simpleton, clodpoll, jerk, half-wit; dolt, dunce & numskull.

Use of the word "idiot" in headlines can hurt feelings.

The original meaning was “a person so mentally deficient as to be incapable of ordinary reasoning;" but this in Middle English later in the fourteenth century extended to "a simple man, uneducated person, layman".  A meaning shift had also happened in Latin, the classical form meaning “an ordinary person, layman; outsider" whereas in the Late Latin it conveyed the sense of "an uneducated or ignorant person".  This mirrored what happened with the Greek idiotes which meant literally "a private person" (ie a layman, someone uninvolved in public affairs) but came to be applied patronizingly to suggest someone "ignorant and uneducated".  In plural, the Greek word could mean "one's own countrymen."  In medieval English common law, the formalized distinction was between an idiot (one who has been without reasoning or understanding from birth) and a lunatic (who became that way later in life), and the difference could be important in determining the responsibility and punishment for crimes committed.  The idiot savant first appeared in medical literature in 1870; idiot box was first used to describe television in 1959 and, given that broadcasting had begun in the 1930s, it’s surprising it took that long to work that out; idiot light to describe the dashboard warning lights in cars is attested from 1961, a reference to drivers so lacking in mechanical sympathy not to notice indications of problems or bother to scan gauges.

The adjective idiotic was from 1713, following the Classical Latin idioticus and the Ancient Greek idiotikos; idiotical is from 1640s; the noun idiocy (state of being an idiot) is from the 1520s, from idiot on the model of prophecy etc and the early alternatives included idiotacy (1580s), idiotry (1590s).  Until well into the twentieth century, blithering was one of the more popular adjectives applied to idiot, the form dating from 1880, the present-participle adjective from the verb blither (to talk nonsense).  A handy adaptation of idiot was the in-joke among IT staff who sometimes classify problems reported by users as ID10T errors.

Comrade Lenin agitprop.

The term useful idiot is from political science and so associated with Vladimir Lenin (Vladimir Ilyich Ulyanov (1870–1924; first leader of Soviet Russia 1917-1922 & USSR 1922-1924) that it's attributed to him but there's no evidence he ever spoke or wrote the words.  It became popular during the Cold War to describe pro-communist intellectuals and apologists in the West, the (probably retrospective) association with Lenin probably because had the useful idiots actually assisted achieving a communist revolution there, their usefulness outlived, he'd probably have had them all shot.  Although it took many Western intellectuals decades to recant (some never quite managed) their support for the Soviet Union, the watershed was probably Comrade Khrushchev's (1894–1971; Soviet leader 1953-1964)  so called "Secret Speech" (On the Cult of Personality and Its Consequences) to the 20th Congress of the Communist Party of the Soviet Union on 25 February 1956 in which he provided a detailed critique of the rule of comrade Stalin (1878-1953; Soviet leader 1924-1953), especially the bloody purges of the late 1930s.  Some had however already refused to deny what had become obvious to all but avid denialists, The God that Failed a collection of six essays published in 1949 by poet Stephen Spender (1909-1995) et al in which the writers lay bare their sense of betrayal and disillusionment with communism because of the totalitarian state built by Stalin which was in so many ways just another form of fascism. 

Idiot, Imbecile & Moron

Idiot, imbecile, and moron were in the early twentieth century used in a psychological classification system, each one assigned to a specific range of abilities.

Idiots: Those so defective that the mental development never exceeds that or a normal child of about two years.

Imbeciles: Those whose development is higher than that of an idiot, but whose intelligence does not exceed that of a normal child of about seven years.

Morons: Those whose mental development is above that of an imbecile, but does not exceed that of a normal child of about twelve years.

Of these three words moron is the newest, created only in the early twentieth century, coined specifically for the purpose of medical diagnosis.  Moron is from the Ancient Greek mōros (foolish, stupid), the root shared with the rare morosoph (a learned fool).  Imbecile dates from the sixteenth century, an adjective meaning "weak, feeble", from the Classical Latin imbecillus (weak, weak-minded) and not until the early nineteenth century did it begin to be used as a noun.  Moran actually replaced “feeble-minded” and “simpleton” (introduced in 1846) but neither were ever standardised in the medical lexicon.  The clinical use is now obsolete but the generalized use of all three is well established as terms of opprobrium for someone who acts in some dopey way or says something stupid, but, the convention is now they can only be applied to someone not cognitively impaired, an inversion of their original purpose when part of the system of classification.

In the early 1900s, as the profession of psychiatry became more accepted within medicine, the system of classification became increasingly scientific: Idiots were those with IQs between 0–25, imbeciles between 26-50 and morons between 51–70.  The interest in the then fashionable field of eugenics saw further refinements with a teleological flavor: the concepts "moral insanity", "moral idiocy"," and "moral imbecility" used by the emerging field of eugenic criminology, which held crime could be reduced by preventing "feeble-minded" people from reproducing and the US Supreme Court used the terminology in the judgment of forced-sterilization case Buck v Bell (274 U.S. 200 (1927)). 

The later introduction of retard, retarded & retardation was a genuine attempt to de-stigmatize those once labeled idiots, imbeciles & morons.  The process was the same as the invented word moron replacing “simpleton” and “feeble-minded” (from the Latin flebilis (to be lamented).  Retarded was from the Latin retardare (to make slow, delay, keep back, or hinder) and was first used in relation to developmental delay in 1895 and was introduced as an alternative to idiot, moron, and imbecile because at the time it wasn’t derogatory, being a familiar technical term from engineering and mathematics but the associative connection meant that by the 1960s, it had become an insult.  As "retarded" and the related clinical terms from psychiatry appeared on the euphemism treadmill they gradually assumed their derogatory connotations.  It seems to be an organic process in language, an original term, neutral in meaning, enters public use and because of the thing with which it’s associated, becomes pejorative, the process noted also with words which become racial slurs.  It’s a very particular process: “Chinaman” thought pejorative while “Englishman” is not; “Aussie” a term of endearment whereas as “Paki” is a slur although that too is circumstantial, commercial television station Channel 9 (Australia) using “The Pakis” in their promotional material for the coverage of the 1983-1984 cricket season.  It wouldn’t now be used.

So, as sympathy emerged for various sensitivities, the search for connotatively neutral replacements settled on variations of “intellectual disability”, the new sub-categories being profound, severe, and moderate levels.  The World Health Organisation (WHO) in 1968 published (in an out-of-sequence amendment to the ICD-8 (International Statistical Classification of Diseases and Related Health Problems) a classification of intellectual disability (ID), based on what they called “relative degrees of cognitive functioning”:

Profound ID:          IQ below 20-25

Severe ID:             IQ 20-25 to 35-40

Moderate ID:         IQ 35-40 to 50-55

Mild ID:                 IQ 50-55 to 70

The alignment with the old system was idiot=profound, imbecile=moderate/severe and moron or feeble minded=mild but, by the time the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) and ICD-10 were published in 1994, the profession was moving away from the use of raw IQ scores to something more nuanced, the DSM noting the importance of assessing “dysfunction or impairment” in at least two areas including “communication, self-care, home living, social/interpersonal skills, use of community resources, self direction, functional academic skills, work, leisure & health and safety”.  The ICD noted “mental retardation is a condition of arrested or incomplete development of the mind, which is especially characterized by impairment of skills manifested during the developmental period, contributing to the overall level of intelligence- cognitive, language, motor and social abilities”.  However, the IQ baselines remained and the DSM-5 refined the model further, noting an intellectual disability should be defined by:

(1) Current intellectual deficits of two or more standard deviations below the population mean, which generally translates into performance in the lowest 3% of a person’s age and cultural group, or an IQ of 70 or below.

(2) Concurrent deficits in at least two domains of adaptive functioning of at least two or more standard deviations, which generally translates into performance in the lowest 3 % of a person’s age and cultural group, or standard scores of 70 or below.

Both assessments need to be measured with an individualized, standardized, culturally appropriate, psychometrically sound measure and needed to assess (1) conceptual skills (communication, language, time, money & academic), (2) social skills (interpersonal skills, social responsibility, recreation & friendships) and (3) practical skills (daily living skills, work & travel).  US legislation in 2010 required the terms "mental retardation" and" mentally retarded" be removed from federal records and replaced with "intellectual disability" and "individual with an intellectual disability", a change reflected in the DSM-5 (2013).

Saturday, August 12, 2023

Mandarin

Mandarin (pronounced man-duh-rin)

(1) In Imperial China, a member of any of the nine ranks of public officials, each distinguished by a particular kind of button worn on the cap.

(2) By extension, an influential or senior government official or bureaucrat.

(3) In informal (derogatory) use, a pedantic or elitist bureaucrat.

(4) By extension, a member of an elite or powerful group or class, as in intellectual or cultural milieus (usually but not necessarily paid officials of institutions and it tends to be derogatory).  The word is sometimes applied to any authority thought deliberately superior or complex; esoteric, highbrow or obscurantist.

(5) As “Standard Mandarin”, an official language of China and Taiwan, and one of four official languages in Singapore; Putonghua, Guoyu or Huayu (initial capital letter).

(6) A northern Chinese dialect, especially as spoken in and around Beijing (initial capital letter).

(7) A small, spiny citrus tree, Citrus reticulata, native to China, bearing lance-shaped leaves and flattish, orange-yellow to deep-orange loose-skinned fruit, some varieties of which are called tangerines; a small citrus tree (Citrus nobilis), cultivated for its edible fruit; the fruit of such tree, resembling small tangerines.

(8) In botany, any of several plants belonging to the genus Disporum or Streptopus, of the lily family, as S. roseus (rose mandarin) or D. lanuginosum (yellow mandarin), having drooping flowers and red berries.

(9) Of or relating to a mandarin or mandarins.

(10) In ornithology, an ellipsis of mandarin duck (Aix galericulata).

(11) Elegantly refined, as in dress, language or taste.

(12) A color in the orange spectrum.

(13) In ichthyology, as mandarin fish, the term applied to a number of brightly-colored species.

1580–1590: From the Portuguese mandarim & mandarij (or the older Dutch mandorijn), an alteration (by association with mandar (to order) of the Austronesian Malay menteri & manteri, from the Hindi mantrī and the Sanskrit मन्त्रिन् (mantrin) (minister, councillor), from मन्त्र (mantra), (counsel, maxim, mantra) + -इन् (-in) (an agent suffix).  In Chinese folk etymology, the word originates from Mandarin 滿大人/满大人 (Mǎndàrén (literally “Manchu (important man”)).  Mantra was ultimately from the primitive Indo-European root men- (to think) and the evolution of mandarin (in the sense of Chinese civil administration) was influenced in Portuguese by (mandar) (to command, order).  It was used generically of the several grades of Chinese officials who had entered the civil service (usually by way of the competitive exam); the Chinese equivalent was kwan (public servant) and by the early twentieth century it came to be used of “an important person” though often in a resentful manner rather than the sense of “a celebrity”.  The use to describe the small fruit was first noted in 1771 and was from the French mandarine, feminine of mandarin, based on the association with the color often used for the robes worn by mandarins in the Chinese civil service.  Mandarin, mandarinship, mandarinism & mandarinate are nouns, mandarinal is an adjective; the noun plural is manderins.

Lindsay Lohan in mandarin collar The Parent Trap (1998).  It wouldn't now be done because of fears of being cancelled for cultural appropriation.

In fashion, the mandarin collar (a short unfolded stand-up collar on a shirt or jacket) was a style adopted by Western fashion houses and said to be reminiscent of (though sometimes with exaggerated dimensions) the style depicted in the clothing of mandarins in Imperial China. The mandarin gown (technically a cheongsam which was actually from the Cantonese 長衫/长衫 (coeng saam) (long robe) was (1) a tight-fitting and usually brightly colored and elaborately patterned formal woman's dress, split at the thigh (known also as a qipao) & (2) a plain colored, tight-fitting dress with a short split at the thigh, worn as a school uniform by schoolgirls in Hong Kong.  Some dictionaries and food guides include “Mandarin cuisine” as a descriptor of the food associated with the area around Beijing but there’s little evidence of use and the main distinction in the West seems to be between Beijing cuisine and Cantonese cuisine from the south.  However, “Mandarin” is a most popular element in the names of Chinese restaurants in the West.

Lindsay Lohan mixing a Red Bull & mandarin juice while attending an event with former special friend Samantha Ronson, Mandarin Oriental Hotel, London, February 2012.

The use to describe the standard language of the nation was a calque of the Chinese 官話/官话 (Guānhuà) (spoken language of the mandarins), as an extension from mandarin (bureaucrat of the Chinese Empire) to the language used by the imperial court and sometimes by imperial officials elsewhere; from this, it was in the twentieth century adopted as a synonym for “Modern Standard Chinese” although academics and translators note the ambiguity which developed after the use was extended in the early seventeenth century to a number of northern dialects of Chinese to the extent they consider Manderin a branch of the Chinese languages and consisting of many dialects; Guanhua or Beifanghua.  Standard Mandarin (the language of the elites, media and education) and Mandarin Chinese (the group of Northern Chinese dialects together with Standard Mandarin) are not wholly interchangeable and within China are described differently.

Mandarin duck.

There are some forks of Mandarin Chinese which, but for a few words and phrases, are unintelligible to speakers of Standard Mandarin and the whole set of Mandarin languages are parts of the broader group of languages and dialects (or topolects) spoken.  The evolution of Mandarin to become both the imperial lingua franca and the official “court language” of the Ming and Qing dynasties was in part a pragmatic solution to the mutual unintelligibility of the varieties of spoken Chinese which had emerged over centuries.  It became prevalent during the late Ming (1368-1644) and early Qing (1636-1912) eras and, because of the centralization of Imperial administration, the particular court dialect spoken in Beijing became dominant by the mid-nineteenth century and substantially formed what was by the time of the formation of the People’s Republic of China (PRC) under the Chinese Communist Party (CCP) in 1949, it was “Standard Chinese”.