Tuesday, July 12, 2022

Googly

Googly (pronounced goo-glee)

(1) In cricket, a bowled ball that swerves in one direction and breaks in the other; an off-break bowled with a leg break action.  The delivery was once also known as the bosie or bosie-ball and is now commonly referred to as the wrong'un.

(2) Something protruding; exophthalmic (rare).

(3) A slang term for bulging eyes (archaic).

Circa 1880s: Apparently an invention of modern English but the origin is uncertain.  It may have been from the mid-nineteenth century use of goggle (to stare at admiringly or amorously'' although google was during the same era used to describe the Adam's apple, derived presumably from the sense of eyes which are thought similarly protruding, either literally or figuratively, a meaning alluded to by a popular hero in a contemporary comic strip being named “Goo” (suggesting ''sentimental, amorous'').  Googly (and a number of variant spellings) does appear around the turn of the twentieth century to have been in common use as an adjective applied to the eyes.  The numerical value of googly in Chaldean numerology is 2 and in Pythagorean numerology, 5.  Googly is a noun & adjective; the noun plural is googlies.

Bernard Bosanquet sending down a googly.

In cricket, a googly is a type of delivery bowled by a right-arm leg spin bowler.  It differs from the classic leg spin type delivery, in that the ball is propelled from the bowler's hand in a way that upon hitting the pitch, it deviates in a direction opposite from that the batter expects (ie towards the leg rather than the off stump).  Usually now called the wrong'un, it was once also called the bosie, the eponym referring to English cricketer Bernard Bosanquet (1877-1936) who is believed to have invented the action.  That the googly is Bosanquet’s creation is probably true in the sense that he was the one who practiced the delivery, learning to control and disguise it so it could be deployed when most useful.  However, cricket being what it is, it’s certain that prior to Bosanquet, many bowlers would occasionally (and perhaps unintentionally have bowled deliveries that behaved as googlies but, being something difficult to perfect, few would have been able to produce it on demand.  What Bosanquet, uniquely at the time, did was add it to his stock repertoire which inspired other bowlers to practice.

The “googly problem” also exists in “twister theory”, one of the many esoteric branches of theoretical physics understood only by a chosen few.  In twister theory, the “googly problem” is nerd shorthand for what’s properly known as “the gravitational problem”, an allusion to certain observed behavior differing from that which would be predicted by the mysterious twister theory, rather the same experience suffered by the batter in cricket who finds his leg stump disturbed by a ball he had every reasonable expectation would harmlessly go through to the keeper down the off side.  As one might expect of a work which involves a "googly problem", the author was an English mathematician, the Nobel laureate in physics, Sir Roger Penrose (b 1931).  It's presumed one of his pleasures has been explaining the googly reference to physicists from places not well acquainted with the charms of cricket.

Bosanquet appears to have perfected his googly between 1897-1901 and, as a noun, the word has been attached to it since shortly afterwards, documented in publications in England, Australia and New Zealand from circa 1903.  However, that was just the point at which a certain delivery which was so distinctive to demand an identifier came to be regarded as the definitive googly, a word which had long been used to describe cricket balls which behaved erratically off the pitch, a use perhaps based on the long use of “google-eyed” to refer to a surprised expression (ie those of the bamboozled batter).  Some etymologists have pondered whether the construct might thus be the phonetic goo + guile but few have ever supported this.  Googly was by the late-nineteenth century a well-known word used adjectively to describe spin-bowling which proved effective but there’s no suggestion it implied a particular type of delivery.  Googly and the odd variant seem to have been a way of refereeing to balls which relied on a slow delivery and spin through the air to turn off the pitch as opposed to those bowled fast or at medium pace, using the seam on the ball to achieve any movement.  All the evidence suggests the word was used to convey some unusual behavior.

Match report, day three of the second test of the 1891-1892 Ashes series, published in the Australian Star (Sydney), 1 February 1, 1892.

Here, in the one match report are deliveries described both as being googly (ie used as an adjective) and the googler (a noun) but there’s nothing here or anywhere else to suggest either is anything more specific than a reference to beguiling slow bowling.  Everything suggests the existence of both the noun and adjective was deliberate rather than some sloppy antipodean sub-editing.  Whatever the nuances of Briggs' bowling however, it must have been effective because in the same match he took what was then only the third hat-trick (a bowler taking wickets with three successive balls in the one innings) in Test cricket.  There has been the suggestion the adjectival use (apparently an Australian invention) as "googly ones" was an allusion to the idea of how a cricket ball might behave if egg-shaped, this based on the then widely-used local slang "googie" used for eggs.  Googie was from the Irish and Scottish Gaelic gugaí, gogaí & gogaidh (a nursery word for an egg).  Although wholly speculative, the notion has received support but more popular is the idea it was based on the use of googly in the manner of "googly-eyed", slang for eyes which bulged or were in some way irregular.

Match report of a club game, the Australian Star (Sydney), 1 February 1, 1892.

The report of the test match in 1892 isn’t the first known reference to the googly, the word appearing (again adjectively) in The Leader (Melbourne) on 19 December 1885 in a match report of a club game although, despite noting the bowler’s success in taking two wickets, the writer seemed less than impressed with the bowling.  Although googly is now less common ("wrong'un" now often preferred), it is at least gender-neutral and therefore unlikely to fall foul of the language police and be banned; the fate of batsman, fieldsman and all the positional variations of third man (though "silly point" and other "sillies" are still with us).  Nor is there any hint of the ethnic insensitivity which doomed the “chinaman” (a left-arm unorthodox spin), now dumped in the bin of of words linked with colonial oppression.

Monday, July 11, 2022

Bottomage

Bottomage (pronounced bot-uh m-ree)

In the marine insurance division of Admiralty law, a contract, of the nature of a mortgage, by which the owner of a ship borrows money to make a voyage, pledging the title of the ship as security.

1615-1625: From Middle English as as an addition to Admiralty law, modelled on the Dutch bodemerij, equivalent to bodem (bottom; hull of a ship) + -erij (–ry).  Bottom is from the Middle English botme & bottom (ground, soil, foundation, lowest or deepest part of anything), from the Old English botm & bodan (bottom, foundation; ground, abyss), from the Proto-Germanic buthm, butmaz & budmaz, from the primitive Indo-European bhudhno (bottom).  It was cognate with Old Frisian boden (soil), the Old Norse botn, the Dutch bodem, the Old High German & German Boden (ground, earth, soil), the Icelandic botn and the Danish bund and was related also to the Irish bonn (sole (of foot)), the Ancient Greek πυθμήν (puthmn or pythmen) (bottom of a cup or jar), the Sanskrit बुध्न (budhna) (bottom), the Avestan buna, the Persian بن‎ (bon) (bottom) and the Latin fundus (bottom, piece of land, farm), from which, via French, English gained “fund”.  The suffix -age was from the Middle English -age, from the Old French -age, from the Latin -āticum.  Cognates include the French -age, the Italian -aggio, the Portuguese -agem, the Spanish -aje & Romanian -aj.  It was used to form nouns (1) with the sense of collection or appurtenance, (2) indicating a process, action, or a result, (3) of a state or relationship, (4) indicating a place, (5) indicating a charge, toll, or fee, (6) indicating a rate & (7) of a unit of measure.  Bottamage is a noun; the noun plural is bottomages.

The sense of bottom as “posterior of a person (the sitting part)” is from 1794; the “verb to reach the bottom of” from 1808 and the expression “bottom dollar (the last dollar one has) is from 1857.  The meaning "fundamental character or essence" is from the 1570s and the variation “to get to the bottom of some matter” is from 1773; “bottoms up” as the call to finish one's drink is from 1875 while to do or feel something from “the bottom of (one's) heart” is from 1540s.  The bottom-feeder, originally a technical term in the classification of fishes, dates from 1866, the figurative sense ("one of the lowest status or rank" or an "opportunist who seeks quick profit usually at the expense of others or from their misfortune") noted from 1919.  Bottomage also sometimes appears in Australia as an alternative spelling of "bottom-age" (used in aged based sporting competitions to list the oldest age permitted to participate).

On the bottom.

Bottomage (sometimes referred to as bottomry), is a financing arrangement in maritime law whereby the owner or master of a ship borrows money “upon the bottom (or keel) of it” with an agreement to forfeit the ship itself to the creditor if the loan and interest is not paid at the time nominated, after the ship's safe return.  The contracts tended to be executed when a ship in a foreign port needed emergency repairs and it wasn’t possible to arrange funds in other ways.  Now rare because developments in maritime law discounted the bottomage bond's priority as against other liens and improvements in communications made international money transfers more efficient.  Hardly used since the nineteenth century and now of only historic interest.

It was an unusual, hybrid form of financing and one almost wholly peculiar to the pre-modern sea-trade.  It wasn’t a conventional loan because the lender accepted part of the risk, ships sinking not infrequently.  Nor was it insurance because there was nothing which explicitly secured the risk to the merchant's goods.  Bottomage can be thought of as a type of futures contract in that the insurer has purchased an option on the venture's final profit.  The risk being greater, a bottomage bond giving no remedy to the lender against the owners of the ship or cargo personally, rates were always much higher than the historic trading average of around 12%.

Doctors' Commons (1808), the High Court of Admiralty in session, Designed and etched by Thomas Rowlandson (1757–1827) & Auguste Charles Pugin (1768–1832 London), aquatint by John Bluck (1791–1832), Lambeth Palace Library collection, London.

The Admiralty Court in England dates from the mid- fourteenth century and its creation appears linked to the victory of Edward III’s (1312–1377; King of England 1327-1377) fleet in the battle of Sluys in 1340, one of the opening engagements of the Hundred Years' War (1737-1453).  A specialist tribunal which appears to have been charged with keeping peace at sea and dealing with piracy, as the High Court of Admiralty it developed its own distinct procedures and practices and was attended by a specialist group of solicitors (called proctors), its advocates educated in civil rather than common law; those trained only in common law not permitted to appear.

Bottamage re-imagined, Lindsay Lohan at the beach.

The advocates of the Admiralty Court were all Doctors of Law and were variously described as belonging to the College of Advocates, the College of Civilians or the Society of Doctors' Commons and specialized in ecclesiastical and civil law.  They were admitted to practice by the Dean of Arches who served the Archbishop of Canterbury and, practicing from Doctors’ Commons, cluster of buildings on Knightrider Street between St Paul’s cathedral and the north bank of the Thames they were most concerned with Admiralty and Church law although the advocates also verified and stored documents such as wills and marriage and divorce certificates.  The Doctors’ Commons was unusual in that while it resembled a modern Inn of Court in that it housed a library, a dining hall and rooms from which lawyers practiced, it also contained a court-room where the Admiralty Judge sat.  The arrangement persisted until the reforms of the Victorian Judicature Acts (1873-1875), the College of Advocates abolished in 1865 and the High Court of Admiralty transferred to became part of the unified High Court in 1875 although the tradition of a specialist Admiralty Judge and a specialist Admiralty Bar continues to this day.  In the US, one unique quirk of admiralty courts seemed to one lawyer to offer a possibility, the argument being a judgement should be set aside because the flag hanging in the courtroom didn't have the traditional fringe and thus was not properly constituted.  This the judge rejected and no attempt was made to seek leave to appeal.

The symbol of the Admiralty Court is the Admiralty Oar, traditionally displayed in court when a trial is in progress.

After the passage of the Judicature Acts, Admiralty jurisdiction moved to the newly created division of Divorce, Probate and Admiralty, referred to within the profession as the 3W (wives, wills & wrecks) and this lasted until the 1970 Administration of Justice Act which shifted divorce to the Family Division and probate to Chancery.  The Admiralty Court became part of the Queen’s Bench Division and claims are now dealt with by one of its two judges: the Admiralty Judge and the Admiralty Registrar, the arrest and release of ships handled by the Admiralty Marshal.

Sunday, July 10, 2022

Lede

Lede (pronounced leed)

(1) In hot-print typesetting, a deliberate misspelling of “lead”, created to avoid the possibility “lead” being used as text rather than being understood as an instruction.

(2) In journalism, a short summary serving as an introduction to a news story, article or other copy.

(3) In journalism, the main and thus the news item thought most important.

(4) A man; a person (and many related meanings, all obsolete).

Mid twentieth century:  An altered spelling of lead (in the sense used in journalism: “short introductory summary”), used in the printing trades to distinguish it from the homograph lead (in the sense of the “thin strip of type metal for increasing the space between lines of type” which was made of lead (pronounced led (rhyming with “dead”)) and use in this context has always been was rare outside the US.  The historic plural form (Middle English) was lede but in modern use in journalism it’s always ledes.

The historic use has been obsolete since the late sixteenth century.  The Middle English lede & leode (variously: man; human being, person; lord, prince; God; sir; group, kind; race; a people, nation; human race; land, real property, the subjects of a lord or sovereign; persons collectively (and other forms)) were connected to three closely related words (1) the Old English lēod (man; chief, leader; (poetic) prince; a people, people group; nation), (2) the Old English lēoda (man; person; native of a country (and related to lēod)) and (3) the Old English lēode (men; people; the people of a country (which was originally the plural of lēod)).  Lēod was derived from the Proto-West Germanic liud & liudi, from the Proto-Germanic liudiz (man; person; men; people), from the primitive Indo-European hléwdis (man, people) from hlewd- (to grow; people).  The bafflingly large array of meanings of lede in the Middle English is perhaps best encapsulated in the phrase “in all lede” (all the world).  It was related too to the Northumbrian dialectical form lioda (men, people) and was cognate with the German Leute (nation, people) and the Old High German liut (person, people), from the primitive Indo-European root leudh- (people), the source also of the Old Church Slavonic ljudu and the Lithuanian liaudis (nation, people).  Care must be taken if using lede in Sweden.  In the Swedish, lede was from the nominal use (masculine inflection) of the adjective led (evil), used in the synonym den lede frestaren (the evil tempter), understood as “the evil one, the loathsome or disgusting one; the devil, Satan”.

In modern use, lede was a deliberate misspelling of lead, one of a number of words created so they could be used in the instructions given to printers while avoid the risk they might appear in the text being set.  Lede could thus be used to indicate which paragraphs constitute the lede while avoiding any confusion with the word “lead” which might be part of the text to appear in an article.  The other created forms were “dek” (subhead or sub-heading (modified from “deck”)), hed (headline (modified from head)) and kum (used as “lede to kum” or “lede 2 kum” (a placeholder for “lead to come”)).  The technical terms were of some significance when the hot-typesetting process was used for printing.

It’s not clear when lede came into use in the printing trades, the dictionaries which maintain a listing are not consistent and the origin is suggested to date from anywhere between 1950-1976.  The difficulty is determining the date of origin is that the documents on which the words (lede, dek, hed, kum) appeared were ephemeral, created to be used on the day and discarded.  Lede move from the print factory floor to the newsroom to become part of the jargon of journalism, used to describe the main idea in the first few lines of a story, a device to entice a reader to persist (the idea of click-bait nothing new).  In much of the English-speaking world, the word in this context is spelled “lead” but lede remains common in the US.  In either form, it’s entered the jargon, the phrase "to bury the lede" meaning “unwittingly to neglect to emphasize the very most important part of the story (about an editor’s most damning critique) and “if it bleeds it ledes” means a dramatic story, preferably with explicit images, will be on the front page or be the first item of a news bulletin.

Beyond journalism, lede is sometimes used.  An auction site might for example have a “lede photograph”, the one which is though the most tempting and this attract the biggest audience (ie more click-bait).  In general use, it’s probably not helpful and likely only to confuse.  Henry Fowler (1858-1933) would doubtless have included its general use among what in his Dictionary of Modern English Usage (1926) he derided as a “pride of knowledge”: “...a very un-amiable characteristic and the display of it should sedulously be avoided” and to illustrated his point he said such vanities included “didacticism, French words, Gallicisms, irrelevant allusions, literary critics’ words, novelty hunting, popularized technicalities, quotations, superiority & word patronage.”  Ones suspects that outside of the print-shop, little would Henry Fowler have tolerated lede.

Others agree, suggesting the old sense from Middle English is now used only as a literary or poetic device and it’s otherwise the sort of conscious archaism of which Henry Fowler disapproved, a thing used only to impress others and beyond that, the modern form should never travel beyond wherever as a form of jargon it makes sense.  Still, there were word nerds who though it might have a future.  The US author William Safire (1929–2009) seemed to find hope in its prospects and even coined “misledeing” which might be tempting for editors searching for novel words with which berate neophyte writers.

Saturday, July 9, 2022

Inflation

Inflation (pronounced in-fley-shuhn)

(1) In economics, a persistent, substantial rise in the general level of prices, often related to an increase in the money supply, resulting in the loss of value of currency.

(2) Of or pertaining to the act of inflating or the state of being inflated.

(3) In clinical medicine, the act of distending an organ or body part with a fluid or gas.

(4) In the study of the metrics of educational standards, an undue improvement in academic grades, unjustified by or unrelated to merit.

(5) In theoretical cosmology, an extremely rapid expansion in the size of the universe, said to have happened almost immediately after the big bang.

1300-1350: From the Middle English inflacioun & inflacion, from the Old French inflation (swelling), from the Latin inflationem (nominative īnflātiō) (expansion; a puffing up, a blowing into; flatulence), noun of action from the past participle stem of inflare (blow into, puff up) and thus related to from īnflātus, the perfect passive participle of īnflō (blow into, expand).  The construct of the figurative sense (inspire, encourage) was in- (into) (from the primitive Indo-European root en (in)) + flare (to blow) (from the primitive Indo-European root bhle- (to blow)).  The meaning "action of inflating with air or gas" dates from circa 1600 while the monetary sense of "a sustained increase in prices" replaced the original meaning (an increase in the amount of money in circulation), first recorded in US use in 1838,  The derived noun hyperinflation dates from 1925 when it was first used to describe the period of high inflation in Weimar Germany; earlier, surgeons had used the word when describing certain aspects of lung diseases.  The adjective inflationary was first used in 1916 as a historic reference to the factors which caused a rapid or sustained increase in prices.

The early meaning related to flatulence, the sense of a “swelling caused by gathering of "wind" in the body” before being adopted as a technical term by clinicians treating lung conditions.  The figurative use as in "outbursts of pride" was drawn directly from the Latin inflationem, nominative inflatio, as a noun of action from past participle stem of inflare (blow into; puff up).  The now most common use beyond the tyre business, that of economists to describe statistically significant movement in prices is derived from an earlier adoption by state treasuries to measure volume of money in circulation, first recorded in 1838 in the US; the money supply is now counted with a number of definitions (M1, M3 etc).  The first papers in cosmological inflation theory were published in 1979 by Cornell theoretical physicist Alan Guth (b 1947).

Cosmic Inflation

Cosmic inflation is a theory of exponential expansion of space in the early universe.  This inflationary period is speculated to have begun an indescribably short time after the start of the big bang and to have been about as brief.  Even now, space continues to expand, but at less rapid rates so the big bang is not just a past event but, after fourteen billion-odd years, still happening.

Definitely not to scale.

One implication of the scale of the expansion of space is the speculation that things, some of which may have been matter, may have travelled faster than the speed of light, suggesting the speed of light came into existence neither prior to or at the start of the big bang but after, possibly within a fraction of a second although, within the discipline, other models have been built.  The breaking of the Einsteinian speed limit may suggest conditions in that first fraction of a second of existence were so extreme the laws of physics may not merely have been different but may not have existed or have been even possible.  If that's true, it may be nonsensical to describe them as laws.  Matter, energy and time also may have come into existence later than the start of the big bang.

The theory has produced derivatives.  One notion is, even though it’s possible always to imagine an equation which can express any duration, time may not be divisible beyond a certain point; another that there can never exist a present, only a past or a future.  Perhaps most weird is the idea the (often labeled chaotic but actually unknown) conditions of the very early big bang could have progressed instead to expanding space but without matter, energy or time.  Among nihilists, there’s discussion about whether such a universe could be said to contain nothing, although an even more interesting question is whether a genuine state (non-state?) of nothing is possible even in theory.

Price Inflation

In economics, inflation is in the West is suddenly of interest because the rate has spiked.  The memories are bad because the inflation associated with the 1970s & 1980s was finally suppressed by central banks and some political realists good at managing expectations combining to engineer recessions and the consequent unemployment.  After that, in advanced economies, as inflation faded from memory to history, there tended to be more academic interest in the possibility deflation might emerge as a problem.  As the Bank of Japan discovered, high inflation was a nasty thing but experience and the textbooks at least provided case-studies of how it could be tamed whereas deflation, one established and remaining subject to the conditions which led to its existence, could verge on the insoluble.

In most of the West however, deflationary pressures tended to be sectoral components of the whole, the re-invented means of production and distribution in the Far East exporting unprecedented efficiencies to the West, the falling prices serving only to stimulate demand because they happened in isolation of other forces.  However, the neo-liberal model which began to prevail after the political and economic construct of the post-World War II settlement began to unravel was based on a contradictory implementation of supply-side economics: Restricting the money supply while simultaneously driving up asset prices.  That was always going to have consequences (and there were a few), one of which was the GFC (global financial crisis (2008-circa 2011)) which happened essentially because the rich had run out of customers with the capacity to service loans and had begun lending money to those who were never going to be able to pay it back.  Such lending has always happened but at scale, it can threaten entire financial infrastructures.  Whether that was actually the case in 2008 remains a thing of debate but such was the uncertainty at the time (much based on a widespread unwillingness of many to reveal their true positions) that everyone’s worst-case scenarios became their default assumption and the dynamics which have always driven markets in crisis (fear and stampede) spread.

What was clear in the wake of the failure of Lehman Brothers (1847-2008) was that much money had simply ceased to exist, a phenomenon discussed by a most interested Karl Marx (1818-1883) in Das Kapital (1867–1883) and while losses were widespread, of particular significance were those suffered by the rich because it was these which were restored (and more) by what came to be called quantitative easing (QE), actually a number of mechanisms but essentially increasing the money supply.  The text books had always mentioned the inflationary consequences of this but that had been based on the assumption that the supply would spread wide.  The reason the central bankers had little fear of inducing inflation (as measured by the equations which have been honed carefully since the 1970s so as not to frighten the horses) was that the money created was given almost exclusively to the rich, a device under which not only were the GFC losses made good but the QE system (by popular demand) was maintained, the wealth of rich increasing extraordinarily.  It proved trickle-down economics did work (at least as intended, a trickle being a measure of a very small flow), the inequalities of wealth in society now existing to an extent not seen in more than a century.

Salvator Mundi (circa 1500) by Leonardo da Vinci.  Claimed to be the artist's last known painting, in 2017 it sold at auction in 2017 for US$450.3 million, still a record and more than double that achieved by the next most expensive, Picasso’s Les femmes d’Alger (Version ‘O’), which made US$179.4 million in 2015.

Post GFC inflation did happen but it was sectorally specific, mansions and vintage Ferraris which once changed hands for a few million suddenly selling for tens of millions and a Leonardo of not entirely certain provenance managed not far from half a billion.  The generalized inflationary effect in the broad economy was subdued because (1) the share of the money supply held by the non-rich had been subject only to modest increases and (2) the pre-existing deflationary pressures which had for so long been helpful continued to operate.  By contrast, what governments were compelled (for their own survival) to do as the measures taken during the COVID-19 pandemic so affected economic activity, had the effect of increasing the money supply in the hands of those not rich and combined with (1) low interest rates which set the cost of money at close to zero, (2) pandemic-induced stresses in labour markets and supply and distribution chains and (3) the effects of Russia’s invasion of Ukraine created what is now called a “perfect storm”.  The inflation rate was already trending up even before the invasion but it has proved an accelerant.  In these circumstances, all that can be predicted is that the text-book reaction of central banks (raising interest rates) will be (1) a probably unavoidable over-reaction to deal with those factors which can be influenced by monetary policy and (2) will not affect the geopolitical factors which are vectors through which inflation is being exported to the West.  Central banks really have no choice other than to use the tools at their disposal and see what happens but the problem remains that while those tools are effective (if brutish) devices for dealing with demand-inflation, their utility in handling supply-inflation is limited. 

First world problem: It’s now hard to find a Ferrari 250 GTO for less than US$70 million.

Friday, July 8, 2022

Scrunchie

Scrunchie (pronounced skruhn-chee)

An elastic band covered with gathered fabric, used to fasten the hair, as in a ponytail (usually as scrunchie).

1987: The construct was scrunch + -ie.  Scrunch has existed since the early nineteenth century, either an intensive form of crunch; ultimately derived from the onomatopoeia of a crumpling sound or a blend of squeeze + crunch.  The suffix -ie was a variant spelling of -ee, -ey & -y and was used to form diminutive or affectionate forms of nouns or names.  It was used also (sometimes in a derogatory sense to form colloquial nouns signifying the person associated with the suffixed noun or verb (eg bike: bikie, surf: surfie, hood: hoodie etc).  Scrunchie is now used almost exclusively to describe the hair accessory.  Historically, the older form (scrunchy) was used for every other purpose but since the emergence of the new spelling there’s now some overlap.  Never fond of adopting an English coining, in French, the term is élastique à cheveux (hair elastic).  The alternative spelling is scrunchy (used both in error and in commerce).  Scrunchie is a noun; the noun plural is scrunchies.  The adjectives scrunchier & scrunchiest are respectively the comparative & superlative forms of scrunchy; as applied to scrunchies, they would be used to describe a scrunchie's relative degree of scrunchiness (a non-standard noun).

Mean Girls (2004) themed scrunchies are available on-line.

It's not unlikely that in times past, women took elastic bands (or something with similar properties) and added a decorative covering to create their own stylized hair-ties but as a defined product, the scrunchie appears to have first been offered as a commercial product 1963-1964 but the design was not patented until 1987 when night club singer Rommy Hunt Revson (1944–2022) named the hair accessory the Scunci (after her pet toy poodle), the name scrunchie an organic evolution because the fabric "scrunched up".  They were very popular in the 1990s although some factions in the fashion industry disparaged them, especially offended (some people do take this stuff seriously) if seeing them worn on the wrist or ankle.  The scrunchie is a classic case-study in the way such products wax and wane in popularity, something wholly unrelated to their functionality or movements in price; while the elasticity of scrunchies has remained constant, the elasticity of demand is determined not by supply or price but by perceptions of fashionability.  When such a low-cost item becomes suddenly fashionable, influenced by the appearance in pop-culture, use extends to a lower-income demographic which devalues the appeal and the fashion critics declare the thing "a fashion faux pas" and use declines among all but those oblivious of or indifferent to such rulings.  In the way of such things however, products often don't go away but lurk on the edges of respectability, the comebacks variously ironic, part of a retro trend or something engineered by industry, the tactics now including the use of the millions of influencers available for hire.          

The Dinner Scrunchie

A more recent evolution is the enlarged version called the Dinner Scrunchie, so named, the brand suggests, because it's "elevated enough to wear to dinner".  They're available from MyKitsch in black and designer colors and, covered with semi-sheer chiffon, they're eight inches (200mm) in diameter, about the size of a small dinner plate.  Breakfast and lunch scrunchie seem not to be a thing but those gaps in the market are catered to by the Brunch Scrunchie which while larger than most, is smaller and cheaper than the dinner version; it appear to be about the size of a bread & butter plate.    

Rita Ora (b 1990) in fluoro scrunchie, New York City, May 2014.

The most obvious novelty of the bigger scrunchies is of course the large size and because that translates into greater surface area, in the minds of many, thoughts turn immediately to advertising space.  There are possibilities but because of the inherent scrunchiness, they're really not suitable for text displays except perhaps for something simple like X (formerly known as Twitter) although Elon Musk (b 1971) probably thinks whatever else X may require, it's not brand awareness or recognition.  Where a message can be conveyed with something as simple as a color combination (such as the LGBTQQIAAOP's rainbow flag), the scrunchie can be a good billboard.

Thursday, July 7, 2022

Emerge

Emerge (pronounced ih-murj)

(1) To come forth into view or notice, as from concealment or obscurity

(2) To rise or come forth from or as if from water or other liquid.

(3) To come up or arise, as a question or difficulty.

(4) To come into existence; develop.

(5) To rise, as from an inferior or unfortunate state or condition.

1560s: From the Middle French émerger from the Latin ēmergere (bring forth, bring to light (intransitively "arise out or up, come forth, come up, come out, rise"), an assimilated form of the construct ē- (a variant of the suffix ex- (out, forth)) + mergere (to dive, dip, sink).  The notion was of rising from a liquid by virtue of buoyancy.  The verb re-emerge (also as reemerge) (to emerge again or anew) dates from 1775 and other forms of this evolved: re-emerged (1778), re-emerging (1088) & re-emergence (1801).  The noun emergence dates from the 1640s, initially in the sense of "an unforeseen occurrence" from the French émergence, from the Middle French émerger, from the Latin ēmergere; the modern meaning (process of coming) emerged in 1704, the meanings co-existing until by circa 1735 the new had entirely replaced the old.  The noun emersion (reappearance, act of emerging) dates from the 1630s; it was a noun of action from past-participle stem of Latin ēmergere and originally was used of eclipses and occultations.  The adjective emergent was actually quite old, noted from the late fourteenth century in the sense of “rising from what surrounds it, coming into view", from the Latin emergentem (nominative emergens), present participle of ēmergere.  The present participle is emerging and the past participle emerged)

Emerge, emanate or issue all mean to come forth but differ in nuance.  Emerge is used of coming forth from a place shut off from view, or from concealment, or the like, into sight and notice in the sense the sun emerges from behind the clouds.  Emanate is used of intangible things, as light or ideas, spreading from a source and both gossip and rumors can emanate from sources reliable or otherwise.  Issue is often used of a number of persons, a mass of matter, or a volume of sound or the like, coming forth through any outlet or outlets; smoke is said to issue from a chimney, something, if it be white, associated with the emergence of a new Roman Catholic Pope 

Emerging from the magic circle

With Boris Johnson’s curious, slow-motion resignation as prime-minister of the UK, attention begins to turn to the way Conservative and Unionist (Tory) Party elects its next leader to become the latest custodian of one of the world’s nuclear arsenals.  These days, it takes longer than once it did for the Tories to do this for it’s now an exhaustive process consisting of (1) a multi-round contest in which all members of parliament (MPs) vote for however many of their colleagues have nominated, the unfortunate chap (and these days some Tory chaps are women) receiving the fewest voted dropping out so the next round may proceed until the field is whittled down to two at which point (2) that pair is submitted to the party membership at large to make their choice.  The winner will assume the party leadership and, with the Tories enjoying in the House of Commons the solid majority Mr Johnson secured at the last election, will go to the palace to kiss hands and be invited to become Her Majesty’s Prime-Minister of the United Kingdom & First Lord of the Treasury.

This democratic (and leisurely) process is relatively new to the Tory party, its leaders elected by a formal vote only since 1965 and even then, until 2001, it was only MPs who voted.  Prior to that, Tory leaders were said to “emerge” from what was known as a “magic circle” and although never as mysterious as some suggested, it was an opaque process, conducted by party grandees.  The classic example was in 1957 when the choice was between Harold MacMillan (1894-1986; UK prime-minister 1957-1963) and Rab Butler (1902-1982).  To his office in the House of Lords, the lisping (fifth) Lord Salisbury (1983-1972) summoned those he thought good chaps (women at this point hadn’t yet become chaps) and asked “Hawold or Wab?”  Hawold prevailed.

The change in process in 1965 came about at the insistence of Sir Alec Douglas-Home (1903-1995 and the fourteenth Earl of Home before disclaiming his peerage in 1963 to become prime-minister (1963-1964)).  Since 1957, the country had changed and there was much criticism of the murky manner by which Sir Alec had become party leader and there was a clamour, even within the party, both to modernize and appear more transparently democratic.  From this point, unleashed were the forces which would in 1975 see Margaret Thatcher (1925-2013; UK prime-minister 1979-1990) elected leader but the first beneficiary of the wind of change was Edward "Ted" Heath (1916-2005; UK prime-minister 1970-1974), a grammar school boy who replaced a fourteenth earl.  Notably, to appear more modern, Heath in 1965 didn't repair (as he had with MacMillan when he emerged in 1957), to the Turf Club for a celebratory meal of oysters, game pie and champagne which “…might have made people think a reactionary regime had been installed”.  It can be hard now to understand quite what a change Heath's accession in 1965 flagged; the Tory Party previously had leaders from the middle class but never the lower middle class.  The significance of what emerged in 1965 was less the new leader than a changed Tory Party in a changed country.     

So Tory leaders no longer emerge from a magic circle but a remarkable number are emerging to offer themselves as willing to be considered so clearly it’s thought still a desirable job although not what it was.  MacMillan once marveled that his predecessor, William Ewart Gladstone (1809–1898; four times UK prime-minister 1868-1894), prime-minister at a time when the British Empire spanned the globe, managed every year to spend four months at his country house, an arrangement one suspects Mr Johnson would have found most tolerable.  Ideologically, the indications are there will be little to choose between those on offer, the extent of the variation probably something like that once described by Georges Clemenceau (1841–1929; Prime Minister of France 1906-1909 & 1917-1920) as the difference between "a politician who would murder their own mother and one prepared to murder only someone else's mother".

Mr Johnson will no doubt reflect on his time in Downing Street and perhaps conclude a few things might have been done differently but, after all, he has been prime-minister, one of the few to make it to the top of the greasy pole.  When the office beckoned Lord Melbourne (1779-1848; UK prime-minister 1834 & 1835-1841), he was disinclined to accept, fearing it would be “…a damned bore” but his secretary persuaded him, saying “…no Greek or Roman ever held the office and if it lasts but three months it’ll still be worthwhile to have been Prime Minister of England”.  Having apparently appointed himself Prime-Minister Emeritus, as he sits in No 10, plotting and scheming ways to remain, Mr Johnson can at least remember and be glad.

For a brief, shining moment (2019-2021), the world had three leaders with nuclear weapons and outstanding haircuts.

Alexander Boris de Pfeffel Johnson (b 1964; Prime Minister of the United Kingdom 2019-2022 (probably)) (left).

Donald John Trump (b 1946-; President of the United States 2017-2021) (centre).

Kim Jong-un (b circa 1983; Supreme Leader of DPRK (Democratic People’s Republic of Korea (North Korea)) since 2011) (right).

Wednesday, July 6, 2022

Furlong

Furlong (pronounced fur-lawng or fur-long)

A unit of distance, equal to 220 yards (201.168 m) or ⅛ mile (0.2 km).

Pre 900: From the Middle English from the Old English furlang or furh (length of a furrow), the construct being was furh (furrow) + lang (long).  The Middle English was forwe or furgh, the Old Frisian furch, the Old High German fur(u)h and the later German frche.  In Latin porca was the ridge between furrows.  For over a thousand years, accepted abbreviation has been fur.

Now used exclusively (though usually informally) in horse racing, the furlong had the same ad-hoc origin as much of the old English units of weights and measures (codified in 1925 as the Imperial System which replaced the Winchester Standards (1588-1825) and in official use in the British Empire from 1926).  The furlong (220 yards, 201.168 m) was based on the length of an average plowed furrow (hence “furrow-long,” or furlong) in the old English common-field (sometimes called open and typically of 10 acres) system. Each furrow ran the length of a 40 × 4-rod acre (660 feet).  

The standardization of linear units such as the yard, foot, and inch was undertaken by governments between 1266-1303, taking what were the historically acknowledged sizes of rods, furlongs, and acres as fixed and simply defining them as the newly standardized units. Thus, the furlong, often measured as 625 northern (German) feet, became 660 standard English feet, and the mile, always 8 furlongs, became 5,280 feet.  The medieval origins of what became the Imperial System of weights and measures was based often on the capacity of man and  beast, reflecting the priorities of the time, a furlong the distance a team of oxen could plough without resting and an acre the amount of land tillable by one man behind one ox in one day.

Grand National Course, Aintree Racecourse, Liverpool

Furlong was used from the ninth century to translate the Latin stadium (625 feet), one-eighth of a Roman mile, thus the English word came to be understood as "one-eighth of an English mile".  This meant the English mile was different in length from the Roman one but this had surprisingly few practical consequences because the mile was so rarely used in legal documents.  By comparison, furlong became one of the standard measures in property titles and land deed records.  For that reason it was the mile and not the furlong which was redefined during the reign of Elizabeth I (1533–1603; Queen of England and Ireland 1558-1603).

Although the UK has substantially abandoned the use of imperial measures, they remain substantially in use in the US, Myanmar (Burma) and Liberia although industrial use in the US has long switched to metric.  Having "got Brexit done", Prime-Minister Boris Johnson (b 1964, UK prime-minister since 2019), freed from the oppression of bureaucrats in Brussels, recently permitted nostalgic grocers again to price their products in pounds and ounces alongside kilograms and grams, a human right long denied by EU diktat.  Other hold-outs include international aviation which is a mix, aircraft now all made with metric measurements (although in casual use, wingspans etc are often spoken of in imperial) yet fuel loads are measured exclusively in kilograms and altitude always in feet (and all radio communication is English, reflecting the influence of the US in the 1940s when the regulations were drafted).  That may also explain why computer monitors are still measured in inches are are the wheels of cars (despite the occasional efforts of the French to nudge this niche to things metric).

Tuesday, July 5, 2022

Cacophony

Cacophony (pronounced kuh-kof-uh-nee)

(1) A harsh discordance of sound; dissonance.

(2) The use of inharmonious or dissonant speech sounds in language.

(3) In music, the frequent use of discords of a harshness and relationship difficult to understand.

1650-1660: From the sixteenth century French cacophonie (harsh or unpleasant sound) from the New Latin cacophonia, from the Ancient Greek κακοφωνία (kakophōnía) (ill or harsh sounding), from kakophonos (harsh sounding), the construct being being κακός (kakós) (bad, evil) + φωνή (phōn) (voice, sound) from the primitive Indo-European root bha (to speak, tell, say).  The source of kakós was the primitive Indo-European root kakka (to defecate), something which may have been in mind in that notable year 1789, when first cacophony was used to mean "discordant sounds in music".  Cacophony is a noun, cacophonic & cacophonous are adjectives, cacophonously an adverb and the noun plural is cacophonies.  The most natural antonym is harmony although musicologists tend to prefer the more precise euphony.

Noise as music

There were twentieth century composers, all of whom thought themselves both experimental and somewhere in the classical tradition, who wrote stuff which to most audiences sounded more cacophonous than melodic.  Some, like Béla Bartók (1881–1945) and Arnold Schönberg (1874–1951) had a varied output but for untrained listeners seeking music for pleasure rather than originality or the shock of something new, the work of Karlheinz Stockhausen (1928–2007), Charles Ives (1874–1954) and Philip Glass (b 1937) was no fun.  Some critics however, trained or otherwise, claim to enjoy many of these cul-de-sacs of the avant-garde and find genius among the noise.  Comrade Stalin (1878-1953; Soviet leader 1924-1953) would have called them formalists.

In popular (sic) culture, Lou Reed (1942–2013) in 1975 released Metal Machine Music, either (1) to fulfil a contractual obligation to a record label with which he no longer wished to be associated, (2) to annoy critics and others he didn’t like, (3) as a piece of electronic music, (4) to win a bet with Andy Warhol (1928-1987) or (5) because he was suffering an amphetamine induced psychosis.  Known as MMM or 3M by fans and categorized by most as noise, drone, industrial or minimal, it’s about two hours of modulated feedback running at different speeds.  Despite the derision attracted at the time, it’s since built a cult following and has been performed live, one approving reviewer suggesting MMM is best understood as “…electricity falling in love with itself”.  The original release was a double album on twelve-inch vinyl and featured a “locked grove” on the final track meaning the noise endlessly would loop, theoretically forever.  This trick couldn’t be done on the new medium of the CD but there were no complaints the omission detracted from the experience.