Monday, October 25, 2021

Myriad

Myriad (pronounced mir-ee-uhd)

(1) Originally, ten thousand (10,000) (archaic).

(2) A very great, innumerable or indefinitely great number of something; having innumerable phases, aspects, variations, etc.

1545-1555: From the French myriade, from the Late Latin mȳriadem (accusative of mȳrias (genitive myriadis)), from the Ancient Greek μυριάς (muriás) and myrias (genitive myriados) (the number 10,000), from μυρίος (muríos (plural myrioi)) (numberless, innumerable, countless, infinite; boundless).  In Ancient Greek, myriad was “the biggest number able to be expressed in one word”.  The ultimate origin is unknown but there may be a link with the primitive Indo-European meue-, the source also of the Hittite muri- (cluster of grapes), the Latin muto (penis) and the Middle Irish moth (penis).  The cardinal (ten thousand) is myriad, the ordinal (ten-thousandth) is myriadth, the multiplier (tenthousandfold) is myriadfold and the collective is myriad.  Myriad is a noun and adjective, myriadisation is a noun, myriadth is an adjective, myriadfold is an adjective & adverb and myriadly is an adverb; the noun plural is myriads.

In a hangover from the medieval habit, in the sixteenth century myriad initially was used in accordance with the Greek and Etymology meaning (ten thousand (10,000)) but as early as the late 1500s was used to refer to “a countless number or multitude of whatever was being discussed” and thus assumed the meaning “lots of; really big number of”.  From at last the mid-eighteenth century, when modifying a plural noun, it predominantly meant great in number; innumerable, multitudinous” and that’s long been the default meaning and references the exact numeric origin (10,000) exist only to list the earlier sense as archaic or as a footnote explaining the use in some historic text.  The most pure of the style guides (the editors used to fighting losing battles) still note than when used as an adjective the word myriad requires neither an article before nor a preposition after.  The result of that strictness is that a phrase like “a myriad of stars” where “myriad” acts as part of a nominal (or noun) group is said to be tautological but so much has the pattern of use evolved that most probably would find the alternative, though elegant, somehow lacking. 

The still rare noun myriadisation (crowdfunding) was a creature of social media, the idea being attracting funding in small increments from many (maybe 10,000 or more) and the concept was one of the (many) reasons Barack Obama's (b 1961; US president 2009-2017) campaign in 2008 to secure the Democratic Party nomination for that year's US presidential election was better funded than that of crooked Hillary Clinton (b 1947; US secretary of state 2009-2013).  One topical variant was permyriad, the construct being per- + myriad.  The per- prefix was borrowed from Latin per-, from the Proto-Italic peri- and related to per (through).  As a word-forming element, it's now rare except in science where (1) its used to form nouns & adjectives denoting the maximum proportion of one element in a compound and (2) it was added to the name of an element in a polyatomic ion to denote the number of atoms of that element (usually four).  Historically, in verbs it (3) denoting the sense "through", (4) denoting the sense "thoroughly", (5) denoting the sense "to destruction" and (6) in adjectives and adverbs it denoted the sense of "extremely".  Permyriad means “One out of every ten thousand” (ie one percent of one percent), a concept which was discussed in the aftermath of the Global Financial Crisis (GFC 2008-2011) when the idea spread of “the one percent” as the section of the population which held a disproportionate and socially destructive of the world’s wealth and property.  The point was made that the extent of the distortion was better illustrated were the math done with the “one percent of the one percent” so the definition of permyriad was in the news although the word never staged much of a revival.

Myriad is often a word of choice when writers are searching for a fancy way to say “lots” and it’s over used.  It no longer is used to convey “10,000” or even a number close to that but it should evoke thoughts of a big number.  Synonyms for myriad in its modern sense include countless, endless, infinite & innumerable and (even in Los Angeles) rehab options may not be quite that plentiful.  A better fancy would have been plethora, the synonyms for which include excess, abundance, glut, surfeit, superfluity & slew.  It’s true plethora & myriad are often used interchangeably but they really are subtly different.  Plethora (the plural plethorae or plethoras) was from the Late Latin plēthōra, from the Ancient Greek πληθώρη (plēth) (fullness, satiety), the construct being πλήθω (plthō) (to be full) + -η (-ē) (the nominal suffix).  It means “an excessive amount or number; an abundance”.  In use, plethora is usually followed by “of” except for the technical use in clinical medicine where it describes “an excess of blood in the skin, especially in the face and especially chronically”.

It’s probably not true that during Antiquity the Athenians never spoke of matters where values higher than 10,000 were discussed but they appear never to have created a single word descriptor of anything bigger.  The structural functionalists among linguistic anthropologists would find that unremarkable because in the society of the age, the need was so rare.  Doubtless, there would have been Greeks who speculated on the number grains of sand on the sea-shore or the stars in the night sky and their astronomers even attempted to estimate the distances to stars but these wouldn’t have been things often heard in everyday conversation and such calculations were expressed in equations.  The need for words came later and advances in the sciences including cosmology, particle physics and virology meant millions, billions, trillions and later multipliers became genuinely useful.  The standardization however didn’t happen until well into the twentieth century.  Until then, in British English a billion was a million millions (1,000,000,000,000), something then really of use only to cosmologists whereas in US use it was calculated as a thousand millions (1,000.000,000) and thus a word with some utility in public finance (although in Weimar Germany’s (1918-1933) period of hyper-inflation (1923) the UK’s definition could have been used of the Papiermarks).

50 trillion (50 Billionen, 5×1013) mark note, Weimar Germany, 1923.

Since then of course, because a billion dollars isn’t what it was, we now routinely hear of trillions (often in the form of public debt) while billionaires are the new eligible bachelors and divorcees.  For those dealing with things like atoms, neutrinos and such, there is a point (probably anything beyond a billion) where writing out all those zeros becomes either tedious or impractical so the words are useful.  For mathematicians, the numbers are expressed using exponents: In the expression xn, x is the base and n the exponent; n is the power to which x is raised, thus the common expression "to the power of" so 102=100 and 103=1000.  Of course, numbers being infinite, even this convention can in theory become unmanageable, hence the attraction of something like 10googol to represent a googolplex, a googol being 10100.  If the need arises (say the discovery that the universe is much bigger than thought or Elon Musk gets really rich) words may make values easier for humans to follow than numerals.

Sunday, October 24, 2021

Dot

Dot (pronounced dot)

(1) A small, roundish mark made with or as if with a pen.

(2) A minute or small spot on a surface; speck.

(3) Anything relatively small or speck-like.

(4) A small specimen, section, amount, or portion; a small portion or specimen (the use meaning “a lump or clod” long obsolete).

(5) In grammar, a punctuation mark used to indicate the end of a sentence or an abbreviated part of a word; a full stop; a period.

(6) In the Latin script, a point used as a diacritical mark above or below various letters, as in Ȧ, Ạ, , , Ċ.

(7) In computing, a differentiation point internet addresses etc and in file names a separation device (although historically a marker between the filename and file type when only one dot per name was permitted in early files systems, the best known of which was the 8.3 used by the various iterations of CP/M & DOS (command.com, image.tif, config.sys etc).

(8) In music, a point placed after a note or rest, to indicate that the duration of the note or rest is to be increased one half. A double dot further increases the duration by one half the value of the single dot; a point placed under or over a note to indicate that it is to be played staccato.

(9) In telegraphy. a signal of shorter duration than a dash, used in groups along with groups of dashes (-) and spaces to represent letters, as in Morse code.

(10) In printing, an individual element in a halftone reproduction.

(11) In printing, the mark that appears above the main stem of the letters i, j.

(12) In the sport of cricket, as “dot ball” a delivery not scored from.

(13) In the slang of ballistics as “dotty” (1) buckshot, the projectile from a or shotgun or (2) the weapon itself.

(14) A female given name, a clipping of form of Dorothea or Dorothy.

(15) A contraction in many jurisdictions for Department of Transportation (or Transport).

(16) In mathematics and logic, a symbol (·) indicating multiplication or logical conjunction; an indicator of dot product of vectors: X · Y

(17) In mathematics, the decimal point (.),used for separating the fractional part of a decimal number from the whole part.

(18) In computing and printing, as dot matrix, a reference to the method of assembling shapes by the use of dots (of various shapes) in a given space.  In casual (and commercial) use it was use of impact printers which used a hammer with a dot-shape to strike a ribbon which impacted the paper (or other surface) to produce representations of shapes which could include text.  Technically, laser printers use a dot-matrix in shape formation but the use to describe impact printers caught on and became generic.  The term “dots per inch” (DPI) is a measure of image intensity and a literal measure of the number of dots is an area.  Historically, impact printers were sold on the basis of the number of pins (hammers; typically 9, 18 or 24) in the print head which was indicative of the quality of print although some software could enhance the effect.

(19) In civil law, a woman's dowry.

(20) In video gaming, the abbreviation for “damage over time”, an attack that results in light or moderate damage when it is dealt, but that wounds or weakens the receiving character, who continues to lose health in small increments for a specified period of time, or until healed by a spell or some potion picked up.

(21) To mark with or as if with a dot or dots; to make a dot-like shape.

(22) To stud or diversify with or as if with dots (often in the form “…dotting the landscape…” etc).

(23) To form or cover with dots (such as “the dotted line”).

(24) In colloquial use, to punch someone.

(25) In cooking, to sprinkle with dabs of butter, chocolate etc.

Pre 1000: It may have been related to the Old English dott (head of a boil) although there’s no evidence of such use in Middle English.  Dottle & dit were both derivative of Old English dyttan (to stop up (and again, probably from dott)) and were cognate with Old High German tutta (nipple), the Norwegian dott and the Dutch dott (lump).  Unfortunately there seems no link between dit and the modern slang zit (pimple), a creation of US English unknown until the 1960s.  The Middle English dot & dotte were from the Old English dott in the de-elaborated sense of “a dot, a point on a surface), from the Proto-West Germanic dott, from the Proto-Germanic duttaz (wisp) and were cognate with the Saterland Frisian Dot & Dotte (a clump), the Dutch dot (lump, knot, clod), the Low German Dutte (a plug) and the Swedish dott (a little heap, bunch, clump).  The use in civil jurisdiction of common law where dot was a reference to “a woman's dowry” dates from the early 1820s and was from the French, from the Latin dōtem, accusative of dōs (dowry) and related to dōtāre (to endow) and dāre to (give).  For technical or descript reasons dot is a modifier or modified as required including centered dot, centred dot, middle dot, polka dot, chroma dot, day dot, dot-com, dot-comer (or dot-commer), dot release and dots per inch (DPI).  The synonyms can (depending on context) include dab, droplet, fleck, speck, pepper, sprinkle, stud, atom, circle, speck, grain, iota, jot, mite, mote, particle, period, pinpoint, point, spot and fragment.  Dot & dotting are nouns & verbs, dotter is a noun, dotlike & dotal are adjectives, dotted is an adjective & verb and dotty is a noun & adjective; the noun plural is dots.

Although in existence for centuries, and revived with the modern meaning (mark) in the early sixteenth century, the word appears not to have been in common use until the eighteenth and in music, the use to mean “point indicating a note is to be lengthened by half” appears by at least 1806.  The use in the Morse code used first on telegraphs dates from 1838 and the phrase “on the dot” (punctual) is documented since 1909 as a in reference to the (sometimes imagined) dots on a clock’s dial face.  In computing, “dot-matrix” (printing and screen display) seems first to have been used in 1975 although the processes referenced had by then been in use for decades.  The terms “dotted line” is documented since the 1690s.  The verb dot (mark with a dot or dots) developed from the noun and emerged in the mid eighteenth century.  The adjective dotty as early as the fourteenth century meant “someone silly” and was from "dotty poll" (dotty head), the first element is from the earlier verb dote.  By 1812 it meant also literally “full of dots” while the use to describe shotguns, their loads and the pattern made on a target was from the early twentieth century.  The word microdot was adopted in 1971 to describe “tiny capsules of Lysergic acid diethylamide" (LSD or “acid”); in the early post-war years (most sources cite 1946) it was used in the espionage community to describe (an extremely reduced photograph able to be disguised as a period dot on a typewritten manuscript.

Lindsay Lohan in polka-dots, enjoying a frozen hot chocolate, Serendipity 3 restaurant, New York, 7 January 2019.

The polka-dot (a pattern consisting of dots of uniform size and arrangement," especially on fabric) dates from 1844 and was from the French polka, from the German Polka, probably from the Czech polka, (the dance, literally "Polish woman" (Polish Polka), feminine form of Polak (a Pole).  The word might instead be a variant of the Czech půlka (half (půl the truncated version of půlka used in special cases (eg telling the time al la the English “half four”))) a reference to the half-steps of Bohemian peasant dances.  It may even be influenced by or an actual merger of both.  The dance first came into vogue in 1835 in Prague, reaching London in the spring of 1842; Johann Strauss (the younger) wrote many polkas.  Polka was a verb by 1846 as (briefly) was polk; notoriously it’s sometimes mispronounced as poke-a-dot.

In idiomatic use, to “dot one's i's and cross one's t's” is to be meticulous in seeking precision; an attention to even the smallest detail.  To be “on the dot” is to be exactly correct or to have arrived at exactly at the time specified.  The ides of “joining the dots” or “connecting the dots” is to make connections between various pieces of data to produce useful information.  In software, the process is literal in that it refers to the program “learning: how accurately to fill in the missing pieces of information between the data points generated or captured.  “The year dot” is an informal expression which means “as long ago as can be remembered”.  To “sign on the dotted line” is to add one’s signature in the execution of a document (although there may be no actual dotted line on which to sign).

Dots, floating points, the decimal point and the Floating Point Unit (FPU) 

When handling numbers, decimal points (the dot) are of great significance.  In cosmology a tiny difference in values beyond the dot can mean the difference between hitting one’s target and missing by thousands of mile and in finance the placement can dictate the difference between ending up rich or poor.  Vital then although not all were much bothered: when Lord Randolph Churchill (1849–1895) was Chancellor of the Exchequer (1886), he found the decimal point “tiresome”, telling the Treasury officials “those damned dot” were not his concern and according to the mandarins he was inclined to “round up to the nearest thousand or million as the case may be”.  His son (Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) when Chancellor (1924-1929)) paid greater attention to the dots but his term at 11 Downing Street, although longer, remains less well-regarded.

In some (big, small or complex) mathematical computations performed on computers, the placement of the dot is vital.  What are called “floating-point operations” are accomplished using a representation of real numbers which can’t be handled in the usual way; both real numbers, decimals & fractions can be defined or approximated using floating-point representation, the a numerical value represented by (1) a sign, (2) a significand and (3) an exponent.  The sign indicates whether the number is positive or negative, the significand is a representation of the fractional part of the number and the exponent determines the number’s scale.  In computing, the attraction of floating-point representation is that a range of values can be represented with a relatively small number of bits and although the capability of computers has massively increased, so has the ambitions of those performing big, small or complex number calculations so the utility remains important.  At the margins however (very big & very small), the finite precision of traditional computers will inevitably result in “rounding errors” so there can be some degree of uncertainty, something compounded by there being even an “uncertainty about the uncertainty”.  Floating point calculations therefore solve many problems and create others, the core problem being there will be instances where the problems are not apparent.  Opinion seems divided on whether quantum computing will mean the uncertainty will vanish (at least with the very big if not the very small).

In computer hardware, few pieces have so consistently been the source of problems as Floating point units (FPUs), the so-called “math co-processors”.  Co-processors were an inherent part of the world of the mainframes but came to be thought of as something exotic in personal computers (PC) because there was such a focus on the central processing unit (CPU) (8086, 68020, i486 et al) and some co-processors (notably graphical processing units (GPU)) have assumed a cult-like following.  The evolution of the FPU is interesting in that as manufacturing techniques improved they were often integrated into the CPU architecture before again when the PC era began, Intel’s early 808x & 8018x complimented by the optional 8087 FPU, the model replicated by the 80286 & 80287 pairing, the latter continuing for some time as the only available FPU for almost two years after the introduction of the 80386 (later renamed i386DX in an attempt to differential genuine “Intel Inside” silicon from the competition which had taken advantage of the difficulties in trade-marking numbers).  The delay was due to the increasing complexity of FPU designs and flaws were found in the early 387s.

Intel i487SX & i486SX.

The management of those problems was well-managed by Intel but with the release of the i487 in 1991 they kicked an own goal.  First displayed in 1989, the i486DX had been not only a considerable advance but included an integrated FPU (also with some soon-corrected flaws).  That was good but to grab some of the market share from those making fast 80386DX clones, Intel introduced the i486SX, marketed as a lower-cost chip which was said to be an i486 with a reduced clock speed and without the FPU.  For many users that made sense because anyone doing mostly word processing or other non-number intensive tasks really had little use for the FPU but then Intel introduced the i487SX, a FPU unit which, in the traditional way, plugged into a socket on the system-board (as even them motherboards were coming to be called) al la a 287 or 387.  However, it transpired i487SX was functionally almost identical to an i486DX, the only difference being that when plugged-in, it checked to ensure the original i486SX was still on-board, the reason being Intel wanted to ensure no market for used i486SXs (then selling new for hundreds of dollars) emerged.  To achieve this trick, the socket for the I487 had an additional pin and it was the presence of this which told the system board to disable the i486SX.  The i487SX was not a success and Intel suffered what was coming to be called “reputational damage”.

Dual socket system-board with installed i486SX, the vacant socket able to handle either the i486DX or the i487SX.

The i487SX affair was however a soon forgotten minor blip in Intel’s upward path.  In 1994, Intel released the first of the Pentium CPUs all of which were sold with an integrated FPU, establishing what would become Intel’s standard architectural model.  Like the early implementations of the 387 & 487, there were flaws and upon becoming aware of the problem, Intel initiated a rectification programme.  They did not however issue a recall or offer replacements to anyone who had already purchased a flawed Pentium and, after pressure was exerted, undertook to offer replacements only to those users who could establish their pattern of use indicated they would actually be in some way affected.  Because of the nature of the bug, that meant “relatively few”.  The angst however didn’t subside and a comparison was made with a defect in a car which would manifest only if speeds in excess of 125 mph (200 km/h) were sustained for prolonged periods.  Although in that case only “relatively few” might suffer the fault, nobody doubted the manufacturer would be compelled to rectify all examples sold and such was the extent of the reputational damage that Intel was compelled to offer what amounted to a “no questions asked” replacement offer.  The corporation’s handing of the matter has since often been used as a case study in academic institutions by those studying law, marketing, public relations and such.

Saturday, October 23, 2021

Mini

Mini (pronounce min-ee)

(1) A skirt or dress with a hemline well above the knee, popular since the 1960s.

(2) A small car, build by Austin, Morris, associated companies and successor corporations between 1959-2000.  Later reprised by BMW in a retro-interpretation.

(3) As minicomputer, a generalized (historic) descriptor for a multi-node computer system smaller than a mainframe; the colloquial term mini was rendered meaningless by technological change (Briefly, personal computers (PC) were known as micros).

(4) A term for anything of a small, reduced, or miniature size.

Early 1900s: A shorted form of miniature, ultimately from the Latin minium (red lead; vermilion), a development influenced by the similarity to minimum and minus.  In English, miniature was borrowed from the late sixteenth century Italian miniatura (manuscript illumination), from miniare (rubricate; to illuminate), from the Latin miniō (to color red), from minium (red lead).  Although uncertain, the source of minium is thought to be Iberian; the vivid shade of vermilion was used to mark particular words in manuscripts.  Despite the almost universal consensus mini is a creation of twentieth-century, there is a suggested link in the 1890s connected with Yiddish and Hebrew.

As a prefix, mini- is a word-forming element meaning "miniature, minor", again abstracted from miniature, with the sense presumed to have been influenced by minimum.  The vogue for mini- as a prefix in English word creation dates from the early 1960s, the prime influences thought to be (1) the small British car, (2) the dresses & skirts with high-hemlines and (3) developments in the hardware of electronic components which permitted smaller versions of products to be created as low-cost consumer products although there had been earlier use, a minicam (a miniature camera) advertised as early as 1937.  The mini-skirt (skirt with a hem-line well above the knee) dates from 1965 and the first use of mini-series (television series of short duration and on a single theme) was labelled such in 1971 and since then, mini- has been prefixed to just about everything possible.  To Bridget Jones (from Bridget Jones's Diary (1996) a novel by Helen Fielding (b 1958)), a mini-break was a very short holiday; in previous use in lawn tennis it referred to a tiebreak, a point won against the server when ahead.

Jean Shrimpton and the mini-skirt

Jean Shrimpton, Flemington Racecourse, Melbourne, 1965.

The Victorian Racing Club (VRC) had in 1962 added Fashions on the Field to the Melbourne’s Spring Racing Carnival at Flemington and for three years, women showed up with the usual hats and accessories, including gloves and stockings, then de rigueur for ladies of the Melbourne establishment.  Then on the VRC’s Derby Day in 1965, English model Jean Shrimpton (b 1942) wore a white mini, its hem a daring four inches (100 mm) above the knee.  It caused stir.

The moment has since been described as the pivotal moment for the introduction of the mini to an international audience which is probably overstating things but for Melbourne it was certainly quite a moment.  Anthropologists have documented evidence of the mini in a variety of cultures over the last 4000 odd years so, except perhaps in Melbourne, circa 1965, it was nothing new but that didn’t stop the fashion industry having a squabble about who “invented” the mini.  French designer André Courrèges (1923-2016) explicitly claimed the honor, accusing his London rival to the claim, Mary Quant (b 1930) of merely “commercializing it”.  Courrèges had shown minis at shows in both 1964 and 1965 and his sketches date from 1961.  Quant’s designs are even earlier but given the anthropologists’ findings, it seems a sterile argument.

Minimalism: Lindsay Lohan and the possibilities of the mini.

The Mini

1962 Riley Elf.

The British Motor Corporation (BMC) first released their Mini in 1959, the Morris version called the Mini Minor (a link to the larger Minor, a model then in production) while the companion Austin was the Seven, a re-use of the name of a tiny car of the inter-war years.  The Mini name however caught on and the Seven was re-named Mini early in 1962 although the up-market (and, with modifications to the body, slightly more than merely badge-engineered) versions by Riley and Wolseley were never called Mini, instead adopting names either from or hinting at their more independent past: the Elf and Hornet respectively.  The Mini name was in 1969 separated from Austin and Morris, marketed as stand-alone marque until 1980 when the Austin name was again appended, an arrangement which lasted until 1988 when finally it reverted to Mini although some were badged as Rovers for export markets.  The Mini remained in production until 2000, long before then antiquated but still out-lasting the Metro, its intended successor.

1969 Austin Maxi 1500.

The allure of the Mini name obviously impressed BMC.  By 1969, BMC had, along with a few others, been absorbed into the Leyland conglomerate and the first release of the merged entity was in the same linguistic tradition: The Maxi.  A harbinger of what was to come, the Maxi encapsulated all that would go wrong within Leyland during the 1970s; a good idea, full of advanced features, poorly developed, badly built, unattractive and with an inadequate service network.  The design was so clever that to this day the space utilization has rarely been matched and had it been a Renault or a Citroën, the ungainly appearance and underpowered engine might have been forgiven because of the functionality but the poor quality control, lack of refinement and clunky aspects of some of the drivetrain meant success was only ever modest.  Like much of what Leyland did, the Maxi should have been a great success but even car thieves avoided the thing; for much of its life it was reported as the UK's least stolen vehicle.          

1979 Vanden Plas Mini (a possibly "outlaw" project by Leyland's outpost in South Africa).

Curiously, given the fondness of BMC (and subsequently Leyland) for badge-engineering, there was never an MG version of the Mini (although a couple of interpretations were privately built), the competition potential explored by a joint-venture with the Formula One constructors, Cooper, the name still used for some versions of the current BMW Mini.  Nor was there a luxury version finished by coachbuilders Vanden Plas which, with the addition of much timber veneer and leather to mundane vehicles, provided the parent corporations with highly profitable status-symbols with which to delight the middle-class although there was one "outlaw".  Between August 1978-September 1979, Leyland's South African operation (Leykor) offered a Vanden Plas Mini.  It used the 1098cm3 A-Series engine, a four-speed manual transmission and drum brakes all round.  Available only in a metallic bronze with gold basket weave side-graphics (shades of brown seemed to stalk the 1970s), standard equipment included a folding sunroof, matt-black grille with chrome surround, tinted glass, twin chrome door mirrors, a chrome exhaust tip, mud-flaps and a dipping rear view mirror.  The interior appointments weren't up to the standard of the English VDPs but there was cashmere pure wool upholstery, a walnut veneer dashboard with twin glove boxes, a leather bound steering wheel and mahogany cut-pile carpets.  Apparently, the project was shut down when London got to hear about it.   In the home market, third-party suppliers of veneer and leather such as Radford found a market among those who appreciated the Mini's compact practicality but found its stark functionalism just too austere. 

The Twini

Mini Coopers (1275 S) through the cutting, Mount Panorama, Bathurst, Australia, 1966.

In that year's Gallaher 500, Mini Coopers finished first to ninth.  It was the last occasion on which anything with a naturally-aspirated four-cylinder engine would win the annual endurance classic, an event which has since be won on all but a handful of occasions by V8-powered cars (memorably a V12 Jaguar XJS triumphed in 1985 when Conrod Straight was still at it full length), a statistic distorted somewhat by the rule change in 1995 which stipulated only V8s were allowed to run.    

Although it seemed improbable when the Mini was released in 1959 as a small, utilitarian economy car, the performance potential proved extraordinary; in rallies and on race tracks it was a first-rate competitor for over a decade, remaining popular in many forms of competition to this day.  The joint venture with the Formula One constructor Cooper provided the basis for most of the success but by far the most intriguing possibility for more speed was the model which was never developed beyond the prototype stage: the twin-engined Twini.

Prototype twin-engined Moke while undergoing snow testing, 1962.

It wasn’t actually a novel approach.  BMC, inspired apparently by English racing driver Paul Emery (1916–1993) who in 1961 had built a twin-engined Mini, used the Mini’s underpinnings to create an all-purpose cross-country vehicle, the Moke, equipped with a second engine and coupled controls which, officially, was an “an engineering exercise” but had actually been built to interest the Ministry of Defence in the idea of a cheap, all-wheel drive utility vehicle, so light and compact it could be carried by small transport aircraft and serviced anywhere in the world.  The army did test the Moke and were impressed by its capabilities and the flexibility the design offered but ultimately rejected the concept because the lack of ground-clearance limited the terrain to which it could be deployed.  Based on the low-slung Mini, that was one thing which couldn’t easily be rectified.  Instead, using just a single engine in a front-wheel-drive (FWD) configuration, the Moke was re-purposed as a civilian model, staying in production between 1964-1989 and offered in various markets.  Such is the interest in the design that several companies have resumed production, including in electric form and it remains available today.

Cutaway drawing of Cooper’s Twini.

John Cooper (1923-2000), aware of previous twin-engined racing cars,  had tested the prototype military Moke and immediately understood the potential the layout offered for the Mini (ground clearance not a matter of concern on race tracks) and within six weeks the Cooper factory had constructed a prototype.  To provide the desired characteristics, the rear engine was larger and more powerful, the combination, in a car weighing less than 1600 lb (725 kg), delivering a power-to-weight ratio similar to a contemporary Ferrari Berlinetta and to complete the drive-train, two separate gearboxes with matched ratios were fitted.  Typically Cooper, it was a well thought-out design.  The lines for the brake and clutch hydraulics and those of the main electrical feed to the battery were run along the right-hand reinforcing member below the right-hand door while on the left side were the oil and water leads, the fuel supply line to both engines fed from a central tank.  The electrical harness was ducted through the roof section and there was a central throttle link, control of the rear carburetors being taken from the accelerator, via the front engine linkage, back through the centre of the car.  It sounded intricate but the distances were short and everything worked.

Twini replica.

John Cooper immediately began testing the Twini, evaluating its potential for competition and as was done with race cars in those happy days, that testing was on public roads where it proved to be fast, surprisingly easy to handle and well-balanced.  Unfortunately, de-bugging wasn't complete and during one night session, the rear engine seized which resulting in a rollover, Cooper seriously injured and the car destroyed.  Both BMC and Cooper abandoned the project because the standard Mini-Coopers were proving highly successful and to qualify for any sanctioned competition, at least one hundred Twinis would have to have been built and neither organization could devote the necessary resources for development or production, especially because no research had been done to work out whether a market existed for such a thing, were it sold at a price which guaranteed at least it would break even.

Twini built by Downton Engineering.  Driven by Sir John Whitmore (1937– 2017) &  Paul Frère (1917–2008) in the 1963 Targa Florio, it finished 27th and 5th in class.

The concept however did intrigue others interested in entering events which accepted one-offs with no homologation rules stipulating minimum production volumes.  Downton Engineering built one and contested the 1963 Targa Florio where it proved fast but fragile, plagued by an overheating rear-engine and the bugbear of previous twin-engined racing cars: excessive tire wear.  It finished 27th (and last) but it did finish, unlike some of the more illustrious thoroughbreds which fell by the wayside.  Interestingly, the Downton engineers choose to use a pair of the 998 cm3 (61 cubic inch) versions of the BMC A-Series engine which was a regular production iteration and thus in the under-square (long stroke) configuration typical of almost all the A-Series.  The long stroke tradition in British engines was a hangover from the time when the road-taxation system was based on the cylinder bore, a method which had simplicity and ease of administration to commend it but little else, generations of British engines distinguished by their dreary, slow-revving characteristics.  The long stroke design did however provide good torque over a wide engine-speed range and on road-course like the Targa Florio, run over a mountainous Sicilian circuit, the ample torque spread would have appealed more to drivers than ultimate top-end power.  For that reason, although examples of the oversquare 1071 cm3 (65 cubic inch) versions were available, it was newly developed and a still uncertain quantity and never considered for installation.  The 1071 was used in the Mini Cooper S only during 1963-1964 (with a companion 970 cm3 (61 cubic inch) version created for use in events with a 1000 cm3 capacity limit) and the pair are a footnote in A-Series history as the only over-square versions released for sale

Twin-engined BMW Mini (Binni?).

In the era, it’s thought around six Twinis were built (and there have been a few since) but the concept proved irresistible and twin-engined versions of the "new" Mini (built since 2000 by BMW) have been made.  It was fitting that idea was replicated because what was striking in 2000 when BMW first displayed their Mini was that its lines were actually closer to some of the original conceptual sketches from the 1950s than was the BMC Mini on its debut.  BMW, like others, of course now routinely add electric motors to fossil-fuel powered cars so in that sense twin (indeed, sometimes multi-) engined cars are now common but to use more than one piston engine remains rare.  Except for the very specialized place which is the drag-strip, the only successful examples have been off-road or commercial vehicles and as John Cooper and a host of others came to understand, while the advantages were there to be had, there were easier, more practical ways in which they could be gained.  Unfortunately, so inherent were the drawbacks that the problems proved insoluble.

Friday, October 22, 2021

Loafer

Loafer (pronounced loh-fer)

(1) A person who loafs about; a lazy idler; a lay-about.

(2) A name for a moccasin-like, laceless, slip-on shoe, worn by both men and women.

(3) In some south-western US dialects, a wolf, especially a grey or timber wolf (often in the compound form “loafer wolf).

1830: The construct was loaf + -er.  Loaf was from the From Middle English lof & laf, from the Old English hlāf (bread, loaf of bread), from the Proto-West Germanic hlaib, from the Proto-Germanic hlaibaz (bread, loaf), of uncertain origin but which may be related to the Old English hlifian (to stand out prominently, tower up). It was cognate with the Scots laif (loaf), the German Laib (loaf), the Swedish lev (loaf), the Russian хлеб (xleb) (bread, loaf) and the Polish chleb (bread).  It was used to mean (1) a block of bread after baking, (2) any solid block of food, such as meat or sugar, (3) a solid block of soap, from which standard bar (or cake) of soap is cut or (4) in cellular automata, a particular still life configuration with seven living cells.  The origin of “use your loaf” meaning “think about it” in Cockney rhyming slang was as a shortened form of “loaf of bread” (ie “use your head”).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  Loafer & loafing are nouns & verbs, loafed, loafering & loafered are verbs and loaferish is an adjective; the noun plural is loafers.

The use to describe “a lazy idler” was first documented in 1830 as an Americanism which may have been short for landloafer (vagabond), similar (though not necessarily related) to the obsolete nineteenth century German Landläufer (vagabond) or the Dutch landloper.  Etymologists suggest landloafer may have been a partial translation of a circa 1995 loan-translation of the German Landläufer as “land loper” (and may be compared with the dialectal German loofen (to run) and the English landlouper) but this has little support and most regard a more likely connection being the Middle English love, loove, loffinge & looffinge (a remnant, the rest, that which remains or lingers), from Old English lāf (remainder, residue, what is left), which was akin to Scots lave (the rest, remainder), the Old English lǣfan (to let remain, leave behind).  One amusing coincidence was that in Old English hlaf-aeta (household servant) translated literally as “loaf-eater” (ie, one who eats the bread of his master, suggesting the Anglo-Saxons might still have felt the etymological sense of their lord & master as the “loaf-guard”.  The expression "one mustn't despair because one slice has been cut from the loaf" describes a pragmatic reaction to learning one's unmarried daughter has been de-flowered and is said to be of Yiddish origin but no source has ever been cited.  In modern idomatic use, the derived phrases "a slice off a cut loaf is never missed" and "you never miss a slice from a cut loaf" refer to having enjoyed sexual intercourse with someone who is not a virgin, the idea being that once the end of a loaf (the crust) has been removed, it's not immediately obvious how many slices have been cut. 

The loafer is a style, a slip-on shoe which is essentially a slipper designed as an all-weather shoe for outdoor use.  They’re available in a wide range of styles from many manufacturers and this image is just a few of the dozens recently offered by Gucci.  In the old Soviet Union (the USSR; 1922-1991), there were usually two (when available): one for men and one for women, both (sometimes) available in black or brown.

The verb loaf was first documented in 1835 in US English, apparently a back-formation from the earlier loafer and loafed & loafing soon emerged.  The noun in the sense of “an act of loafing” was in use by 1855.  What constitutes loafing is very much something subjective; a student underachieving in Latin might be thought a loafer by a professor of classics but the “hard working, much published” don who in his whole career never lifted anything much heavier than a book would probably be dismissed as “a loafer” by the laborer digging the trench beneath his study.  A “tavern loafer” was one who spent his hours drinking in bars while a “street loafer” was a synonym for a “delinquent who hung about on street corners”.  Loafer as a description of footwear dates from 1937 and it was used of lace-less, slip-on shoes worn on less formal occasions (essentially slippers designed for outdoor use, a popular early version of which was the “penny loafer”, so named because it featured an ornamental slotted leather band across the upper where a coin was often mounted.  The use in some south-western dialects as “loafer” or “loafer wolf” to describe a grey or timber wolf is based on the American Spanish lobo (wolf), reinterpreted as or conflated with loafer (idler).

Rowan Williams (b 1950; Archbishop of Canterbury 2002-2012) admiring Benedict XVI’s (1927–2022; pope 2005-2013, pope emeritus 2013-2022) red loafers, Lambeth Palace, September 2010.

When in 2013 announced he was resigning the papacy, there was much discussion of what might be the doctrinal or political implications but a few fashionistas also bid farewell to the best-dressed pontiff for probably a century and the one Esquire magazine had named “accessorizer of the year”.  In recent memory, the world had become accustomed to the white-robed John Paul II (1920–2005; pope 1978-2005) who would don colorful garments for ceremonial occasions but never wore them with great élan and eschewed the use of the more elaborate, perhaps influenced by Paul VI (1897-1978; pope 1963-1978) whose reign was marked by a gradual sartorial simplification and he was the last pope to wear the triple tiara which had since the early Middle Ages been a symbol of papal authority; briefly it sat on his head on the day of his coronation before, in an “act of humility”, it was placed on the alter where, symbolically, it has since remained.

The pope and the archbishop discuss the practicalities of cobbling.

Benedict’s pontificate however was eight stylish years, the immaculately tailored white caped cassock (the simar) his core piece of such monochromatic simplicity that it drew attention to the many adornments and accessories he used which included billowing scarlet satin chasubles trimmed with crimson velvet and delicate gold piping and others woven in emerald-green watered silk with a pattern of golden stars.  Much admired also was the mozzetta, a waist-length cape, and the camauro, a red velvet cap with a white fur border that around the world people compared with the usual dress of Santa Claus, X (then known as twitter) quickly fleshing out the history of the Coca-Cola Corporation’s role in creating the “uniform” although there was some exaggeration, the Santa-suit and hat familiar by at least the 1870s although Coca-Cola’s use in advertizing did seem to drive out all colors except red.  On popes however, the red velvet and white fur trim had been around for centuries though it fell from fashion after the Second Vatican Council (Vatican II; 1962-1965) and was thus a novelty when Benedict revived the style.

The pope farewells the archbishop.

Not all (including some cardinals) appreciated the papal bling but what attracted most attention were his bright red loafers, a style of shoe which popes have been depicted wearing since Roman times and the Holy See was forced to issue a statement denying they were hand-crafted by the high-end Italian fashion house Prada.  In their press release, the Vatican’s Press Office reminded the world the red symbolizes martyrdom and the Passion of Christ, the shoes there to signify the pope following in the footsteps of Christ.  Rather than a fashion house, the papal loafers were the work of two Italian artisan cobblers: Adriano Stefanelli and Antonio Arellano and Signor Stefanelli’s connections with the Vatican began when he offered to make shoes for John Paul II after noticing his obvious discomfort during a television broadcast.  Signor Arellano had a longer link with Benedict’s feet, having been his cobbler when, as Joseph Ratzinger, he was the cardinal heading the Inquisition (now called the Dicastery for the Doctrine of the Faith (DDF)) and as soon as Benedict’s surprise elevation was announced, he went immediately to his last and made a pair of red loafers for him (he’s an Italian size 42 (a UK 8 & a US 9)).  Upon his resignation, as pope emeritus, he retired the red loafers in favor of three pairs (two burgundy, one brown) which were a gift from a Mexican cobbler: Armando Martin Dueñas.  Pope Francis (b 1936; pope since 2013) has reverted to the austere ways of Vatican II and wears black shoes.

Channeling Benedict: Lindsay Lohan in red loafers, September 2016.  Although unconfirmed, it's believed these were not a papal gift.