Sunday, October 24, 2021

Dot

Dot (pronounced dot)

(1) A small, roundish mark made with or as if with a pen.

(2) A minute or small spot on a surface; speck.

(3) Anything relatively small or speck-like.

(4) A small specimen, section, amount, or portion; a small portion or specimen (the use meaning “a lump or clod” long obsolete).

(5) In grammar, a punctuation mark used to indicate the end of a sentence or an abbreviated part of a word; a full stop; a period.

(6) In the Latin script, a point used as a diacritical mark above or below various letters, as in Ȧ, Ạ, , , Ċ.

(7) In computing, a differentiation point internet addresses etc and in file names a separation device (although historically a marker between the filename and file type when only one dot per name was permitted in early files systems, the best known of which was the 8.3 used by the various iterations of CP/M & DOS (command.com, image.tif, config.sys etc).

(8) In music, a point placed after a note or rest, to indicate that the duration of the note or rest is to be increased one half. A double dot further increases the duration by one half the value of the single dot; a point placed under or over a note to indicate that it is to be played staccato.

(9) In telegraphy. a signal of shorter duration than a dash, used in groups along with groups of dashes (-) and spaces to represent letters, as in Morse code.

(10) In printing, an individual element in a halftone reproduction.

(11) In printing, the mark that appears above the main stem of the letters i, j.

(12) In the sport of cricket, as “dot ball” a delivery not scored from.

(13) In the slang of ballistics as “dotty” (1) buckshot, the projectile from a or shotgun or (2) the weapon itself.

(14) A female given name, a clipping of form of Dorothea or Dorothy.

(15) A contraction in many jurisdictions for Department of Transportation (or Transport).

(16) In mathematics and logic, a symbol (·) indicating multiplication or logical conjunction; an indicator of dot product of vectors: X · Y

(17) In mathematics, the decimal point (.),used for separating the fractional part of a decimal number from the whole part.

(18) In computing and printing, as dot matrix, a reference to the method of assembling shapes by the use of dots (of various shapes) in a given space.  In casual (and commercial) use it was use of impact printers which used a hammer with a dot-shape to strike a ribbon which impacted the paper (or other surface) to produce representations of shapes which could include text.  Technically, laser printers use a dot-matrix in shape formation but the use to describe impact printers caught on and became generic.  The term “dots per inch” (DPI) is a measure of image intensity and a literal measure of the number of dots is an area.  Historically, impact printers were sold on the basis of the number of pins (hammers; typically 9, 18 or 24) in the print head which was indicative of the quality of print although some software could enhance the effect.

(19) In civil law, a woman's dowry.

(20) In video gaming, the abbreviation for “damage over time”, an attack that results in light or moderate damage when it is dealt, but that wounds or weakens the receiving character, who continues to lose health in small increments for a specified period of time, or until healed by a spell or some potion picked up.

(21) To mark with or as if with a dot or dots; to make a dot-like shape.

(22) To stud or diversify with or as if with dots (often in the form “…dotting the landscape…” etc).

(23) To form or cover with dots (such as “the dotted line”).

(24) In colloquial use, to punch someone.

(25) In cooking, to sprinkle with dabs of butter, chocolate etc.

Pre 1000: It may have been related to the Old English dott (head of a boil) although there’s no evidence of such use in Middle English.  Dottle & dit were both derivative of Old English dyttan (to stop up (and again, probably from dott)) and were cognate with Old High German tutta (nipple), the Norwegian dott and the Dutch dott (lump).  Unfortunately there seems no link between dit and the modern slang zit (pimple), a creation of US English unknown until the 1960s.  The Middle English dot & dotte were from the Old English dott in the de-elaborated sense of “a dot, a point on a surface), from the Proto-West Germanic dott, from the Proto-Germanic duttaz (wisp) and were cognate with the Saterland Frisian Dot & Dotte (a clump), the Dutch dot (lump, knot, clod), the Low German Dutte (a plug) and the Swedish dott (a little heap, bunch, clump).  The use in civil jurisdiction of common law where dot was a reference to “a woman's dowry” dates from the early 1820s and was from the French, from the Latin dōtem, accusative of dōs (dowry) and related to dōtāre (to endow) and dāre to (give).  For technical or descript reasons dot is a modifier or modified as required including centered dot, centred dot, middle dot, polka dot, chroma dot, day dot, dot-com, dot-comer (or dot-commer), dot release and dots per inch (DPI).  The synonyms can (depending on context) include dab, droplet, fleck, speck, pepper, sprinkle, stud, atom, circle, speck, grain, iota, jot, mite, mote, particle, period, pinpoint, point, spot and fragment.  Dot & dotting are nouns & verbs, dotter is a noun, dotlike & dotal are adjectives, dotted is an adjective & verb and dotty is a noun & adjective; the noun plural is dots.

Although in existence for centuries, and revived with the modern meaning (mark) in the early sixteenth century, the word appears not to have been in common use until the eighteenth and in music, the use to mean “point indicating a note is to be lengthened by half” appears by at least 1806.  The use in the Morse code used first on telegraphs dates from 1838 and the phrase “on the dot” (punctual) is documented since 1909 as a in reference to the (sometimes imagined) dots on a clock’s dial face.  In computing, “dot-matrix” (printing and screen display) seems first to have been used in 1975 although the processes referenced had by then been in use for decades.  The terms “dotted line” is documented since the 1690s.  The verb dot (mark with a dot or dots) developed from the noun and emerged in the mid eighteenth century.  The adjective dotty as early as the fourteenth century meant “someone silly” and was from "dotty poll" (dotty head), the first element is from the earlier verb dote.  By 1812 it meant also literally “full of dots” while the use to describe shotguns, their loads and the pattern made on a target was from the early twentieth century.  The word microdot was adopted in 1971 to describe “tiny capsules of Lysergic acid diethylamide" (LSD or “acid”); in the early post-war years (most sources cite 1946) it was used in the espionage community to describe (an extremely reduced photograph able to be disguised as a period dot on a typewritten manuscript.

Lindsay Lohan in polka-dots, enjoying a frozen hot chocolate, Serendipity 3 restaurant, New York, 7 January 2019.

The polka-dot (a pattern consisting of dots of uniform size and arrangement," especially on fabric) dates from 1844 and was from the French polka, from the German Polka, probably from the Czech polka, (the dance, literally "Polish woman" (Polish Polka), feminine form of Polak (a Pole).  The word might instead be a variant of the Czech půlka (half (půl the truncated version of půlka used in special cases (eg telling the time al la the English “half four”))) a reference to the half-steps of Bohemian peasant dances.  It may even be influenced by or an actual merger of both.  The dance first came into vogue in 1835 in Prague, reaching London in the spring of 1842; Johann Strauss (the younger) wrote many polkas.  Polka was a verb by 1846 as (briefly) was polk; notoriously it’s sometimes mispronounced as poke-a-dot.

In idiomatic use, to “dot one's i's and cross one's t's” is to be meticulous in seeking precision; an attention to even the smallest detail.  To be “on the dot” is to be exactly correct or to have arrived at exactly at the time specified.  The ides of “joining the dots” or “connecting the dots” is to make connections between various pieces of data to produce useful information.  In software, the process is literal in that it refers to the program “learning: how accurately to fill in the missing pieces of information between the data points generated or captured.  “The year dot” is an informal expression which means “as long ago as can be remembered”.  To “sign on the dotted line” is to add one’s signature in the execution of a document (although there may be no actual dotted line on which to sign).

Dots, floating points, the decimal point and the Floating Point Unit (FPU) 

When handling numbers, decimal points (the dot) are of great significance.  In cosmology a tiny difference in values beyond the dot can mean the difference between hitting one’s target and missing by thousands of mile and in finance the placement can dictate the difference between ending up rich or poor.  Vital then although not all were much bothered: when Lord Randolph Churchill (1849–1895) was Chancellor of the Exchequer (1886), he found the decimal point “tiresome”, telling the Treasury officials “those damned dot” were not his concern and according to the mandarins he was inclined to “round up to the nearest thousand or million as the case may be”.  His son (Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) when Chancellor (1924-1929)) paid greater attention to the dots but his term at 11 Downing Street, although longer, remains less well-regarded.

In some (big, small or complex) mathematical computations performed on computers, the placement of the dot is vital.  What are called “floating-point operations” are accomplished using a representation of real numbers which can’t be handled in the usual way; both real numbers, decimals & fractions can be defined or approximated using floating-point representation, the a numerical value represented by (1) a sign, (2) a significand and (3) an exponent.  The sign indicates whether the number is positive or negative, the significand is a representation of the fractional part of the number and the exponent determines the number’s scale.  In computing, the attraction of floating-point representation is that a range of values can be represented with a relatively small number of bits and although the capability of computers has massively increased, so has the ambitions of those performing big, small or complex number calculations so the utility remains important.  At the margins however (very big & very small), the finite precision of traditional computers will inevitably result in “rounding errors” so there can be some degree of uncertainty, something compounded by there being even an “uncertainty about the uncertainty”.  Floating point calculations therefore solve many problems and create others, the core problem being there will be instances where the problems are not apparent.  Opinion seems divided on whether quantum computing will mean the uncertainty will vanish (at least with the very big if not the very small).

In computer hardware, few pieces have so consistently been the source of problems as Floating point units (FPUs), the so-called “math co-processors”.  Co-processors were an inherent part of the world of the mainframes but came to be thought of as something exotic in personal computers (PC) because there was such a focus on the central processing unit (CPU) (8086, 68020, i486 et al) and some co-processors (notably graphical processing units (GPU)) have assumed a cult-like following.  The evolution of the FPU is interesting in that as manufacturing techniques improved they were often integrated into the CPU architecture before again when the PC era began, Intel’s early 808x & 8018x complimented by the optional 8087 FPU, the model replicated by the 80286 & 80287 pairing, the latter continuing for some time as the only available FPU for almost two years after the introduction of the 80386 (later renamed i386DX in an attempt to differential genuine “Intel Inside” silicon from the competition which had taken advantage of the difficulties in trade-marking numbers).  The delay was due to the increasing complexity of FPU designs and flaws were found in the early 387s.

Intel i487SX & i486SX.

The management of those problems was well-managed by Intel but with the release of the i487 in 1991 they kicked an own goal.  First displayed in 1989, the i486DX had been not only a considerable advance but included an integrated FPU (also with some soon-corrected flaws).  That was good but to grab some of the market share from those making fast 80386DX clones, Intel introduced the i486SX, marketed as a lower-cost chip which was said to be an i486 with a reduced clock speed and without the FPU.  For many users that made sense because anyone doing mostly word processing or other non-number intensive tasks really had little use for the FPU but then Intel introduced the i487SX, a FPU unit which, in the traditional way, plugged into a socket on the system-board (as even them motherboards were coming to be called) al la a 287 or 387.  However, it transpired i487SX was functionally almost identical to an i486DX, the only difference being that when plugged-in, it checked to ensure the original i486SX was still on-board, the reason being Intel wanted to ensure no market for used i486SXs (then selling new for hundreds of dollars) emerged.  To achieve this trick, the socket for the I487 had an additional pin and it was the presence of this which told the system board to disable the i486SX.  The i487SX was not a success and Intel suffered what was coming to be called “reputational damage”.

Dual socket system-board with installed i486SX, the vacant socket able to handle either the i486DX or the i487SX.

The i487SX affair was however a soon forgotten minor blip in Intel’s upward path.  In 1994, Intel released the first of the Pentium CPUs all of which were sold with an integrated FPU, establishing what would become Intel’s standard architectural model.  Like the early implementations of the 387 & 487, there were flaws and upon becoming aware of the problem, Intel initiated a rectification programme.  They did not however issue a recall or offer replacements to anyone who had already purchased a flawed Pentium and, after pressure was exerted, undertook to offer replacements only to those users who could establish their pattern of use indicated they would actually be in some way affected.  Because of the nature of the bug, that meant “relatively few”.  The angst however didn’t subside and a comparison was made with a defect in a car which would manifest only if speeds in excess of 125 mph (200 km/h) were sustained for prolonged periods.  Although in that case only “relatively few” might suffer the fault, nobody doubted the manufacturer would be compelled to rectify all examples sold and such was the extent of the reputational damage that Intel was compelled to offer what amounted to a “no questions asked” replacement offer.  The corporation’s handing of the matter has since often been used as a case study in academic institutions by those studying law, marketing, public relations and such.

Saturday, October 23, 2021

Mini

Mini (pronounce min-ee)

(1) A skirt or dress with a hemline well above the knee, popular since the 1960s.

(2) A small car, build by Austin, Morris, associated companies and successor corporations between 1959-2000.  Later reprised by BMW in a retro-interpretation.

(3) As minicomputer, a generalized (historic) descriptor for a multi-node computer system smaller than a mainframe; the colloquial term mini was rendered meaningless by technological change (Briefly, personal computers (PC) were known as micros).

(4) A term for anything of a small, reduced, or miniature size.

Early 1900s: A shorted form of miniature, ultimately from the Latin minium (red lead; vermilion), a development influenced by the similarity to minimum and minus.  In English, miniature was borrowed from the late sixteenth century Italian miniatura (manuscript illumination), from miniare (rubricate; to illuminate), from the Latin miniō (to color red), from minium (red lead).  Although uncertain, the source of minium is thought to be Iberian; the vivid shade of vermilion was used to mark particular words in manuscripts.  Despite the almost universal consensus mini is a creation of twentieth-century, there is a suggested link in the 1890s connected with Yiddish and Hebrew.

As a prefix, mini- is a word-forming element meaning "miniature, minor", again abstracted from miniature, with the sense presumed to have been influenced by minimum.  The vogue for mini- as a prefix in English word creation dates from the early 1960s, the prime influences thought to be (1) the small British car, (2) the dresses & skirts with high-hemlines and (3) developments in the hardware of electronic components which permitted smaller versions of products to be created as low-cost consumer products although there had been earlier use, a minicam (a miniature camera) advertised as early as 1937.  The mini-skirt (skirt with a hem-line well above the knee) dates from 1965 and the first use of mini-series (television series of short duration and on a single theme) was labelled such in 1971 and since then, mini- has been prefixed to just about everything possible.  To Bridget Jones (from Bridget Jones's Diary (1996) a novel by Helen Fielding (b 1958)), a mini-break was a very short holiday; in previous use in lawn tennis it referred to a tiebreak, a point won against the server when ahead.

Jean Shrimpton and the mini-skirt

Jean Shrimpton, Flemington Racecourse, Melbourne, 1965.

The Victorian Racing Club (VRC) had in 1962 added Fashions on the Field to the Melbourne’s Spring Racing Carnival at Flemington and for three years, women showed up with the usual hats and accessories, including gloves and stockings, then de rigueur for ladies of the Melbourne establishment.  Then on the VRC’s Derby Day in 1965, English model Jean Shrimpton (b 1942) wore a white mini, its hem a daring four inches (100 mm) above the knee.  It caused stir.

The moment has since been described as the pivotal moment for the introduction of the mini to an international audience which is probably overstating things but for Melbourne it was certainly quite a moment.  Anthropologists have documented evidence of the mini in a variety of cultures over the last 4000 odd years so, except perhaps in Melbourne, circa 1965, it was nothing new but that didn’t stop the fashion industry having a squabble about who “invented” the mini.  French designer André Courrèges (1923-2016) explicitly claimed the honor, accusing his London rival to the claim, Mary Quant (b 1930) of merely “commercializing it”.  Courrèges had shown minis at shows in both 1964 and 1965 and his sketches date from 1961.  Quant’s designs are even earlier but given the anthropologists’ findings, it seems a sterile argument.

Minimalism: Lindsay Lohan and the possibilities of the mini.

The Mini

1962 Riley Elf.

The British Motor Corporation (BMC) first released their Mini in 1959, the Morris version called the Mini Minor (a link to the larger Minor, a model then in production) while the companion Austin was the Seven, a re-use of the name of a tiny car of the inter-war years.  The Mini name however caught on and the Seven was re-named Mini early in 1962 although the up-market (and, with modifications to the body, slightly more than merely badge-engineered) versions by Riley and Wolseley were never called Mini, instead adopting names either from or hinting at their more independent past: the Elf and Hornet respectively.  The Mini name was in 1969 separated from Austin and Morris, marketed as stand-alone marque until 1980 when the Austin name was again appended, an arrangement which lasted until 1988 when finally it reverted to Mini although some were badged as Rovers for export markets.  The Mini remained in production until 2000, long before then antiquated but still out-lasting the Metro, its intended successor.

1969 Austin Maxi 1500.

The allure of the Mini name obviously impressed BMC.  By 1969, BMC had, along with a few others, been absorbed into the Leyland conglomerate and the first release of the merged entity was in the same linguistic tradition: The Maxi.  A harbinger of what was to come, the Maxi encapsulated all that would go wrong within Leyland during the 1970s; a good idea, full of advanced features, poorly developed, badly built, unattractive and with an inadequate service network.  The design was so clever that to this day the space utilization has rarely been matched and had it been a Renault or a Citroën, the ungainly appearance and underpowered engine might have been forgiven because of the functionality but the poor quality control, lack of refinement and clunky aspects of some of the drivetrain meant success was only ever modest.  Like much of what Leyland did, the Maxi should have been a great success but even car thieves avoided the thing; for much of its life it was reported as the UK's least stolen vehicle.          

1979 Vanden Plas Mini (a possibly "outlaw" project by Leyland's outpost in South Africa).

Curiously, given the fondness of BMC (and subsequently Leyland) for badge-engineering, there was never an MG version of the Mini (although a couple of interpretations were privately built), the competition potential explored by a joint-venture with the Formula One constructors, Cooper, the name still used for some versions of the current BMW Mini.  Nor was there a luxury version finished by coachbuilders Vanden Plas which, with the addition of much timber veneer and leather to mundane vehicles, provided the parent corporations with highly profitable status-symbols with which to delight the middle-class although there was one "outlaw".  Between August 1978-September 1979, Leyland's South African operation (Leykor) offered a Vanden Plas Mini.  It used the 1098cm3 A-Series engine, a four-speed manual transmission and drum brakes all round.  Available only in a metallic bronze with gold basket weave side-graphics (shades of brown seemed to stalk the 1970s), standard equipment included a folding sunroof, matt-black grille with chrome surround, tinted glass, twin chrome door mirrors, a chrome exhaust tip, mud-flaps and a dipping rear view mirror.  The interior appointments weren't up to the standard of the English VDPs but there was cashmere pure wool upholstery, a walnut veneer dashboard with twin glove boxes, a leather bound steering wheel and mahogany cut-pile carpets.  Apparently, the project was shut down when London got to hear about it.   In the home market, third-party suppliers of veneer and leather such as Radford found a market among those who appreciated the Mini's compact practicality but found its stark functionalism just too austere. 

The Twini

Mini Coopers (1275 S) through the cutting, Mount Panorama, Bathurst, Australia, 1966.

In that year's Gallaher 500, Mini Coopers finished first to ninth.  It was the last occasion on which anything with a naturally-aspirated four-cylinder engine would win the annual endurance classic, an event which has since be won on all but a handful of occasions by V8-powered cars (memorably a V12 Jaguar XJS triumphed in 1985 when Conrod Straight was still at it full length), a statistic distorted somewhat by the rule change in 1995 which stipulated only V8s were allowed to run.    

Although it seemed improbable when the Mini was released in 1959 as a small, utilitarian economy car, the performance potential proved extraordinary; in rallies and on race tracks it was a first-rate competitor for over a decade, remaining popular in many forms of competition to this day.  The joint venture with the Formula One constructor Cooper provided the basis for most of the success but by far the most intriguing possibility for more speed was the model which was never developed beyond the prototype stage: the twin-engined Twini.

Prototype twin-engined Moke while undergoing snow testing, 1962.

It wasn’t actually a novel approach.  BMC, inspired apparently by English racing driver Paul Emery (1916–1993) who in 1961 had built a twin-engined Mini, used the Mini’s underpinnings to create an all-purpose cross-country vehicle, the Moke, equipped with a second engine and coupled controls which, officially, was an “an engineering exercise” but had actually been built to interest the Ministry of Defence in the idea of a cheap, all-wheel drive utility vehicle, so light and compact it could be carried by small transport aircraft and serviced anywhere in the world.  The army did test the Moke and were impressed by its capabilities and the flexibility the design offered but ultimately rejected the concept because the lack of ground-clearance limited the terrain to which it could be deployed.  Based on the low-slung Mini, that was one thing which couldn’t easily be rectified.  Instead, using just a single engine in a front-wheel-drive (FWD) configuration, the Moke was re-purposed as a civilian model, staying in production between 1964-1989 and offered in various markets.  Such is the interest in the design that several companies have resumed production, including in electric form and it remains available today.

Cutaway drawing of Cooper’s Twini.

John Cooper (1923-2000), aware of previous twin-engined racing cars,  had tested the prototype military Moke and immediately understood the potential the layout offered for the Mini (ground clearance not a matter of concern on race tracks) and within six weeks the Cooper factory had constructed a prototype.  To provide the desired characteristics, the rear engine was larger and more powerful, the combination, in a car weighing less than 1600 lb (725 kg), delivering a power-to-weight ratio similar to a contemporary Ferrari Berlinetta and to complete the drive-train, two separate gearboxes with matched ratios were fitted.  Typically Cooper, it was a well thought-out design.  The lines for the brake and clutch hydraulics and those of the main electrical feed to the battery were run along the right-hand reinforcing member below the right-hand door while on the left side were the oil and water leads, the fuel supply line to both engines fed from a central tank.  The electrical harness was ducted through the roof section and there was a central throttle link, control of the rear carburetors being taken from the accelerator, via the front engine linkage, back through the centre of the car.  It sounded intricate but the distances were short and everything worked.

Twini replica.

John Cooper immediately began testing the Twini, evaluating its potential for competition and as was done with race cars in those happy days, that testing was on public roads where it proved to be fast, surprisingly easy to handle and well-balanced.  Unfortunately, de-bugging wasn't complete and during one night session, the rear engine seized which resulting in a rollover, Cooper seriously injured and the car destroyed.  Both BMC and Cooper abandoned the project because the standard Mini-Coopers were proving highly successful and to qualify for any sanctioned competition, at least one hundred Twinis would have to have been built and neither organization could devote the necessary resources for development or production, especially because no research had been done to work out whether a market existed for such a thing, were it sold at a price which guaranteed at least it would break even.

Twini built by Downton Engineering.  Driven by Sir John Whitmore (1937– 2017) &  Paul Frère (1917–2008) in the 1963 Targa Florio, it finished 27th and 5th in class.

The concept however did intrigue others interested in entering events which accepted one-offs with no homologation rules stipulating minimum production volumes.  Downton Engineering built one and contested the 1963 Targa Florio where it proved fast but fragile, plagued by an overheating rear-engine and the bugbear of previous twin-engined racing cars: excessive tire wear.  It finished 27th (and last) but it did finish, unlike some of the more illustrious thoroughbreds which fell by the wayside.  Interestingly, the Downton engineers choose to use a pair of the 998 cm3 (61 cubic inch) versions of the BMC A-Series engine which was a regular production iteration and thus in the under-square (long stroke) configuration typical of almost all the A-Series.  The long stroke tradition in British engines was a hangover from the time when the road-taxation system was based on the cylinder bore, a method which had simplicity and ease of administration to commend it but little else, generations of British engines distinguished by their dreary, slow-revving characteristics.  The long stroke design did however provide good torque over a wide engine-speed range and on road-course like the Targa Florio, run over a mountainous Sicilian circuit, the ample torque spread would have appealed more to drivers than ultimate top-end power.  For that reason, although examples of the oversquare 1071 cm3 (65 cubic inch) versions were available, it was newly developed and a still uncertain quantity and never considered for installation.  The 1071 was used in the Mini Cooper S only during 1963-1964 (with a companion 970 cm3 (61 cubic inch) version created for use in events with a 1000 cm3 capacity limit) and the pair are a footnote in A-Series history as the only over-square versions released for sale

Twin-engined BMW Mini (Binni?).

In the era, it’s thought around six Twinis were built (and there have been a few since) but the concept proved irresistible and twin-engined versions of the "new" Mini (built since 2000 by BMW) have been made.  It was fitting that idea was replicated because what was striking in 2000 when BMW first displayed their Mini was that its lines were actually closer to some of the original conceptual sketches from the 1950s than was the BMC Mini on its debut.  BMW, like others, of course now routinely add electric motors to fossil-fuel powered cars so in that sense twin (indeed, sometimes multi-) engined cars are now common but to use more than one piston engine remains rare.  Except for the very specialized place which is the drag-strip, the only successful examples have been off-road or commercial vehicles and as John Cooper and a host of others came to understand, while the advantages were there to be had, there were easier, more practical ways in which they could be gained.  Unfortunately, so inherent were the drawbacks that the problems proved insoluble.

Friday, October 22, 2021

Loafer

Loafer (pronounced loh-fer)

(1) A person who loafs about; a lazy idler; a lay-about.

(2) A name for a moccasin-like, laceless, slip-on shoe, worn by both men and women.

(3) In some south-western US dialects, a wolf, especially a grey or timber wolf (often in the compound form “loafer wolf).

1830: The construct was loaf + -er.  Loaf was from the From Middle English lof & laf, from the Old English hlāf (bread, loaf of bread), from the Proto-West Germanic hlaib, from the Proto-Germanic hlaibaz (bread, loaf), of uncertain origin but which may be related to the Old English hlifian (to stand out prominently, tower up). It was cognate with the Scots laif (loaf), the German Laib (loaf), the Swedish lev (loaf), the Russian хлеб (xleb) (bread, loaf) and the Polish chleb (bread).  It was used to mean (1) a block of bread after baking, (2) any solid block of food, such as meat or sugar, (3) a solid block of soap, from which standard bar (or cake) of soap is cut or (4) in cellular automata, a particular still life configuration with seven living cells.  The origin of “use your loaf” meaning “think about it” in Cockney rhyming slang was as a shortened form of “loaf of bread” (ie “use your head”).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  Loafer & loafing are nouns & verbs, loafed, loafering & loafered are verbs and loaferish is an adjective; the noun plural is loafers.

The use to describe “a lazy idler” was first documented in 1830 as an Americanism which may have been short for landloafer (vagabond), similar (though not necessarily related) to the obsolete nineteenth century German Landläufer (vagabond) or the Dutch landloper.  Etymologists suggest landloafer may have been a partial translation of a circa 1995 loan-translation of the German Landläufer as “land loper” (and may be compared with the dialectal German loofen (to run) and the English landlouper) but this has little support and most regard a more likely connection being the Middle English love, loove, loffinge & looffinge (a remnant, the rest, that which remains or lingers), from Old English lāf (remainder, residue, what is left), which was akin to Scots lave (the rest, remainder), the Old English lǣfan (to let remain, leave behind).  One amusing coincidence was that in Old English hlaf-aeta (household servant) translated literally as “loaf-eater” (ie, one who eats the bread of his master, suggesting the Anglo-Saxons might still have felt the etymological sense of their lord & master as the “loaf-guard”.  The expression "one mustn't despair because one slice has been cut from the loaf" describes a pragmatic reaction to learning one's unmarried daughter has been de-flowered and is said to be of Yiddish origin but no source has ever been cited.  In modern idomatic use, the derived phrases "a slice off a cut loaf is never missed" and "you never miss a slice from a cut loaf" refer to having enjoyed sexual intercourse with someone who is not a virgin, the idea being that once the end of a loaf (the crust) has been removed, it's not immediately obvious how many slices have been cut. 

The loafer is a style, a slip-on shoe which is essentially a slipper designed as an all-weather shoe for outdoor use.  They’re available in a wide range of styles from many manufacturers and this image is just a few of the dozens recently offered by Gucci.  In the old Soviet Union (the USSR; 1922-1991), there were usually two (when available): one for men and one for women, both (sometimes) available in black or brown.

The verb loaf was first documented in 1835 in US English, apparently a back-formation from the earlier loafer and loafed & loafing soon emerged.  The noun in the sense of “an act of loafing” was in use by 1855.  What constitutes loafing is very much something subjective; a student underachieving in Latin might be thought a loafer by a professor of classics but the “hard working, much published” don who in his whole career never lifted anything much heavier than a book would probably be dismissed as “a loafer” by the laborer digging the trench beneath his study.  A “tavern loafer” was one who spent his hours drinking in bars while a “street loafer” was a synonym for a “delinquent who hung about on street corners”.  Loafer as a description of footwear dates from 1937 and it was used of lace-less, slip-on shoes worn on less formal occasions (essentially slippers designed for outdoor use, a popular early version of which was the “penny loafer”, so named because it featured an ornamental slotted leather band across the upper where a coin was often mounted.  The use in some south-western dialects as “loafer” or “loafer wolf” to describe a grey or timber wolf is based on the American Spanish lobo (wolf), reinterpreted as or conflated with loafer (idler).

Rowan Williams (b 1950; Archbishop of Canterbury 2002-2012) admiring Benedict XVI’s (1927–2022; pope 2005-2013, pope emeritus 2013-2022) red loafers, Lambeth Palace, September 2010.

When in 2013 announced he was resigning the papacy, there was much discussion of what might be the doctrinal or political implications but a few fashionistas also bid farewell to the best-dressed pontiff for probably a century and the one Esquire magazine had named “accessorizer of the year”.  In recent memory, the world had become accustomed to the white-robed John Paul II (1920–2005; pope 1978-2005) who would don colorful garments for ceremonial occasions but never wore them with great élan and eschewed the use of the more elaborate, perhaps influenced by Paul VI (1897-1978; pope 1963-1978) whose reign was marked by a gradual sartorial simplification and he was the last pope to wear the triple tiara which had since the early Middle Ages been a symbol of papal authority; briefly it sat on his head on the day of his coronation before, in an “act of humility”, it was placed on the alter where, symbolically, it has since remained.

The pope and the archbishop discuss the practicalities of cobbling.

Benedict’s pontificate however was eight stylish years, the immaculately tailored white caped cassock (the simar) his core piece of such monochromatic simplicity that it drew attention to the many adornments and accessories he used which included billowing scarlet satin chasubles trimmed with crimson velvet and delicate gold piping and others woven in emerald-green watered silk with a pattern of golden stars.  Much admired also was the mozzetta, a waist-length cape, and the camauro, a red velvet cap with a white fur border that around the world people compared with the usual dress of Santa Claus, X (then known as twitter) quickly fleshing out the history of the Coca-Cola Corporation’s role in creating the “uniform” although there was some exaggeration, the Santa-suit and hat familiar by at least the 1870s although Coca-Cola’s use in advertizing did seem to drive out all colors except red.  On popes however, the red velvet and white fur trim had been around for centuries though it fell from fashion after the Second Vatican Council (Vatican II; 1962-1965) and was thus a novelty when Benedict revived the style.

The pope farewells the archbishop.

Not all (including some cardinals) appreciated the papal bling but what attracted most attention were his bright red loafers, a style of shoe which popes have been depicted wearing since Roman times and the Holy See was forced to issue a statement denying they were hand-crafted by the high-end Italian fashion house Prada.  In their press release, the Vatican’s Press Office reminded the world the red symbolizes martyrdom and the Passion of Christ, the shoes there to signify the pope following in the footsteps of Christ.  Rather than a fashion house, the papal loafers were the work of two Italian artisan cobblers: Adriano Stefanelli and Antonio Arellano and Signor Stefanelli’s connections with the Vatican began when he offered to make shoes for John Paul II after noticing his obvious discomfort during a television broadcast.  Signor Arellano had a longer link with Benedict’s feet, having been his cobbler when, as Joseph Ratzinger, he was the cardinal heading the Inquisition (now called the Dicastery for the Doctrine of the Faith (DDF)) and as soon as Benedict’s surprise elevation was announced, he went immediately to his last and made a pair of red loafers for him (he’s an Italian size 42 (a UK 8 & a US 9)).  Upon his resignation, as pope emeritus, he retired the red loafers in favor of three pairs (two burgundy, one brown) which were a gift from a Mexican cobbler: Armando Martin Dueñas.  Pope Francis (b 1936; pope since 2013) has reverted to the austere ways of Vatican II and wears black shoes.

Channeling Benedict: Lindsay Lohan in red loafers, September 2016.  Although unconfirmed, it's believed these were not a papal gift.

Thursday, October 21, 2021

Acrimony

Acrimony (pronounced ak-ruh-moh-nee)

Sharpness, harshness, or bitterness of nature, speech, disposition, animosity, spitefulness or asperity; a state of or expression of enmity, hatred or loathing.

1535-1540: From the Middle French acrimonie (quality of being sharp or pungent in taste) or directly from the Latin acrimonia (sharpness, pungency of taste), figuratively "acrimony, severity, energy" an abstract noun, from acer (feminine. acris) (sharp), from the primitive Indo-European root ak- (be sharp, rise (out) to a point, pierce) + -monia or –mony (the suffix of action, state, condition).  Figurative extension to personal sharpness, bitterness and hatred was well established by 1610 and has long been the dominant meaning, the application to describe even a dislike of someone “irritating in manner” was noted from 1775.  The adjectival form acrimonious dates from circa 1610 from the French acrimonieux, from the Medieval Latin acrimoniosus and, again, is now usually figurative of dispositions, the use referencing taste or spell now entirely obsolete.

In the West, democratic politics, sometime in the nineteenth century had, evolved into the form today familiar: a contest between parties or aggregations sometimes described otherwise but which behave like parties. There’s much variation, there are systems which usually have two-parties and some which tend towards more and there are those with electoral mechanisms which cause distortions compared with the results the parties actually achieve but the general model is that of a contest between parties.  What that generates can be fun to watch but what’s more amusing is the contest within parties in which there’s more fear, hatred, loathing and acrimony than anything transacted between one party and another.  Sometimes these hatreds arise out of some ideological difference and sometimes it’s just a visceral personal hatred between people who detest each other.

Paris Hilton (left) and Lindsay Lohan (right), September 2004.

In December 2021, Paris Hilton revealed she and Lindsay Lohan had ended their acrimony of a decade-odd, because they’re “not in high school” and the renewal of the entente cordiale seems to have been initiated the previous month when Ms Lohan announced her engagement.  In her podcast This Is Paris, Ms Hilton observed “I know we’ve had our differences in the past, but I just wanted to say congratulations to her and that I am genuinely very happy for her”, reflecting on the changes in their lives and that of Britney Spears, the three who had been dubbed “The Holy Trinity” after Rupert Murdoch’s (b 1931) New York Post in 2006 published the infamous “Bimbo Summit” front page with the three seated in Ms Hilton’s Mercedes-Benz SLR McLaren (C199; 2003-2010).  Perhaps feeling nostalgic, she added “It just makes me so happy to see, you know, 15 years later, and just so much has happened in the past two weeks… I got married, Britney got her freedom back and engaged, and then Lindsay just got engaged. So I love just seeing how different our lives are now and just how much we’ve all grown up and just having love in our lives.”  She concluded with: “And I think that love is the most important thing in life, it’s something that really just changes you and makes you grow, and when you find that special person that is your other half and is your best friend and you can trust…that’s just an amazing feeling.”  Both recently became mothers and exchanged best wishes.

Acrimony in action

At prayer together: Former Australian prime ministers Kevin Rudd and Julia Gillard, Parliament House, Canberra ACT, Australia.

More than one political leader has observed the secret to a successful relationship between a leader and deputy was to make sure neither wanted the other’s job and the Australian Labor Party's (ALP) Kevin Rudd (b 1957; Australian prime-minister 2007-2010 & 2013) should have listened, not a quality for which he was noted.  Winning a convincing victory and supported by a groundswell of goodwill not seen in a generation, in December 2007 he became Prime Minister of Australia with Julia Gillard (b 1961; Australian prime minister 2010-2013) as deputy.  Things went well for a while, then they went bad and Gillard staged a coup.  After the hatchet men had counted the numbers, late one mid-winter night in June 2010, the deed was done and with Rudd politically defenestrated, Gillard was installed as Australia’s first female prime minister.

Beyond the beltway, the coup wasn’t well received, the voters appearing to take the view that while Rudd might have turned out to be a dud, it was their right to plunge the electoral dagger through the heart, not have the job done by the faceless men stabbing him in the back.  The support which had gained Rudd a healthy majority in 2007 declined and the 2010 election produced the first hung parliament since 1940, Gillard was forced form a minority government which relied on the support of a Green Party MP and three soft-drink salesmen, all with their own price to be paid,  Perhaps surprisingly, the parliament actually ran quite well and was legislatively productive, a thing good or bad depending on one’s view of what was passed but there certainly wasn’t the deadlock some had predicted.

Julia Gillard and Kevin Rudd.

A slender majority of the politicians may have been pleased but the electorate remained unconvinced and however bad Rudd’s standing had been in 2010, by 2013 Gillard’s was worse and, after a splutter or two, in June 2013, the long months of instability came to a head and the hatchet men (this time aligned with Rudd) again assembled to do their dirty day's work.  Rudd took his vengeance, regaining the prime-ministership and Gillard retired from politics though the victory proved pyrrhic, Rudd defeated in the general election three months into his second coming, the spin being his accession to the leadership being what saved his party from what would have been a much worse defeat under Gillard.  Pyrrhic it may have been but he got his revenge so there’ll always be that.

In 2013, Rudd had lost to Tony Abbott (b 1957; Australian prime-minister 2013-2015) who gained quite a good swing, picking up a healthy majority which conventional political wisdom suggested should have guaranteed the Liberal-National coalition at least two terms in office.  Unfortunately, it wasn’t until two years into his term that some in the Liberal Party worked out there’d been a filing error and Mr Abbott thought he’d joined the Democratic Labor Party (the DLP, Catholic Church’s political wing in Australia) and genuinely believed he was leading a DLP government.  Malcolm Turnbull (b 1954; Australian prime-minister 2015-2018), who in 2009 had been usurped as opposition leader by Mr Abbott’s hatchet men, sniffed blood and assembled his henchmen, handing out the axes for what proved to be one of the longer slow-motion coups.  It culminated in a party-room vote in September 2015 when Mr Turnbull took his revenge, assuming the party leadership and becoming prime-minister.

This time, things went really well, Turnbull’s victory greeted with genuine optimism which exceeded that even which had swept Rudd to power in 2007 but it didn’t last and Turnbull missed his historic moment to go to the polls.  Like Sir John Gorton (1911-2002; Australian prime-minister 1968-1971) in 1968 and Gordon Brown (b 1951; UK prime-minister 2007-2010) in 2007, he didn’t seize the moment and do what Anthony Eden (1897-1977; UK prime-minister 1955-1957) did in 1955, realise this is a good as it's going to get and to delay will only make things worse.  Things got worse and after the 2016 election, although the government was returned, Turnbull's majority was greatly reduced.  To some extent, the electoral reversal could be accounted for by Turnbull using the campaign slogan "Continuity with Change", ridiculed almost immediately because it turned out to have been used in a US television comedy as an example of the sort of cynical, meaningless slogans around which election campaigns are now built.  The 3WS (three word slogan) is actually a good idea in the social medial age but whereas Mr Abbott was a master at using them in the propaganda technique perfected by the Nazis (simple messages endlessly repeated) and made "Stop the Boats", "Big Fat Tax" et al potent electoral weapons, Mr Turnbull proved not so adept.  His successors wouldn't make the same mistake of over-estimating the voters' hunger for intelligent discussion. 

God expresses his disapproval: Lightning strike in Wentworth on day of Wentworth by-election, 20 October 2018.

That couldn’t last either and it didn’t.  He had problems during his premiership, some of his own making but most not and in August 2018, the hatchet men of the other faction staged one of the more interesting coups and certainly one of the more convoluted, the challenger, whatever his original intention might have been, acting as a stalking horse, unsuccessful in his challenge but trigging a demand for a second vote which Turnbull, reading the tea leaves, declined to contest.  Scott Morrison (b 1968; Australian prime-minister 2018-2022), a perhaps even too convincing a Vicar of Bray, then won the leadership against two opponents from the right and left and both improbable as prime minister in their own ways.  Turnbull resigned from Parliament, triggering a most unwelcome by-election in which the Liberal Party lost the seat of Wentworth to an independent, thereby losing its absolute majority on the floor of the house.  The dish of vengeance had been served hot and eaten cold.

Morrison went on to secure an unexpected victory in the 2019 election.  There’s been a bit of commentary about that surprise result but, given the circumstances, it can’t be denied it was a personal triumph and for the first time in almost a decade, the country had a prime minister who could govern with the knowledge none of his colleagues were (obviously) plotting against him.  His term has, like all administrations had its ups and downs but unlike many, ultimately he didn't gain a benefit from the COVID-19 pandemic and was defeated in the 2022 election.  Because the acrimonies between some of the leading figures in the new ALP government were well-known, political junkies were looking forward to a new round of back-stabbing and shark-feeding but thus far, the tensions have remained (mostly) well-hidden from public view.

Wednesday, October 20, 2021

Puffer

Puffer (pronounced puhf-er)

(1) A person or thing that puffs.

(2) Any of various fishes of the family Tetraodontidae, noted for the defense mechanism of inflating (puffing up) its body with water or air until it resembles a globe, the spines in the skin becoming erected; several species contain the potent nerve poison tetrodotoxin.  Also called the blowfish or, globefish.

(3) In contract law, the casual term for someone who produces “mere puff” or “mere puffery”, the term for the type of exaggerated commercial claim tolerated by law.

(4) In cellular automaton modelling (a branch of mathematics and computer science), a finite pattern that leaves a trail of debris.

(5) In auctioneering, one employed by the owner or seller of goods sold at auction to bid up the price; a by-bidder (now rare, the term “shill bidders” or “shills” more common).

(6) In marine zoology, the common (or harbour) porpoise.

(6) A kier used in dyeing.

(8) In glassblowing, a soffietta (a usually swan-necked metal tube, attached to a conical nozzle).

(9) Early post-war slang for one who takes drugs by smoking and inhaling.

(10) In mountaineering (and latterly in fashion), an insulated, often highly stylized puffy jacket or coat, stuffed with various forms of insulation.

(11) As Clyde puffer, a type of cargo ship used in the Clyde estuary and off the west coast of Scotland.

(12) In electronics and electrical engineering, a type of circuit breaker.

(13) A manually operated medical device used for delivering medicines into the lungs.

(14) As puffer machine, a security device used to detect explosives and illegal drugs at airports and other sensitive facilities.

(15) In automotive engineering, a slang term for forced induction (supercharger & turbocharger), always less common than puffer.

In 1620–1630: A compound word puff + -er.  Puff is from the Middle English puff & puf from the Old English pyf (a blast of wind, puff).  It was cognate with the Middle Low German puf & pof.  The –er suffix is from the Middle English –er & -ere, from Old English -ere, from the Proto-Germanic -ārijaz, thought usually to have been borrowed from Latin –ārius and reinforced by the synonymous but unrelated Old French –or & -eor (The Anglo-Norman variant was -our), from the Latin -(ā)tor, from the primitive Indo-European -tōr.  Added to verbs (typically a person or thing that does an action indicated by the root verb) and forms an agent noun.  The original form from the 1620s was as an agent noun from the verb puff, the earliest reference to those who puffed on tobacco, soon extended to steamboats and steam engines generally when they appeared.  The sense of "one who praises or extols with exaggerated commendation" is from 1736, which, as “mere puff” or “mere puffery” in 1892 entered the rules of contract law in Carlill v Carbolic Smoke Ball Company (1892, QB 484 (QBD)) as part of the construction limiting the definition of misrepresentation.  The remarkable fish which inflates itself in defense was first noted in 1814, the meanings relating to machinery being adopted as the industrial revolution progressed although the more virile “blower” was always preferred as a reference to supercharging, puffer more appropriate for the hand-held inhalers used by those suffering a variety of respiratory conditions. 

Puffer Jackets and beyond

Calf-length puffer coats.

The first down jacket, a lightweight, waterproof and warm coat for use in cold places or at altitude and known originally as an eiderdown coat, appears to be the one designed by Australian chemist George Finch (1888-1970) for the 1922 Everest expedition but a more recognizable ancestor was the Skyliner, created by American Eddie Bauer (1899-1986) in 1936, his inspiration being the experience of nearly losing his life to hypothermia on a mid-winter fishing trip.  Using trapped air warmed by the body as a diver’s wet suit uses water, Bauer’s imperative was warmth and protection, but he created also a visual style, one copied in 1939 by Anglo-American fashion designer, Charles James (1906-1978) for his pneumatic jacket, the Michelin Man-like motif defining the classic puffer look to this day.

Lindsay Lohan in puffer vest with Ugg boots, Salt Lake City, Utah, 2013 (left) and in puffer jacket, New York City, 2018 (right).

It was in the late 1940s it began to enjoy acceptance as a fashion item, marketed as evening wear and it was sold in this niche in a variety of materials until the 1970s when a new generation of synthetic fibres offered designers more possibilities, including the opportunity to create garments with the then novel characteristic of being simultaneously able to be bulky, lightweight yet able to retain sculptured, stylized shapes.  These attributes enabled puffer jackets to be made for the women’s market, some of which used a layering technique to create its effect and these were instantly popular.  Although initially in mostly dark or subdued colors, by the 1980s, vibrant colors had emerged as a trend, especially in Italy and England.  By the twenty-first century, although available across a wide price spectrum, the puffer as a style cut across class barriers although, those selling the more expensive did deploy their usual tricks to offer their buyers class identifiers, some discrete, some not.

The puffer started life as a jacket and it took a long time to grow but by the 2000s, calf-length puffers had appeared as a retail item after attracting comment, not always favorable, on the catwalks.  Although not selling in the volumes of the jackets, the costs of lengthening can’t have been high because ankle and even floor-length puffers followed.  Down there it might have stopped but, in their fall 2018 collection released during Milan Fashion Week, Italian fashion house Moncler, noted for their skiwear, showed puffer evening gowns, the result of a collaborative venture with Valentino’s designers.  Available in designer colors as well as glossy black, the line was offered as a limited-edition which was probably one of the industry’s less necessary announcements given the very nature of the things would tend anyway to limit sales.  The ploy though did seemed to work, even at US$2,700 for the long dress and a bargain US$3,565 for the cocoon-like winter cape, demand was said to exceed supply so, even if not often worn, puffer gowns may be a genuine collector’s item.

A Dalek.

It wasn’t clear what might have been inspiration for the conical lines although the ubiquity of the shape in industrial equipment was noted.  It seemed variously homage to the burka, a sculptural installation of sleeping bags or the stair-challenged Daleks, the evil alien hybrids of the BBC's Dr Who TV series.  It also picked up also existing motifs from fashion design, appearing even as the playful hybrid of the mullet dress and a cloak.

A monolith somewhere may also have been a reference point but the puffer gown was not stylistically monolithic.  Although to describe the collection as mix-n-match might be misleading, as well as designer colors, some of the pieces technically were jackets, there were sleeves, long and short and though most hems went to the floor, the mullet offered variety, especially for those who drawn to color combination.  Most daring, at least in this context, were the sleeveless, some critics suggesting this worked best with gowns cinched at the middle.


By the time of the commercial release early in 2019, solid colors weren’t the only offering, the range reflecting the influence of Ethiopian patterns although, in a nod to the realities of life, only puffer jackets were made available for purchase.  Tantalizingly (or ominously, depending on one’s view), Moncler indicated the work was part of what they called their “genius series”, the brand intending in the future to collaborate with other designers as well as creating a series of Moncler events in different cities, the stated aim to “showcase the artistic genius found in every city”.  The venture was pursued but in subsequent collections, many found the quality of genius perhaps too subtly executed for anyone but fellow designers and magazine editors to applaud.  The shock of the new has become harder to achieve.