Wednesday, October 25, 2023

Physiognomy

Physiognomy (pronounced fiz-ee-og-nuh-mee or fiz-ee-on-uh-mee)

(1) The face or countenance, especially when considered as an index to the character.

(2) The art of determining character or personal characteristics from the form or features of the body, especially of the face (also called anthroposcopy).

(3) The outward appearance of anything, taken as offering some insight into its character.

(4) Estimation of one's character and mental qualities by a study of the face and general bodily carriage.

1350-1400: From the Late Middle English phisognomie or phisiognomie (art of judging characters from facial features), from the Medieval Latin physionomi from the Late Latin physonomia, from the Late Greek physiognōmía, a syncopated variant of physiognōmonía (art of judging a person by his features).  The construct in Greek was physio- (from physios (nature) + gnōmōn (genitive gnōmōnos) (a judge, interpreter, indicator) from the primitive Indo-European root gno- (to know).  The word soon replaced the earlier Middle English forms fisenamie, fisnamie & fisnomie (from the Middle French fisonomie & the Old French phizonomie).  There was also the strange medieval linguistic cul-de-sac phusiognōmia, an erroneous form of phusiognōmonia (from phusis (nature) + gnōmōn (judge)).  The related form still occasionally used in technical literature is physiognomical.

There’s nothing in science which proves physiognomy can’t be anthropomorphic.  The website Cats that look like Hitler accepts submissions from owners, victims and others who contribute photographs of führeresque felines.  Everyone knows one can tell if a cat is evil by looking into its eyes.

In the same way astrology was once respectable science taught in universities, physiognomy was well accepted by ancient Greek philosophers though it fell into disrepute in medieval times when practiced by crooks and con-men.  As a pseudo-science, it was later revived and briefly popularized by the Swiss theologian Johann Kaspar (1741–1801) who actually enjoyed a friendship with Goethe (Johann Wolfgang von Goethe; 1749–1832) until the author distanced himself, finding the priest just too weird and superstitious.  There was a revival of interest in the late nineteenth century when people started to apply their own interpretations of Darwin’s theories and observations, physiognomy and physical anthropology among the techniques used as a basis for “scientific” racism.  This period of its popularity, not uncoincidentally, happened during the height of European colonialism.  Of late, there’s been a revival of interest because of the rapid improvements in artificial intelligence (AI) and machine learning for facial recognition.  With big data-sets, it becomes, at least theoretically, possible to determine if correlations really do exist between facial appearance, personality and behavior.

Using just an image of Lindsay Lohan downloaded from Wikipedia, the analysis engine used by physiogomicia.com was able to determine she "practices self-denial" and "gets inspired and fantasizes".  The promotional material for the website physiognomica.com begins by asking: "Do you want to learn more about the people surrounding you?"  It promises to "open all the secrets of identity" so that:

"All human secrets will be accessible to you. Is someone capable of lying, is he/she kind, likely to get rich, intelligent or lazy? After analyzing just one photo, we will tell you about the personality and the motives that lead this person. Everything is spread before the eyes. And it is absolutely free of charge.  Nothing stays buried forever!  You will be literally reading people’s minds.  Even before having a conversation with the person you will know exactly how to behave.  From now on it will be easy to evaluate motives of the person.  No more lie detectors and without long discussions with the psychologist."

Tuesday, October 24, 2023

Dot

Dot (pronounced dot)

(1) A small, roundish mark made with or as if with a pen.

(2) A minute or small spot on a surface; speck.

(3) Anything relatively small or speck-like.

(4) A small specimen, section, amount, or portion; a small portion or specimen (the use meaning “a lump or clod” long obsolete).

(5) In grammar, a punctuation mark used to indicate the end of a sentence or an abbreviated part of a word; a full stop; a period.

(6) In the Latin script, a point used as a diacritical mark above or below various letters, as in Ȧ, Ạ, , , Ċ.

(7) In computing, a differentiation point internet addresses etc and in file names a separation device (although historically a marker between the filename and file type when only one dot per name was permitted in early files systems, the best known of which was the 8.3 used by the various iterations of CP/M & DOS (command.com, image.tif, config.sys etc).

(8) In music, a point placed after a note or rest, to indicate that the duration of the note or rest is to be increased one half. A double dot further increases the duration by one half the value of the single dot; a point placed under or over a note to indicate that it is to be played staccato.

(9) In telegraphy. a signal of shorter duration than a dash, used in groups along with groups of dashes (-) and spaces to represent letters, as in Morse code.

(10) In printing, an individual element in a halftone reproduction.

(11) In printing, the mark that appears above the main stem of the letters i, j.

(12) In the sport of cricket, as “dot ball” a delivery not scored from.

(13) In the slang of ballistics as “dotty” (1) buckshot, the projectile from a or shotgun or (2) the weapon itself.

(14) A female given name, a clipping of form of Dorothea or Dorothy.

(15) A contraction in many jurisdictions for Department of Transportation (or Transport).

(16) In mathematics and logic, a symbol (·) indicating multiplication or logical conjunction; an indicator of dot product of vectors: X · Y

(17) In mathematics, the decimal point (.),used for separating the fractional part of a decimal number from the whole part.

(18) In computing and printing, as dot matrix, a reference to the method of assembling shapes by the use of dots (of various shapes) in a given space.  In casual (and commercial) use it was use of impact printers which used a hammer with a dot-shape to strike a ribbon which impacted the paper (or other surface) to produce representations of shapes which could include text.  Technically, laser printers use a dot-matrix in shape formation but the use to describe impact printers caught on and became generic.  The term “dots per inch” (DPI) is a measure of image intensity and a literal measure of the number of dots is an area.  Historically, impact printers were sold on the basis of the number of pins (hammers; typically 9, 18 or 24) in the print head which was indicative of the quality of print although some software could enhance the effect.

(19) In civil law, a woman's dowry.

(20) In video gaming, the abbreviation for “damage over time”, an attack that results in light or moderate damage when it is dealt, but that wounds or weakens the receiving character, who continues to lose health in small increments for a specified period of time, or until healed by a spell or some potion picked up.

(21) To mark with or as if with a dot or dots; to make a dot-like shape.

(22) To stud or diversify with or as if with dots (often in the form “…dotting the landscape…” etc).

(23) To form or cover with dots (such as “the dotted line”).

(24) In colloquial use, to punch someone.

(25) In cooking, to sprinkle with dabs of butter, chocolate etc.

Pre 1000: It may have been related to the Old English dott (head of a boil) although there’s no evidence of such use in Middle English.  Dottle & dit were both derivative of Old English dyttan (to stop up (and again, probably from dott)) and were cognate with Old High German tutta (nipple), the Norwegian dott and the Dutch dott (lump).  Unfortunately there seems no link between dit and the modern slang zit (pimple), a creation of US English unknown until the 1960s.  The Middle English dot & dotte were from the Old English dott in the de-elaborated sense of “a dot, a point on a surface), from the Proto-West Germanic dott, from the Proto-Germanic duttaz (wisp) and were cognate with the Saterland Frisian Dot & Dotte (a clump), the Dutch dot (lump, knot, clod), the Low German Dutte (a plug) and the Swedish dott (a little heap, bunch, clump).  The use in civil jurisdiction of common law where dot was a reference to “a woman's dowry” dates from the early 1820s and was from the French, from the Latin dōtem, accusative of dōs (dowry) and related to dōtāre (to endow) and dāre to (give).  For technical or descript reasons dot is a modifier or modified as required including centered dot, centred dot, middle dot, polka dot, chroma dot, day dot, dot-com, dot-comer (or dot-commer), dot release and dots per inch (DPI).  The synonyms can (depending on context) include dab, droplet, fleck, speck, pepper, sprinkle, stud, atom, circle, speck, grain, iota, jot, mite, mote, particle, period, pinpoint, point, spot and fragment.  Dot & dotting are nouns & verbs, dotter is a noun, dotlike & dotal are adjectives, dotted is an adjective & verb and dotty is a noun & adjective; the noun plural is dots.

Although in existence for centuries, and revived with the modern meaning (mark) in the early sixteenth century, the word appears not to have been in common use until the eighteenth and in music, the use to mean “point indicating a note is to be lengthened by half” appears by at least 1806.  The use in the Morse code used first on telegraphs dates from 1838 and the phrase “on the dot” (punctual) is documented since 1909 as a in reference to the (sometimes imagined) dots on a clock’s dial face.  In computing, “dot-matrix” (printing and screen display) seems first to have been used in 1975 although the processes referenced had by then been in use for decades.  The terms “dotted line” is documented since the 1690s.  The verb dot (mark with a dot or dots) developed from the noun and emerged in the mid eighteenth century.  The adjective dotty as early as the fourteenth century meant “someone silly” and was from "dotty poll" (dotty head), the first element is from the earlier verb dote.  By 1812 it meant also literally “full of dots” while the use to describe shotguns, their loads and the pattern made on a target was from the early twentieth century.  The word microdot was adopted in 1971 to describe “tiny capsules of Lysergic acid diethylamide" (LSD or “acid”); in the early post-war years (most sources cite 1946) it was used in the espionage community to describe (an extremely reduced photograph able to be disguised as a period dot on a typewritten manuscript.

Lindsay Lohan in polka-dots, enjoying a frozen hot chocolate, Serendipity 3 restaurant, New York, 7 January 2019.

The polka-dot (a pattern consisting of dots of uniform size and arrangement," especially on fabric) dates from 1844 and was from the French polka, from the German Polka, probably from the Czech polka, (the dance, literally "Polish woman" (Polish Polka), feminine form of Polak (a Pole).  The word might instead be a variant of the Czech půlka (half (půl the truncated version of půlka used in special cases (eg telling the time al la the English “half four”))) a reference to the half-steps of Bohemian peasant dances.  It may even be influenced by or an actual merger of both.  The dance first came into vogue in 1835 in Prague, reaching London in the spring of 1842; Johann Strauss (the younger) wrote many polkas.  Polka was a verb by 1846 as (briefly) was polk; notoriously it’s sometimes mispronounced as poke-a-dot.

In idiomatic use, to “dot one's i's and cross one's t's” is to be meticulous in seeking precision; an attention to even the smallest detail.  To be “on the dot” is to be exactly correct or to have arrived at exactly at the time specified.  The ides of “joining the dots” or “connecting the dots” is to make connections between various pieces of data to produce useful information.  In software, the process is literal in that it refers to the program “learning: how accurately to fill in the missing pieces of information between the data points generated or captured.  “The year dot” is an informal expression which means “as long ago as can be remembered”.  To “sign on the dotted line” is to add one’s signature in the execution of a document (although there may be no actual dotted line on which to sign).

Dots, floating points, the decimal point and the Floating Point Unit (FPU) 

When handling numbers, decimal points (the dot) are of great significance.  In cosmology a tiny difference in values beyond the dot can mean the difference between hitting one’s target and missing by thousands of mile and in finance the placement can dictate the difference between ending up rich or poor.  Vital then although not all were much bothered: when Lord Randolph Churchill (1849–1895) was Chancellor of the Exchequer (1886), he found the decimal point “tiresome”, telling the Treasury officials “those damned dot” were not his concern and according to the mandarins he was inclined to “round up to the nearest thousand or million as the case may be”.  His son (Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) when Chancellor (1924-1929)) paid greater attention to the dots but his term at 11 Downing Street, although longer, remains less well-regarded.

In some (big, small or complex) mathematical computations performed on computers, the placement of the dot is vital.  What are called “floating-point operations” are accomplished using a representation of real numbers which can’t be handled in the usual way; both real numbers, decimals & fractions can be defined or approximated using floating-point representation, the a numerical value represented by (1) a sign, (2) a significand and (3) an exponent.  The sign indicates whether the number is positive or negative, the significand is a representation of the fractional part of the number and the exponent determines the number’s scale.  In computing, the attraction of floating-point representation is that a range of values can be represented with a relatively small number of bits and although the capability of computers has massively increased, so has the ambitions of those performing big, small or complex number calculations so the utility remains important.  At the margins however (very big & very small), the finite precision of traditional computers will inevitably result in “rounding errors” so there can be some degree of uncertainty, something compounded by there being even an “uncertainty about the uncertainty”.  Floating point calculations therefore solve many problems and create others, the core problem being there will be instances where the problems are not apparent.  Opinion seems divided on whether quantum computing will mean the uncertainty will vanish (at least with the very big if not the very small).

In computer hardware, few pieces have so consistently been the source of problems as Floating point units (FPUs), the so-called “math co-processors”.  Co-processors were an inherent part of the world of the mainframes but came to be thought of as something exotic in personal computers (PC) because there was such a focus on the central processing unit (CPU) (8086, 68020, i486 et al) and some co-processors (notably graphical processing units (GPU)) have assumed a cult-like following.  The evolution of the FPU is interesting in that as manufacturing techniques improved they were often integrated into the CPU architecture before again when the PC era began, Intel’s early 808x & 8018x complimented by the optional 8087 FPU, the model replicated by the 80286 & 80287 pairing, the latter continuing for some time as the only available FPU for almost two years after the introduction of the 80386 (later renamed i386DX in an attempt to differential genuine “Intel Inside” silicon from the competition which had taken advantage of the difficulties in trade-marking numbers).  The delay was due to the increasing complexity of FPU designs and flaws were found in the early 387s.

Intel i487SX & i486SX.

The management of those problems was well-managed by Intel but with the release of the i487 in 1991 they kicked an own goal.  First displayed in 1989, the i486DX had been not only a considerable advance but included an integrated FPU (also with some soon-corrected flaws).  That was good but to grab some of the market share from those making fast 80386DX clones, Intel introduced the i486SX, marketed as a lower-cost chip which was said to be an i486 with a reduced clock speed and without the FPU.  For many users that made sense because anyone doing mostly word processing or other non-number intensive tasks really had little use for the FPU but then Intel introduced the i487SX, a FPU unit which, in the traditional way, plugged into a socket on the system-board (as even them motherboards were coming to be called) al la a 287 or 387.  However, it transpired i487SX was functionally almost identical to an i486DX, the only difference being that when plugged-in, it checked to ensure the original i486SX was still on-board, the reason being Intel wanted to ensure no market for used i486SXs (then selling new for hundreds of dollars) emerged.  To achieve this trick, the socket for the I487 had an additional pin and it was the presence of this which told the system board to disable the i486SX.  The i487SX was not a success and Intel suffered what was coming to be called “reputational damage”.

Dual socket system-board with installed i486SX, the vacant socket able to handle either the i486DX or the i487SX.

The i487SX affair was however a soon forgotten minor blip in Intel’s upward path.  In 1994, Intel released the first of the Pentium CPUs all of which were sold with an integrated FPU, establishing what would become Intel’s standard architectural model.  Like the early implementations of the 387 & 487, there were flaws and upon becoming aware of the problem, Intel initiated a rectification programme.  They did not however issue a recall or offer replacements to anyone who had already purchased a flawed Pentium and, after pressure was exerted, undertook to offer replacements only to those users who could establish their pattern of use indicated they would actually be in some way affected.  Because of the nature of the bug, that meant “relatively few”.  The angst however didn’t subside and a comparison was made with a defect in a car which would manifest only if speeds in excess of 125 mph (200 km/h) were sustained for prolonged periods.  Although in that case only “relatively few” might suffer the fault, nobody doubted the manufacturer would be compelled to rectify all examples sold and such was the extent of the reputational damage that Intel was compelled to offer what amounted to a “no questions asked” replacement offer.  The corporation’s handing of the matter has since often been used as a case study in academic institutions by those studying law, marketing, public relations and such.

Anorexia

Anorexia (pronounced an-uh-rek-see-uh)

(1) In clinical medicine, loss of appetite and inability to eat.

(2) In psychiatry, as anorexia nervosa, a defined eating disorder characterized by fear of becoming fat and refusal of food, leading to debility and even death.

(3) A widely-used (though clinically incorrect) short name for anorexia nervosa.

1590–1600: From the New Latin, from the Ancient Greek νορεξία (anorexía), the construct being ν (an) (without) + ρεξις (órexis) (appetite; desire).  In both the Greek and Latin, it translated literally as "a nervous loss of appetite".  Órexis (appetite, desire) is from oregein (to desire, stretch out) and was cognate with the Latin regere (to keep straight, guide, rule).  Although adopted as a metaphorical device to describe even inanimate objects, anorexia is most often (wrongly) used as verbal shorthand for the clinical condition anorexia nervosa.  The former is the relatively rare condition in which appetite is lost for no apparent reason; the latter the more common eating disorder related to most cases to body image.  Interestingly, within the English-speaking world, there are no variant pronunciations.

Anorexia Nervosa and the DSM

The pro-ana community has created its own sub-set of standard photographic angles, rather as used car sites typically feature certain images such as the interior, the odometer, the engine etc.  Among the most popular images posted on "thinspiration" pages are those which show bone definition through skin and, reflecting the superior contrast possible, there's a tendency use grayscale, usually converted from color originals.  The favored body parts include the spine, hip bones, clavicles (collar bones) and the shoulder blades.     

Although documented since antiquity, the condition in its modern form wasn't noted in western medical literature until an 1873 paper presented to the Royal College of Physicians (RCP) called “Anorexia Hysterica”, a description of a loss of appetite without an apparent gastric cause.  That same year, a similar condition was mentioned in a French publication, also called “l’anorexie hystérique”, and described food refusal combined with hyperactivity.  Although the author of the earlier work had within a year changed the descriptor to “Anorexia Nervosa”, the implication in all these papers was of an affliction exclusively female, something very much implied in l’anorexie hystérique”, hysteria then a mainstream diagnosis and one thought inherently "a condition of women".

A slight Lindsay Lohan demonstrates "an anorexic look" which is something distinct from the clinically defined condition "anorexia nervosa" although there's obviously some overlap.

After its acceptance as a psychogenic disorder in the late nineteenth century, anorexia nervosa (AN) was the first eating disorder placed in the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM).  In the first edition (DSM-I (1952)), it was considered a psycho-physiological reaction (a neurotic illness).  In the DSM-II (1968), it was listed with special symptoms & feeding disturbances, which also included pica and rumination.  In DSM-III (1980), eating disorders were classified under disorders of childhood or adolescence, perhaps, at least in part, contributing to the under-diagnosis of later-onset cases.  At that time, the American Psychiatric Association (APA) created two specific categories that formally recognized the diagnosis of eating disorders: AN and binge eating (called bulimia in DSM-III and bulimia nervosa (BN; the obsessive regurgitation of food) in both the revised DSM-III (1987) and DSM-IV (1994).  In the DSM-IV, all other clinically significant eating disorder symptoms were absorbed by the residual categories of eating disorder not otherwise specified (EDNOS) and binge-eating disorder (BED), noting the disorders were the subjects for further research.  Subsequently, When the DSM-IV was revised (2000), eating disorders moved to an independent section.  The DSM-5 (2013) chapter for eating disorders added to the alphabet soup.  In addition to pica, AN, BN and BED, DSM-5 added  avoidant/restrictive food intake disorder (ARFID) and other specified feeding or eating disorder (OSFED), the latter including some other peculiar pathological eating patterns, like atypical AN (where all other criteria for AN are met, but weight is in the normal range).

Monday, October 23, 2023

Mini

Mini (pronounce min-ee)

(1) A skirt or dress with a hemline well above the knee, popular since the 1960s.

(2) A small car, build by Austin, Morris, associated companies and successor corporations between 1959-2000.  Later reprised by BMW in a retro-interpretation.

(3) As minicomputer, a generalized (historic) descriptor for a multi-node computer system smaller than a mainframe; the colloquial term mini was rendered meaningless by technological change (Briefly, personal computers (PC) were known as micros).

(4) A term for anything of a small, reduced, or miniature size.

Early 1900s: A shorted form of miniature, ultimately from the Latin minium (red lead; vermilion), a development influenced by the similarity to minimum and minus.  In English, miniature was borrowed from the late sixteenth century Italian miniatura (manuscript illumination), from miniare (rubricate; to illuminate), from the Latin miniō (to color red), from minium (red lead).  Although uncertain, the source of minium is thought to be Iberian; the vivid shade of vermilion was used to mark particular words in manuscripts.  Despite the almost universal consensus mini is a creation of twentieth-century, there is a suggested link in the 1890s connected with Yiddish and Hebrew.

As a prefix, mini- is a word-forming element meaning "miniature, minor", again abstracted from miniature, with the sense presumed to have been influenced by minimum.  The vogue for mini- as a prefix in English word creation dates from the early 1960s, the prime influences thought to be (1) the small British car, (2) the dresses & skirts with high-hemlines and (3) developments in the hardware of electronic components which permitted smaller versions of products to be created as low-cost consumer products although there had been earlier use, a minicam (a miniature camera) advertised as early as 1937.  The mini-skirt (skirt with a hem-line well above the knee) dates from 1965 and the first use of mini-series (television series of short duration and on a single theme) was labelled such in 1971 and since then, mini- has been prefixed to just about everything possible.  To Bridget Jones (from Bridget Jones's Diary (1996) a novel by Helen Fielding (b 1958)), a mini-break was a very short holiday; in previous use in lawn tennis it referred to a tiebreak, a point won against the server when ahead.

Jean Shrimpton and the mini-skirt

Jean Shrimpton, Flemington Racecourse, Melbourne, 1965.

The Victorian Racing Club (VRC) had in 1962 added Fashions on the Field to the Melbourne’s Spring Racing Carnival at Flemington and for three years, women showed up with the usual hats and accessories, including gloves and stockings, then de rigueur for ladies of the Melbourne establishment.  Then on the VRC’s Derby Day in 1965, English model Jean Shrimpton (b 1942) wore a white mini, its hem a daring four inches (100 mm) above the knee.  It caused stir.

The moment has since been described as the pivotal moment for the introduction of the mini to an international audience which is probably overstating things but for Melbourne it was certainly quite a moment.  Anthropologists have documented evidence of the mini in a variety of cultures over the last 4000 odd years so, except perhaps in Melbourne, circa 1965, it was nothing new but that didn’t stop the fashion industry having a squabble about who “invented” the mini.  French designer André Courrèges (1923-2016) explicitly claimed the honor, accusing his London rival to the claim, Mary Quant (b 1930) of merely “commercializing it”.  Courrèges had shown minis at shows in both 1964 and 1965 and his sketches date from 1961.  Quant’s designs are even earlier but given the anthropologists’ findings, it seems a sterile argument.

Minimalism: Lindsay Lohan and the possibilities of the mini.

The Mini

1962 Riley Elf.

The British Motor Corporation (BMC) first released their Mini in 1959, the Morris version called the Mini Minor (a link to the larger Minor, a model then in production) while the companion Austin was the Seven, a re-use of the name of a tiny car of the inter-war years.  The Mini name however caught on and the Seven was re-named Mini early in 1962 although the up-market (and, with modifications to the body, slightly more than merely badge-engineered) versions by Riley and Wolseley were never called Mini, instead adopting names either from or hinting at their more independent past: the Elf and Hornet respectively.  The Mini name was in 1969 separated from Austin and Morris, marketed as stand-alone marque until 1980 when the Austin name was again appended, an arrangement which lasted until 1988 when finally it reverted to Mini although some were badged as Rovers for export markets.  The Mini remained in production until 2000, long before then antiquated but still out-lasting the Metro, its intended successor.

1969 Austin Maxi 1500.

The allure of the Mini name obviously impressed BMC.  By 1969, BMC had, along with a few others, been absorbed into the Leyland conglomerate and the first release of the merged entity was in the same linguistic tradition: The Maxi.  A harbinger of what was to come, the Maxi encapsulated all that would go wrong within Leyland during the 1970s; a good idea, full of advanced features, poorly developed, badly built, unattractive and with an inadequate service network.  The design was so clever that to this day the space utilization has rarely been matched and had it been a Renault or a Citroën, the ungainly appearance and underpowered engine might have been forgiven because of the functionality but the poor quality control, lack of refinement and clunky aspects of some of the drivetrain meant success was only ever modest.  Like much of what Leyland did, the Maxi should have been a great success but even car thieves avoided the thing; for much of its life it was reported as the UK's least stolen vehicle.          

1979 Vanden Plas Mini (a possibly "outlaw" project by Leyland's outpost in South Africa).

Curiously, given the fondness of BMC (and subsequently Leyland) for badge-engineering, there was never an MG version of the Mini (although a couple of interpretations were privately built), the competition potential explored by a joint-venture with the Formula One constructors, Cooper, the name still used for some versions of the current BMW Mini.  Nor was there a luxury version finished by coachbuilders Vanden Plas which, with the addition of much timber veneer and leather to vehicles mundane, provided the parent corporations with highly profitable status-symbols with which to delight the middle-class.  There was however a separate development by Leyland's South African operation (Leykor), their Vanden Plas Mini sold briefly between 1978-1979 although the photographic evidence suggests it didn’t match the finish or appointment level of the English-built cars which may account for the short life-span and it's unclear whether the head office approved or even knew of this South African novelty prior to its few months of life.   In the home market, third-party suppliers of veneer and leather such as Radford found a market among those who appreciated the Mini's compact practicality but found its stark functionalism just too austere. 

The Twini

Mini Coopers (1275 S) through the cutting, Mount Panorama, Bathurst, Australia, 1966.

In that year's Gallaher 500, Mini Coopers finished first to ninth.  It was the last occasion on which anything with a naturally-aspirated four-cylinder engine would win the annual endurance classic, an event which has since be won on all but a handful of occasions by V8-powered cars (memorably a V12 Jaguar XJS triumphed in 1985 when Conrod Straight was still at it full length), a statistic distorted somewhat by the rule change in 1995 which stipulated only V8s were allowed to run.    

Although it seemed improbable when the Mini was released in 1959 as a small, utilitarian economy car, the performance potential proved extraordinary; in rallies and on race tracks it was a first-rate competitor for over a decade, remaining popular in many forms of competition to this day.  The joint venture with the Formula One constructor Cooper provided the basis for most of the success but by far the most intriguing possibility for more speed was the model which was never developed beyond the prototype stage: the twin-engined Twini.

Prototype twin-engined Moke while undergoing snow testing, 1962.

It wasn’t actually a novel approach.  BMC, inspired apparently by English racing driver Paul Emery (1916–1993) who in 1961 had built a twin-engined Mini, used the Mini’s underpinnings to create an all-purpose cross-country vehicle, the Moke, equipped with a second engine and coupled controls which, officially, was an “an engineering exercise” but had actually been built to interest the Ministry of Defence in the idea of a cheap, all-wheel drive utility vehicle, so light and compact it could be carried by small transport aircraft and serviced anywhere in the world.  The army did test the Moke and were impressed by its capabilities and the flexibility the design offered but ultimately rejected the concept because the lack of ground-clearance limited the terrain to which it could be deployed.  Based on the low-slung Mini, that was one thing which couldn’t easily be rectified.  Instead, using just a single engine in a front-wheel-drive (FWD) configuration, the Moke was re-purposed as a civilian model, staying in production between 1964-1989 and offered in various markets.  Such is the interest in the design that several companies have resumed production, including in electric form and it remains available today.

Cutaway drawing of Cooper’s Twini.

John Cooper (1923-2000), aware of previous twin-engined racing cars,  had tested the prototype military Moke and immediately understood the potential the layout offered for the Mini (ground clearance not a matter of concern on race tracks) and within six weeks the Cooper factory had constructed a prototype.  To provide the desired characteristics, the rear engine was larger and more powerful, the combination, in a car weighing less than 1600 lb (725 kg), delivering a power-to-weight ratio similar to a contemporary Ferrari Berlinetta and to complete the drive-train, two separate gearboxes with matched ratios were fitted.  Typically Cooper, it was a well thought-out design.  The lines for the brake and clutch hydraulics and those of the main electrical feed to the battery were run along the right-hand reinforcing member below the right-hand door while on the left side were the oil and water leads, the fuel supply line to both engines fed from a central tank.  The electrical harness was ducted through the roof section and there was a central throttle link, control of the rear carburetors being taken from the accelerator, via the front engine linkage, back through the centre of the car.  It sounded intricate but the distances were short and everything worked.

Twini replica.

John Cooper immediately began testing the Twini, evaluating its potential for competition and as was done with race cars in those happy days, that testing was on public roads where it proved to be fast, surprisingly easy to handle and well-balanced.  Unfortunately, de-bugging wasn't complete and during one night session, the rear engine seized which resulting in a rollover, Cooper seriously injured and the car destroyed.  Both BMC and Cooper abandoned the project because the standard Mini-Coopers were proving highly successful and to qualify for any sanctioned competition, at least one hundred Twinis would have to have been built and neither organization could devote the necessary resources for development or production, especially because no research had been done to work out whether a market existed for such a thing, were it sold at a price which guaranteed at least it would break even.

Twini built by Downton Engineering.  Driven by Sir John Whitmore (1937– 2017) &  Paul Frère (1917–2008) in the 1963 Targa Florio, it finished 27th and 5th in class.

The concept however did intrigue others interested in entering events which accepted one-offs with no homologation rules stipulating minimum production volumes.  Downton Engineering built one and contested the 1963 Targa Florio where it proved fast but fragile, plagued by an overheating rear-engine and the bugbear of previous twin-engined racing cars: excessive tire wear.  It finished 27th (and last) but it did finish, unlike some of the more illustrious thoroughbreds which fell by the wayside.  Interestingly, the Downton engineers choose to use a pair of the 998 cm3 (61 cubic inch) versions of the BMC A-Series engine which was a regular production iteration and thus in the under-square (long stroke) configuration typical of almost all the A-Series.  The long stroke tradition in British engines was a hangover from the time when the road-taxation system was based on the cylinder bore, a method which had simplicity and ease of administration to commend it but little else, generations of British engines distinguished by their dreary, slow-revving characteristics.  The long stroke design did however provide good torque over a wide engine-speed range and on road-course like the Targa Florio, run over a mountainous Sicilian circuit, the ample torque spread would have appealed more to drivers than ultimate top-end power.  For that reason, although examples of the oversquare 1071 cm3 (65 cubic inch) versions were available, it was newly developed and a still uncertain quantity and never considered for installation.  The 1071 was used in the Mini Cooper S only during 1963-1964 (with a companion 970 cm3 (61 cubic inch) version created for use in events with a 1000 cm3 capacity limit) and the pair are a footnote in A-Series history as the only over-square versions released for sale

Twin-engined BMW Mini (Binni?).

In the era, it’s thought around six Twinis were built (and there have been a few since) but the concept proved irresistible and twin-engined versions of the "new" Mini (built since 2000 by BMW) have been made.  It was fitting that idea was replicated because what was striking in 2000 when BMW first displayed their Mini was that its lines were actually closer to some of the original conceptual sketches from the 1950s than was the BMC Mini on its debut.  BMW, like others, of course now routinely add electric motors to fossil-fuel powered cars so in that sense twin (indeed, sometimes multi-) engined cars are now common but to use more than one piston engine remains rare.  Except for the very specialized place which is the drag-strip, the only successful examples have been off-road or commercial vehicles and as John Cooper and a host of others came to understand, while the advantages were there to be had, there were easier, more practical ways in which they could be gained.  Unfortunately, so inherent were the drawbacks that the problems proved insoluble.

Chthonian

Chthonian (pronounced thoh-nee-uhn)

(1) In Classical mythology, of or relating to the deities, spirits, and other beings dwelling under the earth.

(2) Dwelling in, or under the earth

1840–1850: From the Greek chthóni(os), the construct being chthon (stem of chthn (earth) + -ios (the adjectival suffix, accusative masculine plural of –ius) + -an (from the Latin -ānus, which forms adjectives of belonging or origin from a noun.  It was akin to the Latin humus (earth).  The alternative spelling in Ancient Greek was khthonios (in or under the earth), from χθών (khthn) (earth, ground, soil).

The Furies (Erinyes)

In Greek mythology, the Furies were the three chthonic female deities of vengeance; known also as Erinyes (the avengers), their counterparts in Roman mythology, the Dirae.  The names of this grumpy triumvirate were Alecto (the angry one), Tisiphone (the avenger) and Megaera (the grudging one).  In the literature, they’re sometimes called the infernal goddesses.

In Internet mythology, three chthonic female deities of vengeance.  Opinion might be divided about the allocation of the labels Angry, Avenging & Grudging. 

There are several myths of the birth of the Furies.  The most popular is they were born simultaneously with Aphrodite but in Hesiod’s Theogony, he claimed the Furies were born out of Uranus’ blood while Aphrodite was being born from sea foam when Titan Cronus castrated his father Uranus and cast his genitals to the sea; implicit in this version is the Furies preceded the Olympian Gods.  Another myth suggests they were born of a union between air and sea while according to Roman Poets (Ovid's (Publius Ovidius Naso; 43 BC–17 AD) Metamorphoses and Virgil's (Publius Vergilius Maro (70–19 BC)) Aeneid, they were the daughters of Nyx (the Night). In some old Greek hymns and Greco-Roman poet Statius' (Publius Papinius Statius (circa 45-circa 96) Thebaid they were the daughters of Hades and Persephone, serving them in the kingdom of underworld.  Furies listened to the complaints and callings of victims in the world when these people cursed the wrongdoers.  Those who murdered their mothers or fathers were especially important for Furies because (at least according to the Ancient Greek poet Hesiod (7-8th century BC)), they were born of a child’s wrongdoing to his father.  They punished people who committed crimes against gods, crimes of disrespect, perjury and those who broke their oaths, but they thought murder the most vile crime and one demanding the most cruel punishment.

Virgil pointing out the Erinyes (1890), engraving by Gustave Doré (1832–1883).

The Furies served Hades and Persephone in the underworld.  When souls of the dead came to the kingdom of Hades, firstly they were judged by three judges; that done, the Furies purified souls the judges deemed good and permitted their passage. Souls deemed wicked were condemned to the Dungeons of the Damned in Tartarus to be subjected to the most awful torture, overseen by Furies.  Descriptions of the Furies (almost always by male writers or artists) varied in detail but mostly they were depicted as ugly with serpents about their hair and arms, wearing black robes with whips in their hands.  Some claimed they also had wings of a bat or bird with burning breath and poisonous blood dripping from their eyes.