Showing posts sorted by date for query Coffin. Sort by relevance Show all posts
Showing posts sorted by date for query Coffin. Sort by relevance Show all posts

Friday, October 18, 2024

Quartervent

Quartervent (pronounced kwawr-ter-vent)

A small, pivoted, framed (or semi-framed) pane in the front or rear side-windows of a car, provided to optimize ventilation.

1930s: The construct was quarter + vent.  Dating from the late thirteenth century, the noun quarter (in its numerical sense) was from the Middle English quarter, from the Anglo-Norman quarter, from the Old French quartier, from the Latin quartarius (a Roman unit of liquid measure equivalent to about 0.14 litre).  Quartus was from the primitive Indo-European kweturtos (four) (from which the Ancient Greek gained τέταρτος (tétartos), the Sanskrit चतुर्थ (caturtha), the Proto-Balto-Slavic ketwirtas and the Proto-Germanic fedurþô).  It was cognate to quadrus (square), drawn from the sense of “four-sided”.  The Latin suffix –arius was from the earlier -ās-(i)jo- , the construct being -āso- (from the primitive Indo-European -ehso- (which may be compared with the Hittite appurtenance suffix -ašša-) + the relational adjectival suffix -yós (belonging to).  The suffix (the feminine –āria, the neuter -ārium) was a first/second-declension suffix used to form adjectives from nouns or numerals.  The nominative neuter form – ārium (when appended to nouns), formed derivative nouns denoting a “place where stuff was kept”.  The Middle English verb quarteren, was derivative of the noun.  Dating from the mid fourteenth century, vent was from the Middle English verb venten (to furnish (a vessel) with a vent), a shortened form of the Old French esventer (the construct being es- + -venter), a verbal derivative of vent, from the Latin ventus (wind), in later use derivative of the English noun.  The English noun was derived partly from the French vent, partly by a shortening of French évent (from the Old French esvent, a derivative of esventer) and partly from the English verb.  The hyphenated form quarter-vent is also used as may be preferable.  Quartervent is a noun; the noun plural is quartervents.  In use, the action of using the function provided by a quartervent obviously can be described with terms like quarterventing or quartervented but no derived forms are recognized as standard.

The 1959 Cadillac Eldorado Biarritz of course had quarter vents (“vent windows” to the Americans) but unlike most in the world, they were electrically-activated.  This was a time when the company's slogan Standard of the World” really could be taken seriously.

The now close to extinct quarter vents were small, pivoted, framed (or semi-framed) panes of glass installed in the front or rear side windows of a car or truck; their purpose was to provide occupants with a source of ventilation, using the air-flow of the vehicle while in motion.  The system had all the attributes of other admirable technologies (such as the pencil) in that it was cheap to produce, simple to use, reliable and effective in its intended purpose.  Although not a complex concept, General Motors (GM) in 1932 couldn’t resist giving the things an impressively long name, calling them “No Draft Individually Controlled Ventilation” (NDICV being one of history’s less mnemonic initializations).  GM’s marketing types must have prevailed because eventually the snappier “ventiplanes” was adopted, the same process of rationality which overtook Chrysler in 1969 when the public decided “shaker” was a punchier name for their rather sexy scoop which, attached directly to the induction system and, protruding through a carefully shape lacuna in the hood (bonnet), shook with the engine, delighting the males aged 17-39 to whom it was intended to appeal.  “Shaker” supplanted Chrysler’s original “Incredible Quivering Exposed Cold Air Grabber” (IQECAG another dud); sometimes less is more.  Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) suggested a good title for his book might be Viereinhalb Jahre [des Kampfes] gegen Lüge, Dummheit und Feigheit (Four and a Half Years of Struggle against Lies, Stupidity and Cowardice) but his publisher thought that a bit ponderous and preferred the more succinct Mein Kampf: Eine Abrechnung (My Struggle: A Reckoning) and for publication even that was clipped to Mein Kampf.  Unfortunately, the revised title was the best thing about it, the style and contents truly ghastly and it's long and repetitious, the ideas within able easily to be reduced to a few dozen pages (some suggest fewer but the historical examples cited for context do require some space).

The baroque meets mid-century modernism: 1954 Hudson Italia by Carrozzeria Touring.  

Given how well the things worked, there’s long been some regret at their demise, a process which began in the 1960s with the development of “through-flow ventilation”, the earliest implementation of which seems to have appeared in the Hudson Italia (1954-1955), an exclusive, two-door coupé co-developed by Hudson in Detroit and the Milan-based Italian coachbuilder Carrozzeria Touring.  Although some of the styling gimmicks perhaps haven’t aged well, the package was more restrained than some extravagances of the era and fundamentally, the lines were well-balanced and elegant.  Unfortunately the mechanical underpinnings were uninspiring and the trans-Atlantic production process (even though Italian unit-labor costs were lower than in the US, Touring’s methods were labor-intensive) involved two-way shipping (the platforms sent to Milan for bodies and then returned to the US) so the Italia was uncompetitively expensive: at a time when the bigger and more capable Cadillac Coupe de Ville listed at US$3,995, the Italia was offered for US$4,800 and while it certainly had exclusivity, it was a time when there was still a magic attached to the Cadillac name and of the planned run of 50, only 26 Italias were produced (including the prototype).  Of those, 21 are known still to exist and they’re a fixture at concours d’élégance (a sort of car show for the rich, the term an unadapted borrowing from the French (literally “competition of elegance”) and the auction circuit where they’re exchanged between collectors for several hundred-thousand dollars per sale.  Although a commercial failure (and the Hudson name would soon disappear), the Italia does enjoy the footnote of being the first production car equipped with what came to be understood as “flow-through ventilation”, provided with a cowl air intake and extraction grooves at the top of the rear windows, the company claiming the air inside an Italia changed completely every ten minutes.  For the quartervent, flow-through ventilation was a death-knell although some lingered on until the effective standardization of air-conditioning proved the final nail in the coffin.

1965 Ford Cortina GT with eyeball vents and quartervents.

The car which really legitimized flow-through ventilation was the first generation (1962-1966) of the Ford Cortina, produced over four generations (some claim it was five) by Ford’s UK subsidiary between 1962-1982).  When the revised model displayed at the Earls Court Motor Show in October 1964, something much emphasized was the new “Aeroflow”, Ford’s name for through-flow ventilation, the system implemented with “eyeball” vents on the dashboard and extractor vents on the rear pillars.  Eyeball vents probably are the best way to do through-flow ventilation but the accountants came to work out they were more expensive to install than the alternatives so less satisfactory devices came to be used.  Other manufacturers soon phased-in similar systems, many coining their own marketing trademarks including “Silent-Flow-Ventilation”, “Astro-Ventilation” and the inevitable “Flow-thru ventilation”.  For the Cortina, Ford took a “belt & braces” approach to ventilation, retaining the quartervents even after the “eyeballs” were added, apparently because (1) the costs of re-tooling to using a single pane for the window was actually higher than continuing to use the quartervents, (2) it wasn’t clear if there would be general public acceptance of their deletion and (3) smoking rates were still high and drivers were known to like being able to flick the ash out via the quartervent (and, more regrettably, the butts too).  Before long, the designers found a way economically to replace the quartervents with “quarterpanes” or “quarterlights” (a fixed piece of glass with no opening mechanism) so early Cortinas were built with both although in markets where temperatures tended to be higher (notable South Africa and Australia), the hinged quartervents remained standard equipment.  When the Mark III Cortina (TC, 1970-1976) was released, the separate panes in any form were deleted and the side glass was a single pane.

Fluid dynamics in action: GM's Astro-Ventilation.

So logically a “quartervent” would describe a device with a hinge so it could be opened to provide ventilation while a “quarterpane”, “quarterlight” or “quarterglass” would be something in the same shape but unhinged and thus fixed.  It didn’t work out that way and the terms tended to be used interchangeably (though presumably “quartervent” was most applied to those with the functionality.  However, the mere existence of the fixed panes does raise the question of why they exist at all.  In the case or rear doors, they were sometimes a necessity because the shape of the door was dictated by the intrusion of the wheel arch and adding a quarterpane was the only way to ensure the window could completely be wound down.  With the front doors, the economics were sometimes compelling, especially in cases when the opening vents were optional but there were also instances where the door’s internal mechanisms (the door opening & window-winding hardware) were so bulky the only way to make stuff was to reduce the size of the window.

1976 Volkswagen Passat without quartervents, the front & rear quarterpanes fixed.

The proliferation of terms could have come in handy if the industry had decided to standardize and the first generation Volkswagen Passat (1973-1980) was illustration of how they might been used.  The early Passats were then unusual in that the four-door versions had five separate pieces of side glass and, reading from left-to-right, they could have been classified thus: (1) a front quarterpane, (2) a front side-window, (3) a rear side-window, (4) a rear quarterpane and (5) a quarterwindow.  The Passat was one of those vehicles which used the quarterpanes as an engineering necessity to permit the rear side-window fully to be lowered.  However the industry didn’t standardize and in the pre-television (and certainly pre-internet) age when language tended to evolve with greater regional variation, not even quarterglass, quartervent, quarterwindow & quarterpane were enough and the things were known variously also as a “fly window”, “valence window”, “triangle window” and “auto-transom”, the hyphen used and not.

1967 Chevrolet Camaro 327 with vent windows (left), 1969 Chevrolet Camaro ZL1 without vent windows (centre) and Lindsay Lohan & Jamie Lee Curtis (b 1958) in Chevrolet Camaro convertible during filming of the remake of Freaky Friday (2003) (provisionally called Freakier Friday), Los Angeles, August 2024.  The Camaro can be identified as a 1968 or 1969 model because the vent windows were deleted from the range after 1967 when “Astro-Ventilation” (GM’s name for flow-through ventilation) was added.  In North American use, the devices typically are referred to as “vent windows” while a “quarter light” is a small lamp mounted (in pairs) in the lower section of the front bodywork and a “quarter vent” is some sort of (real or fake) vent installed somewhere on the quarter panels.

Ford Australia’s early advertising copy for the XA Falcon range included publicity shots both with and without the optional quartervents (left).  One quirk of the campaign was the first shot released (right) of the “hero model” of the range (the Falcon GT) had the driver’s side quartervent airbrushed out (how “Photoshop jobs” used to be done), presumably because it was thought to clutter a well-composed picture.  Unfortunately, the artist neglected to defenestrate the one on the passenger’s side.

Released in Australia in March 1972, Ford’s XA Falcon was the first in the lineage to include through-flow ventilation, the previously standard quartervent windows moved to the option list (as RPO (Regular Production Option) 86).  Because it’s a hot place and many Falcons were bought by rural customers, Ford expected quite a high take-up rate of RPO86 (it was a time when air-conditioning was expensive and rarely ordered) so the vent window hardware was stockpiled in anticipation.  However, the option didn’t prove popular but with a warehouse full of the parts, they remained available on the subsequent XB (1973-1976) and XC (1976-1979) although the take-up rate never rose, less the 1% of each range so equipped and when the XD (1979-1983) was introduced, there was no such option and this continued on all subsequent Falcons until Ford ceased production in Australia in 2016, by which time air conditioning was standard equipment.

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant; under parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “O” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Saturday, September 28, 2024

Melodrama

Melodrama (pronounced mel-uh-drah-muh or mel-uh-dram-uh)

(1) A dramatic form (used in theatre, literature, music etc) that does not observe the laws of cause and effect and that exaggerates emotion and emphasizes plot or action at the expense of characterization.

(2) Loosely, (sometimes very loosely), behavior or events thought “melodramatic” (overly dramatic displays of emotion or behavior and applied especially to situations in which “things are blown out of proportion”).

(3) In formal definition (seventeenth, eighteenth & nineteenth centuries), a romantic dramatic composition characterized by sensational incident with music interspersed.

(4) A poem or part of a play or opera spoken to a musical accompaniment (technically, a passage in which the orchestra plays a somewhat descriptive accompaniment, while the actor speaks).

(5) A popular nickname conferred on highly-strung young women with a Mel*.* given name (Melanie, Melissa, Melina, Melinda, Melisandre, Melodie, Melody et al).

1784 (used in 1782 as melodrame): From the French mélodrame (a dramatic composition in which music is used), the construct being mélo- , from the Ancient Greek μέλος (mélos) (limb, member; musical phrase, tune, melody, song) + drame (refashioned by analogy with the Ancient Greek δρμα (drâma) (deed, theatrical act) and cognate to the German Melodram, the Italian melodramma and the Spanish melodrama.  The adjective melodramatic (pertaining to, suitable for, or characteristic of a melodrama) came into use in 1789 (unrelated to political events that year).  Melodrama, melodramaticism, melodramaturgy, melodramatics & melodramatist are nouns, melodramatize, melodramatizing & melodramatized are verbs, melodramatic is an adjective and melodramatically is an adverb; the noun plural is melodramas or melodramata.

As late as the mid-nineteenth century “melodrama” was still used of stage-plays (usually romantic & sentimental) in which songs were interspersed, the action accompanied by orchestral music appropriate to the situations.  By the 1880s, the shift towards a melodrama being understood as “a romantic and sensational dramatic piece with a happy ending” and this proved influential, the musical element ceasing gradually to be an essential feature, the addition of recorded sound to “moving pictures” (movies) the final nail in the coffin.  Since then, a “melodrama” is understood to be “a dramatic piece characterized by sensational incidents and violent appeals to the emotions, but with (usually) a happy ending”.

The origins of melodrama lie in late sixteenth century Italian opera and reflect an attempt to convince audiences (or more correctly, composers and critics) that the form (ie opera or melodrama) was a revival of the Classical Greek tragedy.  It was a time in Europe when there was a great reverence for the cultures of Antiquity, something the result of the scholars and archivists (and frankly the publicists) of the Renaissance building a somewhat idealized construct of the epoch and the content providers noted the labels, the German-British Baroque composer Frederick Handel (1685–1759) using both for his works.  In the late eighteenth century French dramatists began to develop melodrama as a distinct genre by elaborating the dialogue and adding spectacle, action and violence to the plot-lines, a technique still familiar in the 2020s, sensationalism and extravagant emotionalism as effective click-bait now it was for ticket sales in earlier times.

The use of “melodrama” to refer to the life of a troubled popular culture figure represents a bit of a jump in meaning but it’s now well-understood.

The path of the musical form had earlier been laid in text, something becoming a more significant influence as the spread of the printing press made mass-market publications more accessible and they spread even within non-literate populations because as public and private readings became common forms of entertainment.  Although elements of what would later be understood as melodrama exist in the gloomy tragedies of the French novelist Claude Prosper Jolyot de Crébillon (1707–1777), more of an influence on the composers would be those who wrote with a lighter touch including the Swiss philosopher Jean-Jacques Rousseau (1712–1778) whose Pygmalion (1775) and French theatre director and playwright René-Charles Guilbert de Pixerécourt (1773–1844) whose Le Pèlerin blanc ou les Enfants du hameau (The White Pilgrim (or The Children of the Village)) both came to be regarded as part of the inchoate framework of the genre.  Literary theorists still debate the matter of cause & effect between melodrama and the growing vogue of the Gothic novel, one of fiction’s more emotionally manipulative paths, many concluding the relationship between the two was symbiotic.

There was also the commercial imperative.  Literary historians have documented the simultaneous proliferation of melodramas produced for the English stage during the nineteenth century (notably adaptations of novels by popular authors such as Charles Dickens (1812–1870) and Sir Walter Scott (1771–1832)) and the paucity in original work of substance.  There are some who have argued the writers had “lost their ear” for dramatic verse and prose but it more likely they realized they had “lost their audiences” and these were people with bills to pay (the term “potboiler” was coined later to describe “books written only to provide food for the stove” but few authors of popular fiction have ever been far removed from concerns with their sales).  The reason the melodramas which flourished in the 1800s were so popular will be unsurprising to modern film-makers, political campaign strategists and other content providers for they can be deconstructed as a class of naively sensational entertainment in which the protagonists & antagonists were excessively virtuous or exceptionally evil (thus all tiresome complexities reduced for something black & white), the conflict played out with blood, thrills and violence (spectres, ghouls, witches & vampires or the sordid realism of drunkenness, infidelity or personal ruin used as devices as required).

The word “melodrama” appears often in commentaries on politics and that’s a trend which was probably accelerated by the presentation moving for most purposes to screens and Donald Trump (b 1946; US president 2017-2021) revolutionizing the business by applying the tricks & techniques of reality TV (itself an oxymoron) meant the whole process can now be thought an unfolding melodrama, indeed, the Trump model cannot work as anything else.  The idea of “politics as theatre” was first discussed in the US in the 1960s but then a phenomenon like Mr Trump would have been thought absurdly improbable.

Because of the popularity of the form, melodrama has rarely found much favor with the critics and that old curmudgeon Henry Fowler (1858–1933) in A Dictionary of Modern English Usage (1926) noted (and, one suspects, not without some satisfaction) that the term generally was “…used with some contempt, because the appeal of such plays as are acknowledged to deserve the title is especially to the unsophisticated & illiterate whose acquaintance with human nature is superficial, but whose admiration for goodness and detestation of wickedness is ready & powerful.”  Henry Fowler moved among only a certain social stratum.  He added that the task of the melodramatist’s was to establish in the audience’s mind the notion of the dichotomous characters as good & wicked and then “…provide striking situations that shall provoke and relieve anxieties on behalf of poetic justice.”  One device once used to produce the desired effect was of course music and a whole academic industry emerged in the mid-twentieth century to explain how different sounds could be used to suggest or summon certain emotions and because music increasingly ceased to be an essential part of the melodramatic form, the situations, dialogue and events in purely textual productions became more exaggerated.

Tuesday, September 17, 2024

Sin-eater

Sin-eater (pronounced sin-ee-ter or sin-ee-tah)

(1) An individual (in the historic texts usually a man) who, by the act of eating a piece of bread laid upon the breast of the corpse (although in many depictions the goods are place on the lid of the coffin (casket)) , absorbs the sins of a deceased, enabling them to “enter the kingdom of heaven”.

(2) Figuratively, as a thematic device in literature, a way to represent themes of guilt, atonement, sacrifice, and societal exclusion (used variously to explore the moral complexities inherent in assuming the sins (or guilt) of another, the act of mercy and the implications of personal damnation.

Late 1600s (although the culture practice long pre-dates evidence of the first use of the term):  The construct was sin + eat +-er.  Sin (in the theological sense of “a violation of divine will or religious law; sinfulness, depravity, iniquity; misdeeds”) was from the Middle English sinne, synne, sunne & zen, from the Old English synn (sin), from the Proto-West Germanic sunnju, from the Proto-Germanic sunjō (truth, excuse) and sundī, & sundijō (sin), from the primitive Indo-European hs-ónt-ih, from hsónts (being, true), implying a verdict of “truly guilty” against an accusation or charge), from hes- (to be) (which may be compared with the Old English sōþ (true).  Eat (in the sense of “to ingest; to be ingested”) was from the Middle English eten, from the Old English etan (to eat), from the Proto-West Germanic etan, from the Proto-Germanic etaną (to eat), from the primitive Indo-European hédti, from hed- (to eat).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  Sin-eater is a noun and sin-eating is a verb; the noun plural is sin eaters.  The term often appears as “sin eater” but (untypically for English), seemingly not as “sineater”.

The first documented evidence of the term “sin-eater” appears in texts dating from the late seventeenth century but cultural anthropologists believe the actual practice to be ancient and variations of the idea are seen in many societies so the ritual predates the term, the roots apparently in European and British folk traditions, particularly rural England and Wales.  The earliest (authenticated) known documented mention of a sin-eater occurs Remaines of Gentilisme and Judaisme (1686) by English antiquary John Aubrey (1626–1697), in which is described the custom of a person eating “bread and drinking ale” placed on the chest of a deceased person in order that their “many sins” could be eaten, thus allowing the untainted soul to pass to the afterlife, cleansed of “earthly wrongdoings”.   Aubrey would write of a "sin-eater living along the Rosse road" who regularly would be hired to perform the service, describing him as a “gaunt, ghastly, lean, miserable, poor rascal”.  He mentioned also there was a popular belief that sin-eating would prevent the ghost of the deceased from walking the earth, a useful benefit at a time when it was understood ghosts of tormented souls, unable to find rest, haunted the living.  Whether this aspect of the tradition was widespread or a localism (a noted phenomenon in folklore) isn't know.  Interestingly, in rural England and Wales the practice survived the Enlightenment and became more common (or at least better documented) in the eighteenth & nineteenth centuries.  In the turbulent, troubled Middle East, a macabre variation of the sin-eater has been documented.  There, it's reported that a prisoner sentenced to death can bribe the jailors and secure their freedom, another executed in their place, the paperwork appropriately altered.   

Paris Hilton (b 1981, left) and Lindsay Lohan (b 1986, right) discussing their “manifold sins and wickedness” while shopping, Los Angeles, 2004.

The ritual was of interest not only to social anthropologists but also to economic historians because while it was clear sin-eaters did receive payment (either in cash or in-kind (typically food)), there’s much to suggest those so employed were society’s “outcasts”, part of the “underclass” sub-set (beggars, vagrants, vagabonds etc) which is the West was a less formalized thing than something like the Dalits in Hinduism.  The Dalits (better known as the “untouchables”) in the West are often regarded as the “lowest rung” in the caste system but in Hindu theology the point was they were so excluded they were “outside” the system (a tiresome technical distinction often either lost on or ignored by the colonial administrators of the Raj) and relegated to the least desirable occupations.  Being a sin-eater sounds not desirable and theologically that’s right because in absolving the dead of their sins, the sin-eater becomes eternally burdened with the wickedness absorbed.  Presumably, a sin-eater could also (eventually) have their sins “eaten” but because they were from the impoverished strata of society, it was probably unlikely many would be connected to those with the economic resources required to secure such a service.  As a literary device, a sin-eater (often not explicitly named as such) is a character who in some way “takes on” the sins of others and they can be used to represent themes of guilt, atonement, sacrifice, and societal exclusion.  In popular culture, the dark concept is quite popular and there, rather than in symbolism, the role usually is explored with the character being explicating depicted as a “sin-eater”, an example being The Sin Eater (2020) by Megan Campisi (b 1976), a dystopian novel in which a young woman is forced into the role as a punishment.

Nice work if you can get it: The Sin-Eater, Misty Annual 1986.  Misty was a weekly British comic magazine for girls which, unusually, was found also to enjoy a significant male readership.  Published by UK house Fleetway, it existed only between 1978-1980 although Misty Annual appeared until 1986.  The cover always featured the eponymous, raven haired beauty.

There’s the obvious connection with Christianity although aspects of the practice have been identified in cultures where they arose prior to contact with the West.  The novel The Last Sin Eater by born-again US author Francine Rivers (b 1947) was set in a nineteenth century Appalachian community and dealt with sin, guilt & forgiveness, tied to the “atonement of the sins of man” by the crucifixion of Jesus Christ and thematically that was typical of the modern use.  However, the relationship between sin-eating and the Christian ritual of communion is theologically tenuous.  The communion, in which bread symbolizes the body of Christ and wine symbolizes His blood is actually literal in the Roman Catholic Church under the doctrine of transubstantiation which holds that during the sacrament of the Eucharist (or Holy Communion), the bread and wine offered by the priest to the communicants transforms into the body and blood of Christ.  That obviously requires faith to accept because while the appearances of the bread (usually a form of wafer) and wine (ie their taste, texture, and outward properties) remain unchanged, their substance (what truly they are at the metaphysical level) is said to transform into the body and blood of Christ.  Once unquestioned by most (at least publicly), the modern theological fudge from the Vatican is the general statement: “You need not believe it but you must accept it”.

Sin-eating and communion both involve the consumption of food and drink in a symbolic manner.  In sin-eating, a sin-eater consumes food placed near or on the corpse symbolically to “absorb” their sins so the soul of the deceased may pass to the afterlife free from guilt while in the Christian Eucharist, the taking of bread and wine is a ritual to commemorate the sacrifice of Jesus who, on the cross at Golgotha, died to atone for the sins of all mankind.  So the central difference is the matter of who bears the sins.  In sin-eating, that’s the sin-eater who personally takes on the spiritual burden in exchange for a small payment, thus becoming spiritually tainted in order that another may spiritually be cleansed.  In other words, the dead may “out-source” the cost of their redemption in exchange for a few pieces of silver.  In the Christian communion, it’s acknowledged Jesus has already borne the sins of humanity through His crucifixion, the ritual an acknowledgment of His sacrificial act which offered salvation and forgiveness of sin to all who believe and take him into his heart.  One can see why priests were told to discourage sin-eating by their congregants but historically the church, where necessary, adapted to local customs and its likely the practice was in places tolerated.

Wednesday, September 11, 2024

Platform

Platform (pronounced plat-fawrm)

(1) A horizontal surface or structure raised above the surrounding area, used for appearances, performances etc (speeches, music, drama etc) and known also as a dais or podium if used for public speaking.

(2) A raised floor constructed for any purpose (an area for workers during construction, the mounting of weapons etc.

(3) The raised area (usually a constructed structure) between or alongside the tracks of a railroad station, designed to provide passenger or freight ingress & egress.

(4) The open entrance area, or vestibule, at the end of a railroad passenger car.

(5) A landing in a flight of stairs.

(6) A public statement of the principles, objectives, and policy (often referred to as “planks”, the metaphor being the timber planks used to build physical platforms) on the of a political party, especially as put forth by the representatives of the party in a convention to nominate candidates for an election; a body of principles on which a person or group takes a stand in appealing to the public; program; a set of principles; plan.

(7) Figuratively, a place or an opportunity to express one's opinion (historically also referred to as a tribune; a place for public discussion; a forum.

(8) Figuratively, something (a strategy, a campaign etc) which provides the basis on which some project or cause can advance (described also as a foundation or stage).

(9) A deck-like construction on which the drill rig of an offshore oil or gas well is erected.

(10) In naval architecture, a light deck, usually placed in a section of the hold or over the floor of the magazine (also used in to general nautical design).

(11) In structural engineering, a relatively flat member or construction for distributing weight, as a wall plate, grillage etc.

(12) In military jargon, solid ground on which artillery pieces are mounted or a metal stand or base attached to certain types of artillery pieces.

(13) In geology, a vast area of undisturbed sedimentary rocks which, together with a shield, constitutes a craton (often the product of wave erosion).

(14) In footwear design, a thick insert of leather, cork, or other sturdy material between the uppers and the sole of a shoe, usually intended for stylish effect or to give added height; technically an ellipsis of “platform shoe”, “platform boot” etc.

(15) In computing (as an ellipsis of “computing platform”, a certain combination of operating system or environment & hardware (with the software now usually functioning as a HAL (hardware abstraction layer) to permit the use of non-identical equipment); essentially a standardized system which allows software from a variety of vendors seamlessly to operate.

(16) In internet use (especially of social media and originally as an ellipsis of “digital platform”), software system used to provide online and often multi-pronged interactive services.

(17) In manufacturing, a standardised design which permits variations to be produced without structural change to the base.

(18) In automotive manufacturing (as an ellipsis of “car platform”, a set of components able to be shared by several models (and sometimes shared even between manufacturers).  The notion of the platform evolved from the literal platforms (chassis) used to build the horse-drawn carriages of the pre-modern era.

(19) A plan, sketch, model, pattern, plan of action or conceptual description (obsolete).

(20) In Myanmar (Burma), the footpath or sidewalk.

1540–1550: From the Middle English platte forme (used also as plateforme), from the Middle French plateforme (a flat form), the construct being plate (flat) from the Old French plat, from the Ancient Greek πλατύς (platús) (flat) + forme (form) from the Latin fōrma (shape; figure; form).  It was related to flatscape which survived into modern English as a rare literary & poetic device and which begat the derogatory blandscape (a bland-looking landscape), encouraging the derived “dullscape”, “beigescape”, “shitscape” etc.  Platform & platforming are nouns & verbs, platformer & platformization are nouns, platformed is a verb; the noun plural is platforms.  The noun & adjective platformative and the noun & adverb platformativity are non-standard.

In English, the original sense was “plan of action, scheme, design” which by the 1550s was used to mean “ground-plan, drawing, sketch”, these uses long extinct and replaced by “plan”.  The sense of a “raised, level surface or place” was in use during the 1550s, used particularly of a “raised frame or structure with a level surface”.  In geography, by the early nineteenth century a platform was a “flat, level piece of ground”, distinguished for a “plateau” which was associated exclusively with natural elevated formations; geologists by mid-century standardized their technical definition (a vast area of undisturbed sedimentary rocks which, together with a shield, constitutes a craton (often the product of wave erosion).  The use in railroad station design meaning a “raised area (usually a constructed structure) between or alongside the tracks of a railroad station, designed to provide passenger or freight ingress & egress” dates from 1832.

Donald Trump on the platform, Butler, Pennsylvania, 13 July 2024.

For politicians, the platform can be a dangerous place and the death toll of those killed while on the hustings is not inconsiderable.  Since the attempted assassination of Donald Trump (b 1946; US president 2017-2021), he and his running mate in the 2024 presidential election (JD  James (b 1984; US senator (Republican-Ohio) since 2023) speak behind bullet-proof glass, the threat from homicidal "childless cat ladies" apparently considered "plausible".

The familiar modern use in politics was a creation of US English meaning a “statement of political principles policies which will be adopted or implemented were the candidates of the party to secure a majority at the upcoming election” and appeared first in 1803.  The use would have been derived from the literal platform (ie “on the hustings”) on which politicians stood to address crowds although some etymologists suggest it may have been influenced by late sixteenth century use in England to describe a “set of rules governing church doctrine" (1570s).  During the nineteenth century, platform came to be used generally as a figurative device alluding to “the function of public speaking” and even for a while flourished as a verb (“to address the public as a speaker”).

Lindsay Lohan in Saint Laurent Billy leopard-print platform boots (Saint Laurent part number 532469), New York, March 2019.

On the internet, "cancelling" or "cancel culture" refers to the social (media) phenomenon in which institutions or individuals (either “public figures” or those transformed into a public figures by virtue of an incautious (in the case of decades-old statements sometimes something at the time uncontroversial) tweet or post are called publicly “shamed”, criticized, or boycotted for a behaviour, statement or action deemed to be offensive (problematic often the preferred term) or harmful.  Cancelling is now quite a thing and part of the culture wars but the practice is not knew, the verb deplatform (often as de-platform) used in UK university campus politics as early as 1974 in the sense of “attempt to block the right of an individual to speak at an event (usually on campus)”; the comparative noun & verb being “deplatforming”.  The unfortunate noun & verb “platforming” began in railway use in the sense of (1) the construction of platforms and (2) the movement of passengers or freight on a platform but in the early 2010s it gained a new meaning among video gamers who used it to describe the activity of “jumping from one platform to another.”  Worse still is “platformization” which refers to (1) the increasing domination of the internet by a number of large companies whose products function as markets for content and (2) also in internet use, the conversion of a once diverse system into a self-contained platform.  Software described as “cross-platform” or “platform agnostic” is able to run on various hardware and software combinations.

IBM: In 1983 things were looking good.

In computing, the term “platform” was in use long before “social media platforms” became part of the vernacular.  The significance of “platform” was compatibility, the rationale being that software sold by literally thousands of vendors could be run on machines produced by different companies, sometimes with quite significantly different hardware (the “bus wars” used to be a thing).  The compatibility was achieved was by an operating system (OS) creating was called the HAL (hardware abstraction layer), meaning that by a variety of techniques (most notably “device drivers”)’ an operating system could make disparate hardware manifest as “functionally identical” to application level software.  So, in a sign of the times, the once vital concept of “IBM compatibility” came to be supplanted by “Windows compatibility” and the assertion by in 1984 by NEC when releasing the not “wholly” compatible APC-III that “IBM compatibility is just a state of mind” was the last in its ilk; the APC-III architecture proving a one-off.  The classic computing platform became the “WinTel” (sometimes as “Wintel”, a portmanteau word, the construct being Win(dows) + (In)tel), the combination of the Microsoft Windows OS and the Intel central processing unit (CPU), an evolution traceable to IBM’s decision in 1980 to produce their original PC-1 with an open architecture using Microsoft’s DOS (disk operation system) and Intel’s 8088 (8/16 bit) & 8086 (16 bit) CPUs rather than use in-house products.  In the IBM boardroom, that at the time would have seemed a good idea but it was one which within a decade almost doomed the corporation as the vast ecosystem of “clone” PCs enriched Microsoft & Intel while cannibalizing the corporate market which had built IBM into a huge multi-national.  It is the Wintel platform which for more than forty years has underpinned the digital revolution and, like the steam engine, transformed the world.

The noun & adjective platformative and the noun & adverb platformativity are non-standard.  Platformative was built on the model of “performative” which (1) in structural linguistics and philosophy is used to mean “being enacted as it is said” (ie follows the script) and (2) in post-modernist deconstructionist theory refers to something done as a “performance” for purposes of “spectacle or to create an impression”.  “Platformative is understood as some sort of event or situation which is (1) dependent on the platform on which it is performed or (2) something which exists to emphasise the platform rather than itself.  Platformativity was built on the model of performativity which as a noun (1) in philosophy referred to the capacity of language and expressive actions to perform a type of being and (2) the quality of being performative.  As an adverb, it described something done “in a performative manner”.  The actual use of platformativity seems often mysterious but usually the idea is the extent to which the meaning of a “statement or act” (ie the text) is gained or changed depending upon the platform on which it transpired (something of a gloss on the idea “The medium is the message” which appeared in Understanding Media: The Extensions of Man (1964) by Marshall McLuhan (1911-1980).

The automotive platform

In automobile mass-production, the term “platform” tended to be generic until the post-war years and was used (if at all) interchangeably with “chassis” or “frame”, the basic underling structure used to mount the mechanical components and add the bodywork.  The growing adoption of “unitary” construction during the mid-century years radically changed the way cars were manufactured but didn’t much change the language, the underpinnings still often referred to as the “chassis” even though engineers cheerfully would point out one no longer existed.  What did change the language was sudden proliferation of models offered by the US industry in the 1960s; where once, each line (apart from the odd speciality) had a single model, emerging in the 1960s would be ranges consisting of the (1) full-size, (2) intermediate, (3) compact and (4) sub-compact.  The strange foreign cars were often so small they were often described variously as “micros” & “sub-micros”.  With such different sizes now being built, different platforms were required and these came to be known usually with titles like “A Platform”, “C Platform”, “E Platform” etc (although “A Body”, “C Body” etc were also used interchangeably).  Such nomenclature had actually been in use in Detroit as early as the 1920s but there was little public perception of the use which rarely appeared outside engineering departments or corporate boardrooms.  The concept of the platform was in a sense “engineering agnostic” because the various platforms could be unitary, with a traditional separate chassis or one of a variety of BoF (body-on-frame) constructions (X-Frame, Perimeter-Frame, Ladder-Frame etc).  Regardless, in the language of internal designation, anything could be a “platform”.

With the coming of the 1960s, the “platform” concept became the standard industry language, quickly picked up the motoring press which observed the most notable aspect of the concept was that the design of platforms emphasised the ability to be adapted to a number of different models, often with little more structural adjustment than a (quick & cheap) stretch of the wheelbase or a slightly wider track, both things able to be accommodated on the existing production line without the need to re-tool.  The designers of platforms needed to be cognizant not only of the vehicles which would be mounted atop but also production line rationalization.  What this implies is that the more models which could be produced using the single platform, all else being equal, the more profitable that platform tended to be and some of the long-running platforms proved great cash cows.  However, if a platform (1) proved more expensive to produce than the industry average and (2) was used only on a single or limited number of lines, it could be what Elon Musk (b 1971) would now call a “money furnace”.  Such a fate befell Chrysler’s “E Platform” (usually called the “E Body”), produced between 1969-1974 for two close to identical companion lines.  Exacerbating the E Platforms woes was it being released (1) just before its market segment suffered a precipitous decline in sales, (2) government mandated rules began to make it less desirable, (3) rising insurance costs limited the appeal of the most profitable models and (4) the first oil shock of 1973-1974 drove a final nail into the coffin.

1960 Ford Falcon (US, left) and 1976 Ford PC LTD (Australia, right).  Both built on the "Falcon Platform", the 1960 original was on a 109½ inch (2781 mm) wheelbase and fitted with a 144 cubic inch (2.4 litre) straight-six.  By 1973, Ford Australia had stretched the platform to a 121 inch (3100 mm) wheelbase and fitted a 351 cubic inch (5.8 litre) (335 series "Cleveland") V8.      

Ford in North America introduced the Falcon in 1960 in response to the rising sales of smaller imports, a phenomenon the domestic industry had brought upon itself by making their own mainstream production bigger and heavier during the late 1950s and tellingly, when later they would introduce their “intermediate” ranges, these vehicles were about the size cars had been in 1955; they proved very popular although rising prosperity did mean sales of the full-size lines would remain buoyant until mugged by economic reality in a post oil-shock world.  The Falcon began modestly enough and while the early versions were very obviously built to a (low) price and intended to be a commodity to be disposed of when “used up”, it found a niche and Ford knew it was onto something.  The early platform wasn’t without its flaws, as Australian buyers would discover when they took their stylish new 1960 Falcon to the outback roads the frumpy but robust Holden handled without complaint, but it proved adaptable: In North America, the Falcon was produced between 1960-1969, it lasted from 1962-1991 in Argentina and in Australia, in a remarkable variety of forms, it was offered between 1960-2016.

On the Falcon platform: 1965 Mustang (6 cylinder, left) and 1969 Mustang Boss 429 (right).  In the vibrant market for early Mustangs, although it's the high-performance versions and Shelby American's derivatives which attract the collectors, massively out-selling such things were the so-called "grocery-getters", configured typically with small (in US terms) 6 cylinder engines and automatic transmissions.  The "grocery-getters" used to be known as "secretary's" or "librarian's" cars but such sexist stereotyping would now attract cancellation (once known as "de-platforming).

In North America however, the platform wasn’t retired when the last of the Falcons was sold in 1970 because it was used also for other larger Fords (and companion Mercury & even (somewhat improbably) Lincoln models) including the Fairlane (1962-1970), Maverick (1970–1977) & Granada 1975-1980.  Most famously of course, it was the Falcon platform which was the basis for the first generation Mustang (1964-1973); if the development costs for the Falcon hadn’t been amortized by the time the Mustang was released, the extraordinary popularity of the new “pony car” meant the profits were huge.  It’s of course misleading to suggest a machine like the 1969 Mustang Boss 429 (7.0 litre) was “underneath the body just a Falcon with a big engine” but the basic design is the same and between the early versions of the two, there are many interchangeable parts.  Later, Ford would maintain other long-lasting platforms.  The Fox platform would run between 1979-1993 (the SN95 platform (1994-2004) is sometimes called the “Fox/SN95” because it was “a Fox update" but it was so substantial most engineers list it separately) and the larger Panther platform enjoyed an even more impressive longevity; released in 1979, the final Panther wasn’t produced until 2012.