Showing posts sorted by date for query Plague. Sort by relevance Show all posts
Showing posts sorted by date for query Plague. Sort by relevance Show all posts

Saturday, September 20, 2025

Snarge

Snarge (pronounced snn-arj)

(1) In military & civil aviation, slang, the remains of a bird after it has collided with an airplane (ie bird strike), originally of impacts with turbine engines but latterly applied also to residue left on wings, fuselages etc.

(2) By adoption, the remains of birds and insects left on the windscreens of trains, cars, motorcycle fairings etc,

Early 2000s (probably): A portmanteau word, a blend of sn(ot) + (g)ar(ba)ge.  Snot (used here in the usual sense of “mucus, especially that from the nose”) was from the Middle English snot & snotte, from the Old English ġesnot & snott, from the Proto-West Germanic snott & snutt, from the Proto-Germanic snuttuz (nasal mucus), from the same base as snout and related to snite.  It was cognate with the North Frisian snot (snot), the Saterland Frisian Snotte (snot), the West Frisian snotte (snot), the Dutch snot (snot), the German Low German Snött (snot), the dialectal German Schnutz (snot), the Danish snot (snot) and the Norwegian snott (snot).  Trans-linguistically, “snot” is commendably consistent and its other uses (a misbehaving (often as “snotty”) child; a disreputable man; the flamed-out wick of a candle all reference something unwanted or undesirable).  That said, snot (mucus) is essential for human life, being a natural, protective, and lubricating substance produced by mucous membranes throughout the body to keep tissues moist and act as a barrier against pathogens and irritants like dust and allergens, working to trap foreign particles; it also contains antimicrobial agents to fight infection.  So, when “out-of-sight & out-of-mind” it’s helpful mucus but when oozing (or worse) from the nostrils, it’s disgusting snot.

Garbage (waste material) was from the late Middle English garbage (the offal of a fowl, giblets, kitchen waste (though in earlier use “refuse, that which is purged away”), from the Anglo-Norman, from the Old French garber (to refine, make neat or clean), of Germanic origin, from the Frankish garwijan (to make ready).  It was akin to the Old High German garawan (to prepare, make ready) and the Old English ġearwian (to make ready, adorn).  The alternative spelling was garbidge (obsolete or eye dialect).  Garbage can be used of physical waste or figuratively (ideas, concepts texts, music etc) judged to be of poor quality and became popular in computing, used variously to mean (1) output judged nonsensical (for whatever reason), (2) corrupted data, (3) memory which although allocated was no longer in use and awaiting de-allocation) or (4) valid data misinterpreted as another kind of data.  Synonyms include junk, refuse, rubbish, trash & waste.  Charlie Chaplin (1889–1977) used “Herr Garbage” as the name of the character who in The Great Dictator (1940) represented Dr Joseph Goebbels (1897-1975; Nazi propaganda minister 1933-1945).  Snarge is a noun and no derived forms have ever been listed but a creature which has become snarge would have been snarged and the process (ie point of impact) would have been the act of snarging.  Snarge is inherent the result of a fatality so an adjective like snargish is presumably superfluous but traces of an impact which may not have been fatal presumably could be described as snargelike or snargesque.

Dr Carla Dove at work in the Smithsonian's Feather Identification Laboratory, Washington DC.

The patronymic Dr Carla Dove (b 1962) is manager of the Feather Identification Laboratory at the Smithsonian Institution’s National Museum of Natural History in Washington DC where she heads a team identifying the types or species of birds that collide with military and civil aircraft.  She calls snarge “a term of art” (clearly she’s of the “eye of the beholder” school) and notes that although the scientific discipline of using snarge to determine the species involved in bird strikes began at the Smithsonian in 1960, the term doesn’t seem to have been coined there and its origin, like much slang with a military connection, is murky.  Although a 2003 article in Flying Safety magazine is sometimes cited as the source of the claim the word was “invented at the Feather Identification Laboratory”, Dr Dove is emphatic the staff there “borrowed it” from preparators (the technicians who prepare bird specimens for display or other uses by museums).  It certainly seems to have been in general use (in its specialized niche in military & aviation and wildlife safety circles) by at least the early-to-mid 2000s and the zeitgeisters at Wired magazine were in 2005 printing it without elaboration, suggesting at least in their editorial team it was already establish slang.  So, it may long have been colloquial jargon in museums or among those working in military or civil aviation long before it appeared in print but there no documentary evidence seems to exist.

The origin of the scientific discipline is however uncontested and the world’s first forensic ornithologist was the Smithsonian’s Roxie Laybourne (1910–2003).  In October, 1960, a Lockheed L-188 Electra flying as Eastern Airlines Flight 375 out of Boston Logan Airport had cleared the runway by only a few hundred feet when it flew into a flock of birds, the most unfortunate of which damaged all four engines, resulting in a catastrophic loss of power, causing the craft to nosedive into Boston Harbor, killing 62 of the 72 aboard.  Although the engines were turbo-props rather than jets, they too are highly susceptible to bird-strike damage.  At the time, this was the greatest loss of life attributed to a bird-strike and the FAA (Federal Aviation Authority) ordered all avian remains be sent to the Smithsonian Institution for examination.  There, Ms Laybourne received  the box of mangled bone, blood & feathers and began her investigation, her career taking a trajectory which would include not only the development of protocols designed to reduce the likelihood of bird strikes damaging airliners but also involvement with the USAF (US Air Force) & NASA (National Aeronautics and Space Administration).  Additionally, her work with the FBI (Federal Bureau of Investigation) and various police forces proved forensic ornithology could be of use a diagnostic tool in crime-solving; her evidence helping to convict murderers, kidnappers and poachers.  In 2025, journalist Chris Sweeney published The Feather Detective: Mystery, Mayhem, and the Magnificent Life of Roxie Laybourne, a vivid telling of the tale of a woman succeeding in a world where feminism had not yet wrought its changes.

Snarge on the nosecone of a Cessna Citation, Eisenhower Airport, Wichita, Kansas, July 2021.  The dent indicates the point of impact, the airflow holding the corpse in place.  By the time of landing, the leaked body fluids had congealed to act as a kind of glue.

The study of aviation bird strikes is obviously a specialized field but snarge has come also to be used in the matter of insect deaths, specifically what has come to be called the “windscreen phenomenon” (also as “windshield phenomenon” depending on linguistic tradition).  What that refers to is the increasingly common instances of people reporting they are seeing far fewer dead insects on the windscreens of their cars, many dating the onset of the decline to the late 1990s and the most common explanations offered for this are (1) climate change, (2) habitat loss and (3) the increasing use (or potency) of pesticides.  Individual observations of one’s windscreen now tending to accumulate less snarge than in years gone by is of course impressionistic and caution must be taken not to extrapolate the existence of a global trend from one piece of glass in one tiny part of the planet: what needs to be avoided is a gaboso (the acronym for Generalized Association Based On Single-Observation (also as the derived noun & verb) which is the act of taking one identifiable feature of someone or something and using it as the definitional reference for a group (it ties in with logical fallacies).  However, the reports of increasingly snargeless windscreens were widespread and numerous so while that didn’t explain why it was happening, it did suggest that happening it was.

There was also the matter of social media platforms which have meant the volume of messages about a particular topic in the twenty-first century is not comparable with years gone by.  It’s simply impossible to calculate the extent to which these mass-market (free) platforms have operated as an accelerant (ie a force-multiplier of messaging) but few doubt it’s a considerable effect.  Still, it is striking the same observations were being made in the northern & southern hemispheres and the reference to the decline beginning in the late 1990s was also consistent and a number of studies in Europe and the US have found a precipitous drop in insect populations over the last three decades.  One interesting “quasi theory” was the improved aerodynamic efficiency of the modern automobile meant the entomological slaughter was reduced but quickly aeronautical engineers debunked that, pointing out a slippery shape has a “buffer zone” very close to the surface which means "bugs" have a greater chance of being sucked-in towards the speeding surface because of the differential between negative & positive pressure.  However, on most older vehicles, the “buffer zone” could be as much as 3 feet (close to a metre) from the body.  A bug heading straight for the glass would still be doomed but the disturbed air all around would have deflected a few

Lindsay Lohan with Herbie in Herbie: Fully Loaded (2005).

Herbie was a 1963 Volkswagen Type 1 (Beetle, 1938-2003) and despite the curves which made it look streamlined, its measured Cd (drag coefficient) was typically around 0.48-0.50, some 8% worse than contemporary vehicles of comparable frontal area.  What that meant was its buffer zone would extend somewhat further than the “New Beetle” (1997-2011) which had a Cd between 0.38-0.41, again not as good as the competition because it was compromised by the need to maintain a visual link with the way things were done in 1938.  On the 1963 models (like Herbie) the flat, upright windscreen created significant drag and was obviously a good device for “snarge harvesting” but the later curved screen (introduced in 1973 with the 1303) probably didn’t spare many insects.

Dr Manu Saunders' graphic example of insect snarge on a windscreen during the 2010 "locust plague" in western NSW (New South Wales), Australia, April 2010.

Dr Manu Saunders is a Senior Lecturer in Ecology and Biology and the School of Environmental and Rural Science in Australia’s UNE (University of New England) and she pointed out that “anecdata is not scientific evidence” and just because anecdotes are commonly presented as “evidence of global insect decline” (the so-called “insectageddon”), that doesn’t of necessity make locally described conditions globally relevant.  The problem she identified was that although there have been well-conducted longitudinal studies of snarge on windscreens using sound statistical methods, all have used data taken from a relatively small geographical area while around the planet, there are more than 21 million km (13 million miles, (ie more than 80 round trips to the Moon) of “roads”).  Dr Saunders does not deny the aggregate number of insects is in decline but cautions against the use of one data set being used to assess the extent of a phenomenon with a number of causal factors.

Still snarge-free: The famous photograph of the 25 917s assembled for inspection outside the Porsche factory, Stuttgart, 1969.  The FIA’s homologation inspectors declined the offer to test-drive the 25 which was just as well because, hastily assembled (secretaries, accountants and such drafted in to help), some of were capable of driving only a short distance in first gear.

Fortunately for Porsche, in 1969, although the decline in global insect numbers may already have begun, they were still buzzing around in sufficient numbers to produce the snarge which provided the necessary clue required to resolve the problem of chronic (and potentially lethal) instability which was afflicting the first 917s to be tested at speed.  In great haste, the 917 had been developed after the Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation and world sport's dopiest regulatory body) “relaxed” the rules which previously had set a threshold of 50 identical units for cars classified as Group 4 (5 litre (305 cubic inch)) sports cars, reducing this to a minimum of 25.  What that meant was Porsche needed to develop both a car and a twelve cylinder engine, both items bigger and more complex than anything they’d before attempted, things perhaps not overly challenging had the typical two years been available but the factory needed something which would be ready for final testing in less than half the time.  Remarkably, they accomplished the task in ten months.

Porsche 917 LH Chassis 001 in the livery of the IAA (Internationale Automobil-Ausstellung (International Automobile Exhibition)) used for the Frankfurt Motor Show.

The brief gestation period was impressive but there were teething problems.  The fundamentals, the 908-based space-frame and the 4.5 (275 cubic inch) litre air-cooled flat-12 engine (essentially, two of Porsche’s 2.25 (137 cubic inch) litre flat-sixes joined together) were robust and reliable from the start but, the sudden jump in horsepower (HP) meant much higher speeds and it took some time to tame the problems of the car’s behaviour at high-speed.  Aerodynamics was then still an inexact science and the maximum speed the 917 was able to attain on Porsche’s test track was around 180 mph (290 km/h) but when unleashed on the circuits with long straights where over 200 mph (320 km/h) was possible the early 917s proved highly unstable, the tail “wandering from side-to-side” something disconcerting at any speed but beyond 200 mph, frightening even for professional race drivers.

On Mulsanne Straight, Le Mans: The slippery 917 LH (left) which proved "unsafe at high speed" (left) and the (slightly) slower 917 K (right) which, in the hands of experts), was more manageable.

The instability needed to be rectified because the 917 had been designed with "a bucket of Deutsche Marks in one hand and a map of the Le Mans circuit in the other" and these were the days before the FIA (Fédération Internationale de l'Automobile (International Automobile Federation and world sport's dopiest regulatory body)) started insisting chicanes be spliced into any straight where high speeds beckoned and the Mulsanne Straight at Le Mans was then an uninterrupted 6 km (3.7 mile) straight line.  There, the test results and slide-rule calculations predicted, the 917s would achieve in excess of 360 km/h (224 mph).  Serendipitously, physics and nature combined to show the team where the problem lay: After one alarming high speed run, it was noticed that while the front and central sections of the bodywork were plastered with bloodied snarge, the fibreglass of the rear sections remained a pristine white, the obvious conclusion drawn that while the airflow was inducing the desired degree of down-force on the front wheels, it was passing over the rear of body, thus the lift which induced the wandering.  Some rapid improvisation with pieces of aluminium and much duct tape (to this day a vital tool in the business) to create an ad-hoc, shorter, upswept tail transformed the behaviour and was the basis for what emerged from the factory's subsequent wind-tunnel testing as the 917 K (K for Kurzheck (short-tail).  The rest is history.

Dodge Public Relations announces the world now has "spoilers".  Actually they'd been around for a while but, as Dodge PR knew, until it happens in America, it hasn't happened.

What happened to the 917 wasn’t novel.  In 1966, Dodge had found the slippery shape of its new fastback Charger had delivered the expected speed on the NASCAR ovals but it came at the cost of dangerous lift at the rear, drivers’ graphically describing the experience at speed as something like “driving on ice”.  The solution was exactly what Porsche three years later would improvise, a spoiler on the lip of the trunk (boot) lid which, although only 1½ inches (38 mm) high, at some 150 mph (240 km/h) the fluid dynamics of the air-flow meant sufficient down-force was generated to tame the instability.  Of course, being NASCAR, things didn’t end there and to counter the objection the spoiler was a “non-stock” modification and thus not within the rules, Dodge cited the “safety measure” clause, noting an unstable car on a racetrack was a danger to all.  NASCAR agreed and allowed the device which upset the other competitors who cited the “equalization formula clause” and demanded they too be allowed to fit spoilers.  NASCAR agreed but set the height at maximum height at 1½ inches and specified they could be no wider than the trunk lid.  That left Dodge disgruntled because, in a quirk of the styling, the Charger had a narrower trunk lid than the rest of the field so everybody else’s spoilers worked better which seemed unfair given it was Dodge which had come up with the idea.  NASCAR ignored that objection so for 1967 the factory added to the catalogue two small “quarter panel extensions” each with its own part number (left & right); once installed, the Charger gained a full-width spoiler.

Saturday, September 6, 2025

Deodand

Deodand (pronounced dee-uh-dand)

(1) In English law (prior to 1846), an animal or a personal chattel (the scope later extended) that, having been the immediate, accidental cause of the death of a human being, was forfeited to the Crown to be sold with the money gained applied originally to pious uses.

(2) In English law (prior to 1846), A fine paid to the Crown, equal to the value of a deodand, paid by the owner of the object and subsequently applied originally to pious uses.

1520–1530: From the late thirteenth century Anglo-French deodande, from the Medieval Latin deōdandum (a thing) to be given to God, the construct being the Classical Latin deō (to God (dative singular of deus (god)) + dand(um) to be given (neuter gerund of “dare to give”) from the primitive Indo-European root do- (to give).  Deus was from the primitive Indo-European root dyeu- (to shine and (in derivatives” “sky, heaven, god”).  Deodand is a noun; the noun plural is deodands.

That the doctrine of deodand was a medieval legal relic (the earliest recorded instances of use in England dating from the eleventh century) is not that remarkable because in that it was one of a number; what’s remarkable is it remained part of the common law until the mid-1800s.  The concept was first well documented in thirteenth century legal texts and historians have concluded this “semi-codification” reflected the earlier religious tradition which held an object which caused a death was “tainted” and should be removed from profane use.  In that, it inherited older notion from Roman civil law of noxae deditio (literally “surrender for the wrongdoing” and in English law written usually as “noxal surrender”), the construct being noxae (harm, injury, wrongdoing) + deditio (surrender, giving up).  Noxae deditio was a legal mechanism (in response to what would now be called a writ) with which the owner of an animal or slave (The Romans really did make a distinction) could avoid liability for delicts (wrongs) committed by them by surrendering the animal or slave to the injured party as an alternative to paying damages.  Intriguingly, at certain times, the doctrine was extended to sons (though apparently not daughters) in circumstances where an action was brought against a paterfamilias (the head of a household), on the basis he was held to be responsible for the son’s acts.  Literally, the son could be “handed over”, either until they attained statutory adulthood or for a specified period, depending on the damages assessed.  A similar idea was the Old English wergeld, from the Proto-West Germanic werageld, the construct being wer (man) +‎ ġield (payment).  It was a form of compensation paid by a transgressor to a victim, or (as “blood money) to the victim's family if the victim were dead (the quantum decided by social rank).  The concept is familiar in many societies and is sometimes formalized in Islamic systems using the Sharia Law where the victim’s family can be involved in determining not only how much blood money should be paid but also whether there should be a payment as an alternative to a death sentence.

What evolved in English common law was the rule under which, if a person was killed by an animal, vehicle, tool or other inanimate object, that object was declared a “deodand” to be forfeited to the Crown.  Reflecting the theological basis for this, notionally the surrender was “to God”, but quickly the standard practice became to appraise the value of the beast or object and levy a fine in that sum.  Although the documentary evidence is patchy, it appears originally the forfeited property (or cash from the fine) was devoted to pious uses such as alms (ie charity for the poor) or (as was the usual trend when a revenue stream was identified) ecclesiastical purposes such as building churches or stained glass windows.  Later (another trend being squabbles between church & state), deodans became a source of consolidated royal revenue.  The rationale was partly religious (atonement), partly superstitious (removing the dangerous object), and partly fiscal (Crown revenue).

The school bus scene: In Mean Girls (2004), had Regina George (Rachel McAdams (b 1978)) been killed by the school bus, the vehicle would have been declared a deodand and forfeited to the state although the usual practice was for its value to be assessed and an order for a payment in that sum to be served on the owner.

It was a simple concept but because there was much variation in the circumstances in which a deodand could be declared, the case law reveals inconsistencies in the verdicts.  Were someone to be killed by being run over by a horse-drawn cart, depending on this and that, the deodand might be found to be the cart and horse, the cart or horse alone or even just the particular wheel which crushed the unfortunate deceased.  One of the reasons for the variance is that in many instances the matter was determined not by a judge or magistrate working from precedent but (at coroners’ inquests) by juries which would both define the deodand and assess its value.  Given that, on what appear to be similar facts (a sailor who drowned after being struck by a mast), the deodand might be found to be the whole vessel or merely the mast.  In such cases, the issue was which object (or part of an object) should be held to be the “guilty instrument” and that was a process not simple to define, things made more difficult still by the opinions of jury members being so diverse and prone to be influenced by the identity of both the victim(s) and the owner of the object(s).

Aftermath of the explosion of a locomotive’s steam boiler.  If reduced to scrap by the event in which someone died, the jury could assess the value of the object in its "pre-event" condition.

By the eighteenth century, deodands had become largely devices of reference in that actual confiscation of objects was rare with an assessment of their monetary value to set the fine to be paid the standard practice.  Lawyers, politicians and (especially) those in commerce were critical of the system as irrational and even then there were traces of what would evolve as the modern notions of negligence and responsibility; critiques of deodand came both from what would now be described as “the right” and “the left”.  Those who owned the objects which became lethal instruments argued it was unfair they be punished so severely for what were, however tragic, “mere accidents”, pointing out the system discouraged industrial enterprise while those advocating for victims pointed out it was the state which gained the proceeds of the fines while victims’ families (many of which had lost their sole breadwinner) gained nothing.  What finally brought about the end of deodand was it being overtaken by the industrial age in which deaths came routinely to occur in clusters.  It was the multiple fatalities in marine and train accidents (infamously the Hull Tragedy (1838) and the Sonning Cutting Disaster (1840)) which attracted press coverage and public debate; in each case a “certificate of deodand” was attached to the machinery and, given the cavalier attitude of railway operators towards safety, it was hardly surprising coroners’ juries had little hesitation in declaring a locomotive and its rolling-stock a deodand.  That was obviously an expensive threat to capitalism and the lobbying by these vested interest resulted in parliament abolishing deodands by the Deodands Act 1846 (9 & 10 Vict. c.62).

Tallahassee Democrat, 13 October 1991.

The Daytona Yellow 1969 Chevrolet Corvette ZL1 coupé is the rarest and most valuable C3 Corvette (1968-1982) made, the “other ZL1” a Monaco Orange Roadster having a less pure pedigree (although at auction in January 2023 it realized US$3.14 million.  The yellow ZL1 last changed hands in October 1991 when it was sold in a government forfeiture auction for US$300,000 (then a lot of money) after being seized by the DEA (Drug Enforcement Agency).

The Act however was part of a reform process and the early initiatives included the statutes which would by the mid twentieth century evolve into modern negligence and compensation law, the most significant of the early steps being the Fatal Accidents Act 1846 (Lord Campbell’s Act) which for the first time codified the idea of the “wrongful death claim” and permitted families to sue on this basis.  Although now largely forgotten, the 1846 act was a significant marker of the transition of English law from a medieval, semi-religious system of atonement to a modern, rationalized law of tort, product liability and compensation.

Echoes do however remain in certain legal doctrines of forfeiture (such as state seizures of the proceeds of crime) and the US practice of civil asset forfeiture does, at least in a philosophical sense, sometimes treat property as “guilty”.  The US law provides for property (cars, boats, money etc) connected with the commission of a crime to be seized by the state even if the owner, personally, wasn’t “guilty”; it’s a modern interpretation of the medieval view the object itself bore responsibility.  What this means is the legal rationale is structurally similar to what once was the religious justification: What once was “given to God” as expiation as atonement for sin translates now into deterrence as an expression of public policy (removing dangerous tools or preventing criminals from profiting).  As a kind of “legal fiction”, under both regimes the object is treated as if it possesses some kind of independent agency.  Intriguingly, as an administrative convenience, that idea survived in Admiralty Law under which vessels can in suits be “personified”, thus cases like “The SS <ship name> v. Cargo”, the model for civil asset forfeiture procedures in which the object is the defendant (such as United States v. One 1969 Chevrolet Corvette).

Building from Biblical tradition, the idea of independent agency had a curious history in the legal systems of Christendom and in Europe from the Middle Ages through the early modern period, animals could be put on trial (in both secular courts and ecclesiastical courts) for murder.  These trials followed legal procedures similar to those in which a human was the accused although, obviously, cross-examination was somewhat truncated.  The most commonly tried animals were pigs, simply because it wasn’t uncommon for them freely to roam in urban areas and attacks on babies and infants were frequent.  In Normandy in 1386, a sow was dressed in human clothing and publicly executed for killing a child while at Châlons in 1499, a sow and her six piglets were tried; the sow was executed for killing a man, while the piglets were acquitted due to “lack of evidence.”  Nor were the defendants exclusively porcine, bulls and horses occasionally executed for killing people and in ecclesiastical courts there are many records of rodents and insects being charged with damaging crops.  Presumably because every day of the week rodents and insects were killed just for “being guilty of being rodents and insects”, ceremonial executions wouldn’t have had much symbolic value so the usual result handed down was excommunication(!) or a demand (from God, as it were) the creatures vacate the fields in which they were consuming the crops.

Perpetually hungry weevils enjoying lunch in a granary.

Sometimes the ecclesiastical courts could be imaginative.  In the Italian region of Tyrol in 1713, the priests ordered the hungry weevils to leave the vineyards where they were such a plague but in compensation granted their occupation of a barren piece of land as an alternative habitat.  The reaction of the insects to the ruling would have been rather as King Cnut (better known as Canute, circa 990–1035; King of England 1016-1035) would have predicted but despite that, there’s no record of the weevils being held in contempt of court.  Regrettably, there's no generally accepted collective noun for weevils but weevilage (a portmanteau word, the blend being weevil + (vill)age) seems more compelling than Adelognatha (the scientific term referring to a group of Curculionidae (a family of weevils) characterized by a specific anatomical feature).  There was at least some theological basis for the ecclesiastical courts claiming entomological jurisdiction because in scripture it was written beasts are God’s creatures like all others and over them God granted dominion to man (Genesis 1:26-28 (King James Version of the Bible (KJV, 1611)):

26 And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth.

27 So God created man in his own image, in the image of God created he him; male and female created he them.

28 And God blessed them, and God said unto them, Be fruitful, and multiply, and replenish the earth, and subdue it: and have dominion over the fish of the sea, and over the fowl of the air, and over every living thing that moveth upon the earth.

Bovine trial in progress, rendered as a line drawing by Vovsoft.

The principle was animals could be held accountable for causing harm and this was taken especially seriously when the harm caused was something like that of a crime a human might commit (like murder) and in the secular courts, if the victim was someone of some importance, the proceedings could involve defense lawyers, witnesses, and formal sentencing.  In the ecclesiastical courts, it was more symbolic or ritualistic: insects and rodents might be “summoned” but of course they never turned up so excommunication or other curses were invoked.  By the eighteenth century, the thinkers of the Enlightenment had prevailed and the idea of animals as moral agents was so ridiculed the practice of charging them was almost wholly abandoned although in certain circumstances an owner could be held liable for the damage they caused.  There was though the odd, rural holdout.  In Normandy in 1845 a sow was executed for killing a child (in the legal archives listed as the last “classic pig trial” (the last in the US held in New Hampshire in 1819)) and in Switzerland in 1906 a dog was sentenced to death for a similar offence (this believed to be Europe’s last “animal trial”).

Tuesday, September 2, 2025

Suicide

Suicide (pronounced soo-uh-sahyd)

(1) The intentional taking of one's own life.

(2) By analogy, acts or behavior, which whether intentional or not, lead to the self-inflicted destruction of one's own interests or prospects.

(3) In automotive design, a slang term for rear doors hinged from the rear.

(4) In fast food advertising, a niche-market descriptor of high-calorie products deliberately or absurdly high in salt, sugar and fat.

(5) A trick in the game Diabolo where one of the sticks is released and allowed to rotate 360° round the Diabolo until it is caught by the hand that released it.

(6) In Queensland (Australia) political history, as suicide squad, the collective name for the additional members of the Legislative Council (upper house) appointed in 1921 solely for the purpose of voting for its abolition.

(7) In sardonic military slang, as suicide mission, a description for an operation expected to suffer a very high casualty rate.

(8) A children's game of throwing a ball against a wall and at other players, who are eliminated by being struck.

(9) Pertaining to a suicide bombing, the companion terms being suicide belt & suicide vest.

(10) In electrical power, as "suicide cable (or cord, lead etc)", a power cord with male connections each end and used to inject power from a generator into a structing wiring system (highly dangerous if incorrectly used).

(11) In drug slang, the depressive period that typically occurs midweek (reputedly mostly on Tuesdays, following weekend drug use.

(12) In US slang, a beverage combining all available flavors at a soda fountain (known also as the "graveyard" or "swamp water".

(13) As "suicide runs" or "suicide sprints", a form of high-intensity sports training consisting of a series of sprints of increasing lengths, each followed immediately by a return to the start, with no pause between one and the next.

1651: From the New Latin  suīcīdium (killing of oneself), from suīcīda and thought probably of English origin, the construct being the Latin suī (genitive singular of reflexive pronunciation of se (one’s self)) from suus (one’s own) + cīdium (the suffix forms cīda & cide) from caedere (to kill).  The primitive European root was s(u)w-o (one's own) from the earlier s(w)and new coining displaced the native Old English selfcwalu (literally “self-slaughter”).  Suicide is a noun & verb, suicidal is a noun & adjective, suicider is a noun; the noun plural is suicides.  Pedantic scholars of Latin have never approved of the word because, technically, the construct could as well be translated as the killing of a sow but, in medieval times, purity had long deserted Latin and never existed in English.  The modern meaning dates from 1728; the term in the earlier Anglo Latin was the vaguely euphemistic felo-de-se (one guilty concerning himself).  It may be an urban myth but there was a story that a 1920s editor of the New York Times had a rule that anyone who died in a Stutz Bearcat would be granted a NYT obituary unless the death was a suicide.  Suicide is a noun & verb, suicidal is a noun & adjective, suicider, suicidology, suicidalist, suicidality, suicidalness & suicidism are nouns, suicidogenic is an adjective, suicided is a verb & adjective, suiciding is a verb and suicidally is an adverb; the noun plural is suicides.

Terms like “professional suicide”, “commercial suicide” and “career suicide” are, even if the era of trigger warnings, still used as is “political suicide” and it is a word politicians like to use (of their opponents).  Paul Keating (b 1944; Prime Minister of Australia 1991-1996), having read the Fightback! political manifesto prepared for the 1993 general election by the Liberal Party’s then leader Dr John Hewson (b 1946; leader of the Liberal Party of Australia 1990-1994), declared it “the longest suicide note in Australian political history”, a critique which seems first to have been made by a member of the Canberra press gallery although a similar phrase had a decade earlier been used in the UK by Labour Party politician Sir Gerald Kaufman (1930–2017) when damning his own party’s 1983 platform.  An extraordinary 650(!) pages, Fightback! reflected well on Dr Hewson’s background as an academic neo-liberal economist but as something to persuade voters to vote Liberal it was monumentally bizarre and nobody has since attempted anything like it.  Dubbed at the time (for many a good reason) "the unlosable election", lose in 1993 Dr Hewson did and to this day Fightback! is blamed.

Bloody Bob hasn't!, John Clarke (1948–2017) and Bryan Dawe (b 1948), ABC Television 7:30 report, Monday 15 March, 1993.  

A footnote to the unexpected result in the 1993 election was an exposure of the dangers inherent in pre-recording television material for later broadcast.  The conventional wisdom was a significant factor in Labor's impending defeat was that Mr Keating allowed his personal ambition to become prime-minister prevail over the interests of the party and in deposing Bob Hawke (1929–2019; Prime Minister of Australia 1983-1991) who'd won the previous four elections, he'd sacrificed any hope of gaining a fifth term.  The satirists John Clarke (1948–2017) and Bryan Dawe (b 1948) produced a skit using their “pseudo interview” technique in which they followed the documentary model of the ABC’s (Australian Broadcasting Corporation) Labor in Power series; they depicted the political rivals as two children squabbling over whose turn it was with the toy.  The final question asked “Paul” which of them now had the toy to which he replied “Bloody Bob hasn’t!”.  The punch-line would have worked had Mr Keating had the decency to lose the election but of course he won so the joke went flat. 

UK Prime Minister Lord Salisbury (Robert Arthur Talbot Gascoyne-Cecil, 1830–1903; UK Prime Minister for thirteen years variously 1885-1902) remarked of the long, sad decline of Lord Randolph Churchill (1849–1895) that the deceased had proved to be “chief mourner at his own protracted funeral” and confided to colleagues “the man committed suicide as surely as if he blown his brains out.”.  Kaiser Wilhelm II (1859–1941; German Emperor & King of Prussia 1888-1918) remarked of the ill-advised book published by one politician whose career had imploded that it was probably the “…first time a man has committed suicide twice.  Not noted for his wit, that may have been Wilhelm’s finest moment although it does vie with his observation on hearing that, in deference to the state of war between their two nations, the British Royal family was changing its name to “Windsor”.  The Kaiser said he hoped soon to attend a performance in Berlin of William Shakespeare’s (1564–1616) “The Merry Wives of Saxe-Coburg and Gotha.”.

Comrade Stalin (1878-1953; Soviet leader 1924-1953) arranged a few “suicides” and in a nice touch sometimes appeared at the funeral as chief mourner whereas Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) in similar circumstances seems to have restricted himself to sending a wreath and, for the especially exalted, authorizing a state funeral.  Although doubtlessly it's all just bad luck and coincidence, it is striking how many sources on various platforms have compiled lists of the remarkable number of "suicides" in some way associated with Bill (b 1946; US president 1993-2001) & crooked Hillary Clinton (b 1947; US secretary of state 2009-2013).  It's an impressively large toll but, in fairness, Socks (1989-2009; FCOTUS (First Cat of the United States 1993-2001)) did live an untypically long 20-odd years although he escaped the Clinton's clutches after 2001.

A pioneer in the field of suicidology, Dr Shneidman’s publication record was indicative of his specialization.

Dr Edwin Shneidman (1918-2009) was a clinical psychologist who practiced as a thanatologist (a practitioner in the field of thanatology (the scientific study of death and the practices associated with it, including the study of the needs of the terminally ill and their families); the construct of thanatology being thanato- (from the Ancient Greek θάνατος (thánatos) (death)) + -logy.  The suffix -ology was formed from -o- (as an interconsonantal vowel) + -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism etc).Like many working in the field, Dr Shneidman discussed the effect suicides have on the friends and family of those who took their own lives and there are to these events many responses beyond the obvious.

Geli Raubal.

One especially curious relationship in the anyway strange life of Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) was that he enjoyed with his niece Geli Raubal (1908–1931), the daughter of his elder half-sister Angela (1883–1949) who acted as his housekeeper; despite much speculation, it has never fully been explained and quite what transpired between them will probably never be known.  Most historians have concluded Hitler was obsessed with Geli although whether that meant he was “in love” divides opinion, a substantial body of those working in the field suspecting Hitler was no more capable of love than he was of true friendship.  One day in 1931, in the room he’d allotted to her in his Munich apartment, after Hitler had been driven off for a speaking engagement in Hamburg, Geli committed suicide, shooting herself with her uncle’s Walther PP pistol; she was then 23.  The (pre Nazi-state) Munich police ruled the death a suicide but, inevitably, there has long been speculation about her death, the most popular “theory” being Hitler, in a rage, accidently or intentionally shooting her after discovering her pregnancy, variations of the speculation suggesting the unborn child was either his or that of another man.  There is no substantive evidence to support any of these notions but Hitler’s subsequent reaction and apparent grief was well documented and from the moment he heard of her death he never again ate meat, telling the noted hunter and definitely carnivorous Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945): “It’s like eating a corpse.

Suicide Squads

Henry Asquith (1852-1928) and his youthful friend Venetia Stanley (1887–1948).

Although few were quite as vituperative as Paul Keating who once describes the members of the Australian Senate as "unrepresentative swill", governments in the twentieth century often found upper houses to be such a nuisance they schemed and plotted ways to curb their powers or, preferably, do away with them entirely.  As the electoral franchise was extended, governments were sometimes elected with what they considered a mandate to pursue liberal or progressive policies while upper houses, by virtue of their composition and tenure (some with life-time appointments) often acted as an obstruction, rejecting legislation or imposing interminable delays by sending proposed laws to be “discussed to death” in committees from which “nothing ever emerged”.  This was the situation which confronted the glittering Liberal Party cabinet of HH Asquith (1852–1928; UK prime minister 1908-1916) which in 1909 found the Lords, in defiance of long established convention, blocking passage of the budget.  The Lords was wholly unelected, its membership mostly inherited, sometimes by virtue of some service (virtuous or otherwise) by an ancestor hundreds of years before.  Successive elections didn’t resolve the crisis and Asquith resolved to pursue the only lawful mechanism available: the creation of as many peers as would be necessary (in the hundreds) to secure the passage of his legislation.

Terry Richardson's (b 1965) suicide-themed shoot with Lindsay Lohan, 2012.

That of course required royal ascent and the newly enthroned George V (1865–1936; King of the United Kingdom & Emperor of India 1910-1936), while making his reservations clear, proved a good constitutional monarch and made it known he would follow the advice of his prime-minister.  As it turned out, the “suicide squad” wasn’t required, their Lordships, while not at all approving of the government, were more appalled still at the thought of their exclusive club being swamped with “jumped-up grocers” in “bad hats” and allowed the legislation to pass.  Actually, “castration squad” might have been a more accurate description because while the Lords survived, Asquith ensured it would be less of an obstacle, substituting the road block of its power of veto with a speed-bump, a right to impose a two-year delay (in 1949 reduced to six months).  The New Labour administration (1997-2010) introduced further reforms which were designed eventually to remove from the Lords all those who held seats by virtue of descent and even the Tories later moved in that direction although the efforts have stalled and a few of the hereditary peers remain.  As things now stand, the last remaining absolute veto the Lords retain is to stop an attempt by a government to extend a parliament's life beyond five years. 

The preserved Legislative Council chamber in Queensland's Parliament House.

Some upper house assassins however truly were a suicide squad.  In Australia, the state of Queensland followed the usual convention whereby the sub-national parliaments were bicameral, the Legislative Council the upper house and like the others, it was a bastion of what might now be called "those representing the interests of the 1%" and a classic example of white privilege.  Actually, at the time, the lower houses were also places of white privilege but the Australian Labor Party (ALP) had long regarded the non-elected Legislative Council (and upper houses in general) as undemocratic and reactionary so in 1915, after securing a majority in the Legislative Assembly (the lower house) which permitted the party to form government, they sought abolition.  The Legislative Council predictably rejected the bills passed by the government in 1915 & 1916 and a referendum conducted in 1917 decisively was lost; undeterred, in 1920, the government requested the governor appoint sufficient additional ALP members to the chamber to provide an abolitionist majority.  In this, the ALP followed the example of the Liberal Party in the UK which in 1911 prevailed upon the king to appoint as many new peers as might be needed for their legislation to pass unimpeded through an otherwise unsympathetic House of Lords.  That wasn’t needed as things transpired but in Queensland, the new members of the Legislative Council duly took their places and on 26 October 1921, the upper house voted in favor of abolition, the new appointees known forever as "the suicide squad".  Despite the success, the trend didn't spread and the Commonwealth parliament and those of the other five states remain bicameral although the two recent creations, established when limited self-government was granted to the Northern Territory (NT) and Australian Capital Territory (ACT), both had unicameral assemblies.

Margot Robbie (b 1990) in costume as Harley Quinn (a comic book character created by DC Comics), Suicide Squad (2016).

Across the Tasman Sea (which locals call "the ditch"), the New Zealand upper house lasted another three decades but it’s eventual demise came about not because of conflict but because the institution was increasing viewed as comatose, rejecting nothing, contributing little and rarely inclined even to criticize.  Unlike in England and Queensland, in New Zealand the abolition movement enjoyed cross-party support, left and right (although the latter in those days were pretty leftist), united in their bored disdain.  One practical impediment was the New Zealand parliament couldn’t amend the country’s constitution because no government had ever bothered to adopt the Statute of Westminster (1931) by which the Imperial Parliament had granted effective independence to the Dominions but in 1947 this was done.  Despite that, the Labour Party didn’t act and after prevailing in the 1950 general election, it was a National Party administration which passed the Legislative Council Abolition Act, its passage assured after a twenty-member “suicide squad” was appointed and the upper house’s meeting of 1 December 1950 proved its last.  Opposition from within the chamber had actually been muted, presumably because to sweeten the deal, the government used some of the money saved to pay some generous “retirement benefits” for the displaced politicians.  New Zealand since has continued as a unitary state with a unicameral legislature.

Pineapples.

In the Far East (the practice documented in Japan, the PRC (People's Republic of China) and the renegade province of Taiwan), fruit sellers offer pineapples for sale of the basis of “Murder” (谋杀 and variants) or “Suicide” (自殺する and variants).  Ominous as it sounds, it's just commercial shorthand.  Pineapples being more difficult to handle than many fruits, fruit shops offer the “murder” service in which staff will (for a small fee) peel and chop as required.  Those prepared to do their own preparation at home can take the “suicide” option and (at a lesser cost) purchase the whole fruit, skin and all.  There are many reasons to eat pineapple.

Suicide doors

1928 Mercedes-Benz Nürburg (W08) with four rear-hinged doors.

It wasn’t until the 1950s the practice of hinging doors from the front became (almost) standardized.  Prior to that, they’d opened from the front or rear, some vehicles featuring both.  The rear-hinged doors became known as suicide doors because they were genuinely dangerous (in the pre-seat belt era), the physics of them opening while the car was at speed had the effect of dragging the passenger into the airstream.  Additionally, it was said they were more likely to injure people if struck by passing vehicles while being opened although the consequences of being struck by a car sound severe whatever the circumstances.

2021 Rolls-Royce Phantom VIII Tempus.

Still used in the 1960s by Lincoln, Ford and Rolls-Royce, they were phased out as post-Nader safety regulations began to be applied to automotive design and were thought extinct when the four door Ford Thunderbirds ceased production in 1971.  However, after being seen in a few design exercises over the decades, Rolls-Royce included them on the Phantom VII, introduced in 2003, the feature carried over to the Phantom VIII in 2017.  Like other manufacturers, Rolls-Royce has no fondness for the term suicide doors, preferring to call them coach doors; nomenclature from other marketing departments including flex doors and freestyle doors.  Engineers are less impressed by silly words, noting the correct term is rear-hinged and these days, mechanisms are included to ensure they can be opened only when the vehicle is at rest.  Encouraged by the reaction, Rolls-Royce brought back the rear-hinged door for their fixed (FHC) and drop-head (DHC) coupés although, despite the retro-touch, the factory seems now content usually to call them simply coupés and convertibles.  

1971 Ford Thunderbird Landau.

In a nod to a shifting market, when the fifth generation Thunderbird was introduced in 1967, the four-door replaced the convertible which had been a staple of the line since 1955.  Probably the only car ever visually improved by a vinyl roof, the four-door was unique to the 1967-1971 generation, its replacement offered only as a coupé.  The decision effectively to reposition the model was taken to avoid a conflict with the new Mercury Cougar, the Thunderbird moving to the "personal coupé" segment which would become so popular.  So popular in fact that within a short time Ford would find space both for the Thunderbird and the Continental Mark III, changing tastes by the 1970s meaning the Cougar would also be positioned there along with a lower-priced Thunderbird derivative, the Elite.  Such was the demand for the personal coupé that one manufacturer successfully could support four models in the space, sometimes with over-lapping price-points depending on the options.  The four-door Thunderbirds are unique in being the only car ever built where the appearance was improved by the presence of a vinyl roof, the unusual semi-integration of the rear door with the C pillar necessitating something be done to try to conceal the ungainliness, the fake "landau irons" part of the illusion.

1967 Lincoln Continental convertible.  The later cars with the longer wheelbase are popular as wedding cars because the suicide doors can make ingress & egress more elegant for brides with big dresses although those with big hair often veto the lowering of the roof until after the photos have been taken.

The combination of the suicide door, the four-door coachwork and perhaps even the association with the death of President Kennedy has long made the convertible a magnet for collectors but among American cars of the era, it is different in that although the drive-train is typical of the simple, robust engineering then used, it's packed also with what can be an intimidating array of electrical and hydraulic systems which require both expertise and equipment properly to maintain.  That need has kept a handful of specialists in business for decades, often rectifying the mistakes of others.  It was unique; after the last of the even rarer Mercedes-Benz 300d Cabriolet Ds left the line in 1962, Lincoln alone offered anything in the once well-populated niche.

LBJ's 1964 Lincoln Continental convertible.

The four-door convertible's most famous owner was Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) who would use it to drive visitors around his Texas ranch (often with opened can of Pearl beer in hand according to LBJ folklore).  While never a big seller (21,347 made over seven years and it achieved fewer than 4,000 sales even in its best year), it was the most publicized of the line and to this day remains a staple in film & television productions needing verisimilitude of the era.  The convertible was discontinued after 1967 when 2276 were built, the two-door hardtop introduced the year before out-selling it five to one.  The market had spoken; it would be the last convertible Lincoln ever produced and it's now a collectable, LBJ's 1964 model in 2024 selling at auction for US$200,000 and fully restored examples without a celebrity connection regularly trade at well into five figures, illustrating the magic of the coach-work.

A mother watching her daughter enter her 1963 Lincoln Continental, the door held open by the girl's brother.  These are two of the family's 2.66 (1964 average) children.

Ford's advertising agency rose to the occasion when producing copy for the four-door convertible.  They certainly had scope because it was unique so many superlatives and adjectives which usually were little more than "mere puffery" would in this case have been literally true.  It was though a case of making "a silk purse from a sow's ear" because Lincoln adopted the suicide doors only because the car's wheelbase was too short for conventionally (forward) hinged doors to provide a sufficiently wide gap for entry and exit.  While that may sound a strange thing to plague a new design, the 1961 Continental was built on the platform of a proposed Ford Thunderbird which would have been available only with a two-door body and despite what the advertising copy suggests, even with the use of suicide doors, access to the rear compartment was tight, something not rectified until the wheelbase (123 inches (3,124 mm) for 1961-1963 & 126 inches (3,200 mm) for 1964-1969) was extended.     

Lincoln Continental concepts, Los Angeles Motor Show, 2002 (left) and New York Motors Show 2015 (right).

The Lincoln Continental for decades remained successful after the "great des-sizing" began in 1979 and despite the perceptions of some, the generation which was least-well received was that (1982-1987) based on Ford's smaller "Fox" platform, sales rebounding when the larger eighth generation (1988-1994) made it debut and that was despite the switch to FWD (front-wheel-drive) and the lack of a V8; clearly for Lincoln buyers it was size which mattered rather than the details of what lay beneath and presumably many neither knew, could tell or cared it was FWD, a configuration which anyway increased interior space, something of more tangible benefit to most than what could be achieved on a slalom course.  Interest by the late 1990s was however dwindling and the nameplate suffered a fourteen year hiatus between 2002-2016.  Unfortunately, the resuscitation (without suicide doors) used as its inspiration the concept car displayed at the 2015 New York International Auto Show rather than the one so admired at Los Angeles in 2002.  The LA concept might not have been original but was an elegant and accomplished design, unlike what was offered in NYC fifteen years later: a dreary mash-up which looked something like a big Hyundai or a Chinese knock-off of a Maybach.  The public response was muted.

2019 Lincoln Continental Eightieth Anniversary Edition.

The tenth generation (2017-2020) managed what were by historic standards modest sales but by 2019, it seemed clear the thing was on death-watch but Lincoln surprised the industry with a batch of eighty LWB (long wheelbase) models with suicide doors to mark the eightieth anniversary of the Continental’s introduction in 1939.  Although there were those who suggested the relatively cheap process of a stretch and a re-hinge of the back-doors was a cynical way to turn a US$72K car into one costing US$102K and was likely aimed at the Chinese market where a higher price tag and more shiny stuff is thought synonymous with good taste, the anniversary models were sold only in the home market. Although even at the high price there was enough demand to induce ford to do a run of another 150 (non-commemorative) suicide door versions for 2020, the retro gesture proved not enough to save the breed and it was announced production would end on 30 October 2020 with no replacement listed.  Not only was the announcement expected but so was the reaction; the market having long lost interest in the uninspiring twenty-first century Continentals, few expressed regret.  The name-plate however, one of the most storied in the Ford cupboard, will doubtless one day return.  What it will look like is unpredictable but few expect it will match the elegance of what was done in the 1960s.

Haile Selassie I (1892-1975; Emperor of Ethiopia 1930-1974) being received by a ceremonial guard after alighting from the 1966 Vanden Plas Princess 4 Litre (DM4) Limousine of the Governor-General of Jamaica, 21 April 1966 (left) and Vanden Plas Princess with suicide doors open (right).

Emperor Haile Selassie’s 1966 state visit to Jamaica and the Caribbean has since been celebrated by Rastafari as “Grounation Day”, the term based on the emperor declining to walk on the red carpet provided in accordance with protocol because he wished to “make contact with the soil”.  Among many of the Rastafari (a movement which emerged in the 1930s, taking its name from Ras (the emperor’s pre-imperial name Ras)) Haile Selassie was worshipped as God incarnate, the messiah who delivered the peoples of Africa and the African diaspora to freedom from colonial oppression.  The limousine had been delivered to the island some six weeks earlier for the use of Elizabeth II (1926-2022; Queen of the UK and other places, 1952-2022) during her royal tour after which, she returned to London and the car was re-allocated to Government House as the viceroy’s official vehicle.  While it looked like something left over from pre-war days, for its intended purpose it was ideal, the rear compartment capacious, luxuriously trimmed and tall, making it suitable for those wearing even the highest plumed hats.  Into this welcoming space, occupants stepped through suicide doors which offered unparalleled ease of entry and departure, especially for the diminutive Haile Selassie who would barely have needed to bow his head.

1965 Vanden Plas Princess 4 Litre (DM4) Limousine Landaulette (left) and 1940s advertisement for Dickson automatic rear door-locks.

Based on a car which was even upon its debut in 1952 seemed old fashioned, by 1968 when production finally ended, the Vanden Plas Princess was, stylistically and technically, a true relic and it’s remarkable that complete with a split windscreen of two flat panes, it was a contemporary of machines like the Lincoln Continental, Jaguar XJ6 and NSU Ro80.  It was very much a case it being better to be inside a DM4 among burled walnut and West of England Cloth (durable leather was for chauffeurs and other servants who rode up front) looking (and for some, waving) out than on the outside looking in.  What must seem even more remarkable was that despite picking up a nickname like “suicide doors”, governments for decades did nothing to compel manufacturers to fit the small, cheap mechanisms (available on the aftermarket for US$3.95 a pair) which would prevent the doors opening while the car was in motion.  These potentially life-saving devices were not expensive and if installed in bulk on production lines, the unit cost would not much have exceeded US$1.00.  It was another world and not until the 1960s did the rising death toll compel legislatures to take seriously the matter of automotive safety.

1968 Vanden Plas Princess 4 Litre (DM4) "facelift prototype".

Vanden Plas did in 1968 belatedly plan an update of the DM4 which sort of "brought it into the 1950s" although for the target market, that may have been no bad thing.  By then however Harold Wilson's (1916–1995; UK prime minister 1964-1970 & 1974-1976) Labour Party government had engineered the "great coming together" which was the ultimately doomed British Leyland and with Jaguar also in the conglomerate, their much more advanced Daimler DS420 (1968-1992) limousine was obviously superior and there was no place for the "modernized DM4", the grafting of quad headlights and a one-piece windscreen not enough to save the relic from extinction.  Along with New Zealand's curious hybrid model of the 1970s, the Wilson government was the West's only serious attempt to combine political freedom with a quasi-socialist planned (if not quite command) economy and the reactions to the lessons provided by British Leyland (and other state ventures) contributed to the hegemony of the neo-liberal model which for the last four-decades odd has done what it's done.  

When used by the wedding and hire car industries, some operators took advantage of many of the English limousines from the 1950s & 1960s being fitted with version of the GM (General Motors) Hydramatic automatic transmission, installing in each centre-post a dead-bolt activated by an electrical solenoid, the system triggered by “on” by the shift lever being in drive (locking the rear doors) and “off” by moving the lever to neutral (withdrawing the bolt).  Vanden Plas did at least on some models include on the dashboard a pair of red lights which brightly would glow if the corresponding left or right door was not completely closed.  The much more expensive Rolls-Royce limousines had no such “safety lights”; passengers in those were on their own.  It was not a theoretical problem because there were many documented cases of passengers, especially those sitting (without seat belts) in the jump-seats leaning against the doors, sometimes pressing down the handle, cause the door to open.

1960 Facel Vega Excellence EX1.

The four-door Facel Vegas featured suicide doors which were among the most potentially dangerous because of the dubious (though elegant) engineering in the locking mechanisms.  Note also the "dog leg" of the A-Pillar (windscreen), a styling trend borrowed from Detroit which caused many injuries to knees and one victim was Richard Nixon (1913-1994; US president 1969-1974) who who in August, 1960 suffered a hit during his doomed campaign for that year's presidential election.  It resulted in a staphylococci infection which for two weeks confined him to bed in Walter Reed Hospital at a time when his opponent (John Kennedy (JFK, 1917–1963; US president 1961-1963) was travelling the country campaigning and for a born politician like Nixon it wouldn't have been much consolation that his bedside well-wishers included Lyndon Johnson and Barry Goldwater (1909–1998; Republican Party nominee for the 1964 US presidential election); hearing those two were walking down the corridor, he may have wondered if he could fake his own death.  One biographer suggested the injury happened because his team deliberately chose to use a cheaper Chevrolet rather than a "larger" Cadillac in order to project a less elitist image.  While the reason for the choice of car was true, the impact injury would anyway likely have happened because, beginning with the 1959 range, for reasons of production-line rationalization, Chevrolet & Cadillac (along with corporate stable-mates Buick, Oldsmobile & Pontiac) all shared GM's corporate C-Body platform and while between divisions there were sometimes dimensional differences (notably in wheelbases), the front doors, A-Pillars and seat mounting points were identical in all.    

If compatible (which seems improbable given the novelty of this French approach to door-latch design), the Dickson locks would have been a worthwhile addition for the Facel Vega Excellence (1956-1964) which, in a triumph of fashion over function, had no B-Pillar (ie the central one between the doors) at all, the suicide doors secured only by a locking mechanism in the door sill, something which worked well in static testing but on the road, lateral stresses induced during cornering meant the doors were apt to “fly open”, something to ponder in the pre-seat-belt era.  The completely pillarless look did however look good so there was that.  One of the most glamorous machines of the era, many celebrities were drawn to Facel Vegas but the most infamous association was with the author Albert Camus (1913–1960), killed instantly when the FV3B in which he was a passenger crashed into a tree; the car was being driven by his publisher, Michel Gallimard (1917–1960), who was mortally injured, dying within days.  Although the accident happened on a long, straight section of road, the conditions were icy and the official cause was listed as "...a loss of control while travelling at an excessive speed for the conditions".  The FV3B was a two-door coupé so there was no link with the suicide doors used on the Excellence, the possibility of tyre failure has always been speculative and there's now little support for the conspiracy theory (which long circulated) suggesting the KGB may have sabotaged the car because of the author's anti-Soviet stance.  Powered by a variety of Chrysler V8s (the "Hemi", "Poly" & "Wedge" all at times used), the "big" Facel Vegas (1954-1964; some 506 coupés, 156 sedans and a reputed 11 cabriolets) were France's finest cars of the post-war years but the decision to produce a smaller range doomed the company.  The concept was sound, the market existed and the product was well-designed but the French-made four-cylinder engine proved chronically (and insolubly) unreliable; by the time a version powered by a robust Volvo unit was ready, warranty claims and the costs of the re-engineering had driven Facel Vega bankrupt.

Lure of the tragic

Evelyn McHale: "The most beautiful suicide".

Predictably, it’s the suicides of celebrities (however defined) which attract most interest but there’s a fascination also with those by young women and that’s understandable because of the lure of youthful beauty and tragedy.  The photograph remembered as “the most beautiful suicide” was taken by photography student Robert Wiles (1909-1991), some four minutes after the victim's death.  Evelyn Francis McHale (1923–1947) was a bookkeeper who threw herself to her death from the 86th-floor observation deck of New York's Empire State Building, landing on a Cadillac limousine attached to the General Assembly of the United Nations (UN) which was parked on 34th street, some 200 feet (60 m) west of Fifth Ave.  The police would later find he last note which read: “I don’t want anyone in or out of my family to see any part of me. Could you destroy my body by cremation?  I beg of you and my family – don’t have any service for me or remembrance for me.  My fiance asked me to marry him in June.  I don’t think I would make a good wife for anybody. He is much better off without me.  Tell my father, I have too many of my mother’s tendencies.”  It was reported her mother suffered from “an undiagnosed and untreated depression”.

Mary Miller and the "Genesee Hotel Suicide".  Earlier postcard of the Genesse Hotel with eighth floor ledge indicated by yellow arrow (left) and Mr Sorgi's photograph (centre) of the suicide's aftermath (right).

In many parts of the world, it’s now unusual if someone is not carrying a device able instantly to capture HD (high-definition) images & video footage but until relatively recently, cameras rarely were taken from the home unless to use them at set piece events such as vacations or parties.  Not only are people now able to record what they see but within seconds, images and clips can be transmitted just about anywhere in the world, some “going viral”.  This proliferation of content has had many implications, one noted phenomenon it seeming now more likely someone will film another at imminent risk of death or injury than offer to assist; psychiatrists, sociologists and such have offered views on that but the behaviour, at least in some cases might be better explained by lawyers and economists.

In 1942 it was mostly professional photographers who routinely would have to hand a camera and the devices were not then like the instantly available “point & shoot” technology of the digital age, the process then a cocktail of loading physical film-stock, assessing the light, adjusting the aperture and maybe even swapping lens.  The photograph (the lens wide-open and the shutter was set to a 1000th of a second), of Mary Millar (1907-1942), mid-flight in her leap to death from an eighth-floor ledge of the Genesee Hotel in Buffalo, New York was a thing most unusual: an anyway rare event happening when someone stood ready to take the picture.  When published, the photograph was captioned “Suicide” or “The Genesee Hotel Suicide” but the popular press couldn’t resist embellishment, one using the title “The Despondent Divorcee” which was in the tabloid tradition of “making stuff up”; Ms Millar had never been married and not in a relationship.  She left no suicide note.

Ignatius Russell Sorgi (1912-1995) was a staff photographer on Buffalo’s Courier Express who on 7 May, 1942 happened to take a different route back to the office when he saw a police car speeding down the road, sirens blaring.  Accordingly, in the “ambulance chasing” tradition, he followed, not knowing what he’d see but knew it might be news-worthy and gain him a front-page credit: “I snatched my camera from the car and took two quick shots as she seemed to hesitate…As quickly as possible I shoved the exposed film into the case and reached for a fresh holder.  I no sooner had pulled the slide out and got set for another shot than she waved to the crowd below and pushed herself into space.  Screams and shouts burst from the horrified onlookers as her body plummeted toward the street.  I took a firm grip on myself, waited until the woman passed the second or third story, and then shot.