Showing posts with label Zoology. Show all posts
Showing posts with label Zoology. Show all posts

Tuesday, October 7, 2025

Viscacha or vizcacha

Viscacha or vizcacha (pronounced vi-skah-chuh)

A gregarious burrowing hystricomorph rodent (Lagostomus maximus), of the genera Lagidium and Lagostomus, within family Chinchillidae, about the size of a groundhog, inhabiting the pampas of Paraguay and Argentina and allied to the smaller chinchilla, also from the family Chinchillidae.  It’s known also at the mountain viscacha, a related rodent of the genus Lagidium (of the Andes), about the size of a squirrel, having rabbit-like ears and a squirrel-like tail.

1595–1605: From the Spanish viscacha from Quechua wisk’acha or Quechuan wiskácha.  The Spanish Quechua is from qhichwa (literally “temperate valley”).  With use depending on prevailing practice, both the spellings viscacha & vizcacha are used in various branches of biology and zoology, the older alternatives biscacha, biscacho & bizcacha now rare except in historic citation.  The noun plural is viscachas and the derived term is viscachera (plural viscacheras) which describes a warren inhabited by viscachas.

Vizcacha moments: Time for the world weary to take a nap; Lindsay Lohan (left) joining a viscacha (right) in a yawn.

The viscachas or vizcachas, of which there are five extant species, are rodents of the genera (Lagidium and Lagostomus) within the family Chinchillidae.  Native to South America, despite looking similar to rabbits or hares, they’re not related to either and are thus of interest to evolutionary biologists because they’re an example of convergent evolution.  When biologists first saw the viscacha they noted the question of heritage: mammals part of the Leporidae family (rabbit) or the Chinchillidae family (Chinchilla)?  Sharing the large ears, powerful hind legs, and small front paws, Vizcachas do bear a striking resemblance to the rabbit family but are distinguished by their long bushy tail, a trait unique to the Chinchillidae family.  Helpful for biologists as a species indicator, for the small rodent, it’s a marker of their state of mind, the tail is extended when distressed and curled when at ease.

Residing throughout southern and western South America, they tend to stay close to their underground burrows but possess surprising dexterity as climbers, able to jump from rock to rock so effortlessly and with such alacrity observers report their progress is hard to track with the naked eye.  They live in colonies that can be barely a dozen or number in the hundreds and have acquired an extensive repertoire of vocalizations used in social interactions.  Small they may be but Vizcachas are voluble and, belying their sleepy appearance, are noted for their gregarious behavior.

Up to two feet (.6 m) in length and weighing typically around 3.5 lbs (1.6 kg), Vizcachas are relatively large by rodent standards but are small compared to their carnivorous neighbors, the Puma and Culpeo Fox.  These two are fierce predators but the fast, agile Vizcacha has the advantage of inhabiting a mountainous environment littered with boulders and rocks which is difficult hunting ground so doesn’t suffer greatly from predation induced population decline.  The main threat is humans, less from the habitat loss which threatens some species but because of illegal hunting for their meat and fur, luxury items in some markets.

There are spiritual traditions in which exists the concept of the spirit animal, a creature the spirit of which is said to help guide or protect a person on a journey and the characteristics of which that person shares or embodies.  The apparently ancient concept is prominent in a number of indigenous (notably Native American) religions and cultures and was embraced by Pagan and Wiccan communities as recently as the 1990s and the term totem was sometimes used.  Totem was from the Native North American Ojibwe ᑑᑌᒼ or ᑑᑌᒻ (doodem) and referred to a sacred object, symbol or spirit and in a sense can be thought of as the equivalent of a flag (in the case of a tribe) or coat of arms (in the case of a clan).  The word totem became widely used by anthropologists when discussing cultural practices in many places (and not just in North America).  In academic use where it's a neutral descriptor this is usually not controversial but in general use it can be a form of cultural misappropriation.  In the West, the idea of spirit animals was picked up in the weird world of the new age, dolphins and other charismatic creatures predictably popular.  The concept turned out also to have appeal to some among the less spiritual who adopted the viscacha as their spirit animal because there is seemingly no living thing on earth with an appearance which so encapsulates the qualities of the melancholic, world-weariness and the need to take a nap.


Vizcacha moments: Jiang Zemin (1926–2022; General Secretary of the Chinese Communist Party (CCP) (and thus paramount leader) 1989-2002 & President of the People's Republic of China (PRC) 1993-2003), yawning (left) and resting his eyes (right) during one of the less interesting speeches delivered as part of the otherwise riveting proceedings of the nineteenth congress of the CCP, Beijing, October 2017.  Western diplomats noted that, unusually among those in the senior echelon of the CCP, Mr Jiang could at times seem almost "exuberant" (a contrast to his two more dour successors) but in retirement he may have adopted the viscacha as his spirit animal, the creature quite suited to his more somnolent lifestyle.

The Ciano Diaries, 1939-1943.  Although a literary genre not always renowned for accuracy, historians regard Ciano's among the more reliable.

One can understand Mr Jiang taking a moment to rest his eyes during the congress.  After half a lifetime in politics, some of it in the era when “a fatal error” was not a figurative phrase, he’d probably heard it all before and could sense when he could “tune out” for a while.  Cases have often been documented of those for whom continued attention becomes just too much and one who caused more vizcacha moments than most was Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) whose repetitive and seemingly endless monologues (touching discursively on subjects such as art, architecture, dog breeding, artificial honey, the church, philosophy and vegetarianism) came to be dreaded by almost all compelled to sit and endure a session.  Count Galeazzo Ciano (1903–1944; Italian foreign minister 1936-1943 (and the son-in-law of Benito Mussolini who ordered his execution)) was, like us all, a flawed character but he had a diarist’s eye and in his entries left some of the most vivid recollections of the World War II era.  In the Austrian city of Salzburg in May, 1942 he attended a series of meetings along with Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) and the two senior figures from the OKW (Oberkommando der Wehrmacht (the German military's high command)), Generalfeldmarschall (Field Marshal) Wilhelm Keitel (1882–1946; chief of OKW 1938-1945) and Generaloberst (Colonel General) Alfred Jodl (1890–1946, chief of the OKW operations staff 1939-1945), noting in his diary one “epic struggle”:

Hitler talks, talks, talks, talks.  Mussolini suffers—he, who is in the habit of talking himself, and who, instead, practically has to keep quiet.  On the second day, after lunch, when everything had been said, Hitler talked uninterruptedly for an hour and forty minutes. He omitted absolutely no argument: war and peace, religion and philosophy, art and history.  Mussolini automatically looked at his wrist watch, I had my mind on my own business, and only Cavallero, who is a phenomenon of servility, pretended he was listening in ecstasy, continually nodding his head in approval.  Those, however, who dreaded the ordeal less than we did were the Germans.  Poor people.  They have to take it every day, and I am certain there isn’t a gesture, a word, or a pause which they don’t know by heart.  General Jodl, after an epic struggle, finally went to sleep on the divan. Keitel was reeling, but he succeeded in keeping his head up.  He was too close to Hitler to let himself go as he would have liked to do.

Saturday, September 20, 2025

Snarge

Snarge (pronounced snn-arj)

(1) In military & civil aviation, slang, the remains of a bird after it has collided with an airplane (ie bird strike), originally of impacts with turbine engines but latterly applied also to residue left on wings, fuselages etc.

(2) By adoption, the remains of birds and insects left on the windscreens of trains, cars, motorcycle fairings etc,

Early 2000s (probably): A portmanteau word, a blend of sn(ot) + (g)ar(ba)ge.  Snot (used here in the usual sense of “mucus, especially that from the nose”) was from the Middle English snot & snotte, from the Old English ġesnot & snott, from the Proto-West Germanic snott & snutt, from the Proto-Germanic snuttuz (nasal mucus), from the same base as snout and related to snite.  It was cognate with the North Frisian snot (snot), the Saterland Frisian Snotte (snot), the West Frisian snotte (snot), the Dutch snot (snot), the German Low German Snött (snot), the dialectal German Schnutz (snot), the Danish snot (snot) and the Norwegian snott (snot).  Trans-linguistically, “snot” is commendably consistent and its other uses (a misbehaving (often as “snotty”) child; a disreputable man; the flamed-out wick of a candle all reference something unwanted or undesirable).  That said, snot (mucus) is essential for human life, being a natural, protective, and lubricating substance produced by mucous membranes throughout the body to keep tissues moist and act as a barrier against pathogens and irritants like dust and allergens, working to trap foreign particles; it also contains antimicrobial agents to fight infection.  So, when “out-of-sight & out-of-mind” it’s helpful mucus but when oozing (or worse) from the nostrils, it’s disgusting snot.

Garbage (waste material) was from the late Middle English garbage (the offal of a fowl, giblets, kitchen waste (though in earlier use “refuse, that which is purged away”), from the Anglo-Norman, from the Old French garber (to refine, make neat or clean), of Germanic origin, from the Frankish garwijan (to make ready).  It was akin to the Old High German garawan (to prepare, make ready) and the Old English ġearwian (to make ready, adorn).  The alternative spelling was garbidge (obsolete or eye dialect).  Garbage can be used of physical waste or figuratively (ideas, concepts texts, music etc) judged to be of poor quality and became popular in computing, used variously to mean (1) output judged nonsensical (for whatever reason), (2) corrupted data, (3) memory which although allocated was no longer in use and awaiting de-allocation) or (4) valid data misinterpreted as another kind of data.  Synonyms include junk, refuse, rubbish, trash & waste.  Charlie Chaplin (1889–1977) used “Herr Garbage” as the name of the character who in The Great Dictator (1940) represented Dr Joseph Goebbels (1897-1975; Nazi propaganda minister 1933-1945).  Snarge is a noun and no derived forms have ever been listed but a creature which has become snarge would have been snarged and the process (ie point of impact) would have been the act of snarging.  Snarge is inherent the result of a fatality so an adjective like snargish is presumably superfluous but traces of an impact which may not have been fatal presumably could be described as snargelike or snargesque.

Dr Carla Dove at work in the Smithsonian's Feather Identification Laboratory, Washington DC.

The patronymic Dr Carla Dove (b 1962) is manager of the Feather Identification Laboratory at the Smithsonian Institution’s National Museum of Natural History in Washington DC where she heads a team identifying the types or species of birds that collide with military and civil aircraft.  She calls snarge “a term of art” (clearly she’s of the “eye of the beholder” school) and notes that although the scientific discipline of using snarge to determine the species involved in bird strikes began at the Smithsonian in 1960, the term doesn’t seem to have been coined there and its origin, like much slang with a military connection, is murky.  Although a 2003 article in Flying Safety magazine is sometimes cited as the source of the claim the word was “invented at the Feather Identification Laboratory”, Dr Dove is emphatic the staff there “borrowed it” from preparators (the technicians who prepare bird specimens for display or other uses by museums).  It certainly seems to have been in general use (in its specialized niche in military & aviation and wildlife safety circles) by at least the early-to-mid 2000s and the zeitgeisters at Wired magazine were in 2005 printing it without elaboration, suggesting at least in their editorial team it was already establish slang.  So, it may long have been colloquial jargon in museums or among those working in military or civil aviation long before it appeared in print but there no documentary evidence seems to exist.

The origin of the scientific discipline is however uncontested and the world’s first forensic ornithologist was the Smithsonian’s Roxie Laybourne (1910–2003).  In October, 1960, a Lockheed L-188 Electra flying as Eastern Airlines Flight 375 out of Boston Logan Airport had cleared the runway by only a few hundred feet when it flew into a flock of birds, the most unfortunate of which damaged all four engines, resulting in a catastrophic loss of power, causing the craft to nosedive into Boston Harbor, killing 62 of the 72 aboard.  Although the engines were turbo-props rather than jets, they too are highly susceptible to bird-strike damage.  At the time, this was the greatest loss of life attributed to a bird-strike and the FAA (Federal Aviation Authority) ordered all avian remains be sent to the Smithsonian Institution for examination.  There, Ms Laybourne received  the box of mangled bone, blood & feathers and began her investigation, her career taking a trajectory which would include not only the development of protocols designed to reduce the likelihood of bird strikes damaging airliners but also involvement with the USAF (US Air Force) & NASA (National Aeronautics and Space Administration).  Additionally, her work with the FBI (Federal Bureau of Investigation) and various police forces proved forensic ornithology could be of use a diagnostic tool in crime-solving; her evidence helping to convict murderers, kidnappers and poachers.  In 2025, journalist Chris Sweeney published The Feather Detective: Mystery, Mayhem, and the Magnificent Life of Roxie Laybourne, a vivid telling of the tale of a woman succeeding in a world where feminism had not yet wrought its changes.

Snarge on the nosecone of a Cessna Citation, Eisenhower Airport, Wichita, Kansas, July 2021.  The dent indicates the point of impact, the airflow holding the corpse in place.  By the time of landing, the leaked body fluids had congealed to act as a kind of glue.

The study of aviation bird strikes is obviously a specialized field but snarge has come also to be used in the matter of insect deaths, specifically what has come to be called the “windscreen phenomenon” (also as “windshield phenomenon” depending on linguistic tradition).  What that refers to is the increasingly common instances of people reporting they are seeing far fewer dead insects on the windscreens of their cars, many dating the onset of the decline to the late 1990s and the most common explanations offered for this are (1) climate change, (2) habitat loss and (3) the increasing use (or potency) of pesticides.  Individual observations of one’s windscreen now tending to accumulate less snarge than in years gone by is of course impressionistic and caution must be taken not to extrapolate the existence of a global trend from one piece of glass in one tiny part of the planet: what needs to be avoided is a gaboso (the acronym for Generalized Association Based On Single-Observation (also as the derived noun & verb) which is the act of taking one identifiable feature of someone or something and using it as the definitional reference for a group (it ties in with logical fallacies).  However, the reports of increasingly snargeless windscreens were widespread and numerous so while that didn’t explain why it was happening, it did suggest that happening it was.

There was also the matter of social media platforms which have meant the volume of messages about a particular topic in the twenty-first century is not comparable with years gone by.  It’s simply impossible to calculate the extent to which these mass-market (free) platforms have operated as an accelerant (ie a force-multiplier of messaging) but few doubt it’s a considerable effect.  Still, it is striking the same observations were being made in the northern & southern hemispheres and the reference to the decline beginning in the late 1990s was also consistent and a number of studies in Europe and the US have found a precipitous drop in insect populations over the last three decades.  One interesting “quasi theory” was the improved aerodynamic efficiency of the modern automobile meant the entomological slaughter was reduced but quickly aeronautical engineers debunked that, pointing out a slippery shape has a “buffer zone” very close to the surface which means "bugs" have a greater chance of being sucked-in towards the speeding surface because of the differential between negative & positive pressure.  However, on most older vehicles, the “buffer zone” could be as much as 3 feet (close to a metre) from the body.  A bug heading straight for the glass would still be doomed but the disturbed air all around would have deflected a few

Lindsay Lohan with Herbie in Herbie: Fully Loaded (2005).

Herbie was a 1963 Volkswagen Type 1 (Beetle, 1938-2003) and despite the curves which made it look streamlined, its measured Cd (drag coefficient) was typically around 0.48-0.50, some 8% worse than contemporary vehicles of comparable frontal area.  What that meant was its buffer zone would extend somewhat further than the “New Beetle” (1997-2011) which had a Cd between 0.38-0.41, again not as good as the competition because it was compromised by the need to maintain a visual link with the way things were done in 1938.  On the 1963 models (like Herbie) the flat, upright windscreen created significant drag and was obviously a good device for “snarge harvesting” but the later curved screen (introduced in 1973 with the 1303) probably didn’t spare many insects.

Dr Manu Saunders' graphic example of insect snarge on a windscreen during the 2010 "locust plague" in western NSW (New South Wales), Australia, April 2010.

Dr Manu Saunders is a Senior Lecturer in Ecology and Biology and the School of Environmental and Rural Science in Australia’s UNE (University of New England) and she pointed out that “anecdata is not scientific evidence” and just because anecdotes are commonly presented as “evidence of global insect decline” (the so-called “insectageddon”), that doesn’t of necessity make locally described conditions globally relevant.  The problem she identified was that although there have been well-conducted longitudinal studies of snarge on windscreens using sound statistical methods, all have used data taken from a relatively small geographical area while around the planet, there are more than 21 million km (13 million miles, (ie more than 80 round trips to the Moon) of “roads”).  Dr Saunders does not deny the aggregate number of insects is in decline but cautions against the use of one data set being used to assess the extent of a phenomenon with a number of causal factors.

Still snarge-free: The famous photograph of the 25 917s assembled for inspection outside the Porsche factory, Stuttgart, 1969.  The FIA’s homologation inspectors declined the offer to test-drive the 25 which was just as well because, hastily assembled (secretaries, accountants and such drafted in to help), some of were capable of driving only a short distance in first gear.

Fortunately for Porsche, in 1969, although the decline in global insect numbers may already have begun, they were still buzzing around in sufficient numbers to produce the snarge which provided the necessary clue required to resolve the problem of chronic (and potentially lethal) instability which was afflicting the first 917s to be tested at speed.  In great haste, the 917 had been developed after the Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation and world sport's dopiest regulatory body) “relaxed” the rules which previously had set a threshold of 50 identical units for cars classified as Group 4 (5 litre (305 cubic inch)) sports cars, reducing this to a minimum of 25.  What that meant was Porsche needed to develop both a car and a twelve cylinder engine, both items bigger and more complex than anything they’d before attempted, things perhaps not overly challenging had the typical two years been available but the factory needed something which would be ready for final testing in less than half the time.  Remarkably, they accomplished the task in ten months.

Porsche 917 LH Chassis 001 in the livery of the IAA (Internationale Automobil-Ausstellung (International Automobile Exhibition)) used for the Frankfurt Motor Show.

The brief gestation period was impressive but there were teething problems.  The fundamentals, the 908-based space-frame and the 4.5 (275 cubic inch) litre air-cooled flat-12 engine (essentially, two of Porsche’s 2.25 (137 cubic inch) litre flat-sixes joined together) were robust and reliable from the start but, the sudden jump in horsepower (HP) meant much higher speeds and it took some time to tame the problems of the car’s behaviour at high-speed.  Aerodynamics was then still an inexact science and the maximum speed the 917 was able to attain on Porsche’s test track was around 180 mph (290 km/h) but when unleashed on the circuits with long straights where over 200 mph (320 km/h) was possible the early 917s proved highly unstable, the tail “wandering from side-to-side” something disconcerting at any speed but beyond 200 mph, frightening even for professional race drivers.

On Mulsanne Straight, Le Mans: The slippery 917 LH (left) which proved "unsafe at high speed" (left) and the (slightly) slower 917 K (right) which, in the hands of experts), was more manageable.

The instability needed to be rectified because the 917 had been designed with "a bucket of Deutsche Marks in one hand and a map of the Le Mans circuit in the other" and these were the days before the FIA (Fédération Internationale de l'Automobile (International Automobile Federation and world sport's dopiest regulatory body)) started insisting chicanes be spliced into any straight where high speeds beckoned and the Mulsanne Straight at Le Mans was then an uninterrupted 6 km (3.7 mile) straight line.  There, the test results and slide-rule calculations predicted, the 917s would achieve in excess of 360 km/h (224 mph).  Serendipitously, physics and nature combined to show the team where the problem lay: After one alarming high speed run, it was noticed that while the front and central sections of the bodywork were plastered with bloodied snarge, the fibreglass of the rear sections remained a pristine white, the obvious conclusion drawn that while the airflow was inducing the desired degree of down-force on the front wheels, it was passing over the rear of body, thus the lift which induced the wandering.  Some rapid improvisation with pieces of aluminium and much duct tape (to this day a vital tool in the business) to create an ad-hoc, shorter, upswept tail transformed the behaviour and was the basis for what emerged from the factory's subsequent wind-tunnel testing as the 917 K (K for Kurzheck (short-tail).  The rest is history.

Dodge Public Relations announces the world now has "spoilers".  Actually they'd been around for a while but, as Dodge PR knew, until it happens in America, it hasn't happened.

What happened to the 917 wasn’t novel.  In 1966, Dodge had found the slippery shape of its new fastback Charger had delivered the expected speed on the NASCAR ovals but it came at the cost of dangerous lift at the rear, drivers’ graphically describing the experience at speed as something like “driving on ice”.  The solution was exactly what Porsche three years later would improvise, a spoiler on the lip of the trunk (boot) lid which, although only 1½ inches (38 mm) high, at some 150 mph (240 km/h) the fluid dynamics of the air-flow meant sufficient down-force was generated to tame the instability.  Of course, being NASCAR, things didn’t end there and to counter the objection the spoiler was a “non-stock” modification and thus not within the rules, Dodge cited the “safety measure” clause, noting an unstable car on a racetrack was a danger to all.  NASCAR agreed and allowed the device which upset the other competitors who cited the “equalization formula clause” and demanded they too be allowed to fit spoilers.  NASCAR agreed but set the height at maximum height at 1½ inches and specified they could be no wider than the trunk lid.  That left Dodge disgruntled because, in a quirk of the styling, the Charger had a narrower trunk lid than the rest of the field so everybody else’s spoilers worked better which seemed unfair given it was Dodge which had come up with the idea.  NASCAR ignored that objection so for 1967 the factory added to the catalogue two small “quarter panel extensions” each with its own part number (left & right); once installed, the Charger gained a full-width spoiler.

Saturday, September 6, 2025

Deodand

Deodand (pronounced dee-uh-dand)

(1) In English law (prior to 1846), an animal or a personal chattel (the scope later extended) that, having been the immediate, accidental cause of the death of a human being, was forfeited to the Crown to be sold with the money gained applied originally to pious uses.

(2) In English law (prior to 1846), A fine paid to the Crown, equal to the value of a deodand, paid by the owner of the object and subsequently applied originally to pious uses.

1520–1530: From the late thirteenth century Anglo-French deodande, from the Medieval Latin deōdandum (a thing) to be given to God, the construct being the Classical Latin deō (to God (dative singular of deus (god)) + dand(um) to be given (neuter gerund of “dare to give”) from the primitive Indo-European root do- (to give).  Deus was from the primitive Indo-European root dyeu- (to shine and (in derivatives” “sky, heaven, god”).  Deodand is a noun; the noun plural is deodands.

That the doctrine of deodand was a medieval legal relic (the earliest recorded instances of use in England dating from the eleventh century) is not that remarkable because in that it was one of a number; what’s remarkable is it remained part of the common law until the mid-1800s.  The concept was first well documented in thirteenth century legal texts and historians have concluded this “semi-codification” reflected the earlier religious tradition which held an object which caused a death was “tainted” and should be removed from profane use.  In that, it inherited older notion from Roman civil law of noxae deditio (literally “surrender for the wrongdoing” and in English law written usually as “noxal surrender”), the construct being noxae (harm, injury, wrongdoing) + deditio (surrender, giving up).  Noxae deditio was a legal mechanism (in response to what would now be called a writ) with which the owner of an animal or slave (The Romans really did make a distinction) could avoid liability for delicts (wrongs) committed by them by surrendering the animal or slave to the injured party as an alternative to paying damages.  Intriguingly, at certain times, the doctrine was extended to sons (though apparently not daughters) in circumstances where an action was brought against a paterfamilias (the head of a household), on the basis he was held to be responsible for the son’s acts.  Literally, the son could be “handed over”, either until they attained statutory adulthood or for a specified period, depending on the damages assessed.  A similar idea was the Old English wergeld, from the Proto-West Germanic werageld, the construct being wer (man) +‎ ġield (payment).  It was a form of compensation paid by a transgressor to a victim, or (as “blood money) to the victim's family if the victim were dead (the quantum decided by social rank).  The concept is familiar in many societies and is sometimes formalized in Islamic systems using the Sharia Law where the victim’s family can be involved in determining not only how much blood money should be paid but also whether there should be a payment as an alternative to a death sentence.

What evolved in English common law was the rule under which, if a person was killed by an animal, vehicle, tool or other inanimate object, that object was declared a “deodand” to be forfeited to the Crown.  Reflecting the theological basis for this, notionally the surrender was “to God”, but quickly the standard practice became to appraise the value of the beast or object and levy a fine in that sum.  Although the documentary evidence is patchy, it appears originally the forfeited property (or cash from the fine) was devoted to pious uses such as alms (ie charity for the poor) or (as was the usual trend when a revenue stream was identified) ecclesiastical purposes such as building churches or stained glass windows.  Later (another trend being squabbles between church & state), deodans became a source of consolidated royal revenue.  The rationale was partly religious (atonement), partly superstitious (removing the dangerous object), and partly fiscal (Crown revenue).

The school bus scene: In Mean Girls (2004), had Regina George (Rachel McAdams (b 1978)) been killed by the school bus, the vehicle would have been declared a deodand and forfeited to the state although the usual practice was for its value to be assessed and an order for a payment in that sum to be served on the owner.

It was a simple concept but because there was much variation in the circumstances in which a deodand could be declared, the case law reveals inconsistencies in the verdicts.  Were someone to be killed by being run over by a horse-drawn cart, depending on this and that, the deodand might be found to be the cart and horse, the cart or horse alone or even just the particular wheel which crushed the unfortunate deceased.  One of the reasons for the variance is that in many instances the matter was determined not by a judge or magistrate working from precedent but (at coroners’ inquests) by juries which would both define the deodand and assess its value.  Given that, on what appear to be similar facts (a sailor who drowned after being struck by a mast), the deodand might be found to be the whole vessel or merely the mast.  In such cases, the issue was which object (or part of an object) should be held to be the “guilty instrument” and that was a process not simple to define, things made more difficult still by the opinions of jury members being so diverse and prone to be influenced by the identity of both the victim(s) and the owner of the object(s).

Aftermath of the explosion of a locomotive’s steam boiler.  If reduced to scrap by the event in which someone died, the jury could assess the value of the object in its "pre-event" condition.

By the eighteenth century, deodands had become largely devices of reference in that actual confiscation of objects was rare with an assessment of their monetary value to set the fine to be paid the standard practice.  Lawyers, politicians and (especially) those in commerce were critical of the system as irrational and even then there were traces of what would evolve as the modern notions of negligence and responsibility; critiques of deodand came both from what would now be described as “the right” and “the left”.  Those who owned the objects which became lethal instruments argued it was unfair they be punished so severely for what were, however tragic, “mere accidents”, pointing out the system discouraged industrial enterprise while those advocating for victims pointed out it was the state which gained the proceeds of the fines while victims’ families (many of which had lost their sole breadwinner) gained nothing.  What finally brought about the end of deodand was it being overtaken by the industrial age in which deaths came routinely to occur in clusters.  It was the multiple fatalities in marine and train accidents (infamously the Hull Tragedy (1838) and the Sonning Cutting Disaster (1840)) which attracted press coverage and public debate; in each case a “certificate of deodand” was attached to the machinery and, given the cavalier attitude of railway operators towards safety, it was hardly surprising coroners’ juries had little hesitation in declaring a locomotive and its rolling-stock a deodand.  That was obviously an expensive threat to capitalism and the lobbying by these vested interest resulted in parliament abolishing deodands by the Deodands Act 1846 (9 & 10 Vict. c.62).

Tallahassee Democrat, 13 October 1991.

The Daytona Yellow 1969 Chevrolet Corvette ZL1 coupé is the rarest and most valuable C3 Corvette (1968-1982) made, the “other ZL1” a Monaco Orange Roadster having a less pure pedigree (although at auction in January 2023 it realized US$3.14 million.  The yellow ZL1 last changed hands in October 1991 when it was sold in a government forfeiture auction for US$300,000 (then a lot of money) after being seized by the DEA (Drug Enforcement Agency).

The Act however was part of a reform process and the early initiatives included the statutes which would by the mid twentieth century evolve into modern negligence and compensation law, the most significant of the early steps being the Fatal Accidents Act 1846 (Lord Campbell’s Act) which for the first time codified the idea of the “wrongful death claim” and permitted families to sue on this basis.  Although now largely forgotten, the 1846 act was a significant marker of the transition of English law from a medieval, semi-religious system of atonement to a modern, rationalized law of tort, product liability and compensation.

Echoes do however remain in certain legal doctrines of forfeiture (such as state seizures of the proceeds of crime) and the US practice of civil asset forfeiture does, at least in a philosophical sense, sometimes treat property as “guilty”.  The US law provides for property (cars, boats, money etc) connected with the commission of a crime to be seized by the state even if the owner, personally, wasn’t “guilty”; it’s a modern interpretation of the medieval view the object itself bore responsibility.  What this means is the legal rationale is structurally similar to what once was the religious justification: What once was “given to God” as expiation as atonement for sin translates now into deterrence as an expression of public policy (removing dangerous tools or preventing criminals from profiting).  As a kind of “legal fiction”, under both regimes the object is treated as if it possesses some kind of independent agency.  Intriguingly, as an administrative convenience, that idea survived in Admiralty Law under which vessels can in suits be “personified”, thus cases like “The SS <ship name> v. Cargo”, the model for civil asset forfeiture procedures in which the object is the defendant (such as United States v. One 1969 Chevrolet Corvette).

Building from Biblical tradition, the idea of independent agency had a curious history in the legal systems of Christendom and in Europe from the Middle Ages through the early modern period, animals could be put on trial (in both secular courts and ecclesiastical courts) for murder.  These trials followed legal procedures similar to those in which a human was the accused although, obviously, cross-examination was somewhat truncated.  The most commonly tried animals were pigs, simply because it wasn’t uncommon for them freely to roam in urban areas and attacks on babies and infants were frequent.  In Normandy in 1386, a sow was dressed in human clothing and publicly executed for killing a child while at Châlons in 1499, a sow and her six piglets were tried; the sow was executed for killing a man, while the piglets were acquitted due to “lack of evidence.”  Nor were the defendants exclusively porcine, bulls and horses occasionally executed for killing people and in ecclesiastical courts there are many records of rodents and insects being charged with damaging crops.  Presumably because every day of the week rodents and insects were killed just for “being guilty of being rodents and insects”, ceremonial executions wouldn’t have had much symbolic value so the usual result handed down was excommunication(!) or a demand (from God, as it were) the creatures vacate the fields in which they were consuming the crops.

Perpetually hungry weevils enjoying lunch in a granary.

Sometimes the ecclesiastical courts could be imaginative.  In the Italian region of Tyrol in 1713, the priests ordered the hungry weevils to leave the vineyards where they were such a plague but in compensation granted their occupation of a barren piece of land as an alternative habitat.  The reaction of the insects to the ruling would have been rather as King Cnut (better known as Canute, circa 990–1035; King of England 1016-1035) would have predicted but despite that, there’s no record of the weevils being held in contempt of court.  Regrettably, there's no generally accepted collective noun for weevils but weevilage (a portmanteau word, the blend being weevil + (vill)age) seems more compelling than Adelognatha (the scientific term referring to a group of Curculionidae (a family of weevils) characterized by a specific anatomical feature).  There was at least some theological basis for the ecclesiastical courts claiming entomological jurisdiction because in scripture it was written beasts are God’s creatures like all others and over them God granted dominion to man (Genesis 1:26-28 (King James Version of the Bible (KJV, 1611)):

26 And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth.

27 So God created man in his own image, in the image of God created he him; male and female created he them.

28 And God blessed them, and God said unto them, Be fruitful, and multiply, and replenish the earth, and subdue it: and have dominion over the fish of the sea, and over the fowl of the air, and over every living thing that moveth upon the earth.

Bovine trial in progress, rendered as a line drawing by Vovsoft.

The principle was animals could be held accountable for causing harm and this was taken especially seriously when the harm caused was something like that of a crime a human might commit (like murder) and in the secular courts, if the victim was someone of some importance, the proceedings could involve defense lawyers, witnesses, and formal sentencing.  In the ecclesiastical courts, it was more symbolic or ritualistic: insects and rodents might be “summoned” but of course they never turned up so excommunication or other curses were invoked.  By the eighteenth century, the thinkers of the Enlightenment had prevailed and the idea of animals as moral agents was so ridiculed the practice of charging them was almost wholly abandoned although in certain circumstances an owner could be held liable for the damage they caused.  There was though the odd, rural holdout.  In Normandy in 1845 a sow was executed for killing a child (in the legal archives listed as the last “classic pig trial” (the last in the US held in New Hampshire in 1819)) and in Switzerland in 1906 a dog was sentenced to death for a similar offence (this believed to be Europe’s last “animal trial”).

Monday, September 1, 2025

Booby

Booby (pronounced boo-bee)

(1) A bird, a gannet of the genus Sula, having a bright bill, bright feet, or both; some are listed as threatened or endangered.

(2) A slang term for someone thought stupid or a dunce, ignorant or foolish (although still used in the mid-twentieth century, it's probably now obsolete, the meaning crowded out by intrusion of newer slang, some of which has also fallen victim to the linguistic treadmill).

(3) The losing player in a game (the historic UK usage "booby prize", now largely obsolete except in informal use).

(4) One of the many slang terms for the human female's breasts and related to the more common boob, boobs and boobie.

(5) In croquet, a ball that has not passed through the first wicket. 

1590s: From Spanish Latin from the earlier pooby, apparently a blend of (the now obsolete in this context) poop (to befool) and baby, perhaps by association with Spanish bobo (stupid person, slow bird), thought to be from an imitative root of the Latin balbus (stuttering).  Balbus was from the primitive Indo-European balb- & balbal- (tongue-tied) and was cognate with the Ancient Greek βαμβαίνω (bambaínō) & βαμβαλύζω (bambalúzō) (I chatter with the teeth), the Russian болтать (boltatʹ) (to chatter, to babble), the Lithuanian balbė́ti (to talk, to babble), the Sanskrit बल्बला (balbalā) (stammering) and the Albanian belbët (stammering).  The booby prize dates from 1883, a prize given to the loser in a game as concept which persists in some sporting competitions as "the wooden spoon", the idea being something as removed as possible from the usual silverware given as trophies.  The booby trap was first noted in 1850, originally a schoolboy prank (ie something only a "boob" would fall for); the more lethal sense developed during World War I and remain common military and para-military use.  Booby and boobyism are nouns, boobyish and (the non-standard but potentially useful)  boobyesque are adjectives; the noun plural is boobies.

Boobies: found usually in pairs

A nice pair of boobies.  Charmingly, blue-footed boobies are known to be monogamous, pairs often staying together for life.

A booby is a seabird in the genus Sula, part of the Sulidae family. Boobies are closely related to the gannets (Morus), which were formerly included in Sula, the genus created in 1760 by the French naturalist Mathurin Jacques Brisson (1723-1806).  The name is derived from súla, the Old Norse and Icelandic word for the other member of the family Sulidae, the gannet.  The English name booby was based on the Spanish bobo (stupid) as the tame birds often landed on board sailing ships, where they were easily captured and eaten.  As well as a popular addition to the diet of sailors for whom meat other than fish was a rarity, it was fortuitous for many, the Admiralty's archives revealing boobies are often mentioned as having been caught and eaten by shipwrecked sailors.  In taxonomic classification, variations include Abbott's booby (Papasula abbotti), blue-footed booby (Sula nebouxii), brown booby (Sula leucogaster), masked booby (Sula dactylatra), Nazca booby (Sula granti), Peruvian booby (Sula variegata), red-footed booby (Sula sula) & Tasman booby (Sula dactylatra tasmani).

One step at a time.

The distinctive blue feet (the result of pigments ingested from their diet of fish) also play a part in the bobby’s mating ritual although not exactly in the podophilic sense familiar in a sub-set of humans.  In the spring mating season, the bird’s feet become a bright turquoise blue and, to demonstrate their health and vitality conspicuously they will display them to potential partners.  The job done, as their eggs hatch, the blue hue fades to something less vivid.  One aspect of their behaviour which amused the ornithologists who first observed it was that if among fishers unloading their catch, it tossed a small fish from the by-catch, a booby will take it and waddled off somewhere to enjoy it in solitude rather than gulping it down as in common in many species.  Like penguins, although ungainly on land, they are skilled plunge divers which used their streamlined bodies and air sacs “fly” through the water, catching their prey at high speed and they hunt in "packs", coordinating their movement to maximize the catch.  Boobies have been recorded diving from as high as 90 m (300 feet), their speed upon entry estimated at around 100 km/h (60 mph).

Boobies in time, in step.

Based on the use by mainstream internet sites (including nominally reputable news organizations), boob (more commonly in the plural as boobs) seems to have emerged as the preferred slang for breasts, probably because it seems the term women find most acceptable and the one they most often use, not infrequently as their default descriptor.  The origin appears to lie in bubby (plural bubbies), a slang term for the female breast dating from the 1680s which is thought to be imitative of a baby's cry or the sucking sound heard during lactation.  It was most associated with south-east England although that may reflect more extensive documentation rather than proof of regionalism.  Inherently anyway a form in oral use, the alternative pronunciations included buhb-ee, boo-bee & boob-ee so the evolution to boob was perhaps not unexpected although most dictionaries list the earliest known instance as a late 1930s Americanism with the back-formed clippings boob & boobs not appearing until the early post-war years, initially as a vulgarism, women not embracing use for decades although that their approval seems to have coincided with late second-wave feminism is presumably coincidental.

Fully loaded: Lindsay Lohan in boobie-top with crash helmet in Herbie: Fully Loaded (2005).

In fashion, the boobie top (less commonly as booby-top) is a style of clothing (including dresses) which in some way draws attention to or emphasizes the breasts.  The design is most associated with generous displays of cleavage or skin but is used also to refer to garments which wholly cover the breasts in such as way as to highlight the size, shape or movement.  In the industry, a “boobie top” differs from a “boob tube” in that while the former seeks to highlight the breasts as a feature (either by using the fabric tightly to shift the focus to the size and shape or with a cut which displays the cleavage component of the décolletage) while a “boob tube” is a different interpretation of the minimalist: it completely envelopes the breasts (ie little or no visible cleavage) but otherwise exposes the torso.

A “tube top” in the original style (left) and a “boob tube” (right), both now likely to be advertised as a “boob tube”.

The style was in 1972 first described as a “tube-top” (strapless and extend from the armpits to the naval; such garments had earlier been available but the name was new) and the companion “tube skirt” appeared the next season (again, a re-labeling).  The first “boob tubes” were advertised in 1977 and the early were all a truncated version of the “tube top” in that they wrapped only around the breasts); inherently it was a midriff-baring creation and could be thought of as a kind of strapless, bandeau bra designed for outdoor wear (on warmer days).  Constructed with elasticized fabrics, they were designed to be worn without a bra but, like all forms of structural engineering, physics does limit what's possible and they came later to be available also with a “built in bra”.  Others just choose boob tubes made with a thicker material so a strapless bra unobtrusively could be worn beneath but, VBS (visible bra-straps) no longer being a sin against fashion, some now choose a to make the bra part of the look.  In truth, the terms “tube top” & “boob tube” were all a bit misleading because it was only the material covering the breasts which tended toward a truly tubular bra with the rest being more or less flat and a better description might have been “flange” but this wouldn’t have had the same appeal in a boutique so “tubes” they became.  In product descriptions, the distinction between “tube top” and “boob tube” quickly became blurred and the latter tends now often to be used of both types. 

US Army booby trap messaging, 1942.  Such infections have for centuries been a significant part of military medicine because STIs often would reduce unit strength (ie "battle-ready" troops).

During World War II (1939-1945), the US military kept up with evolution of slang, something reflected in advertising which lent a new definition to "booby trap", a familiar concept in which soldiers were well-drilled.  Despite the efforts of padres, it was rare for commanders to attempt to impose morality and when on deployment it was common for there to be "authorized" brothels (often separate facilities for officers and other ranks) with the prostitutes subject to regular inspection by medical staff and allowed to practice their ancient profession only if the supervising doctor issued a "clean" certificate.  Until well into the twentieth century (and the beginning of the antibiotic era), it wasn't unusual for the losses of combat-ready troops to illness & disease to exceed those caused by battlefield causalities and although the numbers were dwarfed by conditions such as malaria, preventing and treating sexually transmitted diseases (STDs, then called venereal disease (VD)) was an important component of military medicine.  It wasn’t until the 1970s the initialism VD began to be replaced by STD (VD thought to have to have gained too many specific associations) but fortunately for AT&T, in 1951 they renamed their STD (Subscriber Toll Dialing) service (for long-distance phone calls) to DDD (Direct Distance Dialing), apparently for no better reason than the alliterative appeal although it's possible they just wanted to avoid mentioning “toll” with all that implies.  Many countries in the English-speaking world continued to use STD for the phone calls, even after the public health specialists had re-purposed the initialization.  In clinical use, STI (Sexually Transmitted Infection) seems now the preferred term.

The other booby trap: Helpful advertising circa 1950.

In Western legal systems, two aspects of consumer protection which greatly advanced in the twentieth century were product liability and “truth in advertising”.  What the changes in product liability did was break the nexus of “privity of contract”, meaning it was no longer required that to seek redress or compensation, an injured party had to be the purchaser of the defective goods.  That reform took shape during the inter-war years but “truth in advertising”, although an old concept enforced in contract law, really became a movement in the post-war years; it was designed to remove from commerce “deceptive or misleading” claims although advertising agencies still had a wide scope to be “economical with the truth” if they could make their assertions fit into the “mere puffery” rubric.  One field never policed was women’s shapewear (corsets and such) which, with a judicious placement of struts, elasticized panels, ribs and padding could variously make body parts appear curvier, straighter, smaller, larger or higher.  The Wonderbra (and its many imitators) was probably the best known example because among the many garments and devices it was the one which most dramatically deceived and misled.  Of all this trickery the law remained silent and the sage advice remained: caveat emptor.