Showing posts sorted by relevance for query Longevity. Sort by date Show all posts
Showing posts sorted by relevance for query Longevity. Sort by date Show all posts

Thursday, September 15, 2022

Catafalque

Catafalque (pronounced kat-uh-fawk, kat-uh-fawlk or kat-uh-falk)

(1) A (temporary or permanent) raised structure on which the body of a deceased person lies or is carried in state.

(2) A hearse (obsolete).

1635–1645: The orthodoxy is that catafalque is from the seventeenth century French catafalque, from the Italian catafalco, from the Late Latin catafalicum (scaffold), the construct being cata- (from the Ancient Greek κατά (katá) (downwards (and used in Medieval Latin with a sense of “beside, alongside”))) + fal(a) (wooden siege tower) + -icum (neuter of –icus; (the suffix used to denote "belonging to; derived from or pertaining to"), from the Etruscan.  However, etymologists are divided on the origin.  Some believe English picked up the word directly from the Italian and not via French and regard the Italian of uncertain origin, the connection with the Late Latin only speculative.  From the Medieval Latin catafalicum Old French gained chaffaut & chafaud (scaffold) which exists in the Modern French échafaud (scaffold).  Catafalque (the rare alternative spelling is catafalco) is a noun; the noun plural is catafalques.

The coffin carrying Queen Elizabeth II, rested on its catafalque for the lying in state, Westminster Hall, London, September 2022.

A catafalque is the platform upon which the body of the dead lies before their funeral.  In the West the modern practice is for the body to be placed in a coffin but historically the body was sometimes wrapped and this remains the practice for burials at sea.  Catafalques can be elaborately decorated or constructed with austere simplicity and can be mobile or stationary.  Although associated with state funerals they are a common fixture in crematoria or chapels and exist so the coffin is permitted to sit at an appropriate height for ceremonial purposes, most obviously during “open-casket” services.  Those used by undertakers (funeral directors) are usually mobile 9on wheels) so the coffin may easily be moved from one place to another, by one staff member if need be.  Thus, any appropriately elevated surface used from the purpose can be thought of (if only temporarily) as a catafalque although the name-proper is attached only to dedicated devices.  A catafalque party is a military formation, traditionally numbering four (though it may be more or fewer) assembled to stand guard over the coffin while the body is lying in state or at some other site of memorial.

The 1865 (Abraham Lincoln) catafalque in 2006, after the most recent replacement of its fabric covering.

One catafalque noted for its longevity is that hastily (and, from the point-of-view of a professional carpenter or cabinet maker, rather crudely) fabricated was that used for the coffin of Abraham Lincoln (1809–1865) US president 1861-1865) and still in use today.  Of simple construction and using plain framing timber, it’s not at all ornate and gains its aura from the long history of use, having being used in the funerals of some four dozen US figures from politics, the judiciary, the military or society (most recently Senator Harry Reid (1939–2021; US senator (Democrat, Nevada).  Over the years, it has been enlarged and strengthened to accommodate increasingly heavy coffins and the fabric covering has several times been replaced but almost all the original structure remains so it’s not a “grandfather’s axe”.  The simplicity has sometimes been emulated with intent, Pope John Paul II’s (Karol Wojtyła, 1920–2005; Pope of the Roman Catholic Church 1978-2005) plain cypress coffin sitting atop a catafalque so basic it might have been built by Christ himself.  It’s thought JPII’s successor might choose something just as simple.

Voltaire's catafalque.

Voltaire (François-Marie Arouet, 1694–1778), the radical writer of the French Enlightenment, as controversial in death as in life, was buried quietly some way distant from Paris because his friends feared church and state would seek to deny him the proper rites of burial and it was only some thirteen years after his death, just after the French Revolution that his body was disinterred and moved to the Panthéon in Paris, a site created to honor illustrious citizens.  His catafalque was an impressive three tiered construction inscribed: ”Poet, philosopher, historian, he made a great step forward in the human spirit.  He prepared us to become free”.

David Lloyd George's funeral bier, Good Friday (30 March) 1945, Llanystumdwy, Wales.

In the context of funerals, definitionally, there is no difference between a bier and a catafalque.  Bier ((1) a litter to transport the body of a dead person, (2) a platform or stand where a body or coffin is placed & (3) a count of forty threads in the warp or chain of woolen cloth) was from the Middle English beer, beere & bere, from the Old English bēr, from the West Saxon bǣr (stretcher, bier), from the Proto-West Germanic bāru, from the Proto-Germanic bērō, from the primitive Indo-European bher (to carry, bear) and was cognate with the Saterland Frisian Beere (stretcher, bier), the Dutch baar (bier) and the German Bahre (bier, stretcher).  It’s thus functionally the same as a catafalque and the only point of differentiation in modern use seems to be the convention that catafalque is used when the funeral is grand while for more modest affairs (like David Lloyd George’s (1863–1945; UK prime-minister 1916-1922) “farm cart funeral”), bier is preferred.  The pyre (from the Ancient Greek πυρά (pyrá), from πρ (pyr) (fire)), also known as a funeral pyre, is a structure, made almost always of wood, constructed for the purpose of burning a body as part of a funeral rite and thus a form of ceremonial cremation.  Dimensionally, it may be far larger than is required for purposes of combustion because big fires were often an important aspect of the spectacle.

A member of Queen Elizabeth II's catafalque party fainted shortly before his shift was due to end.  He was not seriously injured.

Saturday, March 9, 2024

Tsar

Tsar (pronounced zahr)

(1) An emperor or king.

(2) Title of the former emperors of Russia and several Slavonic states.

(3) Slang term for an autocratic ruler or leader.

(4) Slang term for a person exercising great authority or power in a particular field.

1545-1555: From the Old Russian tsĭsarĭ (emperor or king), akin to the Old Church Slavonic tsěsarĭ, the Gothic kaisar and the Greek kaîsar, all ultimately derived from the Latin Caesar (an emperor, a ruler, a dictator) while the Germanic form of the word was the source of the Finnish keisari and the Estonian keisar.  The prehistoric Slavic was tsesar, Tsar first adopted as an imperial title by Ivan IV (Ivan Vasilyevich, 1530–1584 and better remembered as Ivan the Terrible, Grand Prince of Moscow and all Russia 1533-1584 & Tsar of all Russia 1547-1584) in 1547.  There’s a curious history to spelling tsar as czar.  Spelled thus, it’s contrary to the usage of all Slavonic languages; the word was so spelt by the Carniolan diplomat & historian Baron Siegmund Freiherr von Herberstein (1486–1566) in his work (in Latin) Rerum Moscoviticarum Commentarii (Notes on Muscovite Affairs (1549)) which was such a seminal early source of knowledge of Russia in Western Europe that "czar" passed into the Western languages; despite that history, "tsar" definitely is the proper Latinization.  It still appears and some linguistic academics insist the lineage means it should be regarded as archaic use rather than a mistake and, as a fine technical point, that’s correct in that, for example, the female form czarina is from 1717 (from Italian czarina and German zarin).  In Russian, the female form is tsaritsa and a tsar’s son is a tsarevitch, his daughter a tsarevna.

Nicholas II (Nikolai II Alexandrovich Romanov, 1868–1918; last Tsar of Russia, 1894-1917).  He cut an imposing figure for the portraitists but his cousin Kaiser Wilhelm II (1859–1941; German Emperor & King of Prussia 1888-1918) reckoned the tsar's mental abilities rendered him most suitable to "a cottage in the country where he can grow turnips".  Wilhelm got much wrong in his life but historians seem generally to concur in this he was a fair judge of things.

Tsar and its variants were the official titles of (1) the First Bulgarian Empire 913–1018, (2) the Second Bulgarian Empire (1185–1396), (3) the Serbian Empire (1346–1371), (4) the Tsardom of Russia (1547–1721) (technically replaced in 1721 by imperator, but remaining in use outside Russia (also officially in relation to certain regions until 1917) and (5) the Tsardom of Bulgaria (1908–1946).  So, although most associated with Russia, the first ruler to adopt the title was Simeon I (usually written as Simeon the Great; circa 865-927, ruler of Bulgaria 893-927) and that was about halfway through his reign and nobody since Simeon II (Simeon Borisov Saxe-Coburg-Gotha, b 1937; (last) Tsar of the Kingdom of Bulgaria 1943-1946) has been a tsar.  The transferred sense of "person with dictatorial powers" seems first to have appeared in English in 1866 as an adoption in American English, initially as a disapproving reference to President Andrew Johnson (1808–1875; US President 1865-1869) but it has come to be applied neutrally (health tsar, transport tsar) and use does sometimes demand deconstruction: drug tsar has been applied both to organised crime figures associated with the distribution of narcotics and government appointees responsible for policing the trade.  In some countries, some overlap between the two roles has been noted.

Comrade Stalin agitprop.

Volgograd, the southern Russian city was between 1925-1961 named Stalingrad (Stalin + -grad).  Grad (град in Cyrillic) was from the Old Slavic and translates variously as "town, city, castle or fortified settlement"; it once existed in many languages as gord and can be found still as grad, gradić, horod or gorod in many place-names.  Before it was renamed in honour of comrade Stalin (1878-1953, leader of the USSR 1924-1953), between 1589-1925, the city, at the confluence of the Tsaritsa and Volga rivers was known as Tsaritsyn, the name from the Turkic-related Tatar dialect word sarisin meaning "yellow water" or "yellow river" but because of the similarity in sound and spelling, came in Russia to be associated with Tsar.  Stalingrad is remembered as the scene of the epic and savage battle which culminated in the destruction in February 1943 of the German Sixth Army, something which, along with the strategic failure of the Wehrmacht in the offensive (Unternehmen Zitadelle (Operation Citadel) in the Kursk salient five months later, marked what many military historians record as the decisive moment on the Eastern Front.  It has become common to refer to comrade Stalin as the "Red Tsar" whereas casual comparisons of Mr Putin (Vladimir Vladimirovich Putin; b 1952; president or prime minister of Russia since 1999) don't often reach to Russia's imperial past; they seem to stop with Stalin.

Caesar (an emperor, a ruler, a dictator) was from the late fourteenth century cesar (from Cæsar) and was originally a surname of the Julian gens in Rome, elevated to a title after Caius Julius Caesar (100-44 BC) became dictator and it was used as a title of emperors down to Hadrian (76–138; Roman emperor 117-138).  The name ultimately is of uncertain origin, Pliny the Elder (23–79) suggested it came from the Latin caesaries (head of hair) because the future dictator was born with a lush growth while others have linked it to the Latin caesius (bluish-gray), an allusion to eye color.  The "probity of Caesar's" wife (the phrase first recorded in English in the 1570s) as the figure of a person who should be above suspicion comes from the biography of Julius Caesar written by the Greek Middle Platonist priest-philosopher & historian Plutarch (circa 46–circa 123).  Plutarch related the story of how Julius Caesar divorced his wife Pompeia because of rumors of infidelity, not because he believed the tales of her adultery but because, as a political position, “the wife of Caesar must not even be under suspicion”.  That’s the origin of the phrase “the probity of Caesar’s wife, a phrase which first appeared in English in the 1570s.

In late nineteenth century US slang, a sheriff was "the great seizer" an allusion to the office's role in seizing property pursuant to court order.  The use of Caesar to illustrate the distinction between a subject’s obligations to matters temporal and spiritual is from the New Testament: Matthew 22:21.

They say unto him, Caesar's. Then saith he unto them, Render therefore unto Caesar the things which are Caesar's; and unto God the things that are God's.

Christ had been answering a question posed by the Pharisees to trap Him: Is it lawful to pay taxes to Caesar (Matthew 22:15–20)?  To answer, Jesus held up a denarius, the coin with which pay the tax and noted that on it was the head of Caesar, by then Caesar had become a title, meaning emperor of Rome and its empire.  It was a clever answer; in saying "render unto Caesar that which is Caesar's and render unto God that which is God's", Jesus dismisses the notion of believers being conflicted by the demands of the secular state as a false dilemma because, one can fulfil the requirements of the sate by a mere payment of coin without any implication of accepting its doctrines or legitimacy.  Over the years much has been made of what is or should be "rendered unto Caesar", but more interesting is inference which must be drawn: if we owe Caesar that which bears his image, what then do we owe God?  It can only be that we owe God that which bears the image of God, an impressive inventory listed in the book of Genesis and now interpreted by some Christians as "the whole universe".  To Caesar we can only ever owe money; to God we owe ourselves.

In the Old English the spelling was casere, which would under the expected etymological process have evolved into coser, but instead, circa 1200, it was replaced in the Middle English by keiser, from the Norse or Low German, and later by the French or Latin form of the name.  Cæsar also is the root of German Kaiser, the Russian tsar and is linked with the Modern Persian shah.  Despite the common assumption, "caesar" wasn’t an influence on the English "king".  King was from the Middle English king & kyng, from the Old English cyng & cyning (king), from the Proto-West Germanic kuning, from the Proto-Germanic kuningaz & unungaz (king), kin being the root.  It was cognate with the Scots keeng (king), the North Frisian köning (king), the West Frisian kening (king), the Dutch koning (king), the Low German Koning & Köning (king), the German König (king), the Danish konge (king), the Norwegian konge (king), the Swedish konung & kung (king), the Icelandic konungur & kóngur (king), the Finnish kuningas (king) and the Russian князь (knjaz) (prince) & княги́ня (knjagínja) (princess).  It eclipsed the non-native Middle English roy (king) and the Early Modern English roy, borrowed from Old French roi, rei & rai (king).

The Persian Shah was from the Old Persian xšāyaθiya (king), once thought a borrowing from the Median as it was compared to the Avestan xšaϑra- (power; command), corresponding to the Sanskrit (the Old Indic) katra- (power; command), source of katriya (warrior).  However, recent etymological research has confirmed xšāyaθiya was a genuine, inherited Persian formation meaning “pertaining to reigning, ruling”.  The word, with the origin suffix -iya was from a deverbal abstract noun xšāy-aθa- (rule, ruling) (Herrschaft), from the Old Persian verb xšāy- (to rule, reign).  In the Old Persian, the full title of the Achaemenid rulers of the First Empire was Xšāyaθiya Xšāyaθiyānām (or in Modern Persian, Šāhe Šāhān (King of Kings)), best as "Emperor", a title with ancient, Near Eastern and Mesopotamian precedents.  The earliest known instance of such a title dates from the Middle Assyrian period as šar šarrāni, used by the Assyrian ruler Tukulti-Ninurta I (1243–1207 BC).

Tsar Bomba: the Tsar bomb

Tupolev Tu-95 in flight (left) and a depiction of the October 1961 test detonation of the Tsar Bomb.

Царь-бомба (Tsar Bomba (Tsar-bomb)) was the Western nickname for the Soviet RDS-220 hydrogen bomb (Project code: AN602; code name Ivan or Vanya), the most powerful nuclear weapon ever detonated.  The test on 30 October 1961 remains the biggest man-made explosion in history and was rated with a yield of 50-51 megatons although the design was technically able to produce maximum yield in excess of 100.  For a long time the US estimated the yield at 54 megatons and the Russians at 58 but after the fall of the Soviet Union in 1991, it was confirmed the true yield was 50-51 megatons.  Only one was ever built and it was detonated on an island off the Russian arctic coast.  The decision to limit the size blast was related to the need to ensure (1) a reduced nuclear fall-out and (2) the aircraft dropping the thing would be able to travel a safe distance from the blast radius (the Kremlin's attitude to the lives of military personnel had changed since comrade Stalin's time).  No nuclear power has since expressed any interest in building weapons even as large as the Tsar Bomb and for decades the trend in strategic arsenals has been more and smaller weapons, a decision taken on the pragmatic military grounds that it's pointless to destroy things many times over.  It's true that higher yield nuclear weapons would produce "smaller rubble" but to the practical military mind such a result represents just "wasted effort".

Progress 1945-1961.

The Tupolev Tu-95 (NATO reporting name: Bear) which dropped the Tsar Bomb was a curious fork in aviation history, noted also for its longevity.  A four-engined turboprop-powered strategic bomber and missile platform, it entered service in 1956 and is expected still to be in operational use in 2040, an expectation the United States Air Force (USAF) share for their big strategic bomber, the Boeing B-52 which first flew in 1952, the first squadrons formed three years later.  Both airframes have proven remarkably durable and amenable to upgrades; as heavy lift devices and delivery systems they could be improved upon with a clean-sheet design but the relatively small advantages gained would not justify the immense cost, thus the ongoing upgrade programmes.  The TU-95's design was, inter-alia, notable for being one of the few propeller-driven aircraft with swept wings and is the only one ever to enter large-scale production.  It's also very loud, the tips of those counter-rotating propellers sometimes passing through the sound barrier.

Footage of the Tsar Bomb test de-classified and released after the dissolution of the Soviet Union (1922-1991).

The Tsar Bomb was in a sense the “ultimate” evolution of the centuries long history of the bomb although it wasn’t the end of innovation, designers seemingly never running out of ideas to refine the concept of the device, the purpose of which is to (1) blow stuff up and (2) kill people.  Bomb was from the French bombe, from the Italian bomba, from the Latin bombus (a booming sound), from the Ancient Greek βόμβος (bómbos) (booming, humming, buzzing), the explosive imitative of the sound itself.  Bomb was used originally of “projectiles; mortar shells etc”, the more familiar “explosive device placed by hand or dropped from airplane” said by many sources to date from 1908 although the word was in the former sense used when describing the anarchist terrorism of the late nineteenth century.  As a footnote, the nickname of Hugh Trenchard (1873-1956), the first Marshal of the Royal Air Force (RAF) was “boom” but this was related to his tone of voice rather than an acknowledgement of him being one of the earliest advocates of strategic bombing.

The figurative uses were wide, ranging from “a dilapidated car” (often as “old bomb”, the use based presumably on the perception such vehicles are often loud).  The bombshell was originally literally a piece of military equipment but it was later co-opted (most memorably as “blonde bombshell) to describe a particularly fetching young women.  So, used figuratively, “bomb” could mean either “very bad” or “very good” and in his weekly Letter from American (broadcast by the BBC World Service 1946-2004), Alistair Cooke (1908–2004) noted a curious trans-Atlantic dichotomy.  In the world of showbiz, Cooke observed, “bomb” was used in both the US & UK to describe the reaction to a play, movie or whatever but in the US, if called “a bomb”, the production was a flop, a failure whereas in the UK, if something was called “quite a bomb”, it meant it was a great success.

I Know Who Killed Me (2007)

I Know Who Killed Me bombed (in the traditional US sense) but in the way these things sometimes happen, the film has since enjoyed a second life with a cult-following and screenings on the specialized festival circuit.  Additionally, DVD & Blu-Ray sales (it's said to be a popular, if sometimes ironic, gift) meant eventually it generated a profit although it has never exactly become a "bomb" (in the UK sense).  However, while it now enjoys a following among a small sub-set of the public, the professional critics have never softened their view.

Tuesday, November 1, 2022

Herringbone

Herringbone (pronounced her-ing-bohn)

(1) A pattern, the weave resembling the skeleton of a herring fish, consisting of adjoining vertical rows of slanting lines, any two contiguous lines forming either a V or an inverted V, used in masonry, textiles, embroidery etc and .  Also called chevron, chevron weave, herringbone weave; a type of twill weave having this pattern.

(2) A fabric constructed with this weave.

(3) A garment made from such a fabric, applied especially to jackets and coats.

(4) In skiing, a method of going up a slope in which a skier sets the skis in a form resembling a V, and, placing weight on the inside edges, advances the skis by turns using the poles from behind for push and support.

(5) A type of cirrocumulus cloud.

1645–1655: The construct was herring + bone.  Herring was from the Middle English hering, from the Old English hǣring, from the Proto-West Germanic hāring (herring) of unknown origin but it may be related to the Proto-Germanic hērą (hair) due to the similarity of the fish’s fine bones to hair. It was cognate with the Scots hering & haring, the Saterland Frisian Hiering & Häiring, the West Frisian hjerring, the Dutch haring, the German and Low German Hereng & Hering, the French hareng, the Norman ĥéren and the Latin haringus; all borrowings from the Germanic.  Bone is from the Middle English bon, from the Old English bān (bone, tusk; bone of a limb), from the Proto-Germanic bainą (bone), from bainaz (straight), from the primitive Indo-European bheyhz (to hit, strike, beat).  It was cognate with the Scots bane, been, bean, bein & bain (bone), the North Frisian bien (bone), the West Frisian bien (bone), the Dutch been (bone; leg), the Low German Been & Bein (bone), the German Bein (leg), the German Gebein (bones), the Swedish ben (bone; leg), the Norwegian and Icelandic bein (bone), the Breton benañ (to cut, hew), the Latin perfinēs (break through, break into pieces, shatter) and the Avestan byente (they fight, hit). It was related also to the Old Norse beinn (straight, right, favorable, advantageous, convenient, friendly, fair, keen) (from which Middle English gained bain, bayne, bayn & beyn (direct, prompt), the Scots bein & bien (in good condition, pleasant, well-to-do, cozy, well-stocked, pleasant, keen), the Icelandic beinn (straight, direct, hospitable) and the Norwegian bein (straight, direct, easy to deal with).  The use to describe a type of cirrocumulus cloud dates from 1903.  The alternative form is herring-bone (not herring bone which would be a bone of a herring).

The herringbone shape (left) and a herring's bones (right).

The herringbone pattern picked up its rather fanciful name because of a resemblance to the fine bones of the fish.  First used in masonry, the motif has for centuries been used in wallpaper, mosaics, upholstery, fabrics, clothing and jewellery.  In engineering, the pattern is found also in the shape cut for some gears but this functionally deterministic.

Roman herringbone brickwork, Villa Rustica, Mehring, Trier-Saarburg, Rhineland, Germany.

The original herringbone design was a type of masonry construction (called opus spicatum, literally "spiked work”) used first in Ancient Rome, widely adopting during medieval times and especially associated with Gothic Revival architecture; it’s commonly seen today.  It’s defined by bricks, tiles or cut stone laid in a herringbone pattern and is a happy coincidence of style and structural integrity.  Although most associated with decorative use, in many cases the layout was an engineering necessity because if tiles or bricks are laid in straight lines, the structure is inherently weak whereas if built using oblique angles, under compression, loads are more evenly distributed.  One of the reasons so much has survived from antiquity is the longevity of the famously sticky Roman concrete, the durability thought in part due to chemical reactions with an unusual Roman ingredient: volcanic ash.

Lindsay Lohan in herringbone flat-cap.

Of gears

Although the term “herringbone cut gears” is more poetic, to engineers they’re known as double helical gears.  In both their manufacturing and operation they do present challenges, the tooling needed in their production demanding unusually fine tolerances and in use a higher degree of alignment must be guaranteed during installation.  Additionally, depending on use, there is sometimes the need periodically to make adjustments for backlash (although in certain applications they can be designed to have to have minimal backlash).  However, because of the advantages the herringbone structure offers over straight cut, spur or helical gears, the drawbacks can be considered an acceptable trade-off, the principle benefits being:

(1) Smoothness of operation and inherently lower vibration:  The herringbone shape inherently balances the load on the teeth, reducing vibration and generated noise.

(2) A high specific load capacity: The symmetrical design of herringbone gears offers a high surface area and an even distribution of load, meaning larger and more robust teeth may be used, making the design idea for transmitting high torque or power.

(3) A reduction in axial thrust: Probably the reasons engineers so favour the herringbone is that axial thrust can be reduced (in certain cases to the point of effective elimination).  With helical gears, the axial force imposed inherently acts to force gears apart whereas the herringbone gears have two helical sections facing each other, the interaction cancelling the axial thrust, vastly improving mechanical stability.

(4) Self-regulating tolerance for misalignment. Herringbone handle small variations in alignment better than spur gears or single helical gears, the opposing helix angles assisting in compensating for any axial misalignment, contributing to smoother gear meshing and extending the life of components.

(5) Heat dissipation qualities: The symmetrical structure assists heat dissipation because the opposing helices create a distribution of heat through a process called mutual heat-soak, reducing the risk of localized overheating, something which improves thermal efficiency by making the heat distribution pattern more uniform.

Gears: helical (left), herringbone (or double helical) (centre) and straight-cut (right).  Although road cars long ago abandoned them, straight-cut gears are still used in motorsport where drivers put up with their inherent whine and learn the techniques needed to handle the shifting.

Saturday, March 4, 2023

Vermiform

Vermiform (pronounced vur-muh-fawrm)

Resembling or having the long, thin, cylindrical shape of a worm; long and slender.

1720-1730: From the Medieval Latin vermiformis, the construct being vermis (worm) + forma (form).  Vermis was from the primitive Indo-European wr̥mis and cognates included the Ancient Greek όμος (rhómos) and the Old English wyrm (worm (which evolved into the Modern English worm)).  Form was from -fōrmis (having the form of), from fōrma (a form, contour, figure, shape, appearance, looks).  The root of the Latin vermis was the primitive Indo-European wer- (to turn, bend), an element most productive, contributing to: adverse; anniversary; avert; awry; controversy; converge; converse (as the adjectival sense of "exact opposite”); convert; diverge; divert; evert; extroversion; extrovert; gaiter; introrse; introvert; invert; inward; malversation; obverse; peevish; pervert; prose; raphe; reverberate; revert; rhabdomancy; rhapsody; rhombus; ribald; sinistrorse; stalwart; subvert; tergiversate; transverse; universe; verbena; verge (as the verb meaning "tend, incline"); vermeil; vermicelli; vermicular; vermiform; vermin; versatile; verse (in the sense of the noun "poetry") version; verst; versus; vertebra; vertex; vertigo; vervain; vortex; -ward; warp; weird; worm; worry; worth (in the adjectival sense of "significant, valuable, of value") worth (as the verb "to come to be"); wrangle; wrap; wrath; wreath; wrench; wrest; wrestle; wriggle; wring; wrinkle; wrist; writhe; wrong; wroth & wry.  Vermiform is an adjective.

Commonly used in medicine to describe the appendix, Modern French also gained the word from Latin as the adjective vermiforme (plural vermiformes), the spelling of the medical use apéndice vermiforme (plural apéndices vermiformes).  The only known derived form in English is the adjective subvermiform, used apparently exclusively in the disciplines of zoology, including entomology.  The meaning was defined in a dictionary from 1898 as “shaped somewhat like a worm” which is surprisingly imprecise for the language of science but that vagueness appears adequate for the purposes to which it’s put.  For whatever reason, vermiform was a word much favored by the US humorist HL Mencken (1880-1956).

The female Eumillipes persephone: 1,306 legs & 330 segments.  

Because the scientific literature has for some time been dominated by COVID-19 and all that flowed the brief, sudden prominence of two vermiform creatures, one ancient, the other more recent, was an amusing distraction.  The younger animal was a new species of millipede which boasted not only more legs than any other creature on the planet but was the first of its kind to live up to its name.    

Since circa 1600, the term millipede has been applied to any of the many elongated arthropods, of the class Diplopoda (a taxonomic subphylum within the phylum Arthropoda (the centipedes, millipedes and similar creepy-crawlies) with cylindrical bodies that have two pairs of legs for each one of their many body segments and, although milliped was long regarded as the correct spelling by scientists who work with myriapods, millipede is by far the most common form in general use (although there’s the odd specialist who insists on millepede).  Millipede was from the Latin millipeda (wood louse), the construct being mille (thousand) + pes (genitive pedis) (foot), from the primitive Indo-European root ped (foot) (probably a loan-translation of Greek khiliopous).  When named, it wasn’t intended as a mathematically precise definition, only to suggest the things had lots of legs though, certainly many fewer than a thousand.  The creature has always possessed a certain comical charm because, despite having usually twice the number of legs as centipedes, the millipede is entirely harmless whereas there are centipedes which can be quite nasty.  For centuries millipede was thought a bit of a misnomer, with no example ever observed with more than 750 legs and that deep-soil dweller was an outlier, most having fewer with a count in two figures quite common.  The new species also lives in the depths: Eumilipes persephone (Persephone, the daughter of Zeus who was taken by Hades to the underworld), a female was found to be sprouting 1,306 legs.  Pale and eyeless, it’s vermiform in the extreme, the body-length almost a hundred times its width and instead of vision, it used a large antennae to navigate through darkness to feed on fungi.

The sheer length of the thing does suggest a long lifespan by the standards of the species, most of which tend not to survive much beyond two years.  The persephone however, based on a count of the body segments which grow predictably in the manner of tree rings, seems likely to live perhaps as long as a decade.  One factor which accounts for the longevity is the absence of predators, the persephone’s natural environment banded iron formations and volcanic rock some 200 feet (60 m) beneath the surface of a remote part of Western Australia.  Entomologists didn’t actually venture that deep to explore, instead using the simple but effective method of lowering buckets of tempting vegetation down shafts drilled by geologists exploring for minerals, returning later to collect whatever creatures had been tempted to explore.

Artist’s impression of an Arthropleura: half a metre wide and perhaps nearly three metres in length, the latter dimension similar to a small car.  

Days after the announcement from the Western Australian desert, livescience.com also announced researchers in the UK found the fossilized exoskeleton of an Arthropleura, the largest arthropod yet known to have lived.  The length of a modern car, the giant millipede-like creatures appear to have done most of their their creeping and crawling during the Carboniferous Period, between 359 million and 299 million years ago.

Although the Arthropleura have long been known from the fossil record, there’d not before been any suggestion they ever grew quite so large and the find was quite serendipitous, discovered on a beach in a block of sandstone which had recently fallen and cracked apart.  The exoskeleton fragment is 30 inches (750 mm) long and 22 inches (550 mm) wide which means the giant millipede would have been around 102 inches (2600 mm) long and weighed around 110 lbs (50 kg)m making it the biggest land animals of the Carboniferous era.  Despite its bulk however, the physics of movement and the need to support its own weight mean the leg count is nowhere near as impressive as its young relation what is now on the other side of the world (what are now the Australian and European land masses were closer together during the Carboniferous) and it’s still not clear if Arthropleura had two legs per segment or every two segments but either way it adds up to much fewer than a hundred.

Ultimately, Arthropleura was a victim of changing conditions.  In its time, it would have been living in a benign equatorial environment but, over millions of years, the equator can shift because of the phenomenon of TPW (true polar wander) in which the outer layer of Earth shifts around the core, tilting the crust relative to the planet’s axis.  This last happened some eighty-four million years ago.  So, the conditions which for so long had been ideal changed and changed suddenly and Arthropleura was unable to adapt, going extinct after having flourished for nearly fifty-million years.  The reasons for their demise are those seen repeatedly in the fossil record: In an abruptly changed environment, there was suddenly more competition for fewer resources and the Arthropleura lost out to animals which were stronger, more efficient and better able to adapt.

The human appendix.

Thousands of years after first being described, the human appendix, a the small blind-ended vermiform structure at the junction of the large and the small bowel remains something of a mystery.  For centuries the medical orthodoxy was it vestigial, a evolutionary dead-end and a mere quirk of human development but the current thinking is it exists as a kind of “safe-house” for the good bacteria resident in the bowel, enabling them to repopulate as required.  However, being blind-ended, although intestinal contents easily can enter, in certain circumstances it can operate as a kind of one-way, non-return valve, making exit impossible which results in inflammation.  This is the medical condition appendicitis and in acute cases, the appendix must surgically be removed.  That's usually fine if undertaken in good time because it's a simple, commonly performed procedure but unfortunately, in a small number of cases, a residual "stump" of the structure may escape the knife and in this inflammation may re-occur, something surgeons resentfully label “stumpitis”.  Apparently the most useless part of the human anatomy, there is noting in the medical literature to suggest anyone has noticed any aspect of their life being changed by not having an appendix.

Wednesday, October 18, 2023

Cimarron

Cimarron (pronounced sim-uh-ron, sim-uh-rohn or sim-er-uhn)

(1) A Maroon (an African or one of African descent who escaped slavery in the Americas, (or a descendant thereof, especially a member of the Cimarron people of Panama).

(2) In Latin America (1) feral animals or those which have returned to the wild, (2) rural areas (campestral) and the inhabitants there dwelling & (3) wild plants.

(3) A name used in the US for both rivers & as both a localities.

(4) A not fondly remembered small "Cadillac", built between 1981-1988.

1840–1850: From the Colonial Spanish cimarrón (a maroon (used also casually of feral animals, wild rams etc), from the Spanish and thought likely equivalent to the Old Spanish cimarra (brushwood, thicket), the construct being & cim(a) (peak, summit (from the Latin cȳma (spring shoots of a vegetable), from the Ancient Greek  + -arrón (the adjectival suffix).  Most etymologists appear to accept the Spanish cimarrón was a native Spanish formation from cima (summit, peak), referring to slaves who escaped to seek refuge in the mountains but the alternative theory is that it was a borrowing from Taíno símaran (wild (like a stray arrow)), from símara (arrow).  The feminine was cimarrona, the masculine plural cimarrones & the feminine plural cimarronas.  The verb maroon (put ashore on a desolate island or some isolate and remote coast by way of punishment) dates from 1724 and was from maroon (fugitive black slave living in the wilder parts of Dutch Guyana or Jamaica and other West Indies islands) which has always been assumed to be a corruption of the Spanish cimmaron & cimarrón.  Cimarron is a noun & proper noun (the adjective cimarific (based on Cimar(ron) + (horr)ific) was sardonic; a slur relating to the Cadillac); the noun plural is Cimarrons.

The Cadillac Cimarron, 1982-1988

For those who can remember the way things used to done: 1968 Cadillac Coupe DeVille convertible.

The path of the reputation of the unfortunate Cadillac Cimarron was unusual in the more it was upgraded and improved, the further it seemed to fall in the estimation of the motoring press.  Despite the impression which seems over the decades to have become embedded, the early critical reaction to the Cimarron was generally polite and even positive, while acknowledging the inadequacies of the original engine-transmission combinations.  The journalists may however have been in a mood to be unusually forgiving because in 1981, when the first examples were provided for press evaluation, that a Cadillac was for the first time since 1914 fitted with a four-cylinder engine and one with a displacement smaller than 2.0 litres (122 cubic inch) for the first time since 1908 was a sign how much the universe had shifted; not even ten years earlier every Cadillac on sale used an 8.2 litre (500 cubic inch) V8.  The ripples of the first oil shock would see the big-block V8 twice downsized but so much had rising cost (and the threatened scarcity) of gas scarred the consumer that even Cadillac owners wanted more efficient vehicles.  They still wanted to drive Cadillacs and while demand for the full-sized cars remained, it was obvious to General Motors (GM) that the segment was in decline and the alternatives proving popular were not the traditional Lincoln and Imperial but the premium brand Europeans, Mercedes-Benz, BMW and (as a niche player), Jaguar.

The cleverly engineered 1976 Cadillac Seville which hid its origins well.

The Europeans produced very different machines to the Cadillacs and it would have taken much time and money to match them in sophistication but what could be done quickly and at relatively low cost was to make a Cadillac out of a Chevrolet and that was the path chosen, the long-serving Chevrolet Nova re-styled, re-trimmed, re-engined (with the 5.7 litre (350 cubic inch) Oldsmobile V8) and re-badged as the Cadillac Seville.  On paper, it didn’t sound promising but on the road it actually worked rather well, essentially because Chevrolet had done a creditable job in making the Nova drive something like a Cadillac with some Mercedes-like characteristics.  So, the task for Cadillac’s engineers wasn’t that onerous but they did it well and the Seville was a great success, something especially pleasing to GM because the thing retailed at some four times what Chevrolet charged for Novas.  That made the Seville one of the most famously profitable lines ever to emerge from Detroit which was good but what was not was that most people who bought one weren’t conquests from Mercedes-Benz or BMW (and definitely not from Jaguar) but those who would otherwise have bought a Cadillac.  Still, the Seville did its bit and contributed to brief era of record sales and high profits for GM.

Cadillac’s new enemy: 1982 BMW 320i (E21).

By the early 1980s however, Cadillac decided it need to do the same thing again, this time on a smaller scale.  A second oil shock had struck in 1979 and this time the US economy wasn’t bouncing back as it had in the mid-1970s and the recession of the early 1980s was nasty indeed.  One market segment which was a bright spot however was what was called the “small executive sedan” dominated then by the BMW 3-Series, soon to be joined by what would become known as the Mercedes-Benz C-Class, compact, high-quality and high-priced cars being bought by what to Cadillac would be a most attractive demographic: the then newly defined YUPIEs (young upwardly-mobile professionals).  Cadillac had nothing which appealed to this market and their plans for an entry were years sway even from the initial design phases.  The economic situation of the time however had made the matter urgent and so, at a very late stage, Cadillac was appended to GM’s ambitious programme to use the one “world car” platform to be used in the divisions which produced cars in the planet’s major markets (the US, UK, Europe, Japan & Australasia).  This one front-wheel drive platform would provide a family sized car in Japan, the UK and Europe, a medium-sized entrant in Australasia and a small car in the US with the highest possible degree of component interchangeability and a consequent reduction in the time and cost to bring the lines to production.

1982 Holden Camira SL/E (1982-1989), the Australian version of the “World Car”.

The longevity of the GM “World Car" (the J-Car (J-Body the US nomenclature)), the last produced in 2005, attests to the quality of GM’s fundamental engineering and over the decades, over 10 million would be sold as Vauxhalls (UK), Opels (Europe), Holdens (Australia & New Zealand), Isuzus and even Toyotas (Japan) and Chevrolets, Buicks, Oldsmobiles, Pontiacs & Cadillacs (US).  By the standards of the time they were good cars (although they did prove less suited to Australian driving conditions) but they could not, and certainly not in the eleven months available, be made into what would be thought of as “a Cadillac”.  To do that, given the technology available at the time, ideally the platform would have been widened, a small version of one of the corporate V8s (perhaps as small as 3.5 litres (215 cubic inch) fitted and the configuration changed to accommodate rear-wheel drive (RWD) and independent rear suspension (IRS).  The J-Body could have accommodated all this and, thus configured, coupled with the lashings of leather expected in the interior, GM would have had an appropriately sized small executive sedan, executed in an uniquely American way.  Like the Seville, it may not have made much of a dent in the business Mercedes-Benz and BMW were doing but it would have had real appeal and it’s doubtful it would have cannibalized the sales of the bigger Cadillacs.  Additionally, it would have been ideally place to take advantage of the rapid fall in gas prices which came with the 1980s “oil glut”.  Alas, such a thing would have taken too long to develop and it would have been such an expensive programme Cadillac would have convinced the GM board they may as well accelerate the development of their own small car.  So, needing something small to put in the showrooms because that’s what Cadillac dealers were clamouring for, the decision was taken to tart up the J-Body.

1982 Cadillac Cimarron (1982-1988), the origins of which were obvious.

That, for the 1982 model year, was exactly what was done.  The Cadillac Cimarron was nothing more than a Chevrolet Cavalier with a lot of extra stuff bolted or glued on.  Apparently, the name “Cimarron” was chosen because it had in the US been used to refer to the wild and untamed horses which once roamed freely in the American West, the company hoping to add the idea of an “untamed spirit” to the (even if by then slightly tarnished) reputation for luxury and elegance once associated with Cadillac.  Whether much thought was given to the name’s association with slavery isn’t known.  That aside, the spirit wasn’t exactly untamed because the already anaemic performance of the Chevrolet was hampered further by all the extra weight of the luxury fittings which adorned the Cimarron, something which was tolerated (indeed probably expected) in what Chevrolet was selling as an “economy car” but luxury buyers had higher expectations.

Cadillac found that bigger was better: Yuppie Lindsay Lohan entering Cadillac Escalade, May 2012.

Most would conclude it made things worse.  Had it been sold as the Chevrolet Caprice II (a la Ford’s approach with the LTD II), the Cimarron would probably have been a hit and while there would have been the same criticisms, in a car costing so much less, they would have been less pointed.  However, that would have meant the Cadillac dealers not having product to put in their showrooms which was of course the point of the whole Cimarron venture.  As it was, sales never came close to Cadillac’s optimistic projections, numbers influenced presumably by the Seville’s stellar performance a few years earlier and this time the mark-up was less, a Cimarron only twice the cost of a Cavalier.  That wasn’t enough however and nor were the constant upgrades, the most notable of which was the introduction of the Chevrolet’s 2.8 litre (173 cubic inch) V6 in 1985 and that did induce a surge in sales (though still to nothing like the once hoped for levels) but it was short lived and after production ended in 1988, Cadillac offered no replacement and they’ve not since attempted to build anything on this scale.