Tuesday, October 8, 2024

Decorum

Decorum (pronounced dih-kawr-uhm or dih-kohr-uhm)

(1) Dignified propriety of behavior, speech, dress, demeanour etc.

(2) The quality or state of being decorous, or exhibiting such dignified propriety; orderliness; regularity.

(3) The conventions of social behaviour; an observance or requirement of one’s social group (sometimes in the plural as “decorums” the use an allusion to the many rules of etiquette (the expectations or requirements defining “correct behaviour” which, although most associated with “polite society”, do vary between societal sub-sets, differing at the margins)).

1560–1570: A learned borrowing (in the sense of “that which is proper or fitting in a literary or artistic composition”) from the Latin decōrum, noun use of neuter of decōrus (proper, decent (ie decorous) from decor (beauty, elegance, charm, grace, ornament), probably from decus (an ornament; splendor, honor), the Proto-Italic dekos (dignity), from the primitive Indo-European os (that which is proper), from de- (take, perceive) (and used in the sense of “to accept” on the notion of “to add grace”).  By the 1580s the use of decorum has spread from its literary adoption from the Latin to the more generalized sense of “propriety of speech, behavior or dress; formal politeness”, a resurrection of the original sense in Latin (polite, correct in behaviour, that which is seemly).  Decorously (in a decorous manner) is an adverb, decorousness (the state or quality of being decorous; a behavior considered decorous) is a noun, indecorous (improper, immodest, or indecent) and undecorous (not decorous) are adjectives).  The adjective dedecorous (disgraceful; unbecoming) is extinct.  Decorum is a noun; the noun plural is decora or decorums.

Whether on rugby pitches, race tracks, in salons & drawing rooms or geo-politics, disagreements over matters of decorum have over millennia been the source of innumerable squabbles, schisms and slaughter but linguistically, the related adjective decorous (characterized by dignified propriety in conduct, manners, appearance, character, etc) has also not been trouble-free.  Decorous seems first to have appeared in the 1650s from the Latin decōrus and akin to both decēre (to be acceptable, be fitting) and docēre (to teach (in the sense of “to make fitting”) with the adjectival suffix –ōsus appended.  In Latin, the -ōsus suffix (full, full of) was a doublet of -ose in an unstressed position and was used to form adjectives from nouns, to denote possession or presence of a quality in any degree, commonly in abundance.  English picked this up from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus and it became productive.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  Decorous is an adjective, decorousness is a noun and decorously is an adverb.

In use there are two difficulties with decorous: (1) the negative forms and (2) how it should be pronounced, both issues with which mercifully few will be troubled (or even see what the fuss is about) but to a pedantic subset, much noted.  The negative forms are undecorous & indecorous (both of which rarely are hyphenated) but the meanings are differences in the meaning.  Undecorous means simply “not decorous” which can be bad enough but indecorous is used to convey “improper, immodest, or indecent” which truly can be damning in some circles so the two carefully should be applied.  There’s also the negative nondecorous but it seems never to have been a bother.  The problem is made worse by the adjective dedecorous (disgraceful; unbecoming) being extinct; it would have been a handy sort of intermediate state between the “un-” & “in-” forms and the comparative (more dedecorous) & superlative (most dedecorous) would have provided all the nuance needed.  The related forms are the nouns nondecorousness, indecorous & indecorous and the adverbs nondecorously, undecorously & undecorously.

The matter of the pronunciation of decorous is one for the pedants but there’s a lot of them about and like décor, the use is treated as a class-identifier, the correlation between pedantry and class-identifiers probably high; the two schools of thought are  dek-er-uhs & dih-kawr-uhs (the second syllable -kohr- more of a regionalism) and in 1926 when the stern Henry Fowler (1858–1933) published his A Dictionary of Modern English Usage, he in his prescriptive way insisted on the former.  By 1965, when the volume was revised by Sir Ernest Gowers (1880–1966), he noted the “pronunciation has not yet settled down”, adding that “decorum pulls one way and decorate the other”.  In his revised edition, Sir Ernest distinguished still between right & wrong (a position from which, regrettably, subsequent editors felt inclined to retreat) but had become more descriptive than his predecessor of how things were done rather than how they “ought to be” done and added while “most authorities” had come to prefer dih-kawr-uhs, that other arbiter, the Oxford English Dictionary (OED) had listed dek-er-uhs first and it thus “may win”.  By the 2020s, impressionistically, it would seem it has.

Décor is another where the pronunciation can be a class-identifier and in this case it extend to the spelling, something directly related.  In English, the noun décor dates from 1897 in the sense of “scenery and furnishings” and was from the eighteenth century French décor, a back-formation from the fourteenth century décorer (to decorate), from the Latin decorare (to decorate, adorn, embellish, beautify), the modern word thus duplicating the Latin decor.  The original use in English was of theatre stages and such but the term “home décor” was in use late in the 1890s to described the technique of hanging copies of old masters as home decoration.  From this evolved the general use (decorations and furnishings of a room, building etc), well established by the mid 1920s and it’s been with us ever since.  Typically sensibly, the French l'accent aigu (acute accent) (the “é” pronounced ay in French) was abandoned by the Americans without corroding society but elsewhere, décor remained preferred by among certain interior decorators and their clients, the companion French pronunciation obligatory too.

Courtoom decorum: Lindsay Lohan arriving at court, Los Angeles, 2011-2013.  All the world's a catwalk.

Top row; left to right: 9 Feb 2011; 23 Feb; 2011; 10 Mar 2011; 22 Apr 2011.
Centre row; left to right: 23 Jun 2011; 19 Oct 2011; 2 Nov 2011; 14 Dec 2011.
Bottom row; left to right: 17 Dec 2011; 30 Jan 2012; 22 Feb 2012; 28 Mar 2012.

In English, the original use of decorum was in the technical jargon of what word come to be called literary theory; decorum describing a structuralist adherence to formal convention.  It was applied especially to poetry where rules of construction abound and it was about consistency with the “canons of propriety” (in this context defined usually as “good taste, good manners & correctness” which in our age of cultural (and linguistic) relativism is something many would label as “problematic” but all are free to “plug-in” their own standards).  Less controversially perhaps, decorum was understood as the matter of behavior on the part of the poet qua ("in the capacity or character of; as being" and drawn from the Latin legal qua (acting in the capacity of, acting as, or in the manner of)) their poem and therefore what is proper and becoming in the relationship between form and substance.  That needs to be deconstructed: decorum was not about what the text described because the events variously could be thought most undecorous or indecorous but provided the author respected the character, thought and language appropriate to each, the literary demands of decorum were satisfied.  Just as one would use many different words to describe darkness compared to those used of sunlight, a work on a grand and profound theme should appear in a dignified and noble style while the trivial or humble might be earthier.

The tradition of decorum is noted as a theme in the works by the Classical authors from Antiquity but the problem there is that we have available only the extant texts and they would be but a fragment of everything created and it’s acknowledged there was much sifting and censoring undertaken in the Medieval period (notably by priests and monks who cut out “the dirty bits” and it’s not known how much was destroyed because it was thought “worthless” or worse “obscene”.  What has survived may be presumed to be something of the “best of” Antiquity and there’s no way of knowing if in Athens and Rome there were proto-post modernists who cared not a fig for literary decorum.  The Greek and Roman tradition certainly seems to have been influential however because decorum is obvious in Elizabethan plays.  In William Shakespeare’s (1564–1616) Much Ado About Nothing (circa 1598), the comic passages such as the badinage between Beatrice and Benedick appear for amusing effect in colloquial dramatic prose while the set-piece romantic episodes are in formal verse; the very moment Benedick and Beatrice realize they are in love, that rise in the emotional temperature is signified by them suddenly switched to poetic verse.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

By contrast, in rhetoric, the conventions of literary decorum were probably most useful when being flouted.  Winston Churchill’s (1875-1965; UK prime-minister 1940-1945 & 1951-1955) World War II (1939-1945) speeches are remembered now for their eloquence and grandeur but there’s much evidence that at the time many listeners regarded their form as an anachronism and preferred something punchier but what made them effective was the way he could mix light & dark, high and low to lend his words a life which transcended the essential artificiality of a speech.  Once, when discussing serious matter of international relations and legal relationships between formerly belligerent powers, he paused to suggest that while Germany might be treated harshly after all that had happened, the Italians “…might be allowed to work their passage back.” [to the community of the civilized world].  What the flouting of decorum could do was make something worthy but dull seem at least briefly interesting or at least amusing, avoiding what the British judge Lord Birkett (1883–1962) would have called listening to “the ‘refayned’ and precious accents of a decaying pontiff.

In English literature, it was during the seventeenth & eighteenth centuries that decorum became what might now be called a fetish, a product of the reverence for what were thought to be the “Classical rules and tenets” although quite how much these owned to a widespread observance in Antiquity and how much to the rather idealized picture of the epoch painted by medieval and Renaissance scholars really isn’t clear.  Certainly, in the understanding of what decorum was there were influences ancient & modern, Dr Johnson (Samuel Johnson (1709-1784)) observing that while terms like “cow-keeper” or “hog-herd” would be thought too much the vulgar talk of the peasantry to appear in “high poetry”, to the Ancient Greeks there were no finer words in the language.  Some though interpolated the vulgarity of the vernacular just because of the shock value the odd discordant word or phrase could have, the English poet Alexander Pope (1688-1744) clearly enjoying mixing elegance, wit and grace with the “almost brutal forcefulness” of the “the crude, the corrupt and the repulsive” and it’s worth noting he made his living also as a satirist.  His example must have appealed to the Romantic poets because they sought to escape the confines imposed by the doctrines of Neoclassicism, William Wordsworth (1770–1850) writing in the preface to Lyrical Ballads (1798 and co-written with Samuel Taylor Coleridge (1772-1834)) that these poems were here to rebel against “false refinement” and “poetic diction”.  He may have had in mind the odd “decaying pontiff”.

Sunday, October 6, 2024

Monolith

Monolith (pronounced mon-uh-lith)

(1) A column, large statue etc, formed originally from a single block of stone but latter day use applies the term to structures formed from any material and not of necessity a single piece (although technically such a thing should be described using the antonym: “polylith”.

(2) Used loosely, a synonym of “obelisk”.

(3) A single block or piece of stone, especially when used in architecture or sculpture and applied most frequently to large structures.

(4) Something (or an idea or concept) having a uniform, massive, redoubtable, or inflexible quality or character.

(5) In architecture, a large hollow foundation piece sunk as a caisson and having a number of compartments that are filled with concrete when it has reached its correct position

(6) An unincorporated community in Kern County, California, United States (initial capital).

(7) In chemistry, a substrate having many tiny channels that is cast as a single piece, which is used as a stationary phase for chromatography, as a catalytic surface etc.

(8) In arboreal use, a dead tree whose height and size have been reduced by breaking off or cutting its branches (use rare except in UK horticultural use).

1829: The construct was mono- + lith.  Mono was from the Ancient Greek, a combining form of μόνος (monos) (alone, only, sole, single), from the Proto-Hellenic mónwos, from the primitive Indo-European mey- (little; small).  It was related to the Armenian մանր (manr) (slender, small), the Ancient Greek μανός (manós) (sparse, rare), the Middle Low German mone & möne, the West Frisian meun, the Dutch meun, the Old High German muniwa, munuwa & munewa (from which German gained Münne (minnow).  As a prefix, mono- is often found in chemical names to indicate a substance containing just one of a specified atom or group (eg a monohydrate such as carbon monoxide; carbon attached to a single atom of oxygen). 

In English, the noun monolith was from the French monolithe (object made from a single block of stone), from Middle French monolythe (made from a single block of stone) and their etymon the Latin monolithus (made from a single block of stone), from the Ancient Greek μονόλιθος (monólithos) (made from a single block of stone), the construct being μονο- (mono-) (the prefix appended to convey the meaning “alone; single”), from μόνος (monos) + λίθος (líthos) (a stone; stone as a substance).  The English form was cognate with the German monolith (made from a single block of stone).  The verb was derived from the noun.  Monolith is a noun & verb, monolithism, monolithicness & monolithicity are nouns, monolithic is an adjective and monolithically is an adverb; the noun plural is monoliths.  The adjective monolithal is listed as "an archaic form of monolithic".

Monolith also begat two back-formations in the technical jargon of archaeology: A “microlith” is (1) a small stone tool (sometimes called a “microlite”) and (2) the microscopic acicular components of rocks.  A “megalith” is (1) a large stone slab making up a prehistoric monument, or part of such a monument, (2) A prehistoric monument made up of one or more large stones and (3) by, extension, a large stone or block of stone used in the construction of a modern structure.  The terms seem not to be in use outside of the technical literature of the profession.  The transferred and figurative use in reference to a thing or person noted for indivisible unity is from 1934 and is now widely used in IT, political science and opinion polling.  The adjective monolithic (formed of a single block of stone) was in use by the early nineteenth century and within decades was used to mean “of or pertaining to a monolith”, the figurative sense noted since the 1920s.  The adjective prevailed over monolithal which seems first to have appeared in a scientific paper in 1813.  The antonym in the context of structures rendered for a single substance is “polylith” but use is rare and multi-component constructions are often described as “monoliths”.  The antonym in the context of “anything massive, uniform, and unmovable, especially a towering and impersonal cultural, political, or social organization or structure” is listed by many sources as “chimera” but terms like “diverse”, “fragmented” etc are usually more illustrative for most purposes.  In general use, there certainly has been something of a meaning-shift.  While "monolith" began as meaning "made of a single substance", it's now probably most used to covey the idea of "something big & tall" regardless of the materials used.

One of the Monoliths as depicted in the film 2001: A Space Odyssey (1968). 

The mysterious black structures in Sir Arthur C Clarke's (1917–2008) Space Odyssey series (1968-1997) became well known after the release in 1968 of Stanley Kubrick's (1928–1999) film of the first novel in the series, 2001: A Space Odyssey.  Although sometimes described as “obelisk”, the author noted they were really “monoliths”.  In recent years, enthusiasts, mischief makers and click-bait hunters have been erecting similar monoliths in remote parts of planet Earth, leaving them to be discovered and publicized.  With typical alacrity, modern commerce noted the interest  and soon, replicas were being offered for sale, a gap in the market for Christmas gifts between US$10,000-45,000 apparently identified.

The terms “obelisk” and “monolith” are sometimes used interchangeably and while in the case of many large stone structures this can be appropriate, the two terms have distinct meanings.  Classically, an obelisk is a tall, four-sided, narrow pillar that tapers to a pyramid-like point at the top.  Obelisks often are carved from a single piece of stone (and are thus monolithic) but can also be constructed in sections and archaeologists have discovered some of the multi-part structures exists by virtue of necessity; intended originally to be a single piece of stone, the design was changed after cracks were detected.  A monolith is a large single block stone which can be naturally occurring (such as a large rock formation) or artificially shaped; monoliths take many forms, including obelisks, statues and even buildings.  Thus, while an obelisk can be a monolith, not all monoliths are obelisks.

Highly qualified German content provider Chloe Vevrier (b 1968) standing in front of the Luxor Obelisk, Paris 2010.

The Luxor Obelisk sits in the centre of the Place de la Concorde, one of the world’s most photographed public squares.  Of red granite, 22.5 metres (74 feet) in height and weighing an estimated 227 tonnes (250 short (US) tons), it is one of a pair, the other still standing front of the first pylon of the Luxor Temple on the east bank of the Nile River, Egypt.  The obelisk arrived in France in May 1833 and less than six month later was raised in the presence of Louis Philippe I (1773–1850; King of the French 1830-1848).  The square hadn’t always been a happy place for kings to stand; in 1789 (then known as the Place de Louis XV) it was one of the gathering points for the mobs staging what became the French Revolution and after the storming of the Bastille (of of history’s less dramatic events despite the legends), the square was renamed Place de la Revolution, living up to the name by being the place where Louis XVI (1754–1793; King of France 1774-1792), Marie Antoinette (1755–1793; Queen Consort of France 1774-1792) and a goodly number of others were guillotined.  Things were calmer by 1833 when the obelisk was erected.

The structure was a gift to France by Pasha Mehmet Ali (1769–1849, Ottoman Albanian viceroy and governor of Egypt 1805-1848) and in return Paris sent a large mechanical clock which to this day remains in place in the clock tower of the mosque at the summit of the Citadel of Cairo and of the 28 obelisks, six remain in Egypt with the rest in various displays around the world.  Some 3000 years old, in its original location the Obelisk contributed to scientific history wine in circa 250 BC Greek geographer & astronomer Eratosthenes of Cyrene (circa 276 BC–circa 195 BC) used the shadow it cast to calculate the circumference of the Earth.  By comparing the shadow at a certain time with one in Alexandria, he concluded that the difference in distance between Alexandria and Aswan was seven degrees and 14 minutes and from this he could work out the Earth’s circumference.

Monolithic drivers

In IT, the term “monolithic driver” was used to refer to a software driver designed to handle multiple hardware components or functionalities within a single, large, and cohesive codebase.  In this it differed from earlier (and later) approaches which were modular or layered, the functionality is split into separate, smaller drivers or modules, each of which handled specific tasks or addressed only certain hardware components.  Monolithic drivers became generally available in the late 1980s, a period when both computer architecture and operating systems were becoming more sophisticated in an attempt to overcome the structural limitations imposed by the earlier designs.  It was in the era many of the fundamental concepts which continue to underpin modern systems were conceived although the general adoption of some lay a decade or more away.

During the 1970s & 1980s, many systems were built with a tight integration between software and hardware and some operating systems (OS) were really little more than “file loaders” with a few “add-ons”, and the limitations imposed were “worked-around” by some programmers who more-or-less ignored the operating system an address the hardware directly using “assemblers” (a flavor of “machine-code”).  That approach made for fast software but at the cost of interoperability and compatibility, such creations hardware specific rather using an OS as what came to be known as the HAL (hardware abstraction layer) but at the time, few OSs were like UNIX with its monolithic kernel in which the core OS services (file system management, device drivers etc.) were all integrated into a single large codebase.  As the market expanded, it was obvious the multi-fork approach was commercially unattractive except for the odd niche.

After its release in 1981, use of the IBM personal computers (PC) proliferated and because of its open (licence-free) architecture, an ecosystem of third party suppliers arose, producing a remarkable array of devices which either “hung-off” or “plugged-in” a PC; the need for hardware drivers grew.  Most drivers at the time came from the hardware manufacturers themselves and typically were monolithic (though not yet usually described as such) and written usually for specific hardware and issues were rife, a change to an OS or even other (apparently unrelated) hardware or software sometimes inducing instability or worse.  As operating systems evolved to support more modularity, the term “monolithic driver” came into use to distinguish these large, single-block drivers from the more modular or layered approaches that were beginning to emerge.

It was the dominance of Novell’s Netware (1983-2009) on PC networks which compelled Microsoft to develop Windows NT (“New Technology”, 1993) and it featured a modular kernel architecture, something which made the distinction between monolithic and modular drivers better understood and as developers increasingly embraced the modular, layered approach which better handled maintainability and scalability.  Once neutral, the term “monolithic driver” became something of a slur in IT circles, notably among system administrators (“sysadmins” or “syscons”, the latter based on the “system console”, the terminal on a mainframe hard-wired to the central processor) who accepted ongoing failures of this and that as inherent to the business but wanted to avoid a SPoFs (Single Point of Failure).

In political science, the term “monolithic” is used to describe a system, organization, or entity perceived as being unified, indivisible, and operating with a high degree of internal uniformity, often with centralized control. When something is labeled as monolithic, it implies that it lacks diversity or internal differentiation and presents a singular, rigid structure or ideology.  Tellingly, the most common use of the term is probably when analyzing electoral behavior and demonstrating how groups, societies or sub-sets of either. Although often depicted in the media as “monolithic” in their views, voting patterns or political behavior are anything but and there’s usually some diversity.  In political science, such divergences within defined groups are known as “cross-cutting cleavages”.

It’s used also of political systems in which a regime is structured (or run) with power is highly concentrated, typically in a single dictator or ruling party.  In such systems, usually there is little effective opposition and dissent is suppressed (although some of the more subtle informally tolerate a number of “approved dissenters” who operated within understood limits of self-censorship.  The old Soviet Union (the USSR (Union of Soviet Socialist Republics) 1922-1991), the Islamic Republic of Iran (1979-), the Republic of China (run by the Chinese Communist Party (CCP) (1949-) and the DPRK (Democratic Republic of Korea (North Korea) 1948-) are classic examples of monolithic systems; while the differences between them were innumerable, structurally all were (or are) politically monolithic.  The word is used also as a critique in the social sciences, Time magazine in April 2014 writing of the treatment of “Africa” as a construct in Mean Girls (2004):  Like the original Ulysses, Cady is recently returned from her own series of adventures in Africa, where her parents worked as research zoologists. It is this prior “region of supernatural wonder” that offers the basis for the mythological reading of the film. While the notion of the African continent as a place of magic is a dated, rather offensive trope, the film firmly establishes this impression among the students at North Shore High School. To them, Africa is a monolithic place about which they know almost nothing. In their first encounter, Karen inquires of Cady: “So, if you’re from Africa, why are you white?” Shortly thereafter, Regina warns Aaron that Cady plans to “do some kind of African voodoo” on a used Kleenex of his to make him like her—in fact, the very boon that Cady will come to bestow under the monomyth mode.”  It remains a sensitive issue and one of the consequences of European colonial practices on the African continent (something which included what would now be regarded as "crimes against humanity) so the casual use of "Africa" as a monolithic construct is proscribed in a way a similar of "Europe" would not attract criticism.    

The limitations of the utility of the term mean it should be treated with caution and while there are “monolithic” aspects or features to constructs such as “the Third World”, “the West” or “the Global South”, the label does over-simplify the diversity of cultures, political systems, and ideologies within these broad categories.  Even something which is to some degree “structurally monolithic” like the United States (US) or the European Union (EU) can be highly diverse in terms of actual behavior.  In the West (and the modern-day US is the most discussed example), the recent trend towards polarization of views has become a popular topic of study and the coalesced factions are sometimes treated as “monolithic” despite in many cases being themselves intrinsically factionalized.

Saturday, October 5, 2024

Ballistic

Ballistic (pronounced buh-lis-tik)

(1) A projected object having its subsequent travel determined or describable by the laws of exterior ballistics, most used in denoting or relating to the flight of projectiles after the initial thrust has been exhausted, moving under their own momentum and subject to the external forces of gravity and the fluid dynamics of air resistance

(2) Of or relating to ballistics.

(3) In slang and idiomatic use, (as “go ballistic”, “went ballistic” etc), to become overwrought or irrational; to become enraged or frenziedly violent.  For those who need to be precise is describing such instances, the comparative is “more ballistic” and the superlative “most ballistic”.

(4) Of a measurement or measuring instrument, depending on a brief impulse or current that causes a movement related to the quantity to be measured

(5) Of materials, those able to resist damage (within defined parameters) by projectile weapons (ballistic nylon; ballistic steel etc), the best-know use of which is the “ballistics vest”.

(6) As “ballistics gel(atin)”, as substance which emulates the characteristics and behavior under stress of human or animal flesh (used for testing the effect of certain impacts, typically shells fired from firearms).

(7) As “ballistic podiatry”, industry slang for “the act of shooting oneself in the foot”, used also by military doctors to describe soldiers with such self-inflicted injuries.  The more general term for gunshot wounds is “ballistic trauma”

(8) In ethno-phonetics, as “ballistic syllable”, a phonemic distinction in certain Central American dialects, characterized by a quick, forceful release and a rapid crescendo to a peak of intensity early in the nucleus, followed by a rapid, un-controlled decrescendo with fade of voicing.

(9) As “ballistic parachute”, a parachute used in light aircraft and helicopters, ejected from its casing by a small explosion.

1765–1775: The construct was the Latin ballist(a) (a siege engine (ancient military machine) for throwing stones to break down fortifications), from the Ancient Greek βαλλίστρα (ballístra), from βάλλω (bállō) (I throw). + -ic.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  The modern use (of the big military rockets or missiles (those guided while under propulsion, but which fall freely to their point of impact (hopefully the intended target)) dates from 1949 although the technology pre-dated the label.  The term “ballistic missile” seems first to have appeared in 1954 and remains familiar in the “intercontinental ballistic missile” (ICBM).  The figurative use (“go ballistic”, “went ballistic”) to convey “an extreme reaction; to become irrationally angry” is said to have been in use only since 1981 which is surprising.  To “go thermo-nuclear” or “take the nuclear option” are companion phrases but the nuances do differ.  The noun ballistics (art of throwing large missiles; science of the motion of projectiles) seems first to have appeared in 1753 and was from the Latin ballist(a), from the Ancient Greek ballistes, from ballein (to throw, to throw so as to hit that at which the object is aimed (though used loosely also in the sense “to put, place, lay”)), from the primitive Indo-European root gwele- (to throw, reach).  In the technical jargon of the military and aerospace industries, the derived forms included (hyphenated and not) aeroballistic, antiballistic, astroballistic, ballistic coefficient, quasiballistic, semiballistic, subballistic, superballistic & thermoballistic.  In science and medicine, the forms include bioballistic, cardioballistic, electroballistic and neuroballistic.  Ballistic & ballistical are adjectives, ballisticity, ballistician & ballistics are nouns and ballistically is an adverb; the wonderful noun plural is ballisticies.

The basilisk was a class of large bore, heavy bronze cannons used during the late Middle Ages and in their time were a truly revolutionary weapon, able quickly to penetrate fortifications which in some cases had for centuries enabled attacks to be resisted.  Although there were tales of basilisks with a bores between 18-24 inches (460-610 mm), these were almost certainly a product of the ever-fertile medieval imagination and there’s no evidence any were built with a bore exceeding 5 inches (125 mm).  As a high-velocity weapon however, that was large enough for it to be highly effective, the 160 lb (72 kg) shot carrying a deadly amount of energy and able to kill personnel or destroy structures.  Because of the explosive energy needed to project the shot, the barrels of the larger basilicks could weigh as much as 4000 lb (1,800 kg); typically they were some 10 feet (3 m) in length but the more extraordinary, built as long-range devices, could be as long as 25 feet (7.6 m).  Despite the similarity in form, the name basilisk was unrelated to “ballistics” and came from the basilisk of mythology, a fire-breathing, venomous serpent able to kill and destroy, its glace alone deadly.  It was thus a two part allusion (1) the idea of “spitting fire” and (2) the thought the mere sight of an enemy’s big canons would be enough to scare an opponent into retreat.

As soon as it appeared in Europe, it was understood the nature of battlefields would change and the end of the era of the castle was nigh.  It was the deployment of the big cannons which led to the conquest of Constantinople (capital of the Byzantine Empire now Istanbul in the Republic of Türkiye) in 1453 after a 53 day siege; the city’s great walls which for centuries had protected it from assault were worn down by the cannon fire to the point where the defenders couldn’t repair the damage at the same rate as the destruction.  In an example of the way economics is a critical component of war, the Austrian cannon makers had offered the cannons to the Byzantines but the empire was in the throes of one of the fiscal crises which determined to outcomes of so many conflicts and had no money with which to make the purchase.  The practical Austrians then sold their basilisks to the attacking Ottoman army and the rest is history.  Despite such successes, the biggest of the basilisks became rare after the mid sixteenth century as military tactics evolved to counter their threat by becoming more mobile and the traditional siege of static targets became less decisive and smaller, more easily transported cannon, lighter and cheaper to produce, came to dominate artillery formations.

Queen Elizabeth's Pocket Pistol, Navy, Army and Air Force Institute Building, Dover Castle, Dover, Kent, England.

Queen Elizabeth's Pocket Pistol was a basilisk built in 1544 in Utrecht (in the modern-day Netherlands), the name derived from it being a presented to Henry VIII (1491–1547; King of England (and Ireland after 1541) 1509-1547) as a for his daughter (the future Elizabeth I (1533–1603; Queen of England & Ireland 1558-1603) although the first known reference to it being called “Queen Elizabeth's Pocket Pistol” dates from 1767. Some 24 feet (7.3 m) in length and with a 4.75 inch (121 mm) bore, it was said to be able to launch a 10 lb (4.5 kg) ball almost 2000 yards (1.8 km) although as a typical scare tactic, the English made it known to the French and Spanish that its shots were heavier and able to reach seven miles (12 km).  Just to makes sure the point was understood, it was installed to guard the approaches to the cliffs of Dover.  Modern understandings of the physics of ballistics and the use of computer simulations have since suggested there may have been some exaggeration in even the claim of a 2000 yard range and it was likely little more than half that.  Such use of propaganda remains part of the military arsenal to this day.

It was fake news:  Responding to viral reports, the authoritative E!-News in April 2013 confirmed Lindsay Lohan did not "go ballistic" and attack her ex-assistant at a New York City club.  For some reason, far and wide, the fake news had been believed.

Despite the costs involved and the difficulties in maintaining and transporting big cannons, some militaries couldn’t resist them and predictably, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945), who thought just about everything (buildings, tanks, trains, monuments, cars, battleships et al) should be bigger, oversaw some of the most massive artillery pieces ever built, often referred to by historians as “super heavy guns”.  The term is no exaggeration and the most striking example were the Schwerer Gustav and Dora.  With a bore of 31.5 inches (800 mm), the Schwerer Gustav and Dora apparatus weighed 1350 tons (1225 tonnes) and could fire a projectile as heavy as 7.1 tons (6.4 tonnes) some 29 miles (47 km).  Two were built, configured as “railway guns” and thus of most utility in highly developed areas where rail tracks lay conveniently close to the targets.  The original design brief from the army ordinance office required long-range device able to destroy heavily fortified targets and for that purpose, they could be effective.  However, each demands as crew of several thousand soldiers, technicians & mechanics with an extensive logistical support system in place to support their operation which could be fewer than one firing per day.  The Schwerer Gustav’s most successful deployment came during the siege of Sevastopol (1942).  Other big-bore weapons followed but success prove patchy, especially as allied control of the skies made the huge, hard to hid machines vulnerable to attack and even mounting them inside rock formations couldn’t resist the Royal Air Force’s (RAF) new, ground-penetrating bombs.

Schwerer Gustav being readied for a test firing, Rügenwalde, Germany, 19 March 1943, Hitler standing second from the right with Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) to his right.  Hitler referred to huge gun as “meine stählerne faust” (my steel fist) but it never fulfilled his high expectations and like many of the gigantic machines which so fascinated the Führer (who treated complaints about their ruinous cost as “tiresome”) it was a misallocation of scarce resources.

It was the development of modern ballistic rockets during World War II (1939-1945) which put an end to big guns (although the Iraqi army did make a quixotic attempt to resurrect the concept, something which involved having a British company “bust” UN (United Nations) sanctions by claiming their gun barrel components were “oil pipes”), the German’s A4 (V-2) rocket the world’s first true long-range ballistic missile. The V-2 represented a massive leap forward in both technology and military application and briefly it would touch “the edge of space” before beginning its ballistic trajectory, reaching altitudes of over 100 km (62 miles) before descending toward its target.  Everything in the field since has to some degree been an evolution of the V-2, the three previous landmarks being (1) the Chinese “Fire Arrows” of the early thirteenth century which were the most refined of the early gunpowder-filled rockets which followed a simple ballistic path, (2) the eighteenth century Indian Mysorean Rockets with the considerable advance of metal casings, the great range a shock to soldiers of the British Raj who had become accustomed to enjoying a technology advantage and (3) the British Congreve Rockets of the early nineteenth century, essentially a refinement of Mysorean enhanced by improved metallurgy and aerodynamics and made more effective still when combined with the well organized logistics of the British military.

Thursday, October 3, 2024

Houndstooth

Houndstooth (pronounced houns-tuth)

(1) A two-colour fabric pattern of broken checks (multi-color versions using the pattern do now exist and are also so-described).

(2) Fabric with a houndstooth pattern; an item of clothing made with such fabric.

(3) In botany, as Cynoglossum officinale (houndstongue, houndstooth, dog's tongue, gypsy flower (and “rats and mice” due to its smell), a herbaceous plant of the family Boraginaceae.

1936: A word, based on the appearance of the design, the pattern (in architecture, decorative art, fabric etc) is ancient but the descriptive term “houndstooth” has been in use only since 1936.  The shape is sometimes referred to as dogstooth (or dog's tooth) and in French it’s the more pleasing pied-de-poule (chicken feet), preferred also by the Italians.  In 1936 there must have been pedants who insisted it should have been “hound's tooth” because that does appear in some advertisements but in commercial use, houndstooth quickly was standardized.  The name was chosen a reference directly to a dog’s tooth, not the pattern of teeth marks left by its bite.  The construct was hounds + tooth.  Hound was from the Middle English hound, from the Old English hund, from the Proto-West Germanic hund, from the Proto-Germanic hundaz and was congnate with the West Frisian hûn, the Dutch hond, the Luxembourgish Hond, the German Hund, the German Low German Hund, the Danish hund, the Faroese hundur, the Icelandic hundur, the Norwegian Bokmål hund, the Norwegian Nynorsk hund and the Swedish hund, from the pre-Germanic untós (which may be compared with the Latvian sùnt-ene (big dog), an enlargement of the primitive Indo-European w (dog).  Elsewhere, the forms included the Old Irish (dog), the Tocharian B ku, the Lithuanian šuõ, the Armenian շուն (šun), and the Russian сука (suka)).  

In England, as late as the fourteenth century, “hound” remained the word in general use to describe most domestic canines while “dog” was used of a sub-type resembling the modern mastiff and bulldog.  By the sixteenth century, dog had displaced hound as the general word descriptor. The latter coming to be restricted to breeds used for hunting and in the same era, the word dog was adopted by several continental European languages as their word for mastiff.  Dog was from the Middle English dogge (source also of the Scots dug (dog)), from the Old English dogga & docga of uncertain origin.  Interestingly, the original sense appears to have been of a “common dog” (as opposed one well-bred), much as “cur” was later used and there’s evidence it was applied especially to stocky dogs of an unpleasing appearance.  Etymologists have pondered the origin:  It may have been a pet-form diminutive with the suffix -ga (the similar models being compare frocga (frog) & picga (pig), appended to a base dog-, or doc-(the origin and meaning of these unclear). Another possibility is Old English dox (dark, swarthy) (a la frocga from frog) while some have suggested a link to the Proto-West Germanic dugan (to be suitable), the origin of Old English dugan (to be good, worthy, useful), the English dow and the German taugen; the theory is based on the idea that it could have been a child’s epithet for dogs, used in the sense of “a good or helpful animal”.  Few support that and more are persuaded there may be some relationship with docce (stock, muscle), from the Proto-West Germanic dokkā (round mass, ball, muscle, doll), from which English gained dock (stumpy tail).  In fourteenth century England, hound (from the Old English hund) was the general word applied to all domestic canines while dog referred to some sub-types (typically those close in appearance to the modern mastiff and bulldog.  In German, the form endures as der Hund (the dog) & die Hunde (the dogs) and the houndstooth pattern is Hahnentritt.  Houndstooth is a noun; the noun plural is houndsteeth.  Strictly speaking, it may be that certain use of the plural (such as several houndstooth jackets) should be called “houndstooths” but this is an ugly word which should be avoided and no sources seem to list it as standard.  The same practice seems to have been adopted for handing the plural of cars called “Statesman”, “statesmen” seeming just an absurdity.

Although the classic black & white remains the industry staple, designer colors are now not uncommon.

In modern use in English, a “hound” seems to be thought of as a certain sort of dog, usually large, with a finely honed sense of smell and used (often in packs) for hunting and the sense development may also have been influenced by the novel The Hound of the Baskervilles (1901-1902) by the physician Sir Arthur Conan Doyle (1859–1930).  The best regarded of Conan Doyle’s four novels, it’s set in the gloomy fog of Dartmoor in England’s West Country and is the tale of the search for a “fearsome, diabolical hound of supernatural origin”.  The author's name is an example of how conventions of use influence things.  He's long been referred to as “Sir Arthur Conan Doyle” or “Conan Doyle” which would imply the surname “Conan Doyle” but his surname was “Doyle” and he was baptized with the Christian names “Arthur Ignatius Conan”, the “Conan” from his godfather.  Some academic and literary libraries do list him as “Doyle” but he's now referred to almost universally as “Conan Doyle” and the name “Arthur Doyle” would be as un-associated with him as “George Shaw” would with George Bernard Shaw (GBS; 1856-1950).  Conan Doyle's most famous creation was of course the detective Sherlock Holmes and he wore a houndstooth deerstalker cap.   Tooth (a hard, calcareous structure present in the mouth of many vertebrate animals, generally used for biting and chewing food) was from the Middle English tothe, toth & tooth, from the Old English tōþ (tooth), from the Proto-West Germanic tanþ, from the Proto-Germanic tanþs (tooth), from the primitive Indo-European h₃dónts (tooth) and related to tusk.

Lindsay Lohan in monochrome check jacket, Dorchester Hotel, London, June 2017 (left), Lindsay Lohan in L.A.M.B. Lambstooth Sweater, Los Angeles, April 2005 (centre) and racing driver Sir Lewis Hamilton (b 1985) in a Burberry Houndstooth ensemble, Annual FIA Prize Giving Ceremony, Baku, Azerbaijan, December 2023 (right).  The Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation) is world sport's dopiest regulatory body.  Although, at a distance, a wide range of fabrics look like houndstooth, some are really simple symmetrical, monochrome checks without the distinctive pattern and where designers have varied the shape, other descriptors (and L.A.M.B. couldn’t resist “lambstooth”) are used, something which helps also with product differentiation.  Sir Lewis though, sticks to the classics.  Regarded as the most fashion conscious of the Formula One drivers of his generation, it’s clear that assiduously he studies Lohanic fashion directions.

Designers consider houndstooth part of the plaid “family”, the jagged contours of the shape the point of differentiation from most which tend towards uniform, straight lines.  Although for the archaeological record its clear the concept of the design has an ancient lineage, what’s now thought of as the “classic” black & white houndstooth was defined in the mid-nineteenth century when it began to be produced at scale in the Scottish lowlands, in parallel with the plaid most associated with the culture, the tartan (although in some aspects the “history & cultural traditions” of tartan were a bit of a commercial construct).  Technically, a houndstooth is a two tone (the term monochrome often used in the industry to convey the idea of “black & white” (a la photography) rather than being etymologically accurate) plaid in four bands, two of each color (in both the weft & warp weave), woven with the simple 2:2 twill.  One of the charms of the design is that with slight variations in size and scale, different effects can be achieved and color mixes are now not uncommon although the classic black & white remains 

The history in the Lowlands is murky but it seems certain the early fabrics were woven from wool which makes sense given the importance of sheep to the economy and the early garments were utilitarian, often cloak-like outer garments for those tending the flocks.  The early term was “shepherd’s check” which became first “dogstooth” and then “houndstooth”, canine teeth something with which shepherds would have been familiar because of the threat to their animals from the predations of wild dogs.  Fabric with smaller checks could be called “puppycheck”.  Interestingly, despite its striking appearance, the houndstooth pattern remained a generic and was never adopted as a family or clan symbol, a la the tartans.  It gained a new popularity in the 1930s when photographs began to appear of members of the British royal family and various gentry wearing houndstooth jackets while hunting or riding, thus the association with wealth and privilege which so appealed to the middle class who started wearing them too.  By the time designers began to put them on the catwalks, houndstooth’s future was assured.

Houndstooth has received the imprimatur of more than one Princess of Wales: Catherine, Princess of Wales (b 1982, left) and Diana, Princess of Wales (1961-1997, right) in a typically daring color mix.

1969 Holden Monaro GTS 350 (left), 1972 Holden Monaro GTS 308 (centre) and 1977 Chrysler Cordoba (right).  Despite the popular perception, not all the “personal luxury” Chryslers of the era and not even all the Cordobas (1975-1983) were finished in “Fine Corinthian Leather”; except for a one-off appearance in the 1975 Imperial Brochures, the Corinthian hides were exclusive to the Cordoba. 

For passenger car interiors, houndstooth (rendered usually with a synthetic material) enjoyed a late mid-century spate of popularity, used for what were called generically “cloth inserts” and the use of houndstooth trended towards vehicles marketed as “sporty” whereas for luxury cars plusher fabrics like velour were preferred.  The cloth inserts seem only ever to have been paired with vinyl and not used with leather.

Houndstooth (left), Pepita (Shepherd's Check) (centre) and Vichy Check (right).

For decades, it’s been common to refer to the optional upholstery offered by Porsche in the 1960s as “houndstooth” but according to Recaro Automotive Seating, the German concern which supplied the fabric, the correct name is “Pepita” (known also as “Shepherd’s Check”), a design built with interconnected squares.  What has happened is that “houndstooth” has for most purposes in colloquial English become a generic term, used to describe anything “houndstoothesque” and it’s an understandable trend given that not only would a close examination be required to determine which pattern appears on a fabric, unless one is well-acquainted with the differences in shape, most would be none the wiser.  Nor did Recaro use “Vichy Check” for the seats they trimmed for Porsche although that erroneous claim too is sometimes made.  Further confusing the history, when Stuttgarter Karosseriewerk Reutter (Porsche’s original supplier) started production of seats used in the Porsche 356 (1948-1965) a number of fabrics were offered including one in nylon in a similar black-and-white pattern which was neither Houndstooth nor Pepita.

1967 Porsche 911S, trimmed in Recaro Pepita.

The Reutter family founded Recaro in 1963 and in December that year the first Pepita pattern fabrics were made commercially available, used on the later Porsche 356Cs, the 911 (which briefly was called the 901) & the 912.  Porsche’s best known use of the pepita fabric was on the Recaro Sportsitz (Sport seat), first displayed at the 1965 Frankfurt Motor Show and they’re a prized part of the early 911S models, the first of which were delivered in the northern summer of 1966.  At that point, the Pepita fabric became a factory option for the 911 and the last use was in the Recaro Idealsitz (Ideal seat), offered only in 1970–71 in black & white, red & beige, brown & beige and blue & green.  In a nostalgic nod, Porsche returned Pepita seats to the option list for the 911 legacy model, released in 2013 to mark the car’s 50th anniversary although Recaro was not involved in the production.

Matching numbers, matching houndstooth: 1970 Holden HG GTS 350 Monaro in Indy Orange with black detailing (paint combo code 567-122040) and houndstooth cloth seat inserts in Indy Orange & black (trim code 1199-10Z).  This car (VIN: 81837GJ255169; Model: HG81837; Chassis: HG16214M) is one of the most prized Monaros because the specification includes a 350 cubic inch (5.7 litre) small block Chevrolet V8 (L48) with the “McKinnon block”, paired with the four-speed manual Saginaw gearbox.  Holden built 405 HG GTS 350s, 264 as manuals and 141 with the two-speed Powerglide automatic transmission.  The “McKinnon block” is a reference to the General Motors (GM) McKinnon Industries plant in St. Catharine's, Ontario where the engines were built; the “American” cars exported to the UK, Australia, New Zealand and elsewhere in the Commonwealth often came from Canada because of the preferential tariff arrangements.

Very 1960s: GM's Indy Orange houndstooth fabric; in the US it was also offered in the Chevrolet Camaro.

Introduced in 1968, the Holden Monaro was the car which triggered Australia’s brief flirtation with big (in local terms, the cars were “compact” size in US nomenclature) coupés, a fad which would fade away by the mid 1970s.  It had been Ford which had first tested the market with a Falcon two-door hardtop (XM, 1964-1965 & XP, 1965-1966) but when the restyled model was released, it was again based on the US Falcon and the range no longer included a two-door hardtop, the wildly successful Mustang having rendered it unnecessary.  There was still a two-door Falcon sedan but it was thought to have limited appeal in Australia and was never offered so Ford didn’t have a model comparable with the Monaro until the XA Falcon Hardtop made its debut late in 1972.