Wednesday, October 9, 2024

Decker

Decker (pronounced dek-er)

(1) Something (typically a bus, ship, aircraft, bed, sandwich etc), having a specified number of decks, floors, levels, layers and such (used usually in combination with a numerical or other expression indicating the number in the construction (double decker, triple decker, upper decker, five decker etc (sometimes hyphenated).

(2) As “table decker” an employee who “decks” (ie sets or adorns) a table used for entertaining (used also as a “coverer”) (archaic).  The idea lives on in the verb “bedeck” (to adorn).

(3) In boxing slang, a fighter with a famously powerful punch, able to “deck” an opponent (ie knock them to the canvas with a single punch).

(4) In historic naval slang, as “quarter-decker”, a label applied to officers known more for their attention to matters of etiquette or trivial regulations than competent seamanship or ability in battle.  It was an allusion to a warship’s “quarter deck” (the part of the spar-deck of a man-of-war (warship) between the poop deck and main-mast (and originally (dating from the 1620s), a smaller deck above the half-deck, covering about a quarter of the vessel’s LOA (length overall)).  In many navies, the quarter-deck was reserved as “a promenade for officers only”.

1785–1795: The construct was deck + -er.  Deck in this context was from the Middle English dekke (covering extending from side to side over part of a ship), from a nautical use of the Middle Dutch decke & dec (roof, covering), from the Middle Dutch decken, from the Proto-Germanic thakam (source also of the noun “thatch” and from the primitive Indo-European root steg & teg- (to cover) and the Old Dutch thecken, from the Proto-West Germanic þakkjan, from the Proto-Germanic þakjaną and related to the German Decke (covering, blanket).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  The noun double-decker was first used in 1835 of ships with two decks above the water line and this extended to land transport (trains) in 1867.  Decker is a noun & adjective; the noun plural is deckers.

Flight deck of the US Navy's Nimitz-class aircraft carrier USS Carl Vinson (CVN 70).

The reason ships, trains, buses, aircraft and such have "decks" while buildings have "floors” or “stories (or storeys)” is traceable to nautical history and the nomenclature used in shipbuilding.  English picked up “deck” from the Middle Dutch decke & dec (roof, covering) where the use had been influenced by the Old Norse þekja (to cover) and in early shipbuilding, a “deck” was the structure which covered the hull of the ship, providing both a horizontal “working surface” and enclosing the vessel, creating a space for stores, cargo or accommodation which was protected from the elements.  In that sense the first nautical decks acted as a “roof”.  As ships became larger, the nautical architects began to include multiple decks, analogous with the floors of buildings in that they fulfilled a similar function, providing segregated layers (ie the storeys in buildings) used for cannons, crew quarters, storage and such.  As the terminology of shipbuilding became standardized, each deck came to have a specific name depending on its purpose or position (main deck, flight deck, poop deck, gun deck etc).

Ford Mustang convertible (1965–1973) replacement floor pan (complete, part number 3648B) by Moonlight Drive Sheet Metal.

Until the nineteenth century, although the vehicles used on land became larger, they tended to get longer rather than higher but the advent of steam propulsion made possible trains which ran on railways and these could pull carriages carrying freight or passengers.  The first “double decker” versions appeared in France in 1867 and were described as voitures à imperial, (imperial cars) were used on the Chemin de Fer de l'Ouest (Western Railway), the upper deck roofless and thus an “open-air experience”,  Rapidly, the idea spread and double-deck carriages became common for both long-distance and commuter services.  An outlier in the terminology is car design; cars have a floor (sometimes called the “floor pan”) rather than a deck, presumably because there’s only ever one.  In the narrow technical sense there have been cars with “two floors” but they were better understood as a “double-skinned” single floor and they were used for armor or to provide a space for something specialized such as hydrogen fuel-cells, the technique often called “sandwich construction”.

Boeing 314 Clipper flying boat cutaway (left) and front schematics of Boeing 747-300 (right).  Re-using some of an earlier design for a bomber which failed to meet the military’s performance criteria, between 1938-1941, Boeing built twelve 314 Clippers, long-range flying boats with the range to cross both the Atlantic and Pacific oceans.  Although used by the military during World War II, most of their service was with the two commercial operators Pan Am (originally Pan American Airways) and BOAC (British Overseas Airways Corporation).  Very much a machine of the pre-war age, the last Clippers were retired from service between 1946-1948, the advances in aviation and ground infrastructure built during war-time rendering them obsolete and prohibitively expensive to maintain.

Because train designers adopted the nautical terminology, it naturally came to be used also in buses, and aircraft, the term “flight deck” (where the pilot(s) sat) common even before multiple decks appeared on flying boats and other long-distance airframes.  The famous “bubble” of the Boeing 747 (1968-2023) remains one of the best known decks and although most associated with the glamour of first-class international travel, was designed originally as a freight compartment.  The multi-deck evolution continued and the Airbus A380 (2005-2021) was the first “double decker” with two passenger decks extending the full length of the fuselage (with cargo & baggage) carried in the space beneath hence the frequent description of the thing as a “triple decker”.

Lindsay Lohan contemplating three decker sandwich, now usually called a “club sandwich”.  Many menus do specify the number of decks in the clubs.

Deck widely was used of many raised flat surface which people could walk or stand upon (balcony, porch, patio, flat rooftop etc) and came to be used of the floor-like covering of the horizontal sections or compartments, of a ship, a use later extended to land transport (trains, busses etc) and in the twentieth century, to aircraft.  A pack or set of playing cards can be called a deck as (less commonly), can the dealt cards which constitute the “hand” of each player and the notion was extended to sets of just about anything vaguely similar (such as a collection of photographic slides). , Because slides tended to be called a “deck” only when in their magazine, this influenced the later use in IT when certain objects digitally were assemble for storage or use and in audio and video use when cartridges or cassettes were loaded into “tape decks”.  In print journalism, a deck is a headline consisting of one or more full lines of text (applied especially to a sub-headline).  The slang use in the trade of illicit narcotics to describe the folded paper used for distributing drugs was a US regionalism.  There are dozens of idiomatic and other uses of deck, the best known including “all hands on deck”, “swab the decks”, “hit the deck” “clear the decks”, “deck-chair”, “deckhand”, “deck shoes”, “flight deck”, “gun deck”, “observation deck”, “play with a full deck”, “promenade deck”, “re-arranging the deck chairs on the Titanic”, “decked out”, “stack the deck”, “sun deck”, “top deck” & “to deck someone”.

Schematic of the Royal Navy’s HMS Victory, a 104-gun first-rate ship of the line, laid down in 1759 and launched in 1765, most famous as the flagship of Admiral Lord Nelson’s (1758-1805) flagship at the Battle of Trafalgar on 21 October 1805; it was on her Nelson was killed in battle.  Uniquely, after 246 years on the active list, she is the world's oldest naval vessel still in commission.  Although the term wasn’t in use until the 1830s, Victory was a “five decker” configured thus:

Orlop Deck: The lowest deck, mainly used for storage and ship's equipment.
Lower Gun Deck: The deck housing the heaviest cannons.
Middle Gun Deck: This deck contained another set of guns, slightly lighter than those on the lower gun deck.
Upper Gun Deck: The third level of guns, with even lighter cannons.
Quarterdeck and Forecastle: The uppermost decks, where the captain and officers usually directed the ship during battle.

The early meanings in English evolved from “covering” to “platform of a ship” because of the visual similarity and it’s thought the idea of a deck being a “pack of cards” (noted in the 1590s) was based on them being stacked like the decks of a multi-deck man-of-war (warship).  The tape-deck was first so described in 1949 an was a reference to the flat surface of the old reel-to-reel tape recorders.  The first deck chairs were advertised in 1844, an allusion to the use of such thing on the decks of passenger ocean liners and deck shoes were those with sturdy rubber soles suitable for use on slippery surfaces; the modern “boat shoes” are a descendent.  The old admiralty phrase “clear the decks” dated from the days of the tall-masted warships (the best known of which was the big “ship-of-the-line”) and was a reference to the need to remove from the main deck the wreckage resulting from an attack (dislodged masts, sails, spas etc) to enable the battle to be rejoined without the obstructions.  Being made of wood, the ships were hard to sink but highly susceptible to damage, especially to the rigging which, upon fragmentation, tended to fall to the deck.  It may have been a adaptation of the French army slang débarasser le pont (clear the bridge).

Ford 302 cubic inch (4.9 litre) Windsor V8 with the standard deck (left) and the raised deck 351 (5.8) (right).  In production in various displacements between 1961-2000, the 221 (3.6), 255 (4.2), 260 (4.3), 289 (4.7) & 302 (4.9) all used what came retrospectively to be called the “standard deck” while the 351 (5.8) was the sole “raised deck” version.

For decades, it was common for US manufacturers to increase the displacement of their V8 engines but means of creating a “raised deck” version, the process involving raising the height of the engine block's deck surface (the surface where the cylinder heads bolt on).  What this allowed was the use of longer connecting rods while using the original heads and pistons which in combination with a “longer stroke crankshaft” increases the displacement (the aggregate volume of all cylinders).  The industry slang for such things was “decker” and the technique was used with other block configurations but is best known from the use in the 1960s & 1970s for V8s because it’s those which tend to be fetishized.  The path to greater displacement lay either in lengthening the stroke or increasing the bore (or a combination of the two) and while there were general engineering principles (longer stroke=emphasis on more torque at the cost of reducing maximum engine speed and bigger bore=more power and higher engine speeds) but there were limitations in how much a bore could safely be increased including the available metal.  A bigger bore (ie increasing the internal diameter of the cylinder) reduces the thickness of the cylinder walls and if they become too thing, there can be problems with cooling, durability or even the structural integrity of the block.  The piston size also increases which means the weight increases and thus so too does the reciprocating mass, increasing friction, wear and has the potential to compromise reliability, especially at high engine speeds.

Increasing the stroke will usually enhance the torque output, something of greater benefit to most drivers, most of the time than the “top end power” most characteristic of the “big bore” approach.  In street use, most engines spend most time at low or mid-range speed and it’s here a longer stroke tends to produce more torque so it has been a popular approach and the advantage for manufacturers is that creating a “decker” almost always is easier, faster and cheaper than arranging one which will tolerate a bigger bore, something which can demand a new block casting and sometimes changes to the physical assembly line.  With a raised deck, there can be the need to use different intake and exhaust manifolds and some other peripheral components but it’s still usually a cheaper solution than a new block casting.  Ford’s “thinwall” Windsor V8 was one of the longest-serving deckers (although the raised-deck version didn’t see out the platform’s life, the 351 (introduced in 1969) retired in 1997).  Confusingly, during the Windsor era, Ford also produced other 351s which belonged to a different engine family.  Ford didn’t acknowledge the biggest Windsor's raised deck in its designation but when Chrysler released a decker version of the “B Series” big-block V8 (1958-1978), it was designated “RB” (Raised B) and produced between 1959-1979.

1964 AEC Routemaster double decker Bus RM1941 (ALD941B) (left), two sightseeing AEC Routemasters in Christchurch, New Zealand (centre) and one of the "new" Routemasters, London 2023 (right).

London’s red, double-decker busses are one of the symbols most associated with the city and a fixture in literature, art and films needing something with which to capture the verisimilitude.  The classic example of the breed was the long-running AEC Routemaster, designed by the London Transport Board and built by the Associated Equipment Company (AEC) and Park Royal Vehicles.  The Routemaster entered service in 1956 and remained in production until 1968, changed over those years in many details but visually there was such continuity that it takes an expert (and buses are a thing so experts there are) to pick the model year.  They entered service in 1956 and remained in regular service until 2005 although some were retained as “nostalgia pieces” on designated “tourist” routes until COVID-19 finally saw their retirement; since then, many have been repurposed for service around the world on sightseeing duties and other tourist projects.

Boris Johnson (b 1964; UK prime-minister 2019-2022) will leave an extraordinary political legacy which in time might come to be remembered more fondly than it now may appear but one of his most enduring achievements is likely to be the “New Routemaster” which had the typically bureaucratic project name “New Bus for London” but came to be known generally as the “Boris Bus”, the honor accorded by virtue of him championing the idea while serving as Lord Mayor of London (2008-2016).  In truth, the original Routemaster, whatever its period charm, was antiquated years before it was withdrawn from service and although the doorless design made ingress and egress convenient, it was also dangerous and apparently a dozen passenger fatalities annually was not uncommon.  The Borisbus entered service in 2012 and by 2024 almost 1200 were in service.

1930 Lancia Omicron with 2½ deck coachwork and a clerestoried upper windscreen (left) and a “three decker” bus in Pakistan (right).

The Lancia Omicron was a bus chassis produced between 1927-1936; over 600 were built in different wheelbase lengths with both two and three-axle configurations.  Most used Lancia's long-serving, six-cylinder commercial engine but, as early as 1933, some had been equipped with diesel engines which were tested in North Africa where they proved durable and, in the Sudan, Ethiopia, Libya and Algeria, once petrol powered Omicron chassis were being re-powered with diesel power-plants from a variety of manufacturers as late as the 1960s.  Typically of bus use, coachbuilders fabricated many different styles of body but, in addition to the usual single and double deck arrangements, the Omicron is noted for a number of two and a half deck models, the third deck configured usually as a first-class compartment but in at least three which operated in Italy, they were advertised as “smoking rooms”, the implication presumably that the rest of the passenger compartment was smoke-free.  History doesn't record if the bus operators were any more enthusiastic about or successful in enforcing smoking bans than the usual Italian experience.  For a variety of reasons, busses with more than 2.something decks were rare and the Lancias and Alfa Romeos which first emerged in the 1920s were unusual.  However, the famously imaginative and inventive world of Pakistani commerce has produced a genuine “three decker” bus, marketed as the “limousine bus”.  What the designer did was take a long-distance, double decker coach and use the space allocated usually as a luggage compartment to configure as the interior of a long wheelbase (LWB) limousine, thereby creating a “first class” section, the four rows of seating accessible via six car-like (ie limousine) doors.

Tuesday, October 8, 2024

Decorum

Decorum (pronounced dih-kawr-uhm or dih-kohr-uhm)

(1) Dignified propriety of behavior, speech, dress, demeanour etc.

(2) The quality or state of being decorous, or exhibiting such dignified propriety; orderliness; regularity.

(3) The conventions of social behaviour; an observance or requirement of one’s social group (sometimes in the plural as “decorums” the use an allusion to the many rules of etiquette (the expectations or requirements defining “correct behaviour” which, although most associated with “polite society”, do vary between societal sub-sets, differing at the margins)).

1560–1570: A learned borrowing (in the sense of “that which is proper or fitting in a literary or artistic composition”) from the Latin decōrum, noun use of neuter of decōrus (proper, decent (ie decorous) from decor (beauty, elegance, charm, grace, ornament), probably from decus (an ornament; splendor, honor), the Proto-Italic dekos (dignity), from the primitive Indo-European os (that which is proper), from de- (take, perceive) (and used in the sense of “to accept” on the notion of “to add grace”).  By the 1580s the use of decorum has spread from its literary adoption from the Latin to the more generalized sense of “propriety of speech, behavior or dress; formal politeness”, a resurrection of the original sense in Latin (polite, correct in behaviour, that which is seemly).  Decorously (in a decorous manner) is an adverb, decorousness (the state or quality of being decorous; a behavior considered decorous) is a noun, indecorous (improper, immodest, or indecent) and undecorous (not decorous) are adjectives).  The adjective dedecorous (disgraceful; unbecoming) is extinct.  Decorum is a noun; the noun plural is decora or decorums.

Whether on rugby pitches, race tracks, in salons & drawing rooms or geo-politics, disagreements over matters of decorum have over millennia been the source of innumerable squabbles, schisms and slaughter but linguistically, the related adjective decorous (characterized by dignified propriety in conduct, manners, appearance, character, etc) has also not been trouble-free.  Decorous seems first to have appeared in the 1650s from the Latin decōrus and akin to both decēre (to be acceptable, be fitting) and docēre (to teach (in the sense of “to make fitting”) with the adjectival suffix –ōsus appended.  In Latin, the -ōsus suffix (full, full of) was a doublet of -ose in an unstressed position and was used to form adjectives from nouns, to denote possession or presence of a quality in any degree, commonly in abundance.  English picked this up from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus and it became productive.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  Decorous is an adjective, decorousness is a noun and decorously is an adverb.

In use there are two difficulties with decorous: (1) the negative forms and (2) how it should be pronounced, both issues with which mercifully few will be troubled (or even see what the fuss is about) but to a pedantic subset, much noted.  The negative forms are undecorous & indecorous (both of which rarely are hyphenated) but the meanings are differences in the meaning.  Undecorous means simply “not decorous” which can be bad enough but indecorous is used to convey “improper, immodest, or indecent” which truly can be damning in some circles so the two carefully should be applied.  There’s also the negative nondecorous but it seems never to have been a bother.  The problem is made worse by the adjective dedecorous (disgraceful; unbecoming) being extinct; it would have been a handy sort of intermediate state between the “un-” & “in-” forms and the comparative (more dedecorous) & superlative (most dedecorous) would have provided all the nuance needed.  The related forms are the nouns nondecorousness, indecorous & indecorous and the adverbs nondecorously, undecorously & undecorously.

The matter of the pronunciation of decorous is one for the pedants but there’s a lot of them about and like décor, the use is treated as a class-identifier, the correlation between pedantry and class-identifiers probably high; the two schools of thought are  dek-er-uhs & dih-kawr-uhs (the second syllable -kohr- more of a regionalism) and in 1926 when the stern Henry Fowler (1858–1933) published his A Dictionary of Modern English Usage, he in his prescriptive way insisted on the former.  By 1965, when the volume was revised by Sir Ernest Gowers (1880–1966), he noted the “pronunciation has not yet settled down”, adding that “decorum pulls one way and decorate the other”.  In his revised edition, Sir Ernest distinguished still between right & wrong (a position from which, regrettably, subsequent editors felt inclined to retreat) but had become more descriptive than his predecessor of how things were done rather than how they “ought to be” done and added while “most authorities” had come to prefer dih-kawr-uhs, that other arbiter, the Oxford English Dictionary (OED) had listed dek-er-uhs first and it thus “may win”.  By the 2020s, impressionistically, it would seem it has.

Décor is another where the pronunciation can be a class-identifier and in this case it extend to the spelling, something directly related.  In English, the noun décor dates from 1897 in the sense of “scenery and furnishings” and was from the eighteenth century French décor, a back-formation from the fourteenth century décorer (to decorate), from the Latin decorare (to decorate, adorn, embellish, beautify), the modern word thus duplicating the Latin decor.  The original use in English was of theatre stages and such but the term “home décor” was in use late in the 1890s to described the technique of hanging copies of old masters as home decoration.  From this evolved the general use (decorations and furnishings of a room, building etc), well established by the mid 1920s and it’s been with us ever since.  Typically sensibly, the French l'accent aigu (acute accent) (the “é” pronounced ay in French) was abandoned by the Americans without corroding society but elsewhere, décor remained preferred by among certain interior decorators and their clients, the companion French pronunciation obligatory too.

Courtoom decorum: Lindsay Lohan arriving at court, Los Angeles, 2011-2013.  All the world's a catwalk.

Top row; left to right: 9 Feb 2011; 23 Feb; 2011; 10 Mar 2011; 22 Apr 2011.
Centre row; left to right: 23 Jun 2011; 19 Oct 2011; 2 Nov 2011; 14 Dec 2011.
Bottom row; left to right: 17 Dec 2011; 30 Jan 2012; 22 Feb 2012; 28 Mar 2012.

In English, the original use of decorum was in the technical jargon of what word come to be called literary theory; decorum describing a structuralist adherence to formal convention.  It was applied especially to poetry where rules of construction abound and it was about consistency with the “canons of propriety” (in this context defined usually as “good taste, good manners & correctness” which in our age of cultural (and linguistic) relativism is something many would label as “problematic” but all are free to “plug-in” their own standards).  Less controversially perhaps, decorum was understood as the matter of behavior on the part of the poet qua ("in the capacity or character of; as being" and drawn from the Latin legal qua (acting in the capacity of, acting as, or in the manner of)) their poem and therefore what is proper and becoming in the relationship between form and substance.  That needs to be deconstructed: decorum was not about what the text described because the events variously could be thought most undecorous or indecorous but provided the author respected the character, thought and language appropriate to each, the literary demands of decorum were satisfied.  Just as one would use many different words to describe darkness compared to those used of sunlight, a work on a grand and profound theme should appear in a dignified and noble style while the trivial or humble might be earthier.

The tradition of decorum is noted as a theme in the works by the Classical authors from Antiquity but the problem there is that we have available only the extant texts and they would be but a fragment of everything created and it’s acknowledged there was much sifting and censoring undertaken in the Medieval period (notably by priests and monks who cut out “the dirty bits” and it’s not known how much was destroyed because it was thought “worthless” or worse “obscene”.  What has survived may be presumed to be something of the “best of” Antiquity and there’s no way of knowing if in Athens and Rome there were proto-post modernists who cared not a fig for literary decorum.  The Greek and Roman tradition certainly seems to have been influential however because decorum is obvious in Elizabethan plays.  In William Shakespeare’s (1564–1616) Much Ado About Nothing (circa 1598), the comic passages such as the badinage between Beatrice and Benedick appear for amusing effect in colloquial dramatic prose while the set-piece romantic episodes are in formal verse; the very moment Benedick and Beatrice realize they are in love, that rise in the emotional temperature is signified by them suddenly switched to poetic verse.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

By contrast, in rhetoric, the conventions of literary decorum were probably most useful when being flouted.  Winston Churchill’s (1875-1965; UK prime-minister 1940-1945 & 1951-1955) World War II (1939-1945) speeches are remembered now for their eloquence and grandeur but there’s much evidence that at the time many listeners regarded their form as an anachronism and preferred something punchier but what made them effective was the way he could mix light & dark, high and low to lend his words a life which transcended the essential artificiality of a speech.  Once, when discussing serious matter of international relations and legal relationships between formerly belligerent powers, he paused to suggest that while Germany might be treated harshly after all that had happened, the Italians “…might be allowed to work their passage back.” [to the community of the civilized world].  What the flouting of decorum could do was make something worthy but dull seem at least briefly interesting or at least amusing, avoiding what the British judge Lord Birkett (1883–1962) would have called listening to “the ‘refayned’ and precious accents of a decaying pontiff.

In English literature, it was during the seventeenth & eighteenth centuries that decorum became what might now be called a fetish, a product of the reverence for what were thought to be the “Classical rules and tenets” although quite how much these owned to a widespread observance in Antiquity and how much to the rather idealized picture of the epoch painted by medieval and Renaissance scholars really isn’t clear.  Certainly, in the understanding of what decorum was there were influences ancient & modern, Dr Johnson (Samuel Johnson (1709-1784)) observing that while terms like “cow-keeper” or “hog-herd” would be thought too much the vulgar talk of the peasantry to appear in “high poetry”, to the Ancient Greeks there were no finer words in the language.  Some though interpolated the vulgarity of the vernacular just because of the shock value the odd discordant word or phrase could have, the English poet Alexander Pope (1688-1744) clearly enjoying mixing elegance, wit and grace with the “almost brutal forcefulness” of the “the crude, the corrupt and the repulsive” and it’s worth noting he made his living also as a satirist.  His example must have appealed to the Romantic poets because they sought to escape the confines imposed by the doctrines of Neoclassicism, William Wordsworth (1770–1850) writing in the preface to Lyrical Ballads (1798 and co-written with Samuel Taylor Coleridge (1772-1834)) that these poems were here to rebel against “false refinement” and “poetic diction”.  He may have had in mind the odd “decaying pontiff”.

Sunday, October 6, 2024

Monolith

Monolith (pronounced mon-uh-lith)

(1) A column, large statue etc, formed originally from a single block of stone but latter day use applies the term to structures formed from any material and not of necessity a single piece (although technically such a thing should be described using the antonym: “polylith”.

(2) Used loosely, a synonym of “obelisk”.

(3) A single block or piece of stone, especially when used in architecture or sculpture and applied most frequently to large structures.

(4) Something (or an idea or concept) having a uniform, massive, redoubtable, or inflexible quality or character.

(5) In architecture, a large hollow foundation piece sunk as a caisson and having a number of compartments that are filled with concrete when it has reached its correct position

(6) An unincorporated community in Kern County, California, United States (initial capital).

(7) In chemistry, a substrate having many tiny channels that is cast as a single piece, which is used as a stationary phase for chromatography, as a catalytic surface etc.

(8) In arboreal use, a dead tree whose height and size have been reduced by breaking off or cutting its branches (use rare except in UK horticultural use).

1829: The construct was mono- + lith.  Mono was from the Ancient Greek, a combining form of μόνος (monos) (alone, only, sole, single), from the Proto-Hellenic mónwos, from the primitive Indo-European mey- (little; small).  It was related to the Armenian մանր (manr) (slender, small), the Ancient Greek μανός (manós) (sparse, rare), the Middle Low German mone & möne, the West Frisian meun, the Dutch meun, the Old High German muniwa, munuwa & munewa (from which German gained Münne (minnow).  As a prefix, mono- is often found in chemical names to indicate a substance containing just one of a specified atom or group (eg a monohydrate such as carbon monoxide; carbon attached to a single atom of oxygen). 

In English, the noun monolith was from the French monolithe (object made from a single block of stone), from Middle French monolythe (made from a single block of stone) and their etymon the Latin monolithus (made from a single block of stone), from the Ancient Greek μονόλιθος (monólithos) (made from a single block of stone), the construct being μονο- (mono-) (the prefix appended to convey the meaning “alone; single”), from μόνος (monos) + λίθος (líthos) (a stone; stone as a substance).  The English form was cognate with the German monolith (made from a single block of stone).  The verb was derived from the noun.  Monolith is a noun & verb, monolithism, monolithicness & monolithicity are nouns, monolithic is an adjective and monolithically is an adverb; the noun plural is monoliths.  The adjective monolithal is listed as "an archaic form of monolithic".

Monolith also begat two back-formations in the technical jargon of archaeology: A “microlith” is (1) a small stone tool (sometimes called a “microlite”) and (2) the microscopic acicular components of rocks.  A “megalith” is (1) a large stone slab making up a prehistoric monument, or part of such a monument, (2) A prehistoric monument made up of one or more large stones and (3) by, extension, a large stone or block of stone used in the construction of a modern structure.  The terms seem not to be in use outside of the technical literature of the profession.  The transferred and figurative use in reference to a thing or person noted for indivisible unity is from 1934 and is now widely used in IT, political science and opinion polling.  The adjective monolithic (formed of a single block of stone) was in use by the early nineteenth century and within decades was used to mean “of or pertaining to a monolith”, the figurative sense noted since the 1920s.  The adjective prevailed over monolithal which seems first to have appeared in a scientific paper in 1813.  The antonym in the context of structures rendered for a single substance is “polylith” but use is rare and multi-component constructions are often described as “monoliths”.  The antonym in the context of “anything massive, uniform, and unmovable, especially a towering and impersonal cultural, political, or social organization or structure” is listed by many sources as “chimera” but terms like “diverse”, “fragmented” etc are usually more illustrative for most purposes.  In general use, there certainly has been something of a meaning-shift.  While "monolith" began as meaning "made of a single substance", it's now probably most used to covey the idea of "something big & tall" regardless of the materials used.

One of the Monoliths as depicted in the film 2001: A Space Odyssey (1968). 

The mysterious black structures in Sir Arthur C Clarke's (1917–2008) Space Odyssey series (1968-1997) became well known after the release in 1968 of Stanley Kubrick's (1928–1999) film of the first novel in the series, 2001: A Space Odyssey.  Although sometimes described as “obelisk”, the author noted they were really “monoliths”.  In recent years, enthusiasts, mischief makers and click-bait hunters have been erecting similar monoliths in remote parts of planet Earth, leaving them to be discovered and publicized.  With typical alacrity, modern commerce noted the interest  and soon, replicas were being offered for sale, a gap in the market for Christmas gifts between US$10,000-45,000 apparently identified.

The terms “obelisk” and “monolith” are sometimes used interchangeably and while in the case of many large stone structures this can be appropriate, the two terms have distinct meanings.  Classically, an obelisk is a tall, four-sided, narrow pillar that tapers to a pyramid-like point at the top.  Obelisks often are carved from a single piece of stone (and are thus monolithic) but can also be constructed in sections and archaeologists have discovered some of the multi-part structures exists by virtue of necessity; intended originally to be a single piece of stone, the design was changed after cracks were detected.  A monolith is a large single block stone which can be naturally occurring (such as a large rock formation) or artificially shaped; monoliths take many forms, including obelisks, statues and even buildings.  Thus, while an obelisk can be a monolith, not all monoliths are obelisks.

Highly qualified German content provider Chloe Vevrier (b 1968) standing in front of the Luxor Obelisk, Paris 2010.

The Luxor Obelisk sits in the centre of the Place de la Concorde, one of the world’s most photographed public squares.  Of red granite, 22.5 metres (74 feet) in height and weighing an estimated 227 tonnes (250 short (US) tons), it is one of a pair, the other still standing front of the first pylon of the Luxor Temple on the east bank of the Nile River, Egypt.  The obelisk arrived in France in May 1833 and less than six month later was raised in the presence of Louis Philippe I (1773–1850; King of the French 1830-1848).  The square hadn’t always been a happy place for kings to stand; in 1789 (then known as the Place de Louis XV) it was one of the gathering points for the mobs staging what became the French Revolution and after the storming of the Bastille (of of history’s less dramatic events despite the legends), the square was renamed Place de la Revolution, living up to the name by being the place where Louis XVI (1754–1793; King of France 1774-1792), Marie Antoinette (1755–1793; Queen Consort of France 1774-1792) and a goodly number of others were guillotined.  Things were calmer by 1833 when the obelisk was erected.

The structure was a gift to France by Pasha Mehmet Ali (1769–1849, Ottoman Albanian viceroy and governor of Egypt 1805-1848) and in return Paris sent a large mechanical clock which to this day remains in place in the clock tower of the mosque at the summit of the Citadel of Cairo and of the 28 obelisks, six remain in Egypt with the rest in various displays around the world.  Some 3000 years old, in its original location the Obelisk contributed to scientific history wine in circa 250 BC Greek geographer & astronomer Eratosthenes of Cyrene (circa 276 BC–circa 195 BC) used the shadow it cast to calculate the circumference of the Earth.  By comparing the shadow at a certain time with one in Alexandria, he concluded that the difference in distance between Alexandria and Aswan was seven degrees and 14 minutes and from this he could work out the Earth’s circumference.

Monolithic drivers

In IT, the term “monolithic driver” was used to refer to a software driver designed to handle multiple hardware components or functionalities within a single, large, and cohesive codebase.  In this it differed from earlier (and later) approaches which were modular or layered, the functionality is split into separate, smaller drivers or modules, each of which handled specific tasks or addressed only certain hardware components.  Monolithic drivers became generally available in the late 1980s, a period when both computer architecture and operating systems were becoming more sophisticated in an attempt to overcome the structural limitations imposed by the earlier designs.  It was in the era many of the fundamental concepts which continue to underpin modern systems were conceived although the general adoption of some lay a decade or more away.

During the 1970s & 1980s, many systems were built with a tight integration between software and hardware and some operating systems (OS) were really little more than “file loaders” with a few “add-ons”, and the limitations imposed were “worked-around” by some programmers who more-or-less ignored the operating system an address the hardware directly using “assemblers” (a flavor of “machine-code”).  That approach made for fast software but at the cost of interoperability and compatibility, such creations hardware specific rather using an OS as what came to be known as the HAL (hardware abstraction layer) but at the time, few OSs were like UNIX with its monolithic kernel in which the core OS services (file system management, device drivers etc.) were all integrated into a single large codebase.  As the market expanded, it was obvious the multi-fork approach was commercially unattractive except for the odd niche.

After its release in 1981, use of the IBM personal computers (PC) proliferated and because of its open (licence-free) architecture, an ecosystem of third party suppliers arose, producing a remarkable array of devices which either “hung-off” or “plugged-in” a PC; the need for hardware drivers grew.  Most drivers at the time came from the hardware manufacturers themselves and typically were monolithic (though not yet usually described as such) and written usually for specific hardware and issues were rife, a change to an OS or even other (apparently unrelated) hardware or software sometimes inducing instability or worse.  As operating systems evolved to support more modularity, the term “monolithic driver” came into use to distinguish these large, single-block drivers from the more modular or layered approaches that were beginning to emerge.

It was the dominance of Novell’s Netware (1983-2009) on PC networks which compelled Microsoft to develop Windows NT (“New Technology”, 1993) and it featured a modular kernel architecture, something which made the distinction between monolithic and modular drivers better understood and as developers increasingly embraced the modular, layered approach which better handled maintainability and scalability.  Once neutral, the term “monolithic driver” became something of a slur in IT circles, notably among system administrators (“sysadmins” or “syscons”, the latter based on the “system console”, the terminal on a mainframe hard-wired to the central processor) who accepted ongoing failures of this and that as inherent to the business but wanted to avoid a SPoFs (Single Point of Failure).

In political science, the term “monolithic” is used to describe a system, organization, or entity perceived as being unified, indivisible, and operating with a high degree of internal uniformity, often with centralized control. When something is labeled as monolithic, it implies that it lacks diversity or internal differentiation and presents a singular, rigid structure or ideology.  Tellingly, the most common use of the term is probably when analyzing electoral behavior and demonstrating how groups, societies or sub-sets of either. Although often depicted in the media as “monolithic” in their views, voting patterns or political behavior are anything but and there’s usually some diversity.  In political science, such divergences within defined groups are known as “cross-cutting cleavages”.

It’s used also of political systems in which a regime is structured (or run) with power is highly concentrated, typically in a single dictator or ruling party.  In such systems, usually there is little effective opposition and dissent is suppressed (although some of the more subtle informally tolerate a number of “approved dissenters” who operated within understood limits of self-censorship.  The old Soviet Union (the USSR (Union of Soviet Socialist Republics) 1922-1991), the Islamic Republic of Iran (1979-), the Republic of China (run by the Chinese Communist Party (CCP) (1949-) and the DPRK (Democratic Republic of Korea (North Korea) 1948-) are classic examples of monolithic systems; while the differences between them were innumerable, structurally all were (or are) politically monolithic.  The word is used also as a critique in the social sciences, Time magazine in April 2014 writing of the treatment of “Africa” as a construct in Mean Girls (2004):  Like the original Ulysses, Cady is recently returned from her own series of adventures in Africa, where her parents worked as research zoologists. It is this prior “region of supernatural wonder” that offers the basis for the mythological reading of the film. While the notion of the African continent as a place of magic is a dated, rather offensive trope, the film firmly establishes this impression among the students at North Shore High School. To them, Africa is a monolithic place about which they know almost nothing. In their first encounter, Karen inquires of Cady: “So, if you’re from Africa, why are you white?” Shortly thereafter, Regina warns Aaron that Cady plans to “do some kind of African voodoo” on a used Kleenex of his to make him like her—in fact, the very boon that Cady will come to bestow under the monomyth mode.”  It remains a sensitive issue and one of the consequences of European colonial practices on the African continent (something which included what would now be regarded as "crimes against humanity) so the casual use of "Africa" as a monolithic construct is proscribed in a way a similar of "Europe" would not attract criticism.    

The limitations of the utility of the term mean it should be treated with caution and while there are “monolithic” aspects or features to constructs such as “the Third World”, “the West” or “the Global South”, the label does over-simplify the diversity of cultures, political systems, and ideologies within these broad categories.  Even something which is to some degree “structurally monolithic” like the United States (US) or the European Union (EU) can be highly diverse in terms of actual behavior.  In the West (and the modern-day US is the most discussed example), the recent trend towards polarization of views has become a popular topic of study and the coalesced factions are sometimes treated as “monolithic” despite in many cases being themselves intrinsically factionalized.

Saturday, October 5, 2024

Ballistic

Ballistic (pronounced buh-lis-tik)

(1) A projected object having its subsequent travel determined or describable by the laws of exterior ballistics, most used in denoting or relating to the flight of projectiles after the initial thrust has been exhausted, moving under their own momentum and subject to the external forces of gravity and the fluid dynamics of air resistance

(2) Of or relating to ballistics.

(3) In slang and idiomatic use, (as “go ballistic”, “went ballistic” etc), to become overwrought or irrational; to become enraged or frenziedly violent.  For those who need to be precise is describing such instances, the comparative is “more ballistic” and the superlative “most ballistic”.

(4) Of a measurement or measuring instrument, depending on a brief impulse or current that causes a movement related to the quantity to be measured

(5) Of materials, those able to resist damage (within defined parameters) by projectile weapons (ballistic nylon; ballistic steel etc), the best-know use of which is the “ballistics vest”.

(6) As “ballistics gel(atin)”, as substance which emulates the characteristics and behavior under stress of human or animal flesh (used for testing the effect of certain impacts, typically shells fired from firearms).

(7) As “ballistic podiatry”, industry slang for “the act of shooting oneself in the foot”, used also by military doctors to describe soldiers with such self-inflicted injuries.  The more general term for gunshot wounds is “ballistic trauma”

(8) In ethno-phonetics, as “ballistic syllable”, a phonemic distinction in certain Central American dialects, characterized by a quick, forceful release and a rapid crescendo to a peak of intensity early in the nucleus, followed by a rapid, un-controlled decrescendo with fade of voicing.

(9) As “ballistic parachute”, a parachute used in light aircraft and helicopters, ejected from its casing by a small explosion.

1765–1775: The construct was the Latin ballist(a) (a siege engine (ancient military machine) for throwing stones to break down fortifications), from the Ancient Greek βαλλίστρα (ballístra), from βάλλω (bállō) (I throw). + -ic.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  The modern use (of the big military rockets or missiles (those guided while under propulsion, but which fall freely to their point of impact (hopefully the intended target)) dates from 1949 although the technology pre-dated the label.  The term “ballistic missile” seems first to have appeared in 1954 and remains familiar in the “intercontinental ballistic missile” (ICBM).  The figurative use (“go ballistic”, “went ballistic”) to convey “an extreme reaction; to become irrationally angry” is said to have been in use only since 1981 which is surprising.  To “go thermo-nuclear” or “take the nuclear option” are companion phrases but the nuances do differ.  The noun ballistics (art of throwing large missiles; science of the motion of projectiles) seems first to have appeared in 1753 and was from the Latin ballist(a), from the Ancient Greek ballistes, from ballein (to throw, to throw so as to hit that at which the object is aimed (though used loosely also in the sense “to put, place, lay”)), from the primitive Indo-European root gwele- (to throw, reach).  In the technical jargon of the military and aerospace industries, the derived forms included (hyphenated and not) aeroballistic, antiballistic, astroballistic, ballistic coefficient, quasiballistic, semiballistic, subballistic, superballistic & thermoballistic.  In science and medicine, the forms include bioballistic, cardioballistic, electroballistic and neuroballistic.  Ballistic & ballistical are adjectives, ballisticity, ballistician & ballistics are nouns and ballistically is an adverb; the wonderful noun plural is ballisticies.

The basilisk was a class of large bore, heavy bronze cannons used during the late Middle Ages and in their time were a truly revolutionary weapon, able quickly to penetrate fortifications which in some cases had for centuries enabled attacks to be resisted.  Although there were tales of basilisks with a bores between 18-24 inches (460-610 mm), these were almost certainly a product of the ever-fertile medieval imagination and there’s no evidence any were built with a bore exceeding 5 inches (125 mm).  As a high-velocity weapon however, that was large enough for it to be highly effective, the 160 lb (72 kg) shot carrying a deadly amount of energy and able to kill personnel or destroy structures.  Because of the explosive energy needed to project the shot, the barrels of the larger basilicks could weigh as much as 4000 lb (1,800 kg); typically they were some 10 feet (3 m) in length but the more extraordinary, built as long-range devices, could be as long as 25 feet (7.6 m).  Despite the similarity in form, the name basilisk was unrelated to “ballistics” and came from the basilisk of mythology, a fire-breathing, venomous serpent able to kill and destroy, its glace alone deadly.  It was thus a two part allusion (1) the idea of “spitting fire” and (2) the thought the mere sight of an enemy’s big canons would be enough to scare an opponent into retreat.

As soon as it appeared in Europe, it was understood the nature of battlefields would change and the end of the era of the castle was nigh.  It was the deployment of the big cannons which led to the conquest of Constantinople (capital of the Byzantine Empire now Istanbul in the Republic of Türkiye) in 1453 after a 53 day siege; the city’s great walls which for centuries had protected it from assault were worn down by the cannon fire to the point where the defenders couldn’t repair the damage at the same rate as the destruction.  In an example of the way economics is a critical component of war, the Austrian cannon makers had offered the cannons to the Byzantines but the empire was in the throes of one of the fiscal crises which determined to outcomes of so many conflicts and had no money with which to make the purchase.  The practical Austrians then sold their basilisks to the attacking Ottoman army and the rest is history.  Despite such successes, the biggest of the basilisks became rare after the mid sixteenth century as military tactics evolved to counter their threat by becoming more mobile and the traditional siege of static targets became less decisive and smaller, more easily transported cannon, lighter and cheaper to produce, came to dominate artillery formations.

Queen Elizabeth's Pocket Pistol, Navy, Army and Air Force Institute Building, Dover Castle, Dover, Kent, England.

Queen Elizabeth's Pocket Pistol was a basilisk built in 1544 in Utrecht (in the modern-day Netherlands), the name derived from it being a presented to Henry VIII (1491–1547; King of England (and Ireland after 1541) 1509-1547) as a for his daughter (the future Elizabeth I (1533–1603; Queen of England & Ireland 1558-1603) although the first known reference to it being called “Queen Elizabeth's Pocket Pistol” dates from 1767. Some 24 feet (7.3 m) in length and with a 4.75 inch (121 mm) bore, it was said to be able to launch a 10 lb (4.5 kg) ball almost 2000 yards (1.8 km) although as a typical scare tactic, the English made it known to the French and Spanish that its shots were heavier and able to reach seven miles (12 km).  Just to makes sure the point was understood, it was installed to guard the approaches to the cliffs of Dover.  Modern understandings of the physics of ballistics and the use of computer simulations have since suggested there may have been some exaggeration in even the claim of a 2000 yard range and it was likely little more than half that.  Such use of propaganda remains part of the military arsenal to this day.

It was fake news:  Responding to viral reports, the authoritative E!-News in April 2013 confirmed Lindsay Lohan did not "go ballistic" and attack her ex-assistant at a New York City club.  For some reason, far and wide, the fake news had been believed.

Despite the costs involved and the difficulties in maintaining and transporting big cannons, some militaries couldn’t resist them and predictably, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945), who thought just about everything (buildings, tanks, trains, monuments, cars, battleships etc) should be bigger, oversaw some of the most massive artillery pieces ever built, often referred to by historians as “super heavy guns”.  The term is no exaggeration and the most striking example were the Schwerer Gustav and Dora.  With a bore of 31.5 inches (800 mm), the Schwerer Gustav and Dora apparatus weighed 1350 tons (1225 tonnes) and could fire a projectile as heavy as 7.1 tons (6.4 tonnes) some 29 miles (47 km).  Two were built, configured as “railway guns” and thus of most utility in highly developed areas where rail tracks lay conveniently close to the targets.  The original design brief from the army ordinance office required long-range device able to destroy heavily fortified targets and for that purpose, they could be effective.  However, each demands as crew of several thousand soldiers, technicians & mechanics with an extensive logistical support system in place to support their operation which could be fewer than one firing per day.  The Schwerer Gustav’s most successful deployment came during the siege of Sevastopol (1942).  Other big-bore weapons followed but success prove patchy, especially as allied control of the skies made the huge, hard to hid machines vulnerable to attack and even mounting them inside rock formations couldn’t resist the Royal Air Force’s (RAF) new, ground-penetrating bombs.

Schwerer Gustav being readied for a test firing, Rügenwalde, Germany, 19 March 1943, Hitler standing second from the right with Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) to his right.  Hitler referred to huge gun as “meine stählerne faust” (my steel fist) but it never fulfilled his high expectations and like many of the gigantic machines which so fascinated the Führer (who treated complaints about their ruinous cost as “tiresome”) it was a misallocation of scarce resources.

It was the development of modern ballistic rockets during World War II (1939-1945) which put an end to big guns (although the Iraqi army did make a quixotic attempt to resurrect the concept, something which involved having a British company “bust” UN (United Nations) sanctions by claiming their gun barrel components were “oil pipes”), the German’s A4 (V-2) rocket the world’s first true long-range ballistic missile. The V-2 represented a massive leap forward in both technology and military application and briefly it would touch “the edge of space” before beginning its ballistic trajectory, reaching altitudes of over 100 km (62 miles) before descending toward its target.  Everything in the field since has to some degree been an evolution of the V-2, the three previous landmarks being (1) the Chinese “Fire Arrows” of the early thirteenth century which were the most refined of the early gunpowder-filled rockets which followed a simple ballistic path, (2) the eighteenth century Indian Mysorean Rockets with the considerable advance of metal casings, the great range a shock to soldiers of the British Raj who had become accustomed to enjoying a technology advantage and (3) the British Congreve Rockets of the early nineteenth century, essentially a refinement of Mysorean enhanced by improved metallurgy and aerodynamics and made more effective still when combined with the well organized logistics of the British military.

Wednesday, October 2, 2024

Swagger

Swagger (pronounced swag-er)

(1) A manner, conduct, or gait thought an ostentatious display of arrogance and conceit.

(2) To walk or strut with a defiant or insolent air.

(3) To boast or brag noisily.

(4) To bring, drive, force, etc by means of bluster (now rare).

(5) Elegantly fashionable and confident (listed by some dictionaries as “rare” but in UK use it remains understood as a way of differentiating from “arrogant” and appears often in the form “a certain swagger” on the model of a phrase like “a certain grandeur”).

(6) In historic Australian (mostly rural) slang, an alternative name for a “swagman” or “swaggie” (an itinerant worker who carried a swag (a kind of roll-up bed) (archaic).  Swagman remains familiar in Australia because of the opening line of the bush ballad Waltzing Matilda: “Once a jolly swagman camped by a billabong”.

1580–1590: The construct was swag + -er and it was a frequentative form of swag (in the sense of “to sway”), an early use of which appears in William Shakespeare’s (1564–1616) A Midsummer Night's Dream (1595): “What hempen homespuns have we swaggering here?” (Puck in Act III, Scene 1) and it appears also in Henry IV, Part 2 (circa 1598) & King Lear (circa1605).  The verb swag (in the Shakespearian sense of “to strut in a defiant or insolent manner” (which then could also mean “a gait with a sway or lurch”) was from the Middle English swaggen, swagen & swoggen, probably from the Old Norse sveggja (to swing, sway) and may be compared with the dialectal Norwegian svaga (to sway, swing, stagger).  The meaning “to boast or brag” was in use by the 1590s to describe the antics of the concurrent agent-noun swaggerer (blusterer; bully; boastful, noisy fellow), the noun appearing in the early eighteenth century in the sense of “an insolent strut; a piece of bluster; a boastful manner”.  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  Swagger is a noun & verb, swaggerer is a noun, swaggering is an adjective and swaggeringly is an adverb; the noun plural is swaggers.  The verb (used with object) out-swagger was used as a kind of “loaded” superlative, suggesting someone’s swagger had been “topped” by that of another.

Swaggering: Lindsay Lohan in swagger coat, New York City, March 2024.

A swagger coat was a (usually) calf-length overcoat with a distinctive cut which flared out below the knee.  They became fashionable in the early decades of the twentieth century, the wide, roomy silhouette, often without a belt, allowing for a “swaggering” or flowing appearance when worn.  The relaxed fit lent the garment a casual elegance and they often were worn, cloak-like, cast over the shoulders.  Swagger coats were commonly made from heavier fabrics like wool or tweed, making them ideal for outerwear in cooler weather and their air of “quiet sophistication” has made them a timeless classic.  A swagger stick was a short stick carried by a military officer as a symbol of authority but should not be confused with a field-marshal’s baton which was a symbol of the highest military rank.  Swagger sticks were shorter than a walking-cane, tended to be made from rattan or bamboo and adorned with a polished metal tip or cap.  A symbol rather than a practical tool, they are still seen during formal parades or other ceremonial events.  A “swagger-jack” was someone who copied or imitated the actions, sayings or personal habits of another.  The word “swagger” often carries a negative connotation but there’s a long tradition in the UK of it being used to distinguish for someone thought “arrogant”.  When one reviewer wrote of the Rolling Stones album Beggars Banquet (1968) as being the band “at their most swaggeringly debauched”, he really was giving them a compliment.  Much can context influence meaning.

The Swagger Portrait

A swagger portrait is a grand, usually large and often ostentatious portrait, typically commissioned by wealthy or influential individuals to display their status, power and prestige.  The term came into use in the late nineteenth century at the height of the British Empire when countless generals, admirals, politicians, governors, viceroys and others less exalted (though perhaps more deserving) decided it was something they deserved.  The distinguishing characteristics were (1) an imposing dimensionality, larger than life renditions not uncommon, (2) elaborate staging and poses, (3) an attention to detail, something of significance to the subjects often were dripping with decorations or precious jewels which demanded to be captured with precision and (4) a certain grandeur, something at which some artists excelled.  An exemplar of the breed was John Singer Sargent (1856-1925).

Portrait of Theodore Roosevelt (1903; left), oil on canvas by Théobald Chartran (1849–1907) and Portrait of Theodore Roosevelt (1903; right), oil on canvas by John Singer Sargent.

Nobel Peace Prize laureate Theodore Roosevelt (1858–1919; US President 1901-1909), famous also for waging war and shooting wildlife, after being impressed by Théobald Chartran’s portrait of his wife, invited the French artist to paint him too.  He was so displeased with the result, which he thought made him look effete, he refused to hang the work and later supervised its destruction.  Roosevelt then turned instead to expatriate US artist John Singer Sargent.  The relationship didn’t start well as the two couldn’t agree on a setting and during one heated argument, the president suddenly, hand on hip, took on a defiant air while making a point and Sargent had his pose, imploring his subject not to move.  This one delighted Roosevelt and was hung in the White House.

Portrait of Madame X (1884), oil on canvas by John Singer Sargent, Metropolitan Museum of Art, Manhattan.

A controversial work in its time, Madame X was Virginie Amélie Avegno Gautreau (née Avegno; 1859–1915) a banker's wife.  Unusually in the tradition of swagger portraits, Madam X was not a commission but undertaken on the painter's initiative and he understood the critics as well as he knew his subjects, knowing the juxtaposition of a black satin gown and porcelain-white skin would create a sensation.  However he understood the Parisian bourgeoisie less well and after being exhibited at the Paris Salon of 1884, the public reception was such that Singer was just about run out of town.  However, the painting made his reputation and it remains his best known work.

The Duke of Wellington (1812), oil on canvas by Francisco Goya (1812-1814), The National Gallery, London.

Arthur Wellesley (1769-1852; First Duke of Wellington was a British military hero and a less successful Tory politician although he remains remembered as a classic “Ultra”, a calling which is a hallmark of twenty-first century ideology.  Goya’s work is a typical military swagger portrait and it was for his battlefield exploits rather than in parliament which saw him granted the rare distinction of a state funeral.

Portrait of Empress Eugénie (1854), oil on canvas by Franz Xaver Winterhalter (1805-1873), Metropolitan Museum of Art, Manhattan.

The Empress Eugénie (Eugénie de Montijo, 1826–1920, Condesa de Teba) was the wife of Napoleon III (Charles-Louis Napoléon Bonaparte, 1808–1873; first president of France (1848-1852) and the last monarch as Emperor (1852-1870)) and it wasn't an easy gig for her so she deserved a swagger portrait more than many, Winterhalter painting several.  They have many the elements of the swagger portraiture of royalty, lavish fabrics, the subject in regal attire, as much an almost as much an installation as any of the sumptuous surrounds, the message conveyed one of status, power and beauty.