Showing posts with label Economics. Show all posts
Showing posts with label Economics. Show all posts

Sunday, July 13, 2025

Etceterini

Etceterini (pronounced et-set-er-rhini)

One or all of the sports cars & racing cars produced in small volumes by a number of “boutique” Italian manufacturers during the quarter-century-odd following World War II (1939-1945).

1980s (though not attaining wide currency until publication in 1990): A portmanteau word, the construct being etcetera(a) + ini.  Etcetera was from the early fourteenth century Middle English et cetera (and other things; and so forth), from the Latin et cētera (and the other things; and the rest of the things), the construct being et (and) + cetera (the other things; the rest).  Et was from the Proto-Italic et, from the primitive Indo-European éti or heti and was cognate with the Ancient Greek ἔτι (éti), the Sanskrit अति (ati), the Gothic (and, but, however, yet) and the Old English prefix ed- (re-).  Cētera was the plural of cēterum, accusative neuter singular of cēterus (the other, remainder, rest), from the Proto-Italic ke-eteros, the construct being ke (here) +‎ eteros (other).  The Latin suffix -īnī was an inflection of -īnus (feminine -īna, neuter -īnum), from the Proto-Italic -īnos, from the primitive Indo-European -iHnos and was cognate with the Ancient Greek -ινος (-inos) and the Proto-Germanic -īnaz.  The suffix was added to a noun base (particularly proper nouns) to form an adjective, usually in the sense of “of or pertaining to and could indicate a relationship of position, possession or origin”.  Because the cars referenced tended to be small (sometimes very small), some may assume the –ini element to be an Italian diminutive suffix but in Italian the diminutive suffixes are like -ino, -etto, -ello & -uccio but etceterini works because the Latin suffix conveys the idea of “something Italian”.  It was used substantively or adverbially.  Until the early twentieth century, the most common abbreviation was “&c.” but “etc.” (usually with a surely now superfluous period (full-stop)) has long been the standard form.   Etcetera is a noun; the noun plural is etceteras

The word “etcetera” (or “et cetera”) fully has been assimilated into English and (except when used in a way which makes a historic reference explicit) is for most purposes no longer regarded as “a foreign word” though the common use has long been to use the abbreviation (the standard now: “etc”).  If for whatever reason there’s a need for a “conspicuously foreign” form then the original Latin (et cētera (or even the Anglicized et cetera)) should be used.  There is no definitive date on which the assimilation can be said to have been completed (or at least generally accepted), rather it was a process.  From the 1400s, the Middle English et cetera was used and understood by educated speakers, due to Latin's prominence in law, science, religion and academia with it by the mid-eighteenth century being no longer viewed as a “foreignism” (except of course among the reactionary hold-outs with a fondness for popery and ecclesiastical Latin: for them, in churches and universities, even in English texts, et cētera or et cetera remained preferred).  Scholars of structural linguistics use an interesting test to track the process of assimilation as modern English became (more or less) standardized: italicization.  With “et cetera” & “etcetera”, by the mid-eighteenth century, the once de rigour italics had all but vanished.  That test may no longer be useful because words which remains classified as “foreign” (such as raison d'être or schadenfreude) often now appear without italics.

The so-called “pronunciation spellings” (ekcetera, ekcetra, excetera & exetera) were never common and the abbreviations followed the same assimilative path.  The acceptance of the abbreviated forms in printed English more widespread still during the 1600s because of the advantages it offered printers, typesetters much attracted by the convenience and economy.  By early in the eighteenth century it was an accepted element (usually as “&c” which soon supplanted “et cet”) in “respectable prose”, appearing in Nathan Bailey’s (circa 1690-1742) An Universal Etymological English Dictionary (1721) and gaining the imprimatur of trend-setter Anglo-Irish author & satirist Jonathan Swift (1667–1745).  Dr Johnson (Samuel Johnson (1709-1784)) made much use of “&c” in his A Dictionary of the English Language (1755) and although Bailey’s dictionary was influential in the breadth of its comprehensiveness and remained, over 30 editions, in print until 1802, it’s Dr Johnson who is better remembered because he was became a “celebrity lexicographer” (a breed which today must sound improbable.)

One of the implications of linguistic assimilation is the effect on the convention applied when speaking from a written text.  Although wildly ignored (probably on the basis of being widely unknown), the convention is that foreign words in a text should be spoken in the original language only if that’s necessary for emphasis or meaning (such as Caudillo, Duce or Führer) or emphasis.  Where foreign terms are used in writing as a kind of verbal shorthand (such as inter alia (among other things)) in oral use they should be spoken in English.  However, the convention doesn’t extent to fields where the terms have become part of the technical jargon (which need not influence a path of assimilation), as in law where terms like inter alia and obiter (a clipping of obiter dictum (something said by a judge in passing and not a substantive part of the judgment)) are so entrenched in written and oral use that to translate them potentially might be misleading.

Lindsay Lohan (b 1986, left), Britney Spears (b 1981, centre) & Paris Hilton (b 1981, right), close to dawn, Los Angeles, 29 November 2006; the car was Ms Hilton's Mercedes-Benz SLR McLaren (C199 (2003-2009)).  This paparazzo's image was from a cluster which included the one used for the front page on Rupert Murdoch's (b 1931) New York Post with the still infamous headline “BIMBO SUMMIT”.  Even by the standards of the Murdoch tabloids, it was nasty.

So, the text written as: “Lindsay Lohan, Paris Hilton, Britney Spears et al recommend that while a handbag always should contain “touch-up & quick fix-up” items such as lipstick, lip gloss, and lip liner, the more conscientious should pack more including, inter alia, mascara, eyeliner, eyebrow pencil, concealer, a powder compact, a small brush set & comb etc.” would be read aloud as: “Lindsay Lohan, Paris Hilton, Britney Spears and others recommend that while a handbag always should contain “touch-up & quick fix-up” items such as lipstick, lip gloss, and lip liner, the more conscientious should pack more including, among other things, mascara, eyeliner, eyebrow pencil, concealer, a powder compact, a small brush set & comb etcetera.  Despite the cautions from purists (including just about every grammar text-book and style guide on the planet), the “choice” between “etc” and “et al” does seem to becoming blurred with many using seemingly using the two interchangeably.  The rules are (1) “etc” (and other things) is used of things (and according to the style guides should always appear with a period (full-stop) even though such use is archaic and another of those “needless tributes to tradition”) and (2) “et al” (and others) is used of people (especially in citations and again, always with a period).  So, “et al” can’t be used for things; strictly, it’s for things; it’ll be interesting to see if these rules survive into the next century.  Really, it's a silly rule and because it's hardly difficult to distinguish between a text string of "people" and one of "things", if used interchangeably, the two abbreviations are unlikely to confuse.  Et al was the abbreviation of the Latin et aliī (and others).

A Unix /etc directory.

In computing, Unix-based operating systems (OS) feature a directory (the word “folder” thought effete by the Unix community, most of whom are at their happiest when typing arcane commands at the prompt) called “etc” (along with /root, /boot, dev, /bin, /opt etc) which is used as a repository for system-wide configuration files and shell scripts used to boot and initialize the system.  Although there are many variants of the OS, typically an /etc directory will contain (1) OS configuration files (/etc/passwd; /etc/fstab; /etc/hosts), (2) system startup scripts (/etc/init.d or /etc/systemd/, (3) network configuration, (4) user login & environment configuration files and (5) application configuration files.  Originally (sometime in 1969-1970), the “etc” name was adopted because it was “an et cetera” in the literal sense of “and so on”, a place to store files which were essential but didn’t obviously belong elsewhere, a single “general purpose” directory used to avoid needless proliferation in the structure.  Rapidly Unix grew in complexity and configurability so the once “place for the miscellaneous” became the canonical location for configuration files, the original sense displaced but the name retained.  It is pronounced et-see (definitely not ee-tee-see or et-set-er-uh).  Despite their reputation, the Unix guys do have a joke (and there are unconfirmed rumors of a second).  Because so many of the files in /etc can be modified with any text-editor, in some documents earnestly it’s revealed /etc is the acronym of “Editable Text Configuration” but that is fake news; it's a backronym.

The Etceterini: exquisite creations with names ending in vowels

1954 Stanguellini 750 Sport.

In the tradition of mock-Latin, the word etceterini was a late twentieth century coining created to refer to the ecosystem of the numerous small-volume Italian sports & racing cars built in the early post-war years.  A portmanteau word, the construct being etceter(a) + ini, the idea was a word which summoned the idea of “many, some obscure” with an Italianesque flavor.  Credit for the coining is claimed by both automotive historian John de Boer (who in 1990 published The Italian car registry: Incorporating the registry of Italian oddities: (the etceterini register) and reviewer & commentator Stu Schaller who asserts he’d used it previously.  Whoever first released it into the wild (and it seems to have been in circulation as least as early as the mid-1980s) can be content because it survived in its self-defined niche and the evocative term has become part of the lexicon used by aficionados of post-war Italian sports and racing cars.  Being language (and in this English is not unique), it is of course possible two experts, working in the same field, both coined the term independently, the timing merely a coincidence.  Etceterini seems not to have been acknowledged (even as a non-standard form) by the editors of any mainstream English dictionary and surprisingly, given how long its history of use now is, even jargon-heavy publications like those from the Society of Automotive Engineers (SAE) haven’t yet added it to their lexicons.  It does though appear in specialist glossaries, car-model registry websites and niche discussion forums, especially those tied to classic Italian car culture (OSCA, Moretti, Stanguellini, Siata, Bandini, Ermini etc).  So, as a word it has sub-cultural & linguistic clarity but no status among the linguistic establishment.

1953 Siata 208S Barchetta.

John De Boer’s comprehensive The Italian car registry: Incorporating the registry of Italian oddities: (the etceterini register) was last updated in 1994 and remains the best-known publication on the many species of the genus etceterini and included in its 350-odd pages not only a wealth of photographs and cross-referenced details of specification but also lists chassis and engine numbers (priceless data for collectors and restoration houses in their quests for the often elusive quality of “originality”).  Nor are the personalities neglected, as well as some notable owners the designers and builders are discussed and there are sections devoted to coach-builders, a once vibrant industry driven almost extinct by regulators and the always intrusive realities of economics.  One thing which especially delights the collectors are the photographs of some of the obscure accessories of the period, some rendered obsolete by technology, some of which became essential standard-equipment and some seriously weird.  Mr De Boer’s book was from the pre-internet age when, except for a pampered handful in a few universities, “publication” meant paper and printing presses but such things are now virtualized and “weightless publication” is available instantly to all and there are small corners of the internet curated for devotees of the etceterini such as Cliff Reuter’s Etceteriniermini, a title which certainly takes some linguistic liberties.  Some trace the breed even to the late 1930s and such machines certainly existed then but as an identifiable cultural and economic phenomenon, they really were a post-war thing and although circumstances conspired to make their survival rare by the mid 1960s, a handful lingered into the next decade.

1957 Bandini 750 Sport Siluro.

That the ecosystem of the etceterini flourished in Italy in the 1950s was because the country was then a certain place and time and while the memorable scenes depicted in La Dolce Vita (1960) might have been illusory for most, the film did capture something from their dreams.  After the war, there was a sense of renewal, the idea of the “new” Italy as a young country in which “everybody” seemed young and for those who could, sports car and racing cars were compelling.  However, while there was a skilled labor force ready to build them and plenty of places in which they could be built, economics dictated they needed to be small and light-weight because the mechanical components upon which so many relied came from the Fiat parts bin and the most significant commonality among the etceterini were the small (often, by international standards, tiny) engines used otherwise to power the diminutive micro-cars & vans with which Fiat in the post-war years “put Italy on wheels”.  It was no coincidence so many of the small-volume manufacturers established their facilities near to Fiat’s factory in Torino, the closest thing the nation had to a Detroit.  In the early years, it wasn’t unknown for a donkey and cart carrying a few engines to make the short journey from the Fiat foundry to an etceterini’s factory (which was sometime little more than a big garage).  However, just because the things were small didn’t mean they couldn’t be beautiful and, being built by Italians, over the years there were some lovely shapes, some merely elegant but some truly sensuous.

1960 Stanguellini Formula Junior.

There was a high failure rate but many for years flourished and developed also lucrative “sideline” businesses producing lines of speed equipment or accessories for majors such as Fiat or Alfa Romeo and, as has happened in other industries, sometimes the success of these overtook the original concern, Nardi soon noticing their return on capital from selling thing popular custom steering wheels far exceeded what was being achieved from producing a handful of little sports cars, production of which quickly was abandoned with resources re-allocated to the accessory which had become a trans-Atlantic best-seller.  Whether things would have gone on indefinitely had the laissez-faire spirit of the time been allowed to continue can’t be known but by the 1960s, traffic volumes rapidly were increasing on the growing lengths of autostrade (the trend-setting Italian motorway system begun during the administration of Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) with accident rates & the death toll both climbing.  Italy, like many jurisdictions began to impose safety regulations which before long made small-scale production runs unviable but by then rising prosperity meant people were able to purchase their own Fiat or Alfa-Romeo and the etceterini faded into fond memory.  It is of course unthinkable such a thing could again happen because the EU (European Union) is now staffed by divisions of Eurocrats who spend their days in Masonic-like plotting and scheming to devise new reasons to say no, non, nein, nee, nein, não etc.  Had these bloodless bureaucrats existed in the 1940s, not one etceterini would ever have reached the street.

Saturday, July 5, 2025

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela (b 1957) Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurismo) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II (1939-1945) when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.

Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to bet on price movements and (2) producers and sellers to hedge against price movements and in both cases profits (and losses) can be booked against movement up or down.  Futures trading can be lucrative but is also risky, those who win gaining from those who lose and those in the markets are usually professionals.  The story behind crooked Hillary Clinton's extraordinary profits in cattle futures (not a field in which she’d previously (or has subsequently) displayed interest or expertise) while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains murky but it can certainly be said that for an apparently “amateur” dabbling in a market played usually by experienced professionals, she was remarkably successful and while perhaps there was some luck involved, her trading record was such it’s a wonder she didn’t take it up as a career.  While many analysts have, based on what documents are available, commented on crooked Hillary’s somewhat improbable (and apparently sometime “irregular”) foray into cattle futures, there was never an “official governmental investigation” by an independent authority and no thus adverse findings have ever been published.  

The Arrival (1913), oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I (1914-1918).  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915), oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.  Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash. 

Futurismo: Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.

By the early twentieth century when the Futurism movement emerged, machines and mechanism were already hundreds of years old (indeed the precursor devices pre-date Christ) but what changed was the new generations of machines had become sexy (at least in the eyes of men), associated as they were with something beyond mere functionalism: speed and style.  While planes, trains & automobiles all attracted the futurists, the motorcycle was a much-favored motif because it possessed an intimacy beyond other forms of transportation in that, literally it was more an extension of the human body, the rider at speed conforming to the shape of the structure fashioned for aerodynamic efficiency with hands and feet all directly attached to the vital controls: machine as extension of man.

The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.

The Modern Boy (1928-1939) was, as the name implies, a British magazine targeted at males aged 12-18 and the content reflected the state of mind in the society of the inter-war years, the 1930s a curious decade of progress, regression, hope and despair.  Although what filled much of the pages (guns, military conquest and other exploits, fast cars and motorcycles, stuff the British were doing in other peoples’ countries) would today see the editors cancelled or visited by one of the many organs of the British state concerned with the suppression of such things), it was what readers (presumably with the acquiescence of their parents) wanted.  Best remembered of the authors whose works appeared in The Modern Boy was Captain W.E. Johns (1893–1968), a World War I RFC (Royal Flying Corps) pilot who created the fictional air-adventurer Biggles.  The first Biggles tale appeared in 1928 in Popular Flying magazine (released also as Popular Aviation and still in publication as Flying) and his stories are still sometimes re-printed (although with the blatant racism edited out).  The first Biggles story had a very modern-sounding title: The White FokkerThe Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not known although dropping superfluous words (and much else) was a feature of modernism.  In October 1939, a few weeks after the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of restrictions by the Board of Trade on the supply of paper for civilian use.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Thursday, July 3, 2025

Zugzwang

Zugzwang (pronounced tsook-tsvahng)

(1) In chess, a situation in which a player is limited to moves that cost pieces or have a damaging positional effect.

(2) A situation in which, whatever is done, makes things worse (applied variously to sport, politics, battlefield engagements etc).

(3) A situation in which one is forced to act when one would prefer to remain passive and thus a synonym of the German compound noun Zugpflicht (the rule that a player cannot forgo a move).

(4) In game theory, a move which changes the outcome from win to loss.

Circa 1858 (1905 in English): A modern German compound, the construct being zug+zwang.  Zug (move) was from the Middle High German zuc & zug, from the Old High German zug ,from Proto-Germanic tugiz, an abstract noun belonging to the Proto-Germanic teuhaną, from the primitive Indo-European dewk (to pull, lead); it was cognate with the Dutch teug and the Old English tyge.  Zwang (compulsion; force; constraint; obligation) was from the Middle High German twanc, from the Old High German geduang.  It belongs to the verb zwingen and cognates include the Dutch dwang and the Swedish tvång.  The word is best understood as "compulsion to move" or, in the jargon of chess players: "Your turn to move and whatever you do it'll make things worse for you", thus the application to game theory, military strategy and politics where there's often a need to determine the "least worse option".  Zugzwang is a noun; the noun plural is Zugzwänge.  In English, derived forms such as zugzwanged, zugzwanging, zugzwangish, zugzwanger, zugzwangesque and zugzwangee are non-standard and used usually for humorous effect.

Chess and Game Theory

Endgame: Black's turn and Zugzwang! Daily Chess Musings depiction of the elegance of zugwang.

The first known use of Zugzwang in the German chess literature appears in 1858; the first appearance in English in 1905.  However, the concept of Zugzwang had been known and written about for centuries, the classic work being Italian chess player Alessandro Salvio's (circa 1575–circa 1640) study of endgames published in 1604 and he referenced Shatranj writings from the early ninth century, some thousand years before the first known use of the term.  Positions with Zugzwang are not rare in chess endgames, best known in the king-rook & king-pawn conjunctions.  Positions of reciprocal Zugzwang are important in the analysis of endgames but although the concept is easily demonstrated and understood, that's true only of the "simple Zugzwang" and the so-called "sequential Zugzwang" will typically be a multi-move thing which demands an understanding of even dozens of permutations of possibilities.

Rendered by Vovsoft as cartoon character: a brunette Lindsay Lohan at the chessboard.  In her youth, she was a bit of a zugzwanger.

Zugzwang describes a situation where one player is put at a disadvantage because they have to make a move although the player would prefer to pass and make no move. The fact the player must make a move means their position will be significantly weaker than the hypothetical one in which it is the opponent's turn to move. In game theory, it specifically means that it directly changes the outcome of the game from a win to a loss.  Chess textbooks often cite as the classic Zugzwang a match in Copenhagen in 1923; on that day the German Grandmaster (the title inaugurated in 1950) Friedrich Sämisch (1896–1975) played White against the Latvian-born Danish Aron Nimzowitsch (1886-1935).  Playing Black, Nimzowitsch didn’t play a tactical match in the conventional sense but instead applied positional advantage, gradually to limit his opponent’s options until, as endgame was reached, White was left with no move which didn’t worsen his position; whatever he choose would lead either to material loss or strategic collapse and it’s said in his notebook, Nimzowitsch concluded his entry on the match with “Zugzwang!  A noted eccentric in a discipline where idiosyncratic behaviour is not unknown, the Polish Grandmaster Savielly Tartakower (1887-1956) observed of Nimzowitsch: “He pretends to be crazy in order to drive us all crazy.

French sculptor Auguste Rodin's (1840-1917) The Thinker (1904), Musée Rodin, Paris (left) and Boris Johnson (b 1964; UK prime-minister 2019-2022) thinking about which would be his least worst option (left).

In its classic form chess is a game between two, played with fixed rules on a board with a known number of pieces (32) and squares (64).  Although a count of the possible permutations in a match would yield a very big number, in chess, the concept of Zugwang is simple and understood the same way by those playing black and white; information for both sides is complete and while the concept can find an expression both combinatorial game theory (CGT) and classical game theory, the paths can be different.  CGT and GT (the latter historically a tool of economic modelers and strategists in many fields) are both mathematical studies of games behaviour which can be imagined as “game-like” but differ in focus, assumptions, and applications.  In CGT the basic model (as in chess) is of a two-player deterministic game in which the moves alternate and luck or chance is not an element.  This compares GT in which there may be any number of players, moves may be simultaneous, the option exists not to move, information known to players may be incomplete (or asymmetric) and luck & chance exist among many variables (which can include all of Donald Rumsfeld’s (1932–2021: US defense secretary 1975-1977 & 2001-2006) helpful categories (known knowns, known unknowns, unknown unknowns & (most intriguingly) unknown knowns).  So, while CGT is a good device for deconstructing chess and such because such games are of finite duration and players focus exclusively on “winning” (and if need be switching to “avoiding defeat”), GT is a tool which can be applied to maximize advantage or utility in situations where a win/defeat dichotomy is either not sought or becomes impossible.  The difference then is that CGT envisages two players seeking to solve deterministic puzzle on a win/lose basis while GT is there to describes & analyse strategic interactions between & among rational actors, some or all of which may be operating with some degree of uncertainty.

Serial zugzwanger Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022), Parliament House, Canberra.  More than many, Mr Joyce has had to sit and ponder what might at that moment be his “least worst” option.  He has made choices good and bad.

In politics and military conflicts (a spectrum condition according to Prussian general and military theorist Carl von Clausewitz (1780–1831)), a zugzwang often is seen as parties are compelled to take their “least worst” option, even when circumstances dictate it would be better to “do nothing”.  However, the zugzwang can lie in the eye of the beholder and that why the unexpected Ardennes Offensive, (Wacht am Rhein (Watch on the Rhine) the German code-name though popularly known in the West as the Battle of the Bulge, (December 1944-January 1945)) was ordered by Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945).  It was the last major German strategic offensive of World War II (1939-1945) and among all but the most sycophantic of Hitler’s military advisors it was thought not “least worst” but rather “worse than the sensible” option (although not all the generals at the time concurred with what constituted “sensible”).  Under the Nazi state’s Führerprinzip (leader principle) the concept was that in any institutional structure authority was vested in the designated leader and that meant ultimately Hitler’s rule was a personal dictatorship (although the extent of the fragmentation wasn’t understood until after the war) so while the generals could warn, counsel & advise, ultimately decisions were based on the Führer’s will, thus the Ardennes Offensive.

While the operation made no strategic sense to the conventionally-schooled generals, to Hitler it was compelling because the tide of the war had forced him to pursue the only strategy left: delay what appeared an inevitable defeat in the hope the (real but still suppressed) political tensions between his opponents would sunder their alliance, allowing him to direct his resources against one front rather than three (four if the battle in the skies was considered a distinct theatre as many historians argue).  Like Charles Dickens’ (1812–1870) Mr Micawber in David Copperfield (1849-1850), Hitler was hoping “something would turn up”.  Because of the disparity in military and economic strength between the German and Allied forces, in retrospect, the Ardennes Offensive appears nonsensical but, at the time, it was a rational tactic even if the strategy of “delay” was flawed.  Confronted as he was by attacks from the west, east and south, continuing to fight a defensive war would lead only to an inevitable defeat; an offensive in the east was impossible because of the strength of the Red Army and even a major battlefield victor in the south would have no strategic significance so it was only in the west a glimmer of success seemed to beckon.

The bulge.

In the last great example of the professionalism and tactical improvisation which was a hallmark of their operations during the war, secretly the Wehrmacht (the German military) assembled a large armored force (essentially under the eyes of the Allies) and staged a surprise attack through the Ardennes, aided immeasurably by the cover of heavy, low clouds which precluded both Allied reconnaissance and deployment of their overwhelming strength in air-power.  Initially successful, the advance punched several holes in the line, the shape of which, when marked on a map, lent the campaign the name “Battle of the Bulge” but within days the weather cleared, allowing the Allies to unleash almost unopposed their overwhelming superiority in air power.  This, combined with their vast military and logistical resources, doomed the Ardennes Offensive, inflicting losses from which the Wehrmacht never recovered: From mid-January on, German forces never regained the initiative, retreating on all fronts until the inevitable defeat in May.  A last throw of the dice, the offensive both failed and squandered precious (and often irreplaceable) resources badly needed elsewhere.  By December 1944, Hitler had been confronted with a zugzwang (of his own making) and while whatever he did would have made Germany’s position worse, at least arguably, the Ardennes Offensive was not even his “least worse” option.

Thursday, June 19, 2025

Macabre

Macabre (pronounced muh-kah-bruh, muh-kahb or muh-kah-ber)

(1) Gruesome or horrifying; grim; ghastly; horrible.

(2) Of, pertaining to, dealing with, or representing death, especially its grimmer or uglier aspects.

(3) Of or suggestive of the allegorical dance of death and related works of art.

1370s: From the French macabre, from the Middle French danse (de) Macabré, of uncertain origin.  It may have been influenced by the Medieval Latin chorēa Machabaeōrum (a representation of the deaths of Judas Maccabaeus and his brothers) but there’s no documentary evidence (the Maccabees a “liberation movement” who in the second and first centuries BC established Jewish independence in the Land of Israel),  In the popular imagination, the biblical Maccabees became associated death because of the doctrines and prayers for the dead in 2 Maccabees 12:43-46 in which is discussed Judas Maccabeus sending money to Jerusalem as a “sin offering” for those of his soldiers who had fallen in battle while wearing idolatrous amulets, forbidden by Jewish law.  Theologically, the passage is controversial because not all accept the interpretations which focus on the significance of a Jewish belief in prayer for the dead and the concept of Purgatory as a place rather than conceptual imagining.  The notion of “prayer & payments” as the means by which the dead could be “loosed from their earthly sins” so in Purgatory their souls would undergo purification after death did become embedded in Christianity, later associated with the rampant corruption of clerical indulgences which would play a part in triggering the reformation.  The alternative suggestion for the etymology is the French form was (via the Spanish macabro) from the Arabic مَقَابِر (maqābir) (cemeteries), plural of مَقْبَرَة (maqbara) or مَقْبُرَة (maqbura).  Borrowing from the Arabic in plural form was not unusual (eg magazine, derived from the plural مخازن (maxāzin) of the Arabic singular noun مخزن (maxzan) (storehouse; depot; shop) so etymologically the theory is possible but, like the Latin link, evidence wholly is lacking. 

The abstracted sense of “characterized by gruesomeness” emerged in French in the 1840s and that was picked up by English by at least 1889, dictionaries noting a racial sense from 1921.  The sense of “a comedy that deals in themes and subjects usually regarded as serious or taboo” was what extended the figurative use, suggesting “something morbid”.  The origin of that, although contested, is most associated with the French left and new wave of the late 1950s (pièce noire, comédie noire) which may have been the source of the terms “black comedy” & “dark comedy” in English.  Words similar in meaning include spooky, ghastly, ghoulish, grisly, morbid, gruesome, weird, frightening, grim, lurid, cadaverous, deathly, dreadful, frightful, ghostly, hideous, horrible, offensive & scary.  The first known reference to “danse macabre” dates from 1376 in the poem Respit de la Mort: Je fis de macabre la dance (Spared from death, the dance of the macabre) by Jehan Le Fèvre:

Je fis de Macabre la danse,
Qui tout gent maine à sa trace
E a la fosse les adresse.

I danced with the Macabre,
Which all people follow in his footsteps
And send them to the grave.

The poet used it as a noun, inspired presumably by a near-death experience but when it in the early-mid 1400s came into common use it was as an adjective and during the Romantic era it assumed also the meanings some distance from death (grotesque, tragic etc).  In the late Middle English the spelling was Macabrees daunce (reflecting the influence of the Church) and the French pronunciation (with mute “e”) was a misreading of the Middle French forms.  Macabre is an adjective, macabreness is a noun and macabrely is an adverb.  The spelling macaber is now so rare as to be functionally extinct and in popular culture macabre is used as a non-standard noun (the plural the macabres, on the model of the disparaging “the ghastlies”).  

Dance of Death

Danse Macabre of Basel (circa 1450), a memento mori painting by an unknown artist, Historisches Museum Basel (Basel Historical Museum), Barfüsserkirche, Basel, Switzerland.

The Danse Macabre (Dance of Death) was an artistic genre of allegory dating from the late Middle Ages; exploring the universality of death, it made clear that however high or low exulted one’s station in life, the death ultimately will visit all.  It was a popular artistic motif in European folklore and the most elaborated of all Medieval macabre art.  During the fourteenth century, Europe was beset by deathly horrors, recurring famines, the Hundred Years’ War (1337-1453) and, looming over all, the Black Death, an outbreak of bubonic plague which between 1346-1353 may have killed as many as 50 million, making it one of history's most lethal pandemics.  In reducing the population of Europe by between a third and a half, its demographic, political and economic implications were felt for centuries.  In these difficult times, when death not infrequently would strike just about every family in some regions, the Danse Macabre culturally was assimilated across the continent, an omnipresent chance of either a sudden or lingering, painful death spurring not only a religious desire for penance but also an urge to make the most of whatever time was left to one.

Macabre montage: Three images from Terry Richardson's (b 1965) suicide-themed shoot with Lindsay Lohan, 2012.

Especially during the fifteenth and early sixteenth centuries, the theme was a source of the vivid and stark paintings on the walls of churches and the cloisters of cemeteries and ossuaries.  Art of the Danse Macabre was typically a depiction of the personification of death summoning the doomed to dance along to the grave and they featured characters from the exultated to the most humble; popes, emperors, lawyers, laborers & children all appearing, the popular motifs in the works including hourglasses, skulls and extinguished candles.  Although the art was moral and allegorical, many also had a satirical tone and, reflecting the mores of the times, although they made clear death finally would claim rich and poor alike, the living usually were arranged in an order following the the conventional sense of precedence, popes, cardinals, kings, dukes and such at the head of the queue, blacksmiths, fellmongers and farm workers knowing their place; the cold gradations of decay in the phrase of Dr Johnson (Samuel Johnson (1709-1784)).  The pieces were also among the multi-media productions of the medieval period, appearing variously in manuscript illustrations, printed books, paintings on canvas, wood & stone, engravings on stone and metal, woodcuts, sculpture, tapestry embroidery & stained glass as well as in prose & verse.  They were produced as mementos mori, a Latin phrase translated literally as “remember you will die”.  That wasn’t intended to be thought macabre but rather a gentle reminder of the brevity of life and the fragility of earthly existence, hopefully inspiring folk to live lives more fulfilling and purposeful.  The tradition, although it became increasingly detached from its religious associations, never died and has enjoyed periodic resurgences over the last six-hundred years, notably after horrific events such as epidemics or World War I (1914-1918).  The COVID-19 pandemic seemed not to stimulate similar art; popular culture’s preferred platforms have shifted.

The lure of macabre collectables 

It's macabrely ironic the market for bits and pieces associated with RMS Titanic (1911-1912) continues to be buoyant and although for decades after the end of World War II (1939-1945) the trade in Nazi memorabilia flourished on both sides of the Atlantic, in recent years such collecting has attracted increasingly strident criticism and in some jurisdictions the (public) buying and selling of certain items has been banned,  There remains some tolerance for the trade what which would otherwise anyway be collectable (aircraft, armoured vehicles and such) and items of genuine historical significance (such as diplomatic papers) remain acceptable but the circulation of mere ephemera with some Nazi link is increasingly being condemned as macabre and the higher the prices paid, the more distasteful it’s claimed to be.  Nor is it only material tainted by an association with the Nazis which is condemned by some as “trading in the macabre”.

French racing driver Pierre Levegh (1905-1955) in Mercedes-Benz 300 SLR (chassis 0006/55, left), the wreckage after the fire finally was extinguished (centre) and the surviving Elektron panel (right).

In 2023, a battered metal panel from the Mercedes-Benz 300 SLR (W196S, chassis 0006/55) which crashed during the running of the 1955 Le Mans 24 Hour endurance classic sold at auction for US$37,000.  That would have been unremarkable except it was in the aftermath of that crash that more than 80 spectators were killed and many more badly injured; it remains the most lethal single event in the history of the sport and one which led to some profound changes, many of which remain in force to this day.  Footage of the crash is available on-line and it will shock those accustomed to modern safety standards to see the cars continuing to race despite the carnage in the grandstand only metres away, the driver’s corpse lying on the track and the wreckage of the 300 SLR continuing to burn, the water used by fire-fighters making the intensity worse because of the exotic Elektron (a magnesium alloy) used in the lightweight construction.  The surviving panel (a cover placed for aerodynamic advantage over the passenger-side of the cockpit) was retrieved by a track marshal and it remained in his family’s possession until offered at auction by his nephew who inherited it.  Based on the unique underside markings, the factory confirmed the provenance and the auction house described it as “an authentic relic” from one of the “most exclusive models in the history of the automobile”, its special significance coming from involvement in “one of the most significant events in the history of international motor sport”.  Some though it macabre to be trading in something which gained its notoriety from so much death but the interest in such stuff in long standing, the Austin-Healey also involved in the incident in 2011 selling for US$1.3 million although it subsequently had been repaired and continued to race so anyway would have been a collectable on the historic racing circuit though doubtlessly it would have commanded a lower price.

US film star James Dean (1931–1955) with 1955 Porsche 550 Spyder (chassis 550-0055) shortly before his death, the 1955 Ford Country Squire with tandam-axle trailer the team’s tow vehicle (left), the wrecked Porsche (centre) and the salvaged transaxle in display mounting (right).

The death toll need not be in the dozens for collectors to be drawn to relics associated with tragedy; one celebrity can be enough.  In 2021, the four-speed transaxle from film star James Dean’s 1955 Porsche 550 Spyder (550-0055) sold in an on-line auction for US$382,000.  Again, based on the serial number (10 046) & part number (113 301 102), factory verified the authenticity and of the auction lot and it was only the transaxle which had been salvaged from the wreck, the display stand and peripheral bits & pieces (axles, axle tubes, brake assemblies etc) all fabricated.  The crash happened on SR (South Route) 466 (now SR 46) near Cholame, California, en route to October’s upcoming Salinas Road Races and Mr Dean was driving to familiarize himself with his new 550 Spyder which, although mid-engined and thus with a preferable weight distribution compared with the rear-engined 356 which previously he’d campaigned, had characteristics different than he’d before experienced.  In the dimming light of the late afternoon, the Porsche collided with the passenger-side of a 1950 Ford Tudor (two-door sedan) which had just entered the highway, driven by California Polytechnic State University student Donald Turnupseed (1932-1955).  Mr Turnupseed (later cleared by authorities of any blame) suffered only minor injuries while Mr Dean, less than an hour later, was pronounced DoA (dead on arrival) at hospital.