Monday, July 7, 2025

Blazon

Blazon (pronounced bley-zuhn)

(1) In heraldry, an escutcheon or coat of arms or a banner depicting a coat of arms.

(2) In heraldry, a description (verbal or written or in an image) of a coat of arms.

(3) In heraldry, a formalized language for describing a coat of arms (the heraldic description of armorial bearings).

(4) An ostentatious display, verbal or otherwise.

(5) A description or recording (especially of the good qualities of a person or thing).

(6) In literature, verses which dwelt upon and described various parts of a woman's body (usually in admiration). 

(7) Conspicuously or publicly to set forth; display; proclaim.

(8) To adorn or embellish, especially brilliantly or showily.

(9) To depict (heraldic arms or the like) in proper form and color.

(10) To describe a coat of arms.

1275-1300: From the late thirteenth century Middle English blazon (armorial bearings, coat of arms), from the twelfth century Old French blason (shield, blazon (also “collar bone”).  Of the words in the Romance languages (the Spanish blason, Italian blasone, Portuguese brasao & Provençal blezo, the first two are said to be French loan-words and the origins of all remain uncertain.  According to the OED (Oxford English Dictionary), the suggestion by nineteenth century French etymologists of connections with Germanic words related to English blaze is dubious because of the sense disparities.  The verb blazon (to depict or paint (armorial bearings) dates from the mid sixteenth century and was either (or both) from the noun or the French blasonner (from the French noun).  In English, it had earlier in the 1500s been used to mean “descriptively to set forth; descriptively” especially (by at least the 1530s) specifically “to vaunt or boast” and in that sense it was probably at least influenced by the English blaze.  Blazon & blazoning are nouns & verbs, blazoner, blazonry & blazonment are nouns and blazoned & blazonable are adjectives; the noun plural is blazons.

A coat of arms, possibly of dubious provenance. 

The now more familiar verb emblazon (inscribe conspicuously) seems first to have been used around the 1590s in the sense of “extol” and the still common related forms (emblazoning; emblazoned) emerged almost simultaneously.  The construct of emblazon was en- +‎ blazon (from the Old French blason (in its primary sense of “shield”).  The en- prefix was from the Middle English en- (en-, in-), from the Old French en- (also an-), from the Latin in- (in, into).  It was also an alteration of in-, from the Middle English in-, from the Old English in- (in, into), from the Proto-Germanic in (in).  Both the Latin & Germanic forms were from the primitive Indo-European en (in, into).  The intensive use of the Old French en- & an- was due to confluence with Frankish intensive prefix an- which was related to the Old English intensive prefix -on.  It formed a transitive verb whose meaning is to make the attached adjective (1) in, into, (2) on, onto or (3) covered.  It was used also to denote “caused” or as an intensifier.  The prefix em- was (and still is) used before certain consonants, notably the labials “b” & “p”.

Google ngram: It shouldn’t be surprising there seems to have been a decline in the use of “blazon” while “emblazoned” has by comparison, in recent decades, flourished.  That would reflect matters of heraldry declining in significance, their appearance in printed materials correspondingly reduced in volume.  However, because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Self referential emblazoning: Lindsay Lohan's selfie of her modeling a sweater by Ashish, her visage emblazoned in sequins, London, November 2014.

Impressionistically though this assumption is, few would doubt “blazon” is now rare while “emblazoned” is far from uncommon.  While “emblazon” began with the meaning “that which the emblazoner does” (ie (1) to adorn with prominent, (2) to inscribe upon and (3) to draw a coat of arms) it evolved by the mid-nineteenth century with the familiar modern sense of “having left in the mind a vivid impression” (often in the form “emblazoned on one’s memory”).  In English, there’s nothing unusual in a derived or modified form of a word becoming common than its original root, even to the point the where the original is rendered rare, unfamiliar or even obsolete, a phenomenon due to changes in usage patterns, altered conventions in pronunciation or shifts in meaning that make the derived form more practical or culturally resonant.  That’s just how English evolves.

Other examples include (1) ruthless vs. ruth (ruth (pity; compassion) was once a common noun in Middle English but has long been extinct while ruthless, there being many who demand the description, remains popular), (2) unkempt vs kempt (kempt (neatly kept) would have been listed as extinct were it not for it finding a niche as a literary and poetic form and has also been used humorously or ironically), (3) disheveled vs sheveled (sheveled was from the Old French chevelé (having hair) and was part of mainstream vocabulary as late as the eighteenth century but, except in jocular use, is effectively non-existent in modern English) and (4) redolent vs dolent (redolent (evocative of; fragrant) was from dolent (sorrowful), from the Latin dolere (to feel pain)); redolent both outlived and enjoyed a meaning-shift from its root.

Etymologists think of these as part of the linguistic fossil record, noting there’s no single reason for the phenomenon beyond what survives being better adapted to cultural or conversational needs.  In that, these examples differ from the playful fork of back-formation which has produced (1) combobulate (a back-formation from discombobulate (to confuse or disconcert; to throw into a state of confusion) which was a humorous mock-Latin creation in mid-nineteenth century US English) (2) couth (a nineteenth century back-formation from uncouth and used as a humorous form meaning “refined”), (3) gruntled (a twentieth century back-formation meaning “happy or contented; satisfied”, the source being disgruntled (unhappy; malcontented) and most sources indicate it first appeared in print in 1926 but the most celebrated example comes from PG Wodehouse (1881–1975) who in The Code of the Woosters (1938) penned: “He spoke with a certain what-is-it in his voice, and I could see that, if not actually disgruntled, he was far from being gruntled.  Long a linguistic joke, some now take gruntled seriously but for the OED remains thus far unmoved and (4) ept (a back-formation from inept (not proficient; incompetent or not competent (there is a functional difference between those two)) which was from the Middle French inepte, from the Latin ineptus).

Literary use

In literary use, “blazon” was a technical term used by the Petrarchists (devotes of Francis Petrarch (1304-1374), a scholar & poet of the early Italian Renaissance renowned for his love poems & sonnets and regarded also as one of the earliest humanists).  Blazon in this context (a subset of what literary theorists call “catalogue verse”) was adopted because, like the structured and defined elements of heraldic symbolism, Petrarch’s poems contained what might be thought an “inventory” of verses which dwelt upon and detailed the various parts of a woman's body; a sort of catalogue of her physical attributes.  Petrarch’s approach wasn’t new because as a convention in lyric poetry it was well-known by the mid thirteenth century, most critics crediting the tradition to the writings of Geoffrey of Vinsauf, a figure about whom little is although it’s believed he was born in Normandy.  In England the Elizabethan sonneteers honed the technique as a devotional device, often, in imaginative ways, describing the bits of their mistresses they found most pleasing, a classic example a fragment from Amoretti and Epithalamion (1595), a wedding day ode by the English poet Edmund Spenser (circa 1552-1599) to his bride (Elizabeth Boyle) in 1594:

Her goodly eyes like sapphires shining bright.
Her forehead ivory white,
Her cheeks like apples which the sun hath rudded,
Her lips like cherries charming men to bite,
Her breast like to a bowl of cream uncrudded,
Her paps like lilies budded,
Her snowy neck like to a marble tower,
And all her body like a palace fair.



Two bowls of cream uncrudded.

So objectification of the female form is nothing new and the poets saw little wrong with plagiarism, most of the imagery summoned salvaged from the works of Antiquity by elegiac Roman and Alexandrian Greek poets.  Most relied for their effect on brevity, almost always a single, punchy line and none seem ever to attempt the scale of the “epic simile”.  As can be imagined, the novelty of the revival didn’t last and the lines soon were treated by readers (some of whom were fellow poets) as clichés to be parodied (a class which came to be called “contrablazon”), the London-based courtier Sir Philip Sidney (1554–1586) borrowing from the Italian poet Francesco Berni (1497–1535) the trick of using terms in the style of Petrarch but “mixing them up”, thus creating an early form of body dysmorphia: Mopsa's forehead being “jacinth-like”, cheeks of “opal”, twinkling eyes “bedeckt with pearl” and lips of “sapphire blue”.

William Shakespeare (1564–1616) however saw other possibilities in the blazon and in Sonnet 130 (1609) turned the idea on its head, listing the imperfections in her body parts and characteristics yet concluding, despite all that, he anyway adored her like no other (here rendered in a more accessible English):

My mistress' eyes are nothing like the sun;
Coral is far more red than her lips' red;
If snow be white, why then her breasts are dun;
If hairs be wires, black wires grow on her head.
I have seen roses damasked, red and white,
But no such roses see I in her cheeks;
And in some perfumes is there more delight
Than in the breath that from my mistress reeks.
I love to hear her speak, yet well I know
That music hath a far more pleasing sound;
I grant I never saw a goddess go;
My mistress, when she walks, treads on the ground.
   And yet, by heaven, I think my love as rare
   As any she belied with false compare.

Saturday, July 5, 2025

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela (b 1957) Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurismo) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II (1939-1945) when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.

Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to bet on price movements and (2) producers and sellers to hedge against price movements and in both cases profits (and losses) can be booked against movement up or down.  Futures trading can be lucrative but is also risky, those who win gaining from those who lose and those in the markets are usually professionals.  The story behind crooked Hillary Clinton's extraordinary profits in cattle futures (not a field in which she’d previously (or has subsequently) displayed interest or expertise) while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains murky but it can certainly be said that for an apparently “amateur” dabbling in a market played usually by experienced professionals, she was remarkably successful and while perhaps there was some luck involved, her trading record was such it’s a wonder she didn’t take it up as a career.  While many analysts have, based on what documents are available, commented on crooked Hillary’s somewhat improbable (and apparently sometime “irregular”) foray into cattle futures, there was never an “official governmental investigation” by an independent authority and no thus adverse findings have ever been published.  

The Arrival (1913), oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I (1914-1918).  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915), oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.  Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash. 

Futurismo: Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.

By the early twentieth century when the Futurism movement emerged, machines and mechanism were already hundreds of years old (indeed the precursor devices pre-date Christ) but what changed was the new generations of machines had become sexy (at least in the eyes of men), associated as they were with something beyond mere functionalism: speed and style.  While planes, trains & automobiles all attracted the futurists, the motorcycle was a much-favored motif because it possessed an intimacy beyond other forms of transportation in that, literally it was more an extension of the human body, the rider at speed conforming to the shape of the structure fashioned for aerodynamic efficiency with hands and feet all directly attached to the vital controls: machine as extension of man.

The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.

The Modern Boy (1928-1939) was, as the name implies, a British magazine targeted at males aged 12-18 and the content reflected the state of mind in the society of the inter-war years, the 1930s a curious decade of progress, regression, hope and despair.  Although what filled much of the pages (guns, military conquest and other exploits, fast cars and motorcycles, stuff the British were doing in other peoples’ countries) would today see the editors cancelled or visited by one of the many organs of the British state concerned with the suppression of such things), it was what readers (presumably with the acquiescence of their parents) wanted.  Best remembered of the authors whose works appeared in The Modern Boy was Captain W.E. Johns (1893–1968), a World War I RFC (Royal Flying Corps) pilot who created the fictional air-adventurer Biggles.  The first Biggles tale appeared in 1928 in Popular Flying magazine (released also as Popular Aviation and still in publication as Flying) and his stories are still sometimes re-printed (although with the blatant racism edited out).  The first Biggles story had a very modern-sounding title: The White FokkerThe Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not known although dropping superfluous words (and much else) was a feature of modernism.  In October 1939, a few weeks after the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of restrictions by the Board of Trade on the supply of paper for civilian use.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Thursday, July 3, 2025

Zugzwang

Zugzwang (pronounced tsook-tsvahng)

(1) In chess, a situation in which a player is limited to moves that cost pieces or have a damaging positional effect.

(2) A situation in which, whatever is done, makes things worse (applied variously to sport, politics, battlefield engagements etc).

(3) A situation in which one is forced to act when one would prefer to remain passive and thus a synonym of the German compound noun Zugpflicht (the rule that a player cannot forgo a move).

(4) In game theory, a move which changes the outcome from win to loss.

Circa 1858 (1905 in English): A modern German compound, the construct being zug+zwang.  Zug (move) was from the Middle High German zuc & zug, from the Old High German zug ,from Proto-Germanic tugiz, an abstract noun belonging to the Proto-Germanic teuhaną, from the primitive Indo-European dewk (to pull, lead); it was cognate with the Dutch teug and the Old English tyge.  Zwang (compulsion; force; constraint; obligation) was from the Middle High German twanc, from the Old High German geduang.  It belongs to the verb zwingen and cognates include the Dutch dwang and the Swedish tvång.  The word is best understood as "compulsion to move" or, in the jargon of chess players: "Your turn to move and whatever you do it'll make things worse for you", thus the application to game theory, military strategy and politics where there's often a need to determine the "least worse option".  Zugzwang is a noun; the noun plural is Zugzwänge.  In English, derived forms such as zugzwanged, zugzwanging, zugzwangish, zugzwanger, zugzwangesque, zugzwangee et al are non-standard and used usually for humorous effect.

Chess and Game Theory

Endgame: Black's turn and Zugzwang! Daily Chess Musings depiction of the elegance of zugwang.

The first known use of Zugzwang in the German chess literature appears in 1858; the first appearance in English in 1905.  However, the concept of Zugzwang had been known and written about for centuries, the classic work being Italian chess player Alessandro Salvio's (circa 1575–circa 1640) study of endgames published in 1604 and he referenced Shatranj writings from the early ninth century, some thousand years before the first known use of the term.  Positions with Zugzwang are not rare in chess endgames, best known in the king-rook & king-pawn conjunctions.  Positions of reciprocal Zugzwang are important in the analysis of endgames but although the concept is easily demonstrated and understood, that's true only of the "simple Zugzwang" and the so-called "sequential Zugzwang" will typically be a multi-move thing which demands an understanding of even dozens of permutations of possibilities.

Rendered by Vovsoft as cartoon character: a brunette Lindsay Lohan at the chessboard.  In her youth, she was a bit of a zugzwanger.

Zugzwang describes a situation where one player is put at a disadvantage because they have to make a move although the player would prefer to pass and make no move. The fact the player must make a move means their position will be significantly weaker than the hypothetical one in which it is the opponent's turn to move. In game theory, it specifically means that it directly changes the outcome of the game from a win to a loss.  Chess textbooks often cite as the classic Zugzwang a match in Copenhagen in 1923; on that day the German Grandmaster (the title inaugurated in 1950) Friedrich Sämisch (1896–1975) played White against the Latvian-born Danish Aron Nimzowitsch (1886-1935).  Playing Black, Nimzowitsch didn’t play a tactical match in the conventional sense but instead applied positional advantage, gradually to limit his opponent’s options until, as endgame was reached, White was left with no move which didn’t worsen his position; whatever he choose would lead either to material loss or strategic collapse and it’s said in his notebook, Nimzowitsch concluded his entry on the match with “Zugzwang!  A noted eccentric in a discipline where idiosyncratic behaviour is not unknown, the Polish Grandmaster Savielly Tartakower (1887-1956) observed of Nimzowitsch: “He pretends to be crazy in order to drive us all crazy.

French sculptor Auguste Rodin's (1840-1917) The Thinker (1904), Musée Rodin, Paris (left) and Boris Johnson (b 1964; UK prime-minister 2019-2022) thinking about which would be his least worst option (left).

In its classic form chess is a game between two, played with fixed rules on a board with a known number of pieces (32) and squares (64).  Although a count of the possible permutations in a match would yield a very big number, in chess, the concept of Zugwang is simple and understood the same way by those playing black and white; information for both sides is complete and while the concept can find an expression both combinatorial game theory (CGT) and classical game theory, the paths can be different.  CGT and GT (the latter historically a tool of economic modelers and strategists in many fields) are both mathematical studies of games behaviour which can be imagined as “game-like” but differ in focus, assumptions, and applications.  In CGT the basic model (as in chess) is of a two-player deterministic game in which the moves alternate and luck or chance is not an element.  This compares GT in which there may be any number of players, moves may be simultaneous, the option exists not to move, information known to players may be incomplete (or asymmetric) and luck & chance exist among many variables (which can include all of Donald Rumsfeld’s (1932–2021: US defense secretary 1975-1977 & 2001-2006) helpful categories (known knowns, known unknowns, unknown unknowns & (most intriguingly) unknown knowns).  So, while CGT is a good device for deconstructing chess and such because such games are of finite duration and players focus exclusively on “winning” (and if need be switching to “avoiding defeat”), GT is a tool which can be applied to maximize advantage or utility in situations where a win/defeat dichotomy is either not sought or becomes impossible.  The difference then is that CGT envisages two players seeking to solve deterministic puzzle on a win/lose basis while GT is there to describes & analyse strategic interactions between & among rational actors, some or all of which may be operating with some degree of uncertainty.

Serial zugzwanger Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022), Parliament House, Canberra.  More than many, Mr Joyce has had to sit and ponder what might at that moment be his “least worst” option.  He has made choices good and bad.

In politics and military conflicts (a spectrum condition according to Prussian general and military theorist Carl von Clausewitz (1780–1831)), a zugzwang often is seen as parties are compelled to take their “least worst” option, even when circumstances dictate it would be better to “do nothing”.  However, the zugzwang can lie in the eye of the beholder and that why the unexpected Ardennes Offensive, (Wacht am Rhein (Watch on the Rhine) the German code-name though popularly known in the West as the Battle of the Bulge, (December 1944-January 1945)) was ordered by Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945).  It was the last major German strategic offensive of World War II (1939-1945) and among all but the most sycophantic of Hitler’s military advisors it was thought not “least worst” but rather “worse than the sensible” option (although not all the generals at the time concurred with what constituted “sensible”).  Under the Nazi state’s Führerprinzip (leader principle) the concept was that in any institutional structure authority was vested in the designated leader and that meant ultimately Hitler’s rule was a personal dictatorship (although the extent of the fragmentation wasn’t understood until after the war) so while the generals could warn, counsel & advise, ultimately decisions were based on the Führer’s will, thus the Ardennes Offensive.

While the operation made no strategic sense to the conventionally-schooled generals, to Hitler it was compelling because the tide of the war had forced him to pursue the only strategy left: delay what appeared an inevitable defeat in the hope the (real but still suppressed) political tensions between his opponents would sunder their alliance, allowing him to direct his resources against one front rather than three (four if the battle in the skies was considered a distinct theatre as many historians argue).  Like Charles Dickens’ (1812–1870) Mr Micawber in David Copperfield (1849-1850), Hitler was hoping “something would turn up”.  Because of the disparity in military and economic strength between the German and Allied forces, in retrospect, the Ardennes Offensive appears nonsensical but, at the time, it was a rational tactic even if the strategy of “delay” was flawed.  Confronted as he was by attacks from the west, east and south, continuing to fight a defensive war would lead only to an inevitable defeat; an offensive in the east was impossible because of the strength of the Red Army and even a major battlefield victor in the south would have no strategic significance so it was only in the west a glimmer of success seemed to beckon.

The bulge.

In the last great example of the professionalism and tactical improvisation which was a hallmark of their operations during the war, secretly the Wehrmacht (the German military) assembled a large armored force (essentially under the eyes of the Allies) and staged a surprise attack through the Ardennes, aided immeasurably by the cover of heavy, low clouds which precluded both Allied reconnaissance and deployment of their overwhelming strength in air-power.  Initially successful, the advance punched several holes in the line, the shape of which, when marked on a map, lent the campaign the name “Battle of the Bulge” but within days the weather cleared, allowing the Allies to unleash almost unopposed their overwhelming superiority in air power.  This, combined with their vast military and logistical resources, doomed the Ardennes Offensive, inflicting losses from which the Wehrmacht never recovered: From mid-January on, German forces never regained the initiative, retreating on all fronts until the inevitable defeat in May.  A last throw of the dice, the offensive both failed and squandered precious (and often irreplaceable) resources badly needed elsewhere.  By December 1944, Hitler had been confronted with a zugzwang (of his own making) and while whatever he did would have made Germany’s position worse, at least arguably, the Ardennes Offensive was not even his “least worse” option.

Tuesday, July 1, 2025

Sherry

Sherry (pronounced sher-ee)

(1) A fortified, amber-colored wine, originally from the Jerez region of southern Spain or any of various similar wines made elsewhere; usually drunk as an apéritif.  Technically, a white wine.

(2) A female given name, a form of Charlotte.

(3) A reddish color in the amber-brown spectrum.

1590-1600: A (mistaken singular) back formation from the earlier sherris (1530s), from the Spanish (vino deXeres ((wine from) Xeres).  Xeres is now modern-day Jerez (Roman (urbsCaesaris) in Spain, near the port of Cadiz, where the wine was made.  The official name is Jerez-Xérès-Sherry, one of Spain's wine regions, a Denominación de Origen Protegida (DOP).  The word sherry is an anglicisation of Xérès (Jerez) and the drink was previously known as sack, from the Spanish saca (extraction) from the solera.  In EU law, sherry has protected designation of origin status, and under Spanish law, to be so labelled, the product must be produced in the "Sherry Triangle", an area in the province of Cádiz between Jerez de la Frontera, Sanlúcar de Barrameda, and El Puerto de Santa María.  In 1933 the Jerez denominación de origen was the first Spanish denominación officially thus recognized, named D.O. Jerez-Xeres-Sherry and sharing the same governing council as D.O. Manzanilla Sanlúcar de Barrameda.  The name "sherry" continues to be used by US producers where, to conform to domestic legislation, it must be labeled with a region of origin such as Oregon Sherry but can’t be sold in the EU (European Union) because of the protected status laws.  Both Canadian and Australian winemakers now use the term Apera instead of Sherry, although customers seem still to favor the original.  Sherry is a noun; the noun plural is sherries.

Sherry Girl (in bold two-copa ‘Sherry Stance’) and the ultimate sherry party.

Held annually since 2014 (pandemics permitting), Sherry Week is a week-long celebration of “gastronomical and cultural events” enjoyed by the “vibrant global Sherry community” which gathers to “showcase the wine’s incredible diversity, from the dry crispness of Fino to the velvety sweetness of Cream.  Although the multi-venue Sherry Week is now the best known meeting on the Sherry calendar, worldwide, since 2014 some 20,000 events have taken place with the approval of the Consejo Regulador for Jerez-Xérès-Sherry and Manzanilla; to date there have been more than half a million attendees and in 2024 alone there were over 3,000 registered events in 29 countries in cities including London, Madrid, São Paulo, Tokyo, Buenos Aires, Auckland and Shanghai.  Daringly, the publicity for the 2025 gatherings introduced “Sherry Girl” whose “bold two-copa ‘Sherry Stance’” is now an icon for the drink.  Sherry Girl is new but dedicated sherryphiles will be pleased to learn the traditional “Sherry Ruta” (Sherry route) remains on the schedule, again in “multi-venue routes offering exclusive pairing experiences”, described as “not a typical wine crawl but a triumphant strut with tipples, tastings, and tapas.  For the adventurous, participants are able to use the interactive venue map to curate their own Sherry Ruta in their city of choice.  The 2025 event will be held between 3-9 November. 

Dry Sack, a sherry preferred by many because of its balance; straddling sweet and dry.  Purists tend to the dry finos while sweeter cream sherries are recommended for neophytes.

The name "sherry" continues to be used by US producers where, to conform to domestic legislation, it must be labeled with a region of origin such as Oregon Sherry but can’t be sold in the EU because of their protected status laws.  Both Canadian and Australian winemakers now use the term Apera instead of Sherry, although customers seem still to favor the original.  For the upper-middle class and beyond, sherry parties were a fixture of late-Victorian and Edwardian social life but the dislocations of the World War I (1914-1918) seemed to render them extinct. It turned out however to be a postponement and sherry parties were revived, the height of their popularity being enjoyed during the 1930s until the post-war austerity the UK endured after World War II (1939-1945) saw them a relic restricted moistly to Oxbridge dons, the genuinely still rich, Church of England bishops and such although they never quite vanished and those who subscribe to magazines like Country Life or Tatler probably still exchange invitations to each other's sherry parties.

For Sherry and Cocktail Parties, trade literature by Fortnum and Mason, Regent Street, Piccadilly, London, circa 1936.  The luxury department store, Fortnum & Mason, used the services of the Stuart Advertising Agency, which employed designers to produce witty and informative catalogues and the decorative art is illustrative of British commercial art in this period.

For the women who tended to be hostess and organizer, there were advantages compared with the tamer tea party.  Sherry glasses took less space than cups of tea, with all the associated paraphernalia of spoons, milk and sugar and, it being almost impossible to eat and drink while balancing a cup and saucer and conveying cake to the mouth, the tea party demanded tables and chairs.  The sherry glass and finger-food was easier for while one must sit for tea, one can stand for sherry so twice the number of guests could be asked.  Sherry parties indeed needed to be tightly packed affairs, the mix of social intimacy and alcohol encouraging mingling and they also attracted more men for whom the offer of held little attraction.  The traditional timing between six and eight suited the male lifestyle of the time and they were doubtless more attracted to women drinking sherry than women drinking tea for while the raffish types knew it wasn't quite the "leg-opener" as gin was renowned to be, every little bit helps.

In hair color and related fields, "sherry red" (not to be confused with the brighter "cherry red") is a rich hue on the spectrum from amber to dark brown: Lindsay Lohan (who would be the ideal "Cherry Girl" model) demonstrates on the red carpet at the Liz & Dick premiere, Los Angeles, 2012.

Sherry party planner.

Novelist Laura Troubridge (Lady Troubridge, (née Gurney; 1867-1946)), who in 1935 published what became the standard English work on the topic, Etiquette and Entertaining: to help you on your social way, devoted an entire chapter to the sherry party.  She espoused an informal approach as both cheap and chic, suggesting guests be invited by telephone or with “Sherry, six to eight” written on a visiting card and popped in an envelope.   She recommended no more than two-dozen guests, a half-dozen bottles of sherry, a couple of heavy cut-glass decanters and some plates of “dry and biscuity” eats: cheese straws, oat biscuits, cubes of cheddar.  This, she said, was enough to supply the makings of a “…jolly kind of party, with plenty of cigarettes and talk that will probably last until half past seven or eight.
Cocktail Party by Laurence Fellows (1885-1964), Esquire magazine, September 1937.

The Sherry party should not be confused with the cocktail party.  Cocktail parties in drawing rooms at which Martinis were served often were much more louche affairs.  Note the elegantly sceptical expressions on the faces of the women, all of whom have become immured to the tricks of “charming men in suits”.  For women, sherry parties were more welcoming places.