Showing posts sorted by date for query Rational. Sort by relevance Show all posts
Showing posts sorted by date for query Rational. Sort by relevance Show all posts

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text” endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Saturday, August 23, 2025

Suffrage

Suffrage (pronounced suhf-rij)

(1) The right to vote, especially in a publicly contested, democratic elections; the franchise.

(2) The exercise of such a right; casting a vote.

(3) In ecclesiastical use, a prayer, especially a short intercessory prayer (especially those offered for the faithful dead) or a short petition (such as those after the creed in matins and evensong.

(4) Aid, intercession (now rare).

(5) Testimony; attestation; witness; approval (now rare).

(6) The collective opinion of a body of persons (archaic and probably extinct).

1350–1400: From the Middle English suffrage (intercessory prayers or pleas on behalf of another), from the thirteenth century Old French sofrage (plea, intercession), from the from Medieval Latin, from the Latin suffragium (voting tablet, a vote cast in an assembly (for a law or candidate), an act of voting or the exercise of the right to vote, the decision reached by a vote, an expression of approval, influence or promotion on behalf of a candidate), the construct being suffrag(ari) (genitive suffrāgiī or suffrāgī) (to express public support, vote or canvass for, support) + -ium (the noun suffix).  The –ium suffix (used most often to form adjectives) was applied as (1) a nominal suffix (2) a substantivisation of its neuter forms and (3) as an adjectival suffix.  It was associated with the formation of abstract nouns, sometimes denoting offices and groups, a linguistic practice which has long fallen from fashion.  In the New Latin, as the neuter singular morphological suffix, it was the standard suffix to append when forming names for chemical elements.  The derived forms included nonsuffrage, presuffrage, prosuffrage & antisuffrage (the latter a once well-populated field).  Suffrage, suffragist, suffragette, suffragettism & suffragent are nouns and suffraged is an adjective; the noun plural is suffrages.

The sense in English of “vote” or “right to vote” was derived directly from the Classical Latin and it came by the late nineteenth century to be used with modifiers, chosen depending on the campaign being advocated (manhood suffrage, universal suffrage, women's suffrage, negro suffrage etc and the forms were sometimes combined (universal manhood suffrage).  Because the case for women became the most prominent of the political movements, “suffrage” became the verbal shorthand (ie technically a clipping of woman suffrage).The meaning “a vote for or against anything” was in use by the 1530s and by the turn of the century this had assume the specific sense “a vote or voice in deciding a question or in a contest for office”.  By the 1660s, widely it was held to mean “act of voting in a representative government” and this is the origin of the modern idea of the franchise: “the political right to vote as a member of a body” codified in 1787 in the US US Constitution (in reference to the states).

Exercising her suffrage: Wearing “I voted” sticker, Lindsay Lohan leaves polling station after casting her vote in the 2008 US presidential election, West Hollywood, 4 November 2008.  In California, the Democratic ticket (Barack Obama (b 1961; US president 2009-2017) & Joe Biden (b 1942; US president 2021-2025) took gained all 55 electors in the Electoral College with 8,274,473 votes (61.01%) against the 5,011,781 (36.95%) gained by the Republican ticket (John McCain (1936–2018) & Sarah Palin (b 1964).

In zoology the suffrago (as a learned borrowing from Latin suffrāgō (the pastern, or hock)) describes the joint between the tibia and tarsus, such as the hock of a horse's hind leg or the heel of a bird.  Always rare (and now probably extinct), the companion term in clinical use was suffraginous, from the Latin suffraginosus (diseased in the hock), from suffrāgō, used in the sense of “of or relating to the hock of an animal”.  So, there’s an etymological relationship between English noun “suffrage” (in zoology, the joint between the tibia and tarsus) and “suffrage” (an individual's right to vote) and while there are many strange linkages in the language, that one seems weirder than most.  The anatomical term describes what is essentially the hock in quadrupeds (although it was used also of birds) and that was from the Classical Latin, suffrāgō (ankle-bone, hock or the part of the leg just above the heel) and traditionally, etymologists analyzed this as related to sub- (under) + a base meaning “break, fracture” or “support” although there were scholars who connected it with frag- (to break) from frangere (to break).  The functionalists weren’t impressed by that, suggesting it was a transferred anatomical term.

The Suffragist, 7 July, 2017.

Printed originally in 1913 as a single-sheet pamphlet, in November that year The Suffragist was first issued as weekly, eight-page tabloid newspaper, noted for its cover art which was a kind of proto-agitprop.  A classic single-issue political movement, the pamphlets had been produced by the CU (Congressional Union), an affiliate of the NAWSA (National American Woman Suffrage Association) but The Suffragist was an imprint of the CUWS (Congressional Union for Woman Suffrage), created (with a unique legal personage to avoid corporate liability) as a publicity and activist organ; in 1917 it became the NWP (National Woman's Party).  After its aims were in 1918 realised, The Suffragist ceased publication and the activists shifted their attention to the promotion of the ERA (Equal Rights Amendment), some which, more than a century on, has still not been ratified and has thus never been interpolated into the constitution.

Suffrage came ultimately from the suffrāgium (which had a number of senses relating to “voting”) writers from Antiquity documented their takes on the etymology.  In De lingua latina libri XXV (On the Latin Language in 25 Books), the Roman scholar Varro (Marcus Terentius Varro, 116–27 BC) held it arose metaphorically from suffrāgō (ankle-bone), the rationale being that votes originally were cast pebbles, sherds (now more commonly called “shards”) or other small tokens, possibly with astragali (knuckle or ankle-bones typically from sheep or goats) used like dice or counters.  Animal bones widely were used for many purposes, Pliny the Elder (24-79) in his encyclopaedic Naturalis historia (Natural History (37 thematic books in ten conceptual volumes)) noted people re-purposing astragali for tasks as diverse as teaching arithmetic, gambling, divination, or decision-making.  The Roman statesman Cicero (106-43 BC) seems not directly to have commented on the etymology, in his De Legibus (On the Laws) using suffrāgium in the common sense of “voting” & “vote” applied it also as a rhetorical device to suggest “support” so while not supporting the link with bones, nor does he contradict the popular notion that as an ankle-bone supports the human structure, votes support a candidate.

The Suffragist, 15 September, 1917.

The medieval grammarians also took an interest, Isidore of Seville (circa 560-636) covering all bases by noting (1) suffrāgium’s link with fragor (breaking) implied the idea of “breaking one’s voice” in approval (voting then often done in town squares “by the voice” and (2) the role of the ankle-bone in supporting the as a vote cast supports a proposition or candidate in an election.  Because only fragments of texts from thousands of years ago remain extant, it’s impossible to be emphatic about how such things happened but the consensus among modern etymologists appears to favour the purely metaphorical “support” rather than any use of bones as electoral tokens or calculation devices.  Better documented is the migration of suffrāgium to ecclesiastical use, entering Church Latin to use used to mean “prayers of intercession”; it was from here the English suffrage first entered the language.  As the Roman world Christianized, many words were re-purposed in a religious context and suffrāgium was picked up in the sense of “spiritual support”, manifested in prayers of intercession which originally were those offered for the “faithful dead”: in Confessiones (Confessions, 397-400), Saint Augustine of Hippo (354–430) wrote of suffragia sanctorum (the suffrages of the saints) by which he meant their intercessory prayers but, as was not uncommon, although the “masses for the dead” remained the standard, there was some theological mission creep and the prayers could assume a wider vista, extending also to the living.

Heartfelt advice in 1918 from a “suffragette wife” to young ladies contemplating marriage.

The Old French sofrage came directly from Church Latin, entering Middle English in the fourteenth century with suffrages being prayers of intercessions, often described as “petitions” to God or (in the case of specific topics) to the relevant saint or saints and “suffrage” seems to have entered the vernacular, Geoffrey Chaucer (circa 1344-1400) using the word merely as a synonym for “prayers” of whatever type.  Having thus arrived in the Church, the use was extended to the ecclesiastical structure, the first suffragan bishops appointed in the late 1500s, their role being a “bishop who assists another bishop” and the role seems to have been envisaged as something of a clerical plateau, intended as an appointment for one either “unsuitable” for an ordinary jurisdiction or with no desire to ascend the hierarchy.  The use came directly from the thirteenth century Old French suffragan, from the Medieval Latin suffraganeus (an assistant) which was a noun use of the adjective, (assisting, supporting) from the Latin suffragium (support).  The title endures to this day although between denominations there can be variations in the role (ie job description) including some being appointed as assistants to bishops while others directly administer geographical regions within a supervising bishop’s diocese.  That means the title alone does not describe the nature of the office and although a priest may be styled Diocesan bishop, Titular bishop, Coadjutor bishop, Auxiliary bishop or Suffragan Bishop, not all of the same type necessarily fulfil the same duties and there may be overlap.  While engaged in wartime cryptographic work for the UK government, the troubled mathematician Dr Alan Turing (1912-1954) became well-acquainted with the organizational structure of the British Army and was struck by the similarities between that institution and the Church of England as described in Anthony Trollope’s (1815-1882) The Chronicles of Barsetshire (published in a series of six novels between 1855-1867).  Ever the mathematician, Dr Turing devised a table, having concluded a lieutenant-colonel was a dean while a major-general was a bishop.  A brigadier was a suffragan bishop, the rational for that being they were the “cheapest kind of bishop”.

The Suffragist, 3 October, 1917.

It was the “re-discovery” of the Classical world (ironically often through the archives or writings of Islamic scholars) during the Renaissance and Reformation that Western scholars and translators re-visited the Latin sources, reviving the political sense of suffrāgium into English, restoring “vote” and “right to vote” alongside what had become the standard (religious) sense.  Even then, although there was in most places rarely a wide franchise, voting did happen (among a chosen few) and by the seventeenth century “suffrage” (a vote in an election) was part of common English use and in the 1700s & 1800s, as various forces began to coalesce into democratic movements, it assumed the meaning “a right to vote” which evolved gradually (via manhood suffrage, woman suffrage, negro suffrage etc) into the now familiar “universal adult suffrage”. In English, suffrage has thus enjoyed a palimpsestic past, its ancestral roots anatomical, adapted in antiquity for matters electoral, taken up in Christendom as a form of prayer before returning again with a use in democratic politics.

The most famous derived from was of course the noun suffragette which seems first to have been appeared in print in the UK in 1906, used as a term of derision (by a man).  It was an opportunist coining which can be deconstructed as a (etymologically incorrect) feminine form of the noun suffragist (an advocate of the grant or extension of political suffrage) but it owed its existence to the women who in the UK began to take militant action.  Whereas a suffragist might have been someone (male or female) who wrote learned letters on the subject to the editor of The Times, the suffragette chained herself to the railings outside Parliament House and engaged in other forms of civil disobedience with at least one fatality recorded.

The end of civilization as men knew it: Postcard marking the granting of voting rights to women by the colonial government in New Zealand (1893), printed & published in England by the Artist's Suffrage League, Chelsea, London.

Only four countries: New Zealand, Australia, Finland & Norway (and 11 US states) extended the franchise to women prior to World War I.  France (birthplace of “Liberté, égalité, fraternité”) denied women the vote until after World War II (1939-1945), Charles de Gaulle's (1890-1970; President of France 1959-1969) provisional government in Algiers granting “full suffrage” on 21 April 1944 with the first exercise of the right in the municipal elections of 29 April, 1945.  Swiss women gained the right to vote (at the federal level) in 1971, following a national referendum in which a majority approved the idea.  At the cantonal (regional) level, some cantons had earlier granted women voting rights, Vaud the first in 1959.  The last was Appenzell Innerrhoden which did so only to comply with a ruling by the Swiss Federal Supreme Court.

As the campaign stepped up, techniques were borrowed from anarchists and revolutionaries including fire-bombings of institutions of “the establishment”; if imprisoned, the suffragettes would stage hunger strikes compelling the home secretary to order either their release or force-feeding (a practice previously most associated with lunatic asylums).  Although the suffragettes generated international publicity and encouraged similar movements in other places, despite New Zealand having in 1893 having granted the vote to women on the same basis as men without the country having descended into some kind of feminized Hell, little progress was made and it was only the social and economic disruptions brought about by World War I which induced change, women over 30 able to vote in elections and be elected to parliament in 1918.  In 1928, this was extended to all women over 21, thus aligning their franchise with that which men had since 1918 enjoyed.  The 1928 settlement remains the classic definition of “universal suffrage” in the sense of “all adults” and all that has changed is the threshold age has been lowered to 18 although the UK government has suggested it will seek further to lower this to 16.  If that’s enacted, it’ll still be less permissive that what the ayatollahs (not usually thought paragons of liberalism) in Iran permitted during the 1980s when 15 year olds got the vote.

"Love, honor and obey" was a bride's traditional wedding vow but in the nuclear weapons treaty business between the US & USSR the principle was: "trust but verify".  

As the meme-makers knew, even after women voting became a thing, some husbands knew they still had to check to make sure their wives got it right:  Donald Trump (b 1946; US president 2017-2021 and since 2025) verifying the vote of Melania Trump (b 1970, US First Lady 2017-2021 and since 2025) while exercising her “secret ballot” in the 2016 US presidential election, Polling Station 59 (a school), Manhattan, New York, 8 November 2016.

The –ette suffix was from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something and the use in English to create informal feminine forms has long upset some, including Henry Fowler (1858–1933) who in his A Dictionary of Modern English Usage (1926) condemned the formation of “suffragette”: “A more regrettable formation than others such as leaderette & flannelette, in that it does not even mean a sort of suffrage as they mean a sort of leader & of flannel, & therefore tends to vitiate the popular conception of the termination's meaning. The word itself may now be expected to die, having lost its importance; may its influence on word-making die with it!”  Whether one might read into that that damnation that Henry Fowler regretted women getting the vote can be pondered but to be fair, the old linguistic curmudgeon may have been a proto-feminist who approved.  There were anyway some reactionaries who became converted to the cause.  After a satisfactory election result, Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) was reminded by his wife Clementine Churchill (1885–1977) that he’d received more votes from women than from men, having apparently been forgiven for having once been in the vanguard of the opposition to woman suffrage.  “Quite right”, cheerfully he agreed; a practical democrat, he by then welcomed votes regardless of their origin.

Woman Suffrage Headquarters, Euclid Avenue, Cleveland Ohio, 1912.

The word “suffrage” came by the late 1860s to be attached to activists advocating extending the franchise to women, “woman suffragist” & “female suffragist” both used in US publications and the divergence in the movement was reflected in the UK by the adoption of terms “manhood suffragist” (by at least 1866) and “woman suffragist” (by 1871) although the first reference of the latter was to actions in the US, the existence of the breed in England not acknowledged for a further three years.  Historically, both “woman suffrage” & “women's suffrage” were used but the former overwhelmingly was the standard phrasing late in the 1800s and into the next century when the matter became a great political issue.  To modern eyes “woman suffrage” looks awkwardly wrong but is grammatically correct, “woman” used as a noun adjunct (ie a noun modifying a following noun).  Singular noun adjuncts are common such as “student union” even though the in institution has a membership of many students.  In English, a singular noun can function attributively (like an adjective) to describe a category or class (manpower, horse racing etc).  The possessive (women’s suffrage) emphasizes ownership: the notion of suffrage (in the linguistic sense) “belonging” to women and in modern use that that appears to be the common form and “woman suffrage” was a formal, abstract construction from more exacting times, reflected in uses like “manhood suffrage”, “child labor”, “slave trade” etc.  In structural linguistics, the shift to a preference for possessive forms (workers’ unions, children’s rights, women’s movement etc) is thought a marker of the increasingly fashionable concepts of agency and belonging.

“Kaiser Wilson” protest sign criticizing Woodrow Wilson (1856–1924; US president 1913-1921) for not keeping his 1916 election “promise” to fight for woman suffrage: “Have you forgotten your sympathy with the poor Germans because they were not self-governed?  20,000,000 American women are not self-governed.  Take the beam out of your own eye.  The quote: “Take the beam out of your own eye” comes from Biblical scripture:

Matthew 7:3-5 (King James Version, (KJV, 1611))

3 And why beholdest thou the mote that is in thy brother's eye, but considerest not the beam that is in thine own eye?

4 Or how wilt thou say to thy brother, Let me pull out the mote out of thine eye; and, behold, a beam is in thine own eye?

5 Thou hypocrite, first cast out the beam out of thine own eye; and then shalt thou see clearly to cast out the mote out of thy brother's eye.

What’s discussed in Matthew 7:3-5 is hypocrisy, the metaphor being a speck of dust in one’s brother's eye and a plank in one's own and the teaching is one should first rectify their own significant flaws (the “plank”) before criticizing the minor flaws of others (the “speck”).  What reading the passage should do is encourage humility and self-reflection, persuading individuals to acknowledge their own shortcomings before judging others.  The passage was part of the Sermon on the Mount, regarded by Christians as a central element in Christ’s moral teachings and Woodrow Wilson, the son of a preacher and himself a noted (if selective) moralist would have well acquainted with the text.

Watched by an approving comrade Vyacheslav Molotov (1890–1986; Soviet foreign minister 1939-1949 & 1953-1956), comrade Stalin (1878-1953; Soviet leader 1924-1953) casts his vote in the 1937 election for the Supreme Soviet.  To the left, Comrade Marshal Kliment Voroshilov (1881–1969) watches Comrade Nikolai Yezhov (1895–1940, head of the NKVD 1936-1938).

Those voting in 1937 may have had high hopes for the future because, read literally, the 1936 Constitution of the Soviet Union (adopted 5 December 1936) described a democratic utopia.  Unfortunately, within months, comrade Stalin embarked on his Great Purge and turned his country into a kind of combination of prison camp and abattoir, many of those involved in drafting the constitution either sent to the Gulag or shot.  In 1937 the CPSU (Communist Party of the Soviet Union) was declared to have won 99% of the vote so it was not an exceptional result but the photograph is unusual in that it’s one of the few in which the usually dour comrade Molotov is smiling.  It was comrade Vladimir Lenin (1870–1924; head of government of Russia or Soviet Union 1917-1924) who dubbed Molotov “stone ass” because of his famous capacity (rare among the Bolsheviks) to sit for hours at his desk and process the flow of paperwork the CPSU’s bureaucracy generated.  Precise in every way, Molotov would correct those who suggested Lenin’s moniker had been “iron ass” but, disapproving of “shameful bureaucratism”, he may have used several variants in the same vein and in another nod to Molotov’s centrality in the administrative machinery of government, he was known also as “comrade paper-clip”.

On paper, between 1936-1991, the Supreme Soviet was the highest institution of state authority in the Soviet Union (1922-1991) but was in reality a “rubber stamp parliament” which existed only to ratify, adding a veneer of legality to laws sent down by the executive, controlled exclusively by the CPSU although it was valued for photo-opportunities, enthralled delegates always seen attentively listening to comrade Stalin’s speeches.  On election night comrade Stalin was quoted in the Soviet press as saying: “Never in the history of the world have there been such really free and really democratic elections -- never!  History knows no other example like it...our universal elections will be carried out as the freest elections and the most democratic compared with elections in any other country in the world.  Universal elections exist and are also held in some capitalist countries, so-called democratic countries.  But in what atmosphere are elections held there?… In an atmosphere of class conflicts, in an atmosphere of class enmity.  The statement often attributed to comrade Stalin: “It's not who votes that counts, it's who counts the votes” probably was apocryphal but indicative of how he did things and his psephological model has been an inspiration to figures such as Saddam Hussein (1937–2006; president of Iraq 1979-2003) and Kim Jong-Un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011).

Thursday, July 3, 2025

Zugzwang

Zugzwang (pronounced tsook-tsvahng)

(1) In chess, a situation in which a player is limited to moves that cost pieces or have a damaging positional effect.

(2) A situation in which, whatever is done, makes things worse (applied variously to sport, politics, battlefield engagements etc).

(3) A situation in which one is forced to act when one would prefer to remain passive and thus a synonym of the German compound noun Zugpflicht (the rule that a player cannot forgo a move).

(4) In game theory, a move which changes the outcome from win to loss.

Circa 1858 (1905 in English): A modern German compound, the construct being zug+zwang.  Zug (move) was from the Middle High German zuc & zug, from the Old High German zug ,from Proto-Germanic tugiz, an abstract noun belonging to the Proto-Germanic teuhaną, from the primitive Indo-European dewk (to pull, lead); it was cognate with the Dutch teug and the Old English tyge.  Zwang (compulsion; force; constraint; obligation) was from the Middle High German twanc, from the Old High German geduang.  It belongs to the verb zwingen and cognates include the Dutch dwang and the Swedish tvång.  The word is best understood as "compulsion to move" or, in the jargon of chess players: "Your turn to move and whatever you do it'll make things worse for you", thus the application to game theory, military strategy and politics where there's often a need to determine the "least worse option".  Zugzwang is a noun; the noun plural is Zugzwänge.  In English, derived forms such as zugzwanged, zugzwanging, zugzwangish, zugzwanger, zugzwangesque and zugzwangee are non-standard and used usually for humorous effect.

Chess and Game Theory

Endgame: Black's turn and Zugzwang! Daily Chess Musings depiction of the elegance of zugwang.

The first known use of Zugzwang in the German chess literature appears in 1858; the first appearance in English in 1905.  However, the concept of Zugzwang had been known and written about for centuries, the classic work being Italian chess player Alessandro Salvio's (circa 1575–circa 1640) study of endgames published in 1604 and he referenced Shatranj writings from the early ninth century, some thousand years before the first known use of the term.  Positions with Zugzwang are not rare in chess endgames, best known in the king-rook & king-pawn conjunctions.  Positions of reciprocal Zugzwang are important in the analysis of endgames but although the concept is easily demonstrated and understood, that's true only of the "simple Zugzwang" and the so-called "sequential Zugzwang" will typically be a multi-move thing which demands an understanding of even dozens of permutations of possibilities.

Rendered by Vovsoft as cartoon character: a brunette Lindsay Lohan at the chessboard.  In her youth, she was a bit of a zugzwanger.

Zugzwang describes a situation where one player is put at a disadvantage because they have to make a move although the player would prefer to pass and make no move. The fact the player must make a move means their position will be significantly weaker than the hypothetical one in which it is the opponent's turn to move. In game theory, it specifically means that it directly changes the outcome of the game from a win to a loss.  Chess textbooks often cite as the classic Zugzwang a match in Copenhagen in 1923; on that day the German Grandmaster (the title inaugurated in 1950) Friedrich Sämisch (1896–1975) played White against the Latvian-born Danish Aron Nimzowitsch (1886-1935).  Playing Black, Nimzowitsch didn’t play a tactical match in the conventional sense but instead applied positional advantage, gradually to limit his opponent’s options until, as endgame was reached, White was left with no move which didn’t worsen his position; whatever he choose would lead either to material loss or strategic collapse and it’s said in his notebook, Nimzowitsch concluded his entry on the match with “Zugzwang!  A noted eccentric in a discipline where idiosyncratic behaviour is not unknown, the Polish Grandmaster Savielly Tartakower (1887-1956) observed of Nimzowitsch: “He pretends to be crazy in order to drive us all crazy.

French sculptor Auguste Rodin's (1840-1917) The Thinker (1904), Musée Rodin, Paris (left) and Boris Johnson (b 1964; UK prime-minister 2019-2022) thinking about which would be his least worst option (left).

In its classic form chess is a game between two, played with fixed rules on a board with a known number of pieces (32) and squares (64).  Although a count of the possible permutations in a match would yield a very big number, in chess, the concept of Zugwang is simple and understood the same way by those playing black and white; information for both sides is complete and while the concept can find an expression both combinatorial game theory (CGT) and classical game theory, the paths can be different.  CGT and GT (the latter historically a tool of economic modelers and strategists in many fields) are both mathematical studies of games behaviour which can be imagined as “game-like” but differ in focus, assumptions, and applications.  In CGT the basic model (as in chess) is of a two-player deterministic game in which the moves alternate and luck or chance is not an element.  This compares GT in which there may be any number of players, moves may be simultaneous, the option exists not to move, information known to players may be incomplete (or asymmetric) and luck & chance exist among many variables (which can include all of Donald Rumsfeld’s (1932–2021: US defense secretary 1975-1977 & 2001-2006) helpful categories (known knowns, known unknowns, unknown unknowns & (most intriguingly) unknown knowns).  So, while CGT is a good device for deconstructing chess and such because such games are of finite duration and players focus exclusively on “winning” (and if need be switching to “avoiding defeat”), GT is a tool which can be applied to maximize advantage or utility in situations where a win/defeat dichotomy is either not sought or becomes impossible.  The difference then is that CGT envisages two players seeking to solve deterministic puzzle on a win/lose basis while GT is there to describes & analyse strategic interactions between & among rational actors, some or all of which may be operating with some degree of uncertainty.

Serial zugzwanger Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022), Parliament House, Canberra.  More than many, Mr Joyce has had to sit and ponder what might at that moment be his “least worst” option.  He has made choices good and bad.

In politics and military conflicts (a spectrum condition according to Prussian general and military theorist Carl von Clausewitz (1780–1831)), a zugzwang often is seen as parties are compelled to take their “least worst” option, even when circumstances dictate it would be better to “do nothing”.  However, the zugzwang can lie in the eye of the beholder and that why the unexpected Ardennes Offensive, (Wacht am Rhein (Watch on the Rhine) the German code-name though popularly known in the West as the Battle of the Bulge, (December 1944-January 1945)) was ordered by Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945).  It was the last major German strategic offensive of World War II (1939-1945) and among all but the most sycophantic of Hitler’s military advisors it was thought not “least worst” but rather “worse than the sensible” option (although not all the generals at the time concurred with what constituted “sensible”).  Under the Nazi state’s Führerprinzip (leader principle) the concept was that in any institutional structure authority was vested in the designated leader and that meant ultimately Hitler’s rule was a personal dictatorship (although the extent of the fragmentation wasn’t understood until after the war) so while the generals could warn, counsel & advise, ultimately decisions were based on the Führer’s will, thus the Ardennes Offensive.

While the operation made no strategic sense to the conventionally-schooled generals, to Hitler it was compelling because the tide of the war had forced him to pursue the only strategy left: delay what appeared an inevitable defeat in the hope the (real but still suppressed) political tensions between his opponents would sunder their alliance, allowing him to direct his resources against one front rather than three (four if the battle in the skies was considered a distinct theatre as many historians argue).  Like Charles Dickens’ (1812–1870) Mr Micawber in David Copperfield (1849-1850), Hitler was hoping “something would turn up”.  Because of the disparity in military and economic strength between the German and Allied forces, in retrospect, the Ardennes Offensive appears nonsensical but, at the time, it was a rational tactic even if the strategy of “delay” was flawed.  Confronted as he was by attacks from the west, east and south, continuing to fight a defensive war would lead only to an inevitable defeat; an offensive in the east was impossible because of the strength of the Red Army and even a major battlefield victor in the south would have no strategic significance so it was only in the west a glimmer of success seemed to beckon.

The bulge.

In the last great example of the professionalism and tactical improvisation which was a hallmark of their operations during the war, secretly the Wehrmacht (the German military) assembled a large armored force (essentially under the eyes of the Allies) and staged a surprise attack through the Ardennes, aided immeasurably by the cover of heavy, low clouds which precluded both Allied reconnaissance and deployment of their overwhelming strength in air-power.  Initially successful, the advance punched several holes in the line, the shape of which, when marked on a map, lent the campaign the name “Battle of the Bulge” but within days the weather cleared, allowing the Allies to unleash almost unopposed their overwhelming superiority in air power.  This, combined with their vast military and logistical resources, doomed the Ardennes Offensive, inflicting losses from which the Wehrmacht never recovered: From mid-January on, German forces never regained the initiative, retreating on all fronts until the inevitable defeat in May.  A last throw of the dice, the offensive both failed and squandered precious (and often irreplaceable) resources badly needed elsewhere.  By December 1944, Hitler had been confronted with a zugzwang (of his own making) and while whatever he did would have made Germany’s position worse, at least arguably, the Ardennes Offensive was not even his “least worse” option.