Showing posts sorted by date for query Emerge. Sort by relevance Show all posts
Showing posts sorted by date for query Emerge. Sort by relevance Show all posts

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text” endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Saturday, November 8, 2025

Patent

Patent (pronounced pat-nt or peyt-nt)

(1) The exclusive right, granted by a government to an inventor (or owner of the invention) to manufacture, use or sell an invention for a certain length of time.

(2) An invention or process protected by an exclusive right to manufacture, use, or sell it.

(3) An official document conferring on the inventor the exclusive right to manufacture, use, or sell an invention; letters patent.

(4) Protected by an exclusive right given to an inventor to manufacture, use, or sell an invention; patented; the holding of an exclusive right to manufacture, use, or sell an invention.

(5) Relating to, concerned with, or dealing with the granting of exclusive rights to sell or manufacture something, especially inventions (ie the matter of “patent law” dealt with by a “patent attorney”.

(6) Of or pertaining to a right, privilege etc conferred by a patent.

(7) To take out a patent on; obtain the exclusive rights to (an invention, process, etc) by securing a patent.

(8) In US law, the instrument with which by which the federal government conveys a legal title in fee-simple (freehold) to public land.

(9) An ellipsis of patent leather (a varnished, high-gloss leather used in fashion for shoes, handbags, coats and such).

(10) As patent leather, a hide treated in a way which results in a very shiny surface.

(11) Of plate glass, ground and polished on both sides,

(12) In pharmaceuticals, (of a medication) sold without a prescription and usually protected by an exclusive legal right to manufacture (described often as “patent remedies” or “patent drugs”).

(13) In medicine, (of a duct or passage in the body) open or unobstructed.

(14) In medicine (including veterinary medicine) of an infection, in the phase when the organism causing it can be detected by clinical tests.

(15) In phonetics, open, in various degrees, to the passage of the breath stream.

(16) In metallurgy to heat a metal above a transformation temperature and then quench (cool) it in preparation for cold-drawing, wire pulling etc.

(17) In gambling, the combination of seven bets on three selections, offering a return even if only one bet comes in.

(18) In baking (of flour), fine, and consisting mostly of the inner part of the endosperm of the grain from which it is milled.

(19) In botany (and sometimes in horticulture and agriculture generally), expanded or spreading.

(20) Lying open; not enclosed or shut in (often as “a patent field” and applied also to open doorways, passages and such.

(21) Readily open to notice or observation; evident unconcealed, conspicuous, palpable, clear (usually in the phrase “patently obvious”).

(22) To originate and establish as one's own.

(23) A characteristic or quality that one possesses; in particular (hyperbolic) as if exclusively; a monopoly (often in the form “got a patent on”).

(24) An official document granting a right (the significance of the "patent" element in "letters patent" being it indicated the document was openly published an accessible to all (ie in the sense of the Latin patēns).

(25) Any right granted by such a document.

1250–1300: As an adjective, patent was from the Middle English patent, from the Latin patent-, stem of patēns (open, standing open), present participle of patēre (to stand open, lie open).  The Middle English noun patent (document granting an office, property, right, title, etc.; document granting permission, licence; papal indulgence, pardon) was either a clipping of “letters patent”, a translation of the Medieval Latin littera patēns or litterae patentēs (open letters) or was directly from the Anglo-Norman and Middle French patente (which endures in modern French as patent) or a clipping of the Anglo-Norman lettres patentes, Middle French lettres patentes, lettre patente and Old French patentes lettres (document granting an office, privilege, right, etc or making a decree).  The adjective patent (granting a right, privilege, or power) emerged late in the fourteenth century while the sense of “open to view, plain, clear” was in use by at least 1505 and use as an adverb dates from the mid fifteenth century.

The verb dates from the 1670s and was derived from the Middle English nouns patent & patente (wide open; clear, unobstructed; unlimited; of a document: available for public inspection), from the Anglo-Norman & Middle French patent and directly from their etymon the Latin patēns (open; accessible, passable; evident, manifest; exposed, vulnerable), the present active participle of pateō (to be open; to be accessible, attainable; to be exposed, vulnerable; of frontiers or land: to extent, increase), from the primitive Indo-European pete or peth- (to spread out; to fly).  The verb originally was used in the sense of “to obtain right to land" by securing letters patent” while the meaning “obtain a copyright to an invention” was in use by at least 1822, building on the earlier meaning (recorded in 1789) “obtain an exclusive right or monopoly” a privilege granted by the Crown by the issue of letters patent.  Patents issued thus (for a licence granted by a government covering a new and useful invention, conferring exclusive right to exploit the invention for a specified term of years) came into use in the 1580s.  Patent is a noun, verb & adjective, patenter, patentor, patentee, patentholder, patency, patentability, impatency, patency & prepatent are nouns, patented is a verb & adjective, patenting is a verb, patentable, antipatent, patentlike, patentfree, patentless & impatent are adjectives and patentably & patently are adverbs; the noun plural is patents.  The derived forms (nonpatentable, unpatentability, repatent etc) are used as required.

Alice Geek TeckTM strapless bra.  The product was released with Geek TeckTM still in its "Pat. pend" phase.

The Alice Geek TeckTM strapless bra was released in 2015, its novelty being the use of “Patent-Pending Geek TeckTM” panels which exploited the Van Der Waals forces (intermolecular electrostatic attractive forces) created by their silicone construction with microscopic hair-like structures known as setae (analogous to those found on the feet of geckos, famous for their ability to attach themselves (upside-down) to ceilings, using, if need be, only one foot.  The theory was the Geek TeckTM panels would “stick to” the wearer’s skin thereby enhancing the most important design imperative of the strapless bra: staying up.  US patent 9,402,424 was assigned to Kellie K apparel LLC but it seems not to have succeeded which is unfortunate because there’s a gap in the market for a genuinely gravity-defying strapless bra.

The familiar term “patent pending” (often seen stamped on products in the abbreviated form “Pat. pend.”) is used to indicate a patent application has been filed but has not yet been granted.  The significance of the use is: (1) it can act to deters competitors, signalling to potential “copycats” that patent protection is expected to be granted, thus discouraging attempts at imitation, (2) it’s thought to lend credibility to a product, thus conferring a marketing advantage, (3) it can make a product more attractive to potential investors because a patent grants years of protection from competition and (4) the existence of the label can in subsequent infringement proceedings lead to a higher award of damages because it can be used as evidence the other party did not act “in good faith”.  However, the mere existence of a “Pat. Pend.” label does not provide legal protection and others may still (at their own risk) copy and sell the product, something of significance because patent applications can take months (or even years in complex or contested matters) to process and there have been cases where a company violating a subsequently granted patent has “come and gone” (taking with them their profits) by the time a patent is granted.  Importantly, a manufacturer cannot mark something as “Pat. Pend.” just to try to ward of potential competition and in most jurisdictions it’s unlawful to use the term if no application has been filed.  In legal slang, “patentees” and “patentspeak” are terms referring to the legal and technical jargon used in the handling of patents while “patentometrics” is the statistical analysis of patents.

In law, “patent troll” is an informal term used (usually disparagingly) to describe an individual or company which acquires and enforces patents in an aggressive and opportunistic manner, often with no intention of producing, marketing, or promoting the subjects of the patents.  The term is based on the similar concepts “trademark troll” and “copyright troll” and in more formal use a “patent troll” is usually styled a “patent assertion entity” or a “non-practicing entity”.  The seemingly curious business model (making money by neither producing or selling stuff to which one holds the exclusive patent) works usually through litigation or (more typically) the threat of litigation, exploiting the cost–benefit imbalance between contesting versus settling a lawsuit.  Sometimes speculatively but usually because potential targets have been identified, patent trolls will (1) buy older or unused patents from bankrupt companies, small inventors or concerns which have no further use for them or (2) file new patents that are broad or vague, something especially prevalent in highly technical fields where change is rapid (anything IT related the classic example) and specialists can amass hundreds or even thousands of patents, some unambiguously enforceable, some with enough of a hint of validity to be a creditable threat.  Thus equipped, patent trolls search for possible targets for litigation, the ideal victims being (1) companies so big they might settle a claim for what is (for them) a small sum (though most lucrative for the trolls who may have done little more than send a C&D (cease & desist letter)) or (2) smaller companies which cannot afford the cost of litigation (they might settle for less but it’s still a profit to the troll) because even if a case successfully is defended, the cost of doing so can, in the US, run to millions.

What that means is the troll’s business model has three potential revenue streams: (1) licensing fees, (2) one-off settlements and (3) court-awarded damages (in the rare instances in which a case goes to trial).  With no costs associated with R&D (research & development), product testing, production or marketing, a troll’s overheads are comparatively minimal and limited usually to legal and administrative fees.  Highly developed practitioners of trolling also use elaborate company structures made up of trusts, shelf companies and such, often in trans-national form, the jurisdictions chosen on the basis of which is most advantageous for a certain purpose (secrecy, taxation arrangements, limitations of liability etc); all these layers can protect a troll’s assets from counter-claims.  Patents are also “just another asset” and once assembled become a portfolio which can be leveraged as investment vehicles, something done often by the device of bundling them in securitized form, sometimes S&Ded (sliced & diced) for sale to investors, not as individual patents but as a percentage of the whole.

Some products become known as “patent something” because they gained their original uniqueness by virtue of patent protection.  In nautical use, a “patent log” is a mechanical device dragged from the stern of the vessel and used to indicate the craft’s speed through the water; most consist of a rotator (ie on the principle of a propeller) and reading unit, connected by a stiff line (usually covered with a flexible, protective skin).  Even in the age of electronic sensors, patent logs remain in use because they are simple, reliable, low maintenance units which require no external power source, the rotator spinning as it proceeds astern, the rotations of the connecting line registered by a wheel works and dial mounted to the vessel's rail.  The earliest versions of mechanical logs had the counting attached directly to the rotator, meaning the apparatus had to be hauled aboard to “take a reading” so the US innovation in the 1860s of a connecting line (spinning a la the mechanical speedometers which later would appear in automobiles) was an advance which made the thing a “real time” device.

An 1881 Patent Log by Thomas Walker, on display at the Smithsonian Natural History Museum.

For many reasons, to know a vessel’s true speed was an important part of seamanship and “log” element in the name came from the old way sailors determined speed.  Since the sixteenth century, the technique had been to attach knotted rope to a wooden log which was heaved overboard and, the knots being tied at regular intervals, the number of knots counted off over a short period indicating the speed.  From this came the standard unit of speed at sea being the “knot” (one knot being equal to one nautical mile per hour and few things annoy old salts more than the expression “knots per hour”).  The log method obviously was inexact because of the variables to which it was subject so the mechanical device was a great advance.  A company founded by Thomas Walker (1805-1873) as a nautical instrument maker based in Birmingham (in England about as far as one can get from the sea) received a patent for a mechanical log in 1878, sometime before one was granted by the US patent office although that application was submitted in 1877.

Lindsay Lohan during blonde phase in Lanvin patent leather coat, New York City, May 2007.

“Patent leather” describes a hide which has been coated with a process using a substance which produces a high-gloss finish, so shiny as to be described as “like a polished, glazed ceramic”.  In fashion, the attraction of patent leather is that despite the brittle appearance, it retains all the flexible qualities and durability of leather while being almost waterproof (although intrusion can of course be possible at the seams).  Most associated with shoes, boots, handbags and coats, the original patent leather seems exclusively to have been produced in black but a wide range of colors have long been available so the material quickly became a favourite of designers.  In the late 1700s when patent leather first became commercially available in England, the lacquer coating was linseed oil-based but what revolutionized things and made mass-production more viable was the invention by metallurgist Alexander Parkes (1813–1890) of Parkesine, the first man-made “plastic”; it was one of his dozens of patented inventions, thus the name “patent leather”.  It was Parkesine which enabled the development of multi-colored patent leathers and because the product literally is “leather with a synthetic coating”, it’s one of the natural products most easily emulated (in appearance) by a plastic alternative although the imitations never possessed the same qualities.  Interestingly, many of the various processes used early in the nineteenth century to patent leather were never patented.

The former Court of Star Chamber (1836), drawing by unknown artist.

There were also “patent theatres”.  In England letters patent were for years a standard device in the administration of censorship, something that attracted increased interest from governments as soon as the printing presses began to operate at scale.  The printing press was one of the great creations of civilization but their availability appalled priest and politician alike because the last thing they wanted was “the common people” being given ideas (which they knew quickly would become heresy and sedition).  Under Henry VIII (1491–1547; King of England (and Ireland after 1541) 1509-1547) proclamations against heretical and seditious publications soon appeared and in 1538 a statute was added declaring books must be licensed for printing by the Privy Council or other royal nominees.  What this did was create a flourishing black market for works produced by illegal presses and this battle between censorship and “underground” publications would for some 450 years characterize the way things were done in England.  One critical development came in 1557 when the Stationers' Company was granted a “charter of incorporation” which provided that only members of the company (or others holding a special patent) were allowed to print any work for sale in the kingdom.  In 1586, the ever imaginative Court of Star Chamber devised an ordinance which directed that no printing press might be set up in any place other than London (with the exception of one each for the university towns Oxford and Cambridge) and rigorously, the Star Chamber enforced this law with their usual zeal and although the court was in 1641 abolished by the Long Parliament, governments didn’t lose their fondness for censorship; under the Commonwealth restrictions were tightened with all “unofficial periodicals” (a move aimed at troublesome “newsletters, precursors to modern magazines and newspapers) banned and while the Rump Parliament of 1659 permitted “licensed newsbooks”, severely their issue was restricted.

During the Restoration period neither the government’s strategy or tactics much changed and material deemed libellous or offensive (values which cast a wide net) to the state or Church could see offenders fined, imprisoned pilloried or hanged (the last invoked if the offence was judged “high treason”).  By the eighteenth century things had somewhat been relaxed but Thomas Paine (1737-1809) was nevertheless compelled to flee to France when his book Rights of Man (1791) was declared “subversive” and a warrant issued for his arrest; even an article condemning the use of disciplinary flogging by the military could attract a fine of Stg£1,000 (then a small fortune) and two years in prison.  Being popular entertainment and accessible to even the illiterate, censorship of the theatre was important and the licensing of individual plays seems to have begun as early as the 1640s with an inspired piece of legislation in 1572 deeming all players (actors) “rogues and vagabonds” unless they belonged to (1) a baron of the realm, (2) somebody of higher rank or (3) were licensed by two justices.

Theatre Royal, Drury Lane, London, one of the original two "patent theatres".

Later, London’s theatrical companies worked under royal patents created by issue of the appropriate letters patent.  Curiously, governments, while much concerned with the preservation of political & ecclesiastical power, had rather neglected public morality but the Puritans were appalled by even the idea of the theatre.  Oliver Cromwell (1599–1658; Lord Protector of the Commonwealth 1653-1658) and his ilk thought the stage a place of immorality and in 1542 the Long Parliament prohibited all dramatic performances.  Inevitably, with theatres closed, an underground movement arose, the best documented of which were the Droll-Humours.  At or after the Restoration, letters patent were issued so companies could be formed and in 1662 these conferred on the recipients the exclusive right to present, in public, plays in public within the City of Westminster.  It proved a lucrative business and after the deaths of the original holders of the rights, lawyers began their squabble over to whom or what entity the letters patent should be conveyed; the disputes dragged on for some time before ultimately they were settled on the Covent Garden and Drury Lane theatres.  These enduring institutions thus came to be called the “patent theatres” and what the letters called “drama” was confined to the patent theatres.  However, nobody had bothered to define exactly what constituted “legitimate drama” and that remained a source of dispute among critics and lawyers, resolved only when the Theatres Act (1843) rendered the original letters patent inoperative.

Drawing of patent hammer, attached to Mr Richard’s application to the US Patent Office. the image is from the Trowel and Masonry Tool Collector Resource.

In stone-masonry, a “patent hammer” is a specialized hammer used by stonemasons for dressing stone, the head having two faces formed by a number (at least 2 but usually with 4, 6, 8, 10 or 12 “cuts” (blades) broad, thin chisels bolted side by side); the bolts could be loosened, allowing the blades to be removed to be re-sharpening or replaced.  The head of a patent hammer was heavy and the tool was used for finishing granite or the harder grades of sandstone and the choice of which to use was dictated by nature of the stone and the finish desired.  Historically, the most commonly used jaw opening was ⅞ inch but other graduations between ½ and one inch were widely produced and in the jargon of the trade, the number of cuts per nominal inch became the nominal description (eg an “8-cut finish”).  Essentially a time-saving device, use of a patent hammer allowed a stonemason to render a grooved surface more quickly and with more consistency than when using a single hand chisel.  The tools were in various places known also as the “patent bush hammer” “Scotia hammer” and “patent Scotia hammer” although, as a general principle, the Scotias usually were lighter and featured smaller jaw openings.  The tool gained its name from the patent granted in 1828 to Joseph Richards (1784-1848) of Braintree, Massachusetts and although the evidence suggests similar devices had for centuries been in use (presumably crated ad-hoc by stonemasons or tool-makers), this issue of the 1828 patent triggered an onrush of patent applications for stonemasonry tools and the US Patent Office (which classed them as “bush hammers” or “mill picks” to distinguish them from other hammers) soon had literally dozens of variants on the books.

In English law, letters patent and royal decrees (now more commonly styled as royal proclamations) are instruments with which the Crown exercises its prerogative powers, but they differ in form, purpose, and legal effect.  Letters patent are formal, written documents issued under the Great Seal, open for public inspection, declaring the monarch’s will in a matter of public record; they are addressed to all subjects, not to an individual or private recipient.  As an administrative device, letters patent are used to confirm rights, titles, offices, or privileges (including creating or conferring peerages or knighthoods) granting corporate charters (universities or city incorporations etc), issuing patents of invention or land grants and appointing public offices of state (governors, judges etc).  As legal devices, they operate as instruments of grant rather than command and unusually, take effect by virtue of being published, not by their delivery, registration or some form of gazetting.  Importantly, they can be subject to judicial challenge and voided if found to have been issued ultra vires (a legal maxim from the from Latin ultra vires (beyond the power) meaning (in this case) held to be beyond the monarch’s lawful prerogative) so although sounding something of an echo of the days of absolute power being exercised from the throne, they do operate within modern constitutional limits.

A royal proclamation is a command or declaration made by the monarch and issued over their signature but almost always drafted by the responsible ministers in government and published in the Gazette.  While a term like “royal proclamation” sounds like it might be used for commands like “off with their heads”, in modern use, typically, they’re invoked to announce or enforce policies, order, or regulations and that this is done under the royal prerogative is merely procedural.  So, while most are prosaic, (the regulation of this and that; announcing public holidays or public ceremonies etc), historically, royal proclamations have declared war and routinely still are the instrument summoning or dissolving parliament.  In the narrow technical sense the royal proclamation operates as an executive command rather than a grant but has a valid force of law only when issued under a lawful prerogative or statutory authority (since the Bill of Rights (1689), proclamations cannot create new offences or change existing law without the consent of both houses of parliament (as modified by the Parliament Acts (1911 & 1949)).

Mr Andrew Mountbatten Windsor (the former Prince Andrew, Duke of York) in the Garter robe he no longer dons (at least not when in public view).  Mr Mountbatten Windsor is the great grandson of King George V.

Because most are procedural, letters patent usually barely register in the public consciousness but, around the world, their use in late 2025 in the matter of Andrew Albert Christian Edward Mountbatten Windsor (b 1960) certainly made headlines.  Mr Mountbatten Windsor once was styled HRH (His Royal Highness) Prince Andrew, Duke of York, KG (Knight Companion of the Most Noble Order of the Garter), GCVO (Knight Grand Cross of the Royal Victorian Order) but the controversy about his alleged conduct with certain young women associated with the convicted child sex offender Jeffrey Epstein (1953–2019) meant that between 2022 and 2025, almost all his many titles gradually were (in one way or another) put into abeyance before his brother Charles III (b 1948; King of the United Kingdom since 2022) issued the letters patent effectively removing all.

Until that point, the gradual nibbling away of Mr Mountbatten Windsor’s array of titles had been an example of inept crisis management with him in 2022 ceasing to be a “HRH” in a “public capacity” but remaining one in his “private capacity”.  That didn’t mean he could use it only in his bedroom but meant it couldn’t be used were he to appear at any “official public event”.  While one being able to call oneself “HRH” only in private (presumably among consenting adults) might sound a bit of a slap on the royal wrist, it is possession of styles and titles which determine one’s place in the “order of precedence”, something of great significance to those who move in certain circles because where one sits on the pecking order determines things like who has to bow or curtsy to whom and whether at events one gets to sit somewhere nice with the dukes & earls or is shunted off into a corner with the provincial mayors and eldest sons of knights.  As a weapon, the removal of the “HRH” has been used against the Duchess of Windsor (Wallis Simpson; 1896–1986), Diana, Princess of Wales (1961-1997) and the Duchess of Sussex (Meghan Markle; b 1981).  Although Mr Mountbatten Windsor’s notorious television interview (approved by the palace courtiers against the advice of the media pros) seemed at the time the nadir of the crisis management of the “Andrew problem” (ranking with Boeing’s handing of the 737 Max’s “issues” and Intel’s attempt to “non-handle” the flaws in the original Pentium’s inbuilt math co-processor), the “drip feed” of the way his styles and titles gradually were eroded made things worse still.  As a footnote, the former Prince Andrew is now known as “Andrew Mountbatten Windsor” rather than “Andrew Windsor” because his father (Prince Philip, Duke of Edinburgh (1921–2021)) was upset his sons wouldn’t bear his name so the “Mountbatten” was added.

Revelations about his alleged conduct continued to emerge and in mid October, 2025, it was announced that following discussions with the king, he would cease to make use of the styles of address to which he was entitled as a duke and twice a knight of the realm (both knighthoods being in orders of chivalry in the personal gift of the sovereign (his mother) with no involvement by government).  That didn’t mean he ceased to be a duke (with subsidiary peerages) or the possessor of two knighthoods in orders of chivalry, just that he would no longer “use them”.  That meant for all public purposes he would revert to what he was by virtue of his birth: plain old “Prince Andrew”.  Had the revelations stopped there, the “fix” might have worked but as fresh accusations continued to appear, not only was the press making trouble but there were suggestions “the Andrew problem” might be discussed on the floor of the House of Commons where members enjoy what’s called “parliamentary privilege” (the right to make even defamatory statements without risk of legal action).  What appeared to be of particular interest to some politicians was Mr Mountbatten Windsor remaining eighth in the line of succession to the British throne (and thus the monarchies of Australia, the Bahamas, Belize, Canada and such).

Accordingly, on 30 October 2025, the palace announced the king would be removing all his brother's styles, titles, and honours.  While technically this does not revoke the peerages, it does mean they are no longer “effective” and thus not affecting the vital order of precedence.  On 3 November, the king issued letters patent stripping Andrew of both the style “HRH” and title “prince”.  That the king can do this by the mere inking of a sheet of vellum is because (1) letters patent are a powerful tool and (2) in 1917 George V (1865–1936; King of the United Kingdom & Emperor of India 1910-1936) effectively codified the monarch’s authority in such matters; no involvement by parliament being required.  In 1917 the UK was at war with the German Empire so anti-German sentiment was about and as well as changing the royal family’s name from the obviously Teutonic Saxe-Coburg-Gotha to Windsor, the opportunity was taken for an “agonizing reappraisal” of the domestic structure.

Letters Patent issued by George V, 30 November 1917.  When mention was made to the "Great Seal of the United Kingdom of Great Britain and Ireland", the reference was literally to a big wax seal.

Thus, King George V issued letters patent restricting use of the titles “Prince” & “Princess” and the style “HRH” to certain close relatives of the monarch: (1) the children of the sovereign, (2) the male-line grandchildren of the sovereign (3) and the eldest living son of the eldest son of the Prince of Wales (ie the heir apparent’s eldest son).  Other descendants of the monarch would be styled as children of dukes (Lord or Lady).  In doing this George V wasn’t claiming or asserting a new royal prerogative (it had long been acknowledged) but his issue of the 1917 Letters Patent was the moment it was codified and assumed the force of a formal decree.  That’s why it’s misleading to say the UK doesn’t have a written constitution; it’s just all the bits and pieces don’t appear in one consolidated document al la the US, Australia or the old Soviet Union.  The words of the 1917 Letters Patent were:

Whitehall, 30th November, 1917.  The KING has been pleased by Letters Patent under the Great Seal of the United Kingdom of Great Britain and Ireland, bearing date the 30th day of November, 1917, to declare that the children of any Sovereign of these Realms and the children of the sons of any such Sovereign and the eldest living son of the eldest son of the Prince of Wales shall have and at all times hold and enjoy the style, title or attribute of Royal Highness with their titular dignity of Prince or Princess prefixed to their respective Christian names or with their other titles of honour; and that the grandchildren of the sons of any such Sovereign in the direct male line (save only the eldest living son of the eldest son of the Prince of Wales) shall have and enjoy in all occasions the style and title enjoyed by the children of Dukes of these Our Realms.

And forasmuch as it has become expedient that the usage whereby the style, title or attribute of Royal Highness and of Prince or Princess shall be borne by other descendants of Our said Grandfather of blessed memory shall cease, We do hereby further declare that the said styles, titles or attributes shall not henceforth be borne by such descendants of Our said Grandfather save those above mentioned.

Legally, “Our said Grandfather” actually referred to Victoria (1819–1901; Queen of the UK 1837-1901) and what the proclamation did was revoke the practice from Victoria’s time where almost all male-line descendants of the monarch were styled as princes or princesses.  Some countries still operate on the Victorian basis and a particular example is Saudi Arabia, a nation where, under their interpretation of the Sharia, kings and princes may enjoy more than the four wives which is the accepted limit in most Islamic nations which permit polygyny.  The royal scions have thus proliferated and if one moves in certain exulted circles, apart from the odd waiter or hairdresser, it can be possible to go through life and never meet a Saudi who is not a prince or princess.  In Saudi, for many reasons, it would be difficult to change the system but in Demark there recently was a cull of princes and princesses (the titles that is) with those who didn’t make the cut reverting to being count and countess of this and that.  For almost a century the 1917 Letters Patent remained the convention followed but  on 31 December 2012, Elizabeth II (1926-2022; Queen of the UK and other places, 1952-2022) issued letters patent extending both HRH and Prince or Princess status to all the children of the eldest son of the Prince of Wales:

Whitehall, 31st December, 2012.  The QUEEN has been pleased by Letters Patent under the Great Seal of the Realm dated the 31st day of December 2012 to declare that all the children of the eldest son of The Prince of Wales should have and enjoy the style, title and attribute of Royal Highness with the titular dignity of Prince or Princess prefixed to their Christian names or with such other titles of honour.

What that achieved was a bit of “title creep”.  Under the George V rule, only the eldest living son of the eldest son of the Prince of Wales would have been styled a prince; younger siblings would not have been princes or princesses but rather Lord or Lady Mountbatten-Windsor.  What Elizabeth II’s 2012 Letters Patent did was equalize things so all the children of the eldest son of the Prince of Wales would be both HRH and princes or princesses; it’s a thoughtful great-grandmother who thinks of a way to avoid sibling rivalry.  There have since been no further general amendments to the 1917 convention although the royal prerogative has been used to grant or remove titles individually, such the letters patent issued granting the titles prince & princess to the Duke of Sussex’s children.

Windsor Castle, September, 2025.

The UK government's state banquet in honor of the visiting Donald Trump (b 1946; US president 2017-2021 and since 2025), hosted in Windsor Castle in September 2025.  Where one sits on the UK's order of precedence will influence (1) whether one is invited and (2) whether one gets a "good" seat.  Among US presidents, Mr Trump's second state visit was unprecedented.

So, titles and styles are quite a thing in royal families because they operate as a pecking order atop a pecking order.  Despite the frequency with which the claim is made, the British royal family is not wholly averse to change and one change they would be welcome would be things going back to how they were done decades or centuries ago: In 1938, George VI (1895–1952; King of the United Kingdom 1936-1952), being driven through Surry in the company of a US journalist, gestured through the window towards Runnymede, telling his companion: “That’s where the troubles started”.  For the institution of the monarchy, there have since 1215 been many troubles, some quite serious but apart for a brief, aberrant, republican interlude, one royal household or another has remained in place, challenges dealt with as they’ve arisen.  For the royal family, the matter of “the Andrew problem” is not so much what he’s alleged to have done (which could have been handled with the odd wry smile and otherwise never spoken of) but the ghastliness of it becoming public knowledge among “the common people”.  The attraction of “fixing things” by the use of letters patent is it’s quick and (it’s hoped) will mean “the Andrew problem” doesn’t end up being discussed in the House of Commons.  That would be bad enough but once such things start they can get out of hand and if one matter about the royal family is being discussed in parliament, there’s no guarantee it wouldn’t lead to other aspects being questioned.  There are many things about the royal family and their place in the UK’s constitutional apparatus which they’d prefer not be discussed and certainly not in the House of Commons.  As a tactic, the letters patent may well keep the commoners in the Commons at bay but Mr Mountbatten Windsor’s life may yet get worse because various institutions in the US are interested in questioning him in relation to alleged offences committed on US soil and an extradition request is not impossible.

Tuesday, November 4, 2025

Chopstick

Chopstick (pronounced chop-stik)

(1) A harmonically and melodically simple waltz for piano played typically with the forefinger of each hand and sometimes having an accompanying part for a second player.  Originally, it was called The Celebrated Chop Waltz, written in 1877 by British composer Arthur de Lulli (the pen name of Euphemia Allen (1861-1949)); it’s used often as a two-finger exercise for those learning the piano and then name comes from the idea of the two fingers being arrayed in a chopstickesque way (should be used with an initial capital).

(2) In hand games, a game in which players hold up a number of fingers on each hand and try, through certain moves, to eliminate their opponent's hands.

(3) A pair of thin sticks (of ivory, wood, plastic etc), typically some 10 inches (230 mm) in length, used as eating utensils by the Chinese, Japanese, and others in East Asia as well as by those anywhere in the world eating food associated with these places.

(4) As an ethnic slur, a person of East Asian appearance.

(5) In fishing gear, a long straight stick forming part of various fishing tackle arrangements (obsolete).

(6) In parts of Australia where individuals are subject to “attack” by “swooping” magpies, the use of cable ties on bicycle helmets to produce long, thin (ie chopstickish) protrusions which act as a “bird deterrent”.

(7) In automotive slang, the “parking guides” (in some places known as “gutter scrapers”) mounted at a vehicle’s extremities to assist when parking or navigating tight spaces.  They have been replaced by sensors and cameras but were at the time an impressively effective low-tech solution.

1590s (contested): The construct was chop + stick.  The use to describe the eating utensil was first documented in 1637 and may have been a transfer of the sense from the earlier use to describe fishing tackle (in use since at least 1615) which was based on the physical resemblance (ie long & thin).  The “chop” element was long listed by dictionaries as being from the Chinese Pidgin English chop (-chop) (quick), a calque from the Chinese 筷子 (kuàizi) (chopstick”), from 快 (kuài) (quick) but this is now thought improbable because there is no record of Chinese Pidgin English until the eighteenth century.  The notion of the link with Chinese Pidgin English appeared first in the 1880s with the rationale: “The Chinese name of the article is ‘kwai-tsz (speedy-ones)” which was a decade later refined with the explanation “Possibly the inventor of the present word, hearing that the Chinese name had this meaning, and accustomed to the phrase chop-chop for ‘speedily,’ used chop as a translation.  This became orthodoxy after being picked-up for inclusion in the OED (Oxford English Dictionary (1893)), a publication so authoritative it spread to most until English language dictionaries from the late 19th century onwards.  The chronological impossibility of the Pidgin English theory was first noted by Kingsley Bolton (b 1947) in Chinese English: A Sociolinguistic History (2003).  The English form is now thought to come simply from the use of the Chinese, modified over time and oral transmission.  The current orthodoxy is the Pidgin English chop (quick; fast) was from the Cantonese word chāu (快) (quick).  The construct of the Chinese kuàizi (筷子) was kuài (筷) (quick) + zi (子) (a diminutive suffix).  Stick was from the Middle English stikke (stick, rod, twig), from the Old English sticca (twig or slender branch from a tree or shrub (also “rod, peg, spoon”), from the Proto-West Germanic stikkō, from the Proto-Germanic stikkô (pierce, prick), from the primitive Indo-European verb stig, steyg & teyg- (to pierce, prick, be sharp).  It was cognate with the Old Norse stik, the Middle Dutch stecke & stec, the Old High German stehho, the German Stecken (stick, staff), the Saterland Frisian Stikke (stick) and the West Flemish stik (stick).  The word stick was applied to many long, slender objects closely or vaguely resembling twigs or sticks including by the early eighteenth century candles, dynamite by 1869, cigarettes by 1919 (the slang later extended to “death sticks” & “cancer sticks).  Chopstick, chopstickful, chopstickery & chopsticker are nouns, chopsticking & chopsticked are verbs and chopstickish & chopstick-like are adjectives; the noun plural is chopsticks and the word is almost always used in the plural (sometimes as “a pair of chopsticks”).  The adjective chopstickesque is non-standard.

Niche market: a pair of chopsticks in 18-carat gold, diamonds, pearls, and ebony by Erotic Jewellery, Gold Coast, Australia.  The chopsticks were listed at Aus$139,000 and have the environmental benefit being of endlessly reusable and are also dual-purpose, the pearl mounted at the end of one chopstick detachable and able to be worn as a necklace.

In English, chopstick has proved productive.  A chopsticker is one who uses chopsticks, chopstickery describes the skill or art of using chopsticks, a chopstickful describes the maximum quantity of food which can be held in one pair of chopsticks (a la “mouthful”), chopstick land was a slang term for China (used sometimes of East Asia generally) but is now listed as a microaggression, chopstick legs (always in the plural) is a fashion industry term describing long, thin legs (a usually desirable trait), chopstickology is a humorous term used by those teaching others the art of using chopsticks (on the model of “mixology” (the art of making cocktails), “Lohanology” (the study of Lindsay Lohan and all things Lohanic), “sockology” (the study of socks) etc), a chopstick rest is a small device upon which one's chopsticks may be placed while not in use (known also as a chopstick stand), chopstickless means lacking or not using, chopsticks, chopsticky is a adjective (the comparative “more chopsticky”, the superlative “most chopsticky”) meaning (1) resembling a chopstick (ie “long and thin”) (chopstick-like & chopstickish the alternative adjectives in this context), (2) suitable for the use of chopsticks or (3) characterized by the use of chopsticks (the companion noun chopsticky meaning “the state of being chopstickish”.  Chopstickism was once used of things considered Chinese or Asian in character but is now regarded as a racist slur (the non-standard chopstickistic similarly now proscribed).

They may be slender and light but because annual use is measured in the millions, there is a significant environmental impact associated with chopsticks including deforestation, waste and carbon emissions.  Beginning in the early twenty-first century, a number of countries in East Asia have taken measures designed to reduce the extent of the problem including regulatory impositions, technological innovation and public awareness campaigns.  In 2006, the Chinese government levied a 5% consumption tax on disposable wooden chopsticks and later began a “Clean Your Plate” publicity campaign to encourage sustainable dining practices.  In Japan, although disposable chopsticks (waribashi) remain common, some local governments (responsible for waste management) promote reusable options and businesses have been encouraged to offer reusable or bamboo-based alternatives although the RoK (Republic of Korea (South Korea)) went further and promoted reusable metal chopsticks, devices which could last a lifetime.

The Chork

Although the materials used in construction and the possibilities of recycling have attracted some interest, there has in hundreds of years been no fundamental change in the chopstick’s design, simply because it long ago was (in its core function) perfected and can’t be improved upon.  However, in 2016, the US fast food chain Panda Express (which specializes what it describes as “American Chinese cuisine”) displayed the chork (the construct being ch(opstick) + (f)ork).  Designed presumably for the benefit of barbaric Westerners unable to master a pair of chopsticks (one of the planet’s most simple machines) the chork had been developed by Brown Innovation Group (BIG) which first revealed its existence in 2010.  BIG has created a website for the chork which explains the three correct ways to use the utensil: (1) Employ the fork end as one might a conventional fork, (2) break the chork in two and use like traditional chopsticks or (3) use what BIG call cheater/training mode in which the chopstick component is used with the fork part still attached.  Unfortunately for potential chorkers, Panda Express used the chork only as a promotional tool for the "General Tso's Chicken" launch but they remain available from BIG in packs of 12 & 24, both manufactured in the PRC (People's Republic of China).

Richard Nixon, détente and soupgate

Comrade Nikita Khrushchev (1894–1971; Soviet leader 1953-1964, left) and (then vice president) Richard Nixon (1913-1994; US president 1969-1974, right)during the Кухонные дебаты (Kukhonnye debaty) (kitchen debate), conducted in a “model American kitchen” built for the American National Exhibition, Sokolniki Park, Moscow, 24 July 1959.  The pair (through interpreters) debated the respective virtues of communism verses capitalism, the backdrop being what was said to be a model of a “typical American kitchen”, packed with labor-saving appliances and recreational stuff “able to be afforded by the typical American family”.  Neither party persuaded the other but when finally able to choose between dialectical materialism and consumer materialism, most former Soviet comrades opted for the latter.

Richard Nixon (right) and HR Haldeman (1926–1993; White House chief of staff 1969-1973, left), the White House, 1 January 1972.

Although this photograph is sometimes captioned as being taken in the Oval Office, Nixon used that room only for formal meetings or ceremonial events and usually worked from this smaller, adjoining office.  The stacks of paper are not untypical examples of what workplaces often were like before personal computers transformed things and although the printed page has proved remarkably enduring, the days of the stacks mostly are done.  There was though one exception to that.  When in 2014 the House Select Committee on Benghazi (one of the many scandals involving crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) was sitting, the State Department requested crooked Hillary provide all the emails stored on her “personal mail server” which, controversially, she’d used for official US government business and "other purposes".  A period of negotiations with her legal team ensued (given crooked Hillary’s past, it was a busy team) and what ended up being provided was a dozen file boxes filled with print-outs of over 30,000 emails (calculated to be around 64 reams of paper or a stack some 10½ feet (3.25 metres) high).  The reason crooked Hillary refused to provide the material in digital form was presumed to be (1) in digital form it would have been easier for analysts to search for data and (2) concerns that even though she’d had her staff delete from the server some 32,000 messages (claimed to be “personal”), a forensic analysis of a granular message file might have revealed all or some of what had been deleted.  Crooked Hillary’s use of her so called "home-brew" mail server has never satisfactorily been explained and the contents of the deleted emails may never be known.

Richard Nixon became famous for some things and infamous for others but one footnote in the history of his administration was that he banned soup.  In 1969, Nixon hosted a state dinner for Pierre Trudeau (1919–2000; prime minister of Canada 1968-1979 & 1980-1984) and the next day complained to HR Haldeman that formal dinners “take forever”, suggesting “Why don’t we just leave out the soup course?”, adding “Men don’t really like soup.” (other than wives & waitresses, state dinners were then substantially a male preserve).  Well-acquainted with the social ineptitude of his boss, Haldeman had his suspicions so called the president's valet and asked: “Was there anything wrong with the president’s suit after that dinner last night?  Why yes…”, the valet responded, “…he spilled soup down the vest.”  Not until Gerald Ford (1913–2006; US president 1974-1977) assumed the presidency was soup restored to White House menus to the relief of the chefs who couldn’t believe a dinner was really a dinner without a soup course.

Richard Nixon, détente and chopsticks

A chopstick neophyte in Beijing: Comrade Zhou Enlai (1898–1976; premier of the People's Republic of China (PRC) 1949-1976, left), Richard Nixon (centre) and comrade Zhang Chunqiao (1917–2005, right) at the welcome banquet for President Nixon's visit to the PRC, Tiananmen Square, Beijing, 26 February 1972.  After the death of comrade Chairman Mao (Mao Zedong 1893–1976; chairman of the Chinese Communist Party (CCP) 1949-1976), in a CCP power struggle, Zhang (a prominent figure in the Cultural Revolution (1966-1976)) was arrested, becoming one of the “Gang of Four” (which included the late chairman’s last wife).  After a typically efficient CCP-approved trial, he was sentenced to death but was granted a two-year reprieve and his sentence was later commuted to life in prison before being reduced to 18 years.  Released on humanitarian grounds in 1998 to enable him to receive treatment for cancer, he died in 2005.

The event in Beijing was not a “state visit” because at the time no formal diplomatic relations existed between the two nations (the US still recognized the Taiwan-based RoC (Republic of China (which Beijing regards still as a “renegade province”)) as the legitimate government of China). For that reason, the trip was described as an “official visit”, a term not part of diplomatic protocol.  There are in history a few of these fine distinctions: technically, diplomatic relations were never re-established between Berlin and Paris after the fall of the Third Republic in 1940 so ambassadors were never accredited which means Otto Abetz (1903-1958), who fulfilled the role between 1940-1944, should be referred to as “de facto” German ambassador (as the letters patent made clear, he acted with full ambassadorial authority).  In July 1949, a French court handed Abetz a twenty-year sentence for crimes against humanity; released in 1954, he died in 1958 in a traffic accident on the Cologne-Ruhr autobahn and there are conspiracy theorists who suspect the death was “an assassination”.  The de facto ambassador was the great uncle of Eric Abetz (b 1958; Liberal Party senator for Tasmania, Australia 1994-2022, member of the Tasmanian House of assembly since 2024), noted in Australian legal history for being the first solicitor in the city of Hobart to include color on his firm's letterhead.

Longing for a chork.

Still, whatever the detail of the protocol, the PRC's hospitality was lavish and it certainly looked (and tasted) like a state visit.  Both the US and the PRC had their own reasons for wishing to emerge from the “diplomatic deep-freeze” (Moscow something of a pivot) and it was this event which was instrumental in beginning the process of integrating the PRC into the international system.  The “official visit” also introduced into English the idiomatic phrase “Nixon in China” (there are variations) which describes the ability of a politician with an impeccable reputation of upholding particular political values to perform an action in seeming defiance of them without jeopardizing his support or credibility.  For his whole political career Nixon had been a virulent anti-communist and was thus able to make the tentative approach to the PRC (and later détente with the Soviet Union) in a way which would not have been possible for someone without the same history.  In the same way the Democratic Party’s Bill Clinton (b 1946; US president 1993-2001) was able during the 1990s to embark on social welfare “reform” in a way no Republican administration could have achieved.

The chopstick as a hair accessory: Lindsay Lohan (b 1986, left) in The Parent Trap (1998) and Hilary Duff (b 1987, right) at Nickelodeon's 15th Annual Kids Choice Awards, Barker Hangar, Santa Monica, California, April, 2002.  These outfits might now be described as "cultural appropriation".

Following the visit, there was also a culinary ripple in the US.  Since the nineteenth century, Chinese restaurants had been a fixture in many US cities but the dishes they served were often very different from those familiar in China and some genuinely were local creations; fortune cookies began in San Francisco courtesy of a paperback edition of “Chinese Proverbs” and all the evidence suggests egg rolls were invented in New York.  The news media’s coverage of the visit attracted great interest and stimulated interest in “authentic” Chinese food after some of the menus were published.  Noting the banquet on the first night featured shark’s fin soup, steamed chicken with coconut and almond junket (a type of pudding), one enterprising chap was within 24 hours offering in his Manhattan Chinese restaurant recreation of each dish, a menu which remained popular for some months after the president’s return.  Mr Nixon’s favorite meal during the visit was later revealed to be Peking duck and around the US, there was a spike in demand for duck.

One of the menus from the official visit (not from a banquet but one of the "working dinners").  Clearly, the president's fondness for duck had been conveyed to the chef.

The graphic is the National Emblem of the People's Republic of China and in a red circle depicts a representation of Tiananmen Gate, the entrance gate to the Forbidden City imperial palace complex, where in 1949 comrade Chairman Mao Zedong declared the foundation of the PRC (People's Republic of China) in 1949.  The five stars are those from the national flag, the largest representing the CCP, the others the four revolutionary social classes defined in Maoism (the peasantry, proletariat, petty bourgeoisie & national bourgeoisie).  Although Maoism was criticized by comrade Stalin (1878-1953; Soviet leader 1924-1953) and others for being “ideologically primitive”, it has over the decades proved a practical and enduring textbook for insurgencies and revolutionary movements, especially where those involved substantially are rural-dwellers.  Although comrade Stalin may have been sceptical about comrade Mao's contribution to Marxist theory, Maoism has endured and its many (bloody) successes would have surprised Karl Marx (1818-1883) who saw the potential for revolution only in the urban proletariat slaving in factories, grumbling that peasants were impossible to harness as a movement because they: "...were like potatoes, all the same and yet all different."