Showing posts sorted by date for query Gown. Sort by relevance Show all posts
Showing posts sorted by date for query Gown. Sort by relevance Show all posts

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text” endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Thursday, November 20, 2025

Ultracrepidarian

Ultracrepidarian (pronounced uhl-truh-krep-i-dair-ee-uhn)

Of or pertaining to a person who criticizes, judges, or gives advice outside their area of expertise

1819: An English adaptation of the historic words sūtor, ne ultra crepidam, uttered by the Greek artist Apelles and reported by the Pliny the Elder.  Translating literally as “let the shoemaker venture no further” and sometimes cited as ne supra crepidam sūtor judicare, the translation something like “a cobbler should stick to shoes”.  From the Latin, ultra is beyond, sūtor is cobbler and crepidam is accusative singular of crepida (from the Ancient Greek κρηπίς (krēpís)) and means sandal or sole of a shoe.  Ultracrepidarian is a noun & verb and ultracrepidarianism is a noun; the noun plural is ultracrepidarians.  For humorous purposes, forms such as ultracrepidarist, ultracrepidarianish, ultracrepidarianize & ultracrepidarianesque have been coined; all are non-standard.

Ultracrepidarianism describes the tendency among some to offer opinions and advice on matters beyond their competence.  The word entered English in 1819 when used by English literary critic and self-described “good hater”, William Hazlitt (1778–1830), in an open letter to William Gifford (1756–1826), editor of the Quarterly Review, a letter described by one critic as “one of the finest works of invective in the language” although another suggested it was "one of his more moderate castigations" a hint that though now neglected, for students of especially waspish invective, he can be entertaining; the odd quote from him would certainly lend a varnish of erudition to trolling.  Ultracrepidarian comes from a classical allusion, Pliny the Elder (circa 24-79) recording the habit of the famous Greek painter Apelles (a fourth century BC contemporary of Alexander the Great (Alexander III of Macedon, 356-323 BC)), to display his work in public view, then conceal himself close by to listen to the comments of those passing.  One day, a cobbler paused and picked fault with Apelles’ rendering of sandals and the artist immediately took his brushes and pallet and touched-up the errant straps.  Encouraged, the amateur critic then let his eye wander above the ankle and suggested how the leg might be improved but this Apelles rejected, telling him to speak only of shoes and otherwise maintain a deferential silence.  Pliny hinted the artist's words of dismissal may not have been polite.

So critics should comment only on that about which they know.  The phrase in English is usually “cobbler, stick to your last” (a last a shoemaker’s pattern, ultimately from a Germanic root meaning “to follow a track'' hence footstep) and exists in many European languages: zapatero a tus zapatos is the Spanish, schoenmaker, blijf bij je leest the Dutch, skomager, bliv ved din læst the Danish and schuster, bleib bei deinen leisten, the German.  Pliny’s actual words were ne supra crepidam judicaret, (crepidam a sandal or the sole of a shoe), but the idea is conveyed is in several ways in Latin tags, such as Ne sutor ultra crepidam (sutor means “cobbler”, a word which survives in Scotland in the spelling souter).  The best-known version is the abbreviated tag ultra crepidam (beyond the sole), and it’s that which Hazlitt used to construct ultracrepidarian.  Crepidam is from the Ancient Greek κρηπίς (krēpísand has no link with words like decrepit or crepitation (which are from the Classical Latin crepare (to creak, rattle, or make a noise)) or crepuscular (from the Latin word for twilight); crepidarian is an adjective rare perhaps to the point of extinction meaning “pertaining to a shoemaker”.

The related terms are "Nobel disease" & "Nobel syndrome" which are used to describe some of the opinions offered by Nobel laureates on subjects beyond their specialization.  In some cases this is "demand" rather than "supply" driven because, once a prize winner is added to a media outlet's "list of those who comment on X", if they turn out to give answers which generate audience numbers, controversy or clicks, they become "talent" and may be asked questions about matters of which they know little.  This happens because some laureates in the three "hard" prizes (physics, chemistry, physiology or medicine) operate in esoteric corners of their discipline; asking a particle physicist something about plasma physics on the basis of their having won the physics prize may not elicit useful information.  Of course those who have won the economics gong or one of what are now the DEI (diversity, equity and inclusion) prizes (peace & literature) may be assumed to have helpful opinions on everything.

Jackson Pollock (1912-1956): Blue Poles

Number 11 (Blue poles, 1952), Oil, enamel and aluminum paint with glass on canvas.

In 1973, when a million dollars was a still lot of money, the NGA (National Gallery of Australia), a little controversially, paid Aus$1.3 million for Jackson Pollock’s (1912-1956) Number 11, 1952, popularly known as Blue Poles since it was first exhibited in 1954, the new name reputedly chosen by the artist.  It was some years ago said to be valued at up to US$100 million but, given the increase in the money supply (among the rich who trade this stuff) over the last two decades odd, that estimate may now be conservative although the suggestion in 2016 the value may have inflated to as much as US$350 million was though to be "on the high side".  Blue Poles emerged during Pollock’s "drip period" (1947-1950), a method which involved techniques such throwing paint at a canvas spread across the floor.  The art industry liked these (often preferring the more evocative term "action painting") and they remain his most popular works, although at this point, he abandoned the dripping and moved to his “black porings phase” a darker, simpler style which didn’t attract the same commercial interest.  He later returned to more colorful ways but his madness and alcoholism worsened; he died in a drink-driving accident.

Alchemy (1947), Oil, aluminum, alkyd enamel paint with sand, pebbles, fibres, and broken wooden sticks on canvas.

Although the general public remained uninterested (except in the price tags) or sceptical, there were critics, always drawn to a “troubled genius”, who praised Pollock’s work and the industry approves of any artist who (1) had the decency to die young and (2) produced lots of stuff which can sell for millions.  US historian of art, curator & author Helen A Harrison (b 1943; director (1990-2024) of the Pollock-Krasner House and Study Center, the former home and studio of the Abstract Expressionist artists Jackson Pollock and Lee Krasner in East Hampton, New York) is an admirer, noting the “pioneering drip technique…” which “…introduced the notion of action painting", where the canvas became the space with which the artist actively would engage”.  As a thumbnail sketch she offered:

Number 14: Gray (1948), Enamel over gesso on paper.

Reminiscent of the Surrealist notions of the subconscious and automatic painting, Pollock's abstract works cemented his reputation as the most critically championed proponent of Abstract Expressionism. His visceral engagement with emotions, thoughts and other intangibles gives his abstract imagery extraordinary immediacy, while his skillful use of fluid pigment, applied with dance-like movements and sweeping gestures that seldom actually touched the surface, broke decisively with tradition. At first sight, Pollock's vigorous method appears to create chaotic labyrinths, but upon close inspection his strong rhythmic structures become evident, revealing a fascinating complexity and deeper significance.  Far from being calculated to shock, Pollock's liquid medium was crucial to his pictorial aims.  It proved the ideal vehicle for the mercurial content that he sought to communicate 'energy and motion made visible - memories arrested in space'.”

Number 13A: Arabesque (1948), Oil and enamel on canvas.

Critics either less visionary or more fastidious seemed often as appalled by Pollock’s violence of technique as they were by the finished work (or “products” as some labelled the drip paintings), questioning whether any artistic skill or vision even existed, one finding them “…mere unorganized explosions of random energy, and therefore meaningless.”  The detractors used the language of academic criticism but meant the same thing as the frequent phrase of an unimpressed public: “That’s not art, anyone could do that.”

Number 1, 1949 (1949), Enamel and metallic paint on canvas. 

There have been famous responses to  “That’s not art, anyone could do that” but Ms Harrison's was practical, offering people the opportunity to try.  To the view that “…people thought it was arbitrary, that anyone can fling paint around”, Ms Harrison conceded it was true anybody could “fling paint around” but that was her point, anybody could, but having flung, they wouldn’t “…necessarily come up with anything” by which she meant the wouldn't necessarily come up with anything of which the critical establishment (a kind of freemasonry of the art business) would approve (ie could put a price tag on).

Helen A Harrison, The Jackson Pollock Box (Cider Mill Press, 96pp, ISBN-10:1604331860, ISBN-13:978-1604331868).

In 2010, Ms Harrison released The Jackson Pollock Box, a kit which, in addition to an introductory text, included paint brushes, drip bottles and canvases so people could do their own flinging and compare the result against a Pollock.  After that, they may agree with collector Peggy Guggenheim (1898-1979) that Pollock was “...the greatest painter since Picasso” or remain unrepentant ultracrepidarians.  Of course, many who thought their own eye for art quite well-trained didn't agree with Ms Guggenheim.  In 1945, just after the war, Duff Cooper (1890–1954), then serving as Britain's ambassador to France, came across Pablo Picasso (1881–1973) leaving an exhibition of paintings by English children aged 5-10 and in his diary noted the great cubist saying he "had been much impressed".  "No wonder" added the ambassador, "the pictures are just as good as his".

Dresses & drips: Three photographs by Cecil Beaton (1904-1980), shot for a three-page feature in Vogue (March 1951) titled American Fashion: The New Soft Look which juxtaposed Pollock’s paintings hung in New York’s Betty Parsons Gallery with the season’s haute couture by Irene (1872-1951) & Henri Bendel (1868-1936).

Beaton choose the combinations of fashion and painting; pairing Lavender Mist (1950, left) with a short black ball gown of silk paper taffeta with large pink bow at one shoulder and an asymmetrical hooped skirt best illustrates the value of his trained eye.  Critics and social commentators have always liked these three pages, relishing the opportunity to comment on the interplay of so many of the clashing forces of modernity: the avant-garde and fashion, production and consumption, abstraction and representation, painting and photography, autonomy and decoration, masculinity and femininity, art and commerce.  Historians of art note it too because it was the abstract expressionism of the 1940s which was both uniquely an American movement and the one which in the post-war years saw the New York supplant Paris as the centre of Western art.  There have been interesting discussions about when last it could be said Western art had a "centre".

Blue Poles, upside down.

Although the suggestion might offend the trained and discerning eyes of art critics, it’s doubtful that for ultracrepidarians the experience of viewing Blue Poles would much be different were it to be hung upside down.  Fortunately, the world does have a goodly stock of art critics who can explain that while Pollock did more than once say his works should be interpreted “subjectively”, their intended orientation is a part of the whole and an inversion would change the visual dynamics and gravitational illusions upon which the abstraction effects depend would be changed.  It would still be a painting but, in a sense, not the one the artist painted.  Because the drip technique involved “flinging and poring paint” onto a canvas spread across a studio’s floor, there was not exactly a randomness in where the paint landed but physics did mean gravity exerted some pull (in flight and on the ground), lending layers and rivulets what must be a specific downward orientation.  Thus, were the work to be hung inverted, what was in the creative process a downward flow would be seen as “flowing uphill” as it were.  The compositional elements which lent the work its name were course the quasi-vertical “poles” placed at slight angles and its these which are the superstructure which “anchor” the rest of the drips and, being intrinsically “directional”, they too have a “right way up”.  There is in the assessment of art the “eye of the beholder” but although it may be something they leave unstated, most critics will be of the “some eyes are more equal than others” school.

Mondrian’s 1941 New York City 1 as it (presumably correctly) sat in the artist's studio in 1944 (left) and as it was since 1945 exhibited (upside-down) in New York and Düsseldorf (right).  Spot the difference.

So although ultracrepidarians may not “get it” (even after digesting the critics’ explanations) and wouldn’t be able to tell whether or not it was hung correctly, that’s because they’re philistines.  In the world of abstract art however, even the critics can be fooled: in 2022, it was revealed a work in Piet Mondrian’s (1872-1944) 1941 New York City 1 series had for 77 years been hanging upside down.  First in exhibited in 1945 in New York’s MOMA (Museum of Modern Art), the piece was created with multi-colored adhesive paper tape and, in an incorrect orientation, it has since 1980 hung in the Düsseldorf Museum as part of the Kunstsammlung Nordrhein-Westfalen’s collection.  The decades-long, trans-Atlantic mistake came to light during a press conference held to announce the Kunstsammlung’s new Mondrian exhibition and the conclusion was the error may have been caused by something as simple as the packing-crate being overturned or misleading instructions being given to the staff.  1941 New York City 1 will remain upside because of the condition of the adhesive strips.  The adhesive tapes are already extremely loose and hanging by a thread” a curator was quoted as saying, adding that if it were now to be turned-over, “…gravity would pull it into another direction.  And it’s now part of the work’s story.  Mondrian was one of the more significant theorists of abstract art and its withdrawal from nature and natural subjects.  Denaturalization” he proclaimed to be a milestone in human progress, adding: “The power of neo-plastic painting lies in having shown the necessity of this denaturalization in painterly terms... to denaturalize is to abstract... to abstract is to deepen.  Now even ultracrepidarians can understand.

Eye of the beholder: Portrait of Lindsay Lohan in the style of Claude Monet (1840–1926) at craiyon.com and available at US$26 on an organic cotton T-shirt made in a factory powered by renewable energy.

Whether the arguments about what deserves to be called “art” began among prehistoric “artists” and their critics in caves long ago isn’t known but it’s certainly a dispute with a long history.  In the sense it’s a subjective judgment the matter was doubtless often resolved by a potential buyer declining to purchase but during the twentieth century it became a contested topic and there were celebrated exhibits and squabbles which for decades played out before, in the post modern age, the final answer appeared to be something was art if variously (1) the creator said it was or (2) an art critic said it was or (3) it was in an art gallery or (4) the price tag was sufficiently impressive.

So what constitutes “art” is a construct of time, place & context which evolves, shaped by historical, cultural, social, economic, political & personal influences, factors which in recent years have had to be cognizant of the rise of cultural equivalency, the recognition that Western concepts such as the distinction between “high” (or “fine”) art and “folk” (or “popular”) art can’t be applied to work from other traditions where cultural objects are not classified by a graduated hierarchy.  In other words, everybody’s definition is equally valid.  That doesn’t mean there are no longer gatekeepers because the curators in institutions such as museums, galleries & academies all discriminate and thus play a significant role in deciding what gets exhibited, studied & promoted, even though few would now dare to suggest what is art and what is not: that would be cultural imperialism.

Eye of the prompt 1.0: An AI (artificial intelligence) generated portrait of Lindsay Lohan by ChatGPT imagined in "drip painting style", this one using an interpretation which overlaid "curated drips" over "flung paint".  This could be rendered using Ms Harrison's Jackson Pollock Box but would demand some talent.

In the twentieth century, it seemed to depend on artistic intent, something which transcended a traditional measure such as aesthetic value but as the graphic art in advertising and that with a political purpose such as agitprop became bigger, brighter and more intrusive, such forms also came to be regarded as art or at least worth of being studied or exhibited on the same basis, in the same spaces as oil on canvas portraits & landscapes.  Once though, an unfamiliar object in such places could shock as French painter & sculptor Marcel Duchamp (1887-1968) managed in 1917 when he submitted a porcelain urinal as his piece for an exhibition in New York, his rationale being “…everyday objects raised to the dignity of a work of art by the artist's act of choice.”  Even then it wasn’t a wholly original approach but the art establishment has never quite recovered and from that urinal to Dadaism, to soup cans to unmade beds, it became accepted that “anything goes” and people should be left to make of it what they will.  Probably the last remaining reliable guide to what really is "art" remains the price tag.

Eye of the prompt 1.1: An AI (artificial intelligence) generated portrait of Lindsay Lohan by ChatGPT imagined in "drip painting style", this one closer to Pollock’s “action painting” technique.

His drip period wholly non-representational, Pollock didn’t produce recognizable portraiture so applying the technique for this purpose demands guesswork.  As AI illustrates, it can be done but, in blending two incompatible modes, whether it looks much like what Pollock would have produced had he accepted a “paint Lindsay Lohan” commission, is wholly speculative.  What is more likely is that even if some sort of hybrid, a portrait by Pollock would have been an abstraction altogether more chaotic and owing little to the structure on which such works usually depend in that there probably would have been no central focal point, fewer hints of symmetry and a use of shading producing a face not lineal in its composition.  That’s what his sense of “continuous motion” dictated: no single form becoming privileged over the rest.  So, this too is not for the literalists schooled in the tradition of photo-realism but as a work it’s also an example of how most armed with Ms Harrison's Jackson Pollock Box could with "drip & fling" produce this but not necessarily would produce this, chaos on canvas needing talent too.

1948 Cisitalia 202 GT (left; 1947-1952) and 1962 Jaguar E-Type (1961-1974; right), Museum of Modern Art (MoMA), New York City.

Urinals tend not to be admired for their aesthetic qualities but there are those who find beauty in stuff as diverse as math equations and battleships.  Certain cars have long been objects which can exert an emotional pull on those with a feeling for such things and if the lines are sufficiently pleasing, many flaws in execution or engineering can be forgivgen.  New York’s MOMA in 1972 acknowledged such creations can be treated as works of art when they added a 1948 Cisitalia 202 GT finished in “Cisitalia Red” (MoMA object number 409.1972) to their collection, the press release noting it was “…the first time that an art museum in the U.S. put a car into its collection.”  Others appeared from time-to-time and while the 1953 Willys-Overland Jeep M-38A1 Utility Truck (MoMA object number 261.2002) perhaps is not conventionally beautiful, its brutish functionalism has a certain simplicity of form and in the exhibition notes MoMA clarified somewhat by describing it as a “rolling sculpture”, presumably in the spirit of a urinal being a “static sculpture”, both to be admired as pieces of design perfectly suited to their intended purpose, something of an art in itself.  Of the 1962 Jaguar E-Type (sometimes informally as XKE or XK-E in the US) open two seater (OTS, better known as a roadster and acquired as MoMA object number 113.996), there was no need to explain because it’s one of the most seductive shapes ever rendered in metal.  Enzo Ferrari (1898-1988) attended the 1961 Geneva International Motor Show (now defunct but, on much the same basis as manufacturers east of Suez buying brand-names such as MG, Jaguar and such, the name has been purchased for use by an event in staged in Qatar) when the E-Type made its stunning debut and part of folklore is he called it “the most beautiful car in the world”.  Whether those words ever passed his lips isn’t certain because the sources vary slightly in detail and il Commendatore apparently never confirmed or denied the sentiment but it’s easy to believe and to this day many agree just looking at the thing can be a visceral experience.  The MoMA car is finished in "Opalescent Dark Blue" with a grey interior and blue soft-top (there are those who would prefer it in BRG (British Racing Green) over tan leather) and although as a piece of design it's not flawless, anyone who can't see the beauty in a Series 1 E-Type OTS is truly an ultracrepidarian.   

Friday, October 31, 2025

Bob

Bob (pronounced bobb)

(1) A short, jerky motion.

(2) Quickly to move up and down.

(3) In Sterling and related currencies, a slang term for one shilling (10c); survived decimalisation in phrases like "two bob watch", still used by older generations).

(4) A type of short to medium length hairstyle.

(5) A docked horse’s tail.

(6) A dangling or terminal object, as the weight on a pendulum or a plumb line.

(7) A short, simple line in a verse or song, especially a short refrain or coda.

(8) In angling, a float for a fishing line.

(9) Slang term for a bobsled.

(10) A bunch, or wad, especially a small bouquet of flowers (Scottish).

(11) A polishing wheel of leather, felt, or the like.

(12) An affectionate diminutive of the name Robert.

(13) To curtsy.

(14) Any of various hesperiid butterflies.

(15) In computer graphics (using "Bob" as a contraction of Blitter object), a graphical element (GEL) used by the Amiga computer (the first consumer-level computer which handled multi-tasking convincingly).  Technically, Bobs were hardware-generated objects which could be moved on the screen by the blitter coprocessor.  Bobs were an object of some veneration among the demosceners (the computer art subculture that produces and watches demos (audio-visual computer programs)), Bobs rated according to their the volume and dynamics of movement.

(16) In Scotland, a bunch, cluster, or wad, especially a small bouquet of flowers.

(17) A walking beam (obsolete).

1350–1400: From the Middle English bobben (to strike in cruel jest, beat; fool, make a fool of, cheat, deceive), the meaning "move up and down with a short, jerking motion," perhaps imitative of the sound, the sense of mocking or deceiving perhaps connected to the Old French bober (mock, deride), which, again, may have an echoic origin. The sense "snatch with the mouth something hanging or floating," as in bobbing for apples (or cherries), is recorded by 1799 and the phrase “bob and weave” in boxing commentary is attested from 1928.  Bob seems first to have been used to describe the short hair-style in the 1680s, a borrowing probably of the use since the 1570s to refer to "a horse's tail cut short", that derived from the earlier bobbe (cluster (as of leaves)) dating from the mid fourteenth century and perhaps of Celtic origin and perhaps connected in some way with the baban (tassel, cluster) and the Gaelic babag.  Bob endures still in Scots English as a dialectical term for a small bunch of flowers.  Bob is a noun & verb, bobber & boggy are nouns, bobbing is a noun & verb, bobbed is a verb & adjective, bobbish is an adjective and bobbingly & bobbishly are adverbs; the noun plural is bobs.  When used as a proper noun, there's an initial capital.

Australian politician Bob Katter (b 1945) with cane toad.

An introduced pest (ie the cane toad), Mr Katter's idea is children should be given guns (air rifles) to hunt them, each carcass attracting a bounty of 40 cents ("four bob" in the old slang).  This photograph is thus potentially "a five bob".  Affectionately, Mr Katter is known as “his Bobness” and, depending on who is asked, is either (1) an intellectual or (2) barking mad.  Between 1974-1992, Mr Katter served in the Queensland state parliament but since 1993 has been the member for Kennedy (at 567,377 km² (219,066 sq miles) about the size of metropolitan France) in the Commonwealth House of Representatives.  Until 2001 he was a member of the National Party (the old Country Party), after which he sat as an independent.  The suggestion which circulated implying he was asked by the Nationals to vacate his seat after an IQ test revealed he was "too intelligent for the National Party" was fake news and wholly malicious.   

Two two bob coins: Obverse (heads, left) and reverse (tails, right) of two 1945 Australian florins, minted in the same year as Mr Katter.

The coin at the top is one which spent some time in circulation while the more lustrous example below is a UNC (uncirculated coin) which would likely have spent its entire existence in collections.  Numismatists (coin collectors) will pay a premium for a UNC, a 1945 UNC Australian florin typically trading at four times the price of a circulated coin in good condition.  Now nominally equivalent to 20 cents (although a florin’s purchasing power was greater), it was worth two shillings (thus “two bob” in slang).  The group of "bob words" in English is beyond obscure and mostly mysterious.  Most are surely colloquial in origin and probably at least vaguely imitative, but have long become entangled and merged in form and sense (bobby pin, bobby sox, bobsled, bobcat etc).  As a noun, it has been used over the centuries in various senses connected by the notion of "round, hanging mass" and of weights at the end of a fishing line (1610s), pendulum (1752) or plumb-line (1832).  As a description of the hair style, although dating from the 1680s, it entered popular use only in the 1920s when use spiked.  As a slang word for “shilling” (the modern 10c coin), it’s recorded from 1789 but no connection has ever been found and the origin of this is unknown.  In certain countries, among older generations, the term in this sense endures in phrases like “two bob watch” to suggest something of low quality and dubious reliability.

UK Prime Minister Lord Salisbury (Robert Arthur Talbot Gascoyne-Cecil, 1830–1903; UK Prime Minister for thirteen years variously 1885-1902.

The third marquess was, in the words of of Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955): "prime-minister since God knows when" and the affectionate diminutive of his grandson (Robert Arthur James Gascoyne-Cecil 1893-1972; Fifth Marquess of Salisbury 1947-1972) was "Bobbety".  The phrase "Bob's your uncle" is said often to have its origin in the nepotism allegedly extended by Lord Salisbury to his favorite nephew Arthur Balfour (1848–1930; UK Prime Minister 1902-1905), unexpectedly promoted to a number of big jobs during the 1880s.  The story has never convinced etymologists but it certainly impressed the Greeks who made up a big part of Australia's post-war immigration programme, "Spiro is your uncle" in those years often heard in Sydney and Melbourne to denote nepotism among their communities there.

The other potential source is the Scottish music hall, the first known instance in in a Dundee newspaper in 1924 reviewing a musical revue called Bob's Your Uncle.  The phrase however wasn't noted as part of the vernacular until 1937, six years after the release of the song written by JP Long, "Follow your uncle Bob" which alluded to the nepotistic in the lyrics:

Bob's your uncle
Follow your Uncle Bob
He knows what to do
He'll look after you

Partridge's Dictionary of Slang and Unconventional English (1937) notes the phrase but dates it to the 1890s though without attribution and it attained no currency in print until the post-war years.  Although it's impossible to be definitive, the musical connection does seem more convincing, the connection with Lord Salisbury probably retrospective.  It could however have even earlier origins, an old use noted in the Canting Dictionary (1725) in an entry reporting "Bob ... signifies Safety, ... as, It's all Bob, ie All is safe, the Bet is secured."

Of hair

A bob cut or bob is a short to shoulder-length haircut for women.  Historically, in the west, it’s regarded as a twentieth-century style although evidence of it exists in the art of antiquity and even some prehistoric cave-paintings hint it may go way back, hardly surprising given the functionality.  In 1922, The Times (of London), never much in favor of anything new, ran a piece by its fashion editor predicting the demise of the fad, suggesting it was already passé (fashion editors adore the word passé) although the photographic record for the rest of the decade does suggest it took the bright young things of the age a while to take the paper's hint.  Certainly, bobs were less popular by the difficult 1930s but in the 1960s, a variety of social and economic forces saw a resurgence which has never faded and the twenty-first century association with the Karen hasn't lessened demand (although the A-line variant, now known in the industry as the "speak to the manager" seems now avoided by all except those for whom there are few viable alternatives).  The connection with the Karen is the second time the bob has assumed some socio-political meaning; when flaunted by the proto-feminists of the 1920s, it was regarded as a sign of radicalism.  The popularity in the 1920s affected the millinery trades too as it was the small cloche which fitted tightly on the bobbed head which became the hat of choice.  Manufacturer of milliner's materials, hair-nets and hair-pins all suffered depressed demand, the fate too of the corset makers, victims of an earlier social change, a phenomenon which would in the post-war years devastate the industries supporting the production of hats for men.  In the 1970s, some optimists (some of whom may have been men), noting one well-publicized (though not widely practiced) aspect of second-wave feminism, predicted the demise of the bra but that garment endured and flourishes to this day.

Actor Lily Collins (b 1989) in a semi-sheer white Calvin Klein ensemble, the cropped spaghetti-strap top and knee-length pencil skirt both embellished with scale sequins, New York Fashion Week,  New York City, September 2025.  Note the pleasing definition of the sinews (arrowed, centre).  The hair-style is a chin-length bob.

Variations on a theme of bob, Marama Corlett (b 1984. left) and Lindsay Lohan (b 1986, right), Sick Note, June 2017.

Hairdressers have number of terms for the variations.  The motifs can in some cases be mixed and even within styles, lengths can vary, a classic short bob stopping somewhere between the tips of the ears and well above the shoulders, a long bob extending from there to just above the shoulders; although the term is often used, the concept of the medium bob really makes no sense and there are just fractional variations of short and long, everything happening at the margins.  So, a bob starts with the fringe and ends being cut in a straight line; length can vary but the industry considers shoulder-length a separate style and the point at which bobs stop and something else begins. Descriptions like curly and ringlet bobs refer more to the hair than the style but do hint at one caveat, not all styles suit all hair types, a caution which extends also to face shapes.

Greta Thunberg: BB (before-bob) and AB (after-bob).

The style received an unexpected imprimatur when Greta Thunberg (b 2003) opted for a bob (one straddling chin & shoulder-length).  Having gained fame as a weather forecaster, the switch to shorter hair appears to have coincided with her branching out from environmental activism to political direct action in the Middle East.  While there's no doubt she means well, it’s something that will end badly because while the matter of greenhouse gasses in the atmospheric can (over centuries) be fixed, some problems are insoluble and the road to the Middle East is paved six-feet deep with good intentions.  Ms Thunberg seems not to have discussed why she got a bob (and how she made her daily choice of "one braid or two" also remained mysterious) but her braids were very long and she may have thought them excessive and contributing to climate change.  While the effect individually would be slight, over the entire population there would be environmental benefits if all those with long hair got a bob because: (1) use of shampoo & conditioner would be lowered (reduced production of chemicals & plastics), (2) a reduction in water use (washing the hair and rinsing out all that product uses much), (3) reduced electricity use (hair dryers, styling wands & straighteners would be employed for a shorter duration) and (4) carbon emissions would drop because fewer containers of shampoo & conditioner would be shipped or otherwise transported.

Sydney Sweeney (b 1997) with new bob, Variety's Power of Women 2025 Event, Beverly Hills Hotel in Los Angeles, California, October 2025.

Actor Sydney Sweeney (b 1997) seemed not to have revealed whether it was Greta Thunberg who inspired her to get a bob but the symmetrical cut made quite a splash when she appeared on the red carpet at Variety's Power of Women 2025 Event.  The reaction universally was favourable but also noted by critics was her sparkling silver full-length gown from the spring 2026 collection of Christian Cowan (b 1995) & Elias Matso (b 2002); it’s fair to say dress overshadowed hair, fetching though the latter was.  The gown was called “Twisted Crystal Mesh Tee” and for deconstructionist fashionistas, the piece was a delight of detail in sheer fabric including bell sleeves, a scooped neckline, a form-fitting bodice with an intricately crafted twisted waist, lending a cinched effect which merged effortlessly to a lace-up fastener at the back, constructed with a corset-tie motif: coming or going, she looked good.  So lovely is Sydney Sweeney she would look good in just about anything but she certainly knows how to get the most from a garment, her underwear limited to “nude knickers” with diamond drop earrings and rings from EFFY.

Variety's clip of Sydney Sweeney (moving slightly) with new bob.

Her appearance in that dress of course provoked the digital traffic she would have expected and it’s hard to disagree with the feminist critics who suggested the juxtaposition of well-filled gown with the speech she delivered at the event was a device intended deliberately to illustrate the behavioral phenomenon she’d discussed in earlier interviews: That women can be defined as sexy or serious but not both simultaneously.  As evidence of that, the extent of the on-line coverage of how Ms Sweeney looked in the dress may be compared with the minimal attention afforded the speech she delivered from the podium, the former already joining the Alexandre Vauthier (b 1971) LRD (little red dress) worn by & Bella Hadid (b 1996) Cannes Film Festival in May 2016 as one of the dresses of the twenty-first century.  Of her words, most of the “cultural commentators” seemed intent on criticizing what they deemed the apparent discontinuity between her wishing to be taken seriously while looking so stunningly sexy, apparently missing the point that in bundling her body, the garment in which it was wrapped and the text she delivered as a single installation, she made her point well, dress and body just part of her text.

Sydney Sweeney with new bob.

Ever since the Canadian theorist Marshall McLuhan (1911–1980) explained the concept in Understanding Media: The Extensions of Man (1964), it’s been understood “the medium is the message”, his theory being it is the channel or technology through which information is transmitted which matters more than the content in the shaping human experience and society.  While that obviously wasn’t an absolute rule, the notion was helpful, decades before TikTok, in providing a model of the way a structure can have social effects independent of its content.  To define “medium”, McLuhan cast a wide net, including not only the then familiar (and dominant) television & print, but any channel through which information passes, including speech, gesture and appearance.  The person delivering a message is thus a medium and the reaction of an audience to the words of a glamorous, attractive woman can be very different to that extended to someone plain, even if both recite the same text with the same tonal technique.

Sydney Sweeney with new bob.

So, Ms Sweeney’s dress wasn’t just packaging, it was part of the meaning and that was not what she implied but what the audience inferred; what “the medium is the message” meant was the form of delivery and the embodied qualities of the communicator are inseparable from the content’s impact.  This was heady stuff in 1964 and, thirty-odd years on, the internet would gain critical mass and, at scale, prove his principle but his idea wasn’t new, the line of thought running through Western philosophy from Aristotle (384-322 BC) who called it “ethos” to Leo Strauss (1899–1973) who wrote of a kind of “authenticity”.  Unfortunately, Strauss was disturbed by way the writings of Friedrich Nietzsche (1844–1900) were so accessible they were there for Nazis and others to make of them something else so his meaning(s) existed in a kind of elaborated code it took some time to learn but definitely he was in the Aristotelian tradition McLuhan would have understood.  It’s a long way from Nietzsche to Sweeney but from her back to McLuhan, it’s not that far.

Bob identification: By their bob they shall be known

Asymmetrical Bob: Another general term which describes a bob cut with different lengths left and right; they can look good but should not be applied to all styles.  The effect is often most dramatic when combined with some variant of the Shaggy (JBF).

A-line bob: A classic bob which uses slightly longer strands in front, framing the face and, usually, curling under the chin; stylists caution this doesn’t suit all face shapes.

Buzz-cut bob: Known also as the undercut (pixie) bob, and often seen as an asymmetric, this is kind of an extreme inverted mullet; the the usual length(s) in the front and close-cropped at the back.  It can be a dramatic look but really doesn’t suit those above a certain BMI or age (although the former seem often unable to resist the look).

Chin-length bob: Cut straight to the chin, with or without bangs but, if the latter is chosen, it’s higher maintenance, needing more frequent trims to retain the sharpness on which it depends.  Depending on the face shape, it works best with or without fringe.

Inverted bob: A variation on the A-line which uses graduated layers at the back, the perimeter curved rather than cut straight. Known also as the graduated bob, to look best, the number of layers chosen should be dictated by the thickness of growth.

Shaggy bob: A deliberately messy bob of any style, neatness depreciated with strategic cutting either with scissors or razor, a styling trick best done by experts otherwise it can look merely un-kept.  The un-kept thing can be a thing if that’s what one wants but, like dying with gray or silver, it's really suitable only for the very young.  Some call this the choppy and it’s known in the vernacular of hairdressing as the JBF (just been fucked).

Spiky bob: This differs from a JBF in that it’s more obviously stylised.  It can differ in extent but with some types of hair is very high maintenance, demanding daily application of product to retain the directions in which the strands have to travel.  Not all hair is suited to the look and while product can compensate for much, beyond a certain point, there is a law of diminishing returns. 

Shingle bob: A cut tapered very short in the back, exposing the hairline at the neck with the sides shaped into a single curl, the tip of which sits at a chosen point on each cheek.  This needs to be perfectly symmetrical or it looks like a mistake.

Shoulder-length bob: A blunt bob that reaches the shoulders and has very few layers; with some hair it can even be done with all strands the same length.  Inherently, this is symmetrical and a remarkably different effect is created depending on whether it's done with or without a fringe although hairdressers caution this is not a style best suited to "round" faces and with those it can be necessary to experiment, a fringe sometimes improving things, sometimes not.

Speak to the manager bob: Not wishing to lose those customers actually named Karen, the industry shorthand for the edgy (and stereotypically in some strain of blonde) bob didn’t become “Karen”.  The classic SttM is an asymmetric blonde variation of the A-line with a long, side-swept fringe contrasted with a short, spiky cut at the back and emblematic of the style are the “tiger stripes”, created by the chunky unblended highlights.  It's now unfashionable though still seen because it remains the "go to cut" for women of a certain age who have been persuaded the style they've stuck to since they were 19 is no longer flattering.