Showing posts sorted by relevance for query Mullet. Sort by date Show all posts
Showing posts sorted by relevance for query Mullet. Sort by date Show all posts

Tuesday, May 21, 2024

Mullet

Mullet (pronounced muhl-it)

(1) Any of various teleost food marine or freshwater, usually gray fishes of the family Mugilidae (grey mullet (order Mugiliformes)) or Nullidae (red mullet (order Syngnathiformes)), having a nearly cylindrical body; a goatfish; a sucker, especially of the genus Moxostoma (the redhorses).

(2) A hairstyle in which the hair is short in the front and at the sides of the head, and longer in the back; called also the “hockey player haircut" and the "soccer rocker"; the most extreme form is called the skullet, replacing the earlier hockey hair.

(3) In heraldry, a star-like charge having five or six points unless a greater number is specified, used especially as the cadency (any one of several systems used to distinguish between similar coats of arms belonging to members of the same family) mark of a third son; known also as American star & Scottish star.  The alternative spelling is molet.

(4) In slang (apparently always in the plural), a reference to one’s children (two or more).

(5) In slang, a person who mindlessly follows a fad, trend or leader; a generally dim-witted person.

(6) In dress design, a design based on the hairstyle, built around the concept of things being longer at the back, tapering progressively shorter towards the sides and the front.  The name is modern, variations of the style go back centuries.

1350-1400: The use in heraldry is from the Middle English molet(te), from the Old French molete (rowel of a spur), the construct being mole (millstone (the French meule) + -ette (the diminutive suffix).  The reference to the fish species dates from 1400–50, from the late Middle English molet, mulet & melet, from the Old French mulet (red mullet), from the Medieval Latin muletus, from the Latin muletus & moletus from mullus (red mullet) from the Ancient Greek μύλλος (múllos & mýllos) (a Pontic of fish), which may be related to melos (black) but the link is speculative.

The use to describe the hairstyle is said to date from 1994, thought to be a shortening of the slang mullethead (blockhead, fool, idiot ("mull" used in the sense of "to dull or stupefy")), popularized and possibly coined by US pop-music group the Beastie Boys in their song Mullet Head (1994), acknowledged by the Oxford English Dictionary (OED) as the first use "in print" although the origin use is contested.  Mullethead also was a name used in the mid nineteenth century of a large, flat-headed North American freshwater fish which gained a reputation for stupidity (ie was easily caught).  As a surname, Mullet is attested in both France and England from the late thirteenth century, the French form thought related to the Old French mul (mule), the English from the Middle English molet, melet & mulet (mullet) a metonymic occupational name for a fisherman or seller of these fish although some sources do suggest a link to a nickname derived from mule (a beast with a reputation for (1) an ability to carry a heavy burden and (2) stubbornness).  The now less fashionable Australian slang form "stunned mullet" is used to imply that someone appears "especially or unusually dim-witted".

The "mullet" label casts a wide net: Red mullet (Goatfish) (left) and grey mullet (right).

In ichthyology, fish of the family Mugilidae are distinguished variously by modifiers including black mullet, bright mullet, bully mullet, callifaver mullet, grey mullet, diamond mullet, finger mullet, flathead mullet, hardgut mullet, Lebranche mullet, mangrove mullet, pearl mullet, popeye mullet, red mullet, river mullet, sea mullet, so-iuy mullet & striped mullet.  Mullet is a noun and mullety and mulletlike & mulleted are adjectives (as verbs mulleted and mulleting are non-standard as is the adjective mulletesque).  The noun plural is mullet if applied collectively to two or more species of the fish and mullets for other purposes (such as two or more fish of the same species and the curious use as a (class-associated) slang term parents use to refer to their children if there are two or more although use in the singular isn’t recorded; apparently they can have two (or more) mullets but not one mullet.

The Mullet  

Proto-mullet.

The mullet hairstyle goes back a long way.  The Great Sphinx of Giza is thought to be some four and a half thousand years old but evidence of men & women with hair cut short at the front and sides, long at the back, exists from thousands of years earlier.  It’s assumed by historians the cut would variously have been adopted for functional reasons (warmth for the neck and freedom for obstruction of the eyes & face) although aesthetics has probably always been a feature of the human character so it may also have been a preferred style.  There are many findings in the archaeological record and references to the hair style appear in the histories of many cultures.  In the West, the acceptability of longer cuts for men was one of the social changes of the 1960s and the mullet was one style to again arise; from there it’s never gone away although, as the mullet came to be treated as a class-identifier, use did become more nuanced, some claiming to wear one ironically.  The other sense in which "proto-mullet" is used is of a mulletlike hairstyle which at the back is shorter than the full-fledged mullet (such also once called the "tailgate" or "mudflap"). 

Rime of the Ancient Mullet: Samuel Taylor Coleridge (1772–1834).

Opinion remains divided and some schools have gone as far as to ban mullets because of an alleged association with anti-social or disruptive behavior.  At the other end of the spectrum there are are mullet competitions with prizes including trophies and bottles of bourbon whiskey.  It's suspected those who disapprove of the style, if asked to pick the "worst mullet", would likely choose the same contestants winning "best mullet" in their categories.  The competitions seem popular and are widely publicized, although the imagery can be disturbing for those with delicate sensibilities not often exposed to certain sub-cultures.  Such folk are perhaps more familiar with the Romantic poet Samuel Taylor Coleridge but there was a time when he wore a mullet although the portraits which survive suggest his might not have been sufficiently ambitious to win any modern contests.

Emos with variegated tellums: Black & copper (left) and black, magenta, blue & grey (right).

Associated initially with that most reliable of trend-setters, the emo, the tellum (mullet spelled backwards), more helpfully described as the “reverse mullet” is, exactly as suspected, long in front and short at the back.  Definitely a thing exclusively of style because it discards the functionally which presumably was the original rationale for the mullet, emos often combine the look with one or more lurid colors, the more patient sometimes adopting a spiky look which can be enlivened with a different color for each spike.  That’s said to be quite high-maintenance.  The asymmetric tellum can be engineered to provide a dramatic look, concealing much of the face, the power of effect said to be to force the focus onto the one exposed eye.  True obsessives use colored contact lens to match whatever is the primary hue applied to the the hair. 

Martina Navratilova (b 1956) playing a backhand shot.

On a tennis court, a mullet is functional and there are headband users who wrongly have been accused of being mulleteers.  No more monolithic than any others, it’s probably absurd to think of any of the component part of the LGBTQQIAAOP as being a visually identifiable culture but there appears to have been a small lesbian sub-set in the 1980s which adopted the mullet although motives were apparently mixed, varying from (1) chauvinistic assertiveness of the lesbionic, (2) blatant signalling when advertising for a mate to (3) just another haircut.  Despite that, there's little to suggest that in isolation a mullet on a woman tends to be used as a GABOSO (general association based on single observation) to assume she's a lesbian.

Caitlyn Jenner (when identified as Bruce) with mullet at different stages of transition.

It also featured in a recent, celebrated case of gender-fluidity, Bruce Jenner (b 1949) photographed sporting a mullet shortly before beginning his transition to Caitlyn Jenner.  However, the mullet may be unrelated to the change, the photographic record confirming his long-time devotion to the cut and, since transitioning to Caitlyn, it seems to have been retired for styles more overtly traditionally feminine.

A MulletFest entrant in the Junior (14 to 17 Years category).

In Australia, the mullet is much associated with the bogan, one of sociology’s more striking cross-cultural overlaps.  The correlation is of course not 1:1 but while the perception that all mullet-wearers are bogans is probably about right, not all bogans sport a mullet and they’re even credited with at least popularizing the “skull mullet” which takes the “short at the sides” idea down almost to the skin.  At the institutional level, there’s MulletFest which tours the nation conducting “Best Mullet Competitions” at appropriate events (rodeos, agricultural shows, meetings for those displaying hotted-up cars etc) with inclusive categories including five for children (age-based), rangas (redheads), vintage (for the over 50s), grubby (the criteria unclear) and the mysterious “extreme”.  All entrants are “…judged on their haircut, overall presentation and stage presence, and the person with the “Best Mullet of them All” is crowned on the day and takes home that worthy honour.”  Proceeds from MulletFest events are donated to local charities.

The Mullet Skirt

Charles II (1630–1685; King of Scotland 1649-1651, King of Scotland, England and Ireland 1660-1685) an early adopter of the mullet style, in his coronation robes (circa 1661), oil on canvas by John Michael Wright (1617–1694) (left) and two views of Lindsay Lohan, also with much admired legs, following the example of the House of Stuart, Los Angeles, August 2012 (centre & right).  Charles II got more fun out of life than his father (Charles I (1600–1649; King of England, Scotland & Ireland 1625-1649) and possibly more even than Charles III (b 1948; King of the United Kingdom since 2022), the House of Windsor's latest monarch.  Both Charles I & Charles III also rocked the mullet look for their coronations and fashionistas can debate who wore it best.

Sewing pattern for mullet dress (left) and or the catwalk, Miranda Kerr (b 1983, left) demonstrates a pale pink high-low celebrity, prom or graduation party dress, Liverpool Fashion Fest Runway, Mexico City, March 2011 (right).

The style of the mullet skirt long pre-dates the use of the name and the same concept used to be called "tail skirt", "train skirt" or "high-low circle skirt" (which in commercial use often appeared as "Hi-Lo skirt"), the terms still often used by those who find the mere mention of mullet distasteful.  The pattern for the fabric cut is deceptively simple but as in any project involving other than straight lines, it can be difficult to execute and the less volume that's desired in the garment, the harder it becomes to produce with precision.  That so many mullet dresses are bulky is probably a stylistic choice but the volume of fabric is handy for obscuring any inconsistencies.

The cheat cut mullet skirt.

Seamstresses do however have a trick which can work to convert an existing skirt into a mullet although again, it does work best if there's a lot of fabric.  Essentially, the trick is to lay the skirt perfectly flat, achieved by aligning the side seams (if there are no side seams, describe two with chalk lines); use a true, hard surface like a hardwood floor or a table to ensure no variations intrude.  Then, draw the cutting line, describing the shape to permit the extent of mulletness desired.  Unless absolutely certain, it's best to cult less, then try on the garment; if it's not enough, re-cut, repeating the process if necessary.  Because a hem will be needed, the cut should allow the loss of½ inch (12 mm) of fabric.

January Jones (b 1978 left) wore a blue “sea wave” piece from the Atelier Versace Spring 2010 collection to that year’s Emmy Awards ceremony and it was definitely a mullet.  Emma Stone’s (b 1988, centre left & centre right) sequined dress from Chanel's Fall 2009 haute couture collection, worn at the 2011 Vanity Fair Oscar party, was one of the season’s most admired outfits but it is not a mullet because it resembles one only when viewed at a certain angle; it should be regarded as an interpretation of the “train skirt”.  Caitlin FitzGerald (b 1983, right) appeared at the 2014 Golden Globes award ceremony in an Emilia Wickstead dress which featured an anything but straight hemline but it was not a mullet because the designer's intent was not to seek a "mullet effect"' it was a dress with a "swishy" skirt.  So, conceptually, the mullet dress is something like adding an "integrated cloak" to an outfit and the implications of that mean the result will sit somewhere on a spectrum and as with all mullets, there is a beginning, a middle and an end.  

Monday, November 29, 2021

Accouterment

Accouterment (pronounced uh-koo-ter-muhnt or uh-koo-truh-muhnt)

(1) A clothing accessory or a piece of equipment regarded as an accessory (sometimes essential, sometimes not, depending on context).

(2) In military jargon, a piece of equipment carried by a soldier, excluding weapons and items of uniform.

(3) By extension, an identifying yet superficial characteristic; a characteristic feature, object, or sign associated with a particular niche, role, situation etc.

(4) The act of accoutering; furnishing (archaic since Middle English).

1540-1550: From the Middle French accoutrement & accoustrement, from accoustrer, from the Old French acostrer (arrange, sew up).  As in English, in French, the noun accoutrement was used usually in the plural (accoutrements) in the sense of “personal clothing and equipment”, from accoustrement, from accoustrer, from the Old French acostrer (arrange, dispose, put on (clothing); sew up).  In French, the word was used in a derogatory way to refer to “over-elaborate clothing” but was used neutrally in the kitchen, chefs using the word of additions to food which enhanced the flavor.  The verb accouter (also accoutre) (to dress or equip" (especially in military uniforms and other gear), was from the French acoutrer, from the thirteenth century acostrer (arrange, dispose, put on (clothing)), from the Vulgar Latin accosturare (to sew together, sew up), the construct being ad- (to) + consutura (a sewing together), from consutus, past participle of consuere (to sew together), the construct being con- + suere (to sew), from the primitive Indo-European root syu- (to bind, sew).  The Latin prefix con- was from the preposition cum (with), from the Old Latin com, from the Proto-Italic kom, from the primitive Indo- European óm (next to, at, with, along).  It was cognate with the Proto-Germanic ga- (co-), the Proto-Slavic sъ(n) (with) and the Proto-Germanic hansō.  It was used with certain words to add a notion similar to those conveyed by with, together, or joint or with certain words to intensify their meaning.  The synonyms include equipment, gear, trappings & accessory.  The spelling accoutrement (accoutrements the plural) remains common in the UK and much of the English-speaking world which emerged from the old British Empire; the spelling in North America universally is accouterement.  The English spelling reflects the French pronunciation used in the sixteenth century.  Accouterment is a noun; the noun plural (by far the most commonly used form) is accouterments.

In the military, the equipment supplied to (and at different times variously worn or carried by) personnel tends to be divided into "materiel" and "accouterments".  Between countries, at the margins, there are differences in classification but as a general principle:  Materiel: The core equipment, supplies, vehicles, platforms etc used by a military force to conduct its operations.  This definition casts a wide vista and covers everything from a bayonet to an inter-continental ballistic missile (ICBM), from motorcycles to tanks and from radio equipment to medical supplies.  Essentially, in the military, “materiel” is used broadly to describe tangible assets and resources used in the core business of war.  Accouterments: These are the items or accessories associated with a specific activity or role.  Is some cases, an item classified as an accouterment could with some justification be called materiel and there is often a tradition associated with the classification.  In the context of clothing for example, the basic uniform is materiel whereas things like belts, holsters, webbing and pouches are accouterments, even though the existence of these pieces is essential to the efficient operation of weapons which are certainly materiel.

The My Scene Goes Hollywood Lindsay Lohan Doll was supplied with a range of accessories and accouterments.  Items like sunglasses, handbags, shoes & boots, earrings, necklaces, bracelets and the faux fur "mullet" frock-coat were probably accessories.  The director's chair, laptop, popcorn, magazines, DVD, makeup case, stanchions (with faux velvet rope) and such were probably accouterments.

In the fashion business, one perhaps might be able to create the criteria by which it could be decided whether a certain item should be classified as “an accessory” or “an “accouterment” but it seems a significantly pointless exercise and were one to reverse the index, a list of accessories would likely be as convincing as a list of accouterments.  Perhaps the most plausible distinction would be to suggest accessories are items added to an outfit to enhance or complete the look (jewelry, handbags, scarves, hats, sunglasses, belts etc) while accouterments are something thematically related but in some way separate; while one might choose the same accessories for an outfit regardless of the event to be attended, the choice of accouterments might be event-specific.  So, the same scarf might be worn because it works so well with the dress but the binoculars would be added only if going to the races, the former an accessory to the outfit, the latter an accouterment for a day at the track.  That seems as close as possible to a working definition but many will continue to use the terms interchangeably.

Wednesday, October 20, 2021

Puffer

Puffer (pronounced puhf-er)

(1) A person or thing that puffs.

(2) Any of various fishes of the family Tetraodontidae, noted for the defense mechanism of inflating (puffing up) its body with water or air until it resembles a globe, the spines in the skin becoming erected; several species contain the potent nerve poison tetrodotoxin.  Also called the blowfish or, globefish.

(3) In contract law, the casual term for someone who produces “mere puff” or “mere puffery”, the term for the type of exaggerated commercial claim tolerated by law.

(4) In cellular automaton modelling (a branch of mathematics and computer science), a finite pattern that leaves a trail of debris.

(5) In auctioneering, one employed by the owner or seller of goods sold at auction to bid up the price; a by-bidder (now rare, the term “shill bidders” or “shills” more common).

(6) In marine zoology, the common (or harbour) porpoise.

(6) A kier used in dyeing.

(8) In glassblowing, a soffietta (a usually swan-necked metal tube, attached to a conical nozzle).

(9) Early post-war slang for one who takes drugs by smoking and inhaling.

(10) In mountaineering (and latterly in fashion), an insulated, often highly stylized puffy jacket or coat, stuffed with various forms of insulation.

(11) As Clyde puffer, a type of cargo ship used in the Clyde estuary and off the west coast of Scotland.

(12) In electronics and electrical engineering, a type of circuit breaker.

(13) A manually operated medical device used for delivering medicines into the lungs.

(14) As puffer machine, a security device used to detect explosives and illegal drugs at airports and other sensitive facilities.

(15) In automotive engineering, a slang term for forced induction (supercharger & turbocharger), always less common than puffer.

In 1620–1630: A compound word puff + -er.  Puff is from the Middle English puff & puf from the Old English pyf (a blast of wind, puff).  It was cognate with the Middle Low German puf & pof.  The –er suffix is from the Middle English –er & -ere, from Old English -ere, from the Proto-Germanic -ārijaz, thought usually to have been borrowed from Latin –ārius and reinforced by the synonymous but unrelated Old French –or & -eor (The Anglo-Norman variant was -our), from the Latin -(ā)tor, from the primitive Indo-European -tōr.  Added to verbs (typically a person or thing that does an action indicated by the root verb) and forms an agent noun.  The original form from the 1620s was as an agent noun from the verb puff, the earliest reference to those who puffed on tobacco, soon extended to steamboats and steam engines generally when they appeared.  The sense of "one who praises or extols with exaggerated commendation" is from 1736, which, as “mere puff” or “mere puffery” in 1892 entered the rules of contract law in Carlill v Carbolic Smoke Ball Company (1892, QB 484 (QBD)) as part of the construction limiting the definition of misrepresentation.  The remarkable fish which inflates itself in defense was first noted in 1814, the meanings relating to machinery being adopted as the industrial revolution progressed although the more virile “blower” was always preferred as a reference to supercharging, puffer more appropriate for the hand-held inhalers used by those suffering a variety of respiratory conditions. 

Puffer Jackets and beyond

Calf-length puffer coats.

The first down jacket, a lightweight, waterproof and warm coat for use in cold places or at altitude and known originally as an eiderdown coat, appears to be the one designed by Australian chemist George Finch (1888-1970) for the 1922 Everest expedition but a more recognizable ancestor was the Skyliner, created by American Eddie Bauer (1899-1986) in 1936, his inspiration being the experience of nearly losing his life to hypothermia on a mid-winter fishing trip.  Using trapped air warmed by the body as a diver’s wet suit uses water, Bauer’s imperative was warmth and protection, but he created also a visual style, one copied in 1939 by Anglo-American fashion designer, Charles James (1906-1978) for his pneumatic jacket, the Michelin Man-like motif defining the classic puffer look to this day.

Lindsay Lohan in puffer vest with Ugg boots, Salt Lake City, Utah, 2013 (left) and in puffer jacket, New York City, 2018 (right).

It was in the late 1940s it began to enjoy acceptance as a fashion item, marketed as evening wear and it was sold in this niche in a variety of materials until the 1970s when a new generation of synthetic fibres offered designers more possibilities, including the opportunity to create garments with the then novel characteristic of being simultaneously able to be bulky, lightweight yet able to retain sculptured, stylized shapes.  These attributes enabled puffer jackets to be made for the women’s market, some of which used a layering technique to create its effect and these were instantly popular.  Although initially in mostly dark or subdued colors, by the 1980s, vibrant colors had emerged as a trend, especially in Italy and England.  By the twenty-first century, although available across a wide price spectrum, the puffer as a style cut across class barriers although, those selling the more expensive did deploy their usual tricks to offer their buyers class identifiers, some discrete, some not.

The puffer started life as a jacket and it took a long time to grow but by the 2000s, calf-length puffers had appeared as a retail item after attracting comment, not always favorable, on the catwalks.  Although not selling in the volumes of the jackets, the costs of lengthening can’t have been high because ankle and even floor-length puffers followed.  Down there it might have stopped but, in their fall 2018 collection released during Milan Fashion Week, Italian fashion house Moncler, noted for their skiwear, showed puffer evening gowns, the result of a collaborative venture with Valentino’s designers.  Available in designer colors as well as glossy black, the line was offered as a limited-edition which was probably one of the industry’s less necessary announcements given the very nature of the things would tend anyway to limit sales.  The ploy though did seemed to work, even at US$2,700 for the long dress and a bargain US$3,565 for the cocoon-like winter cape, demand was said to exceed supply so, even if not often worn, puffer gowns may be a genuine collector’s item.

A Dalek.

It wasn’t clear what might have been inspiration for the conical lines although the ubiquity of the shape in industrial equipment was noted.  It seemed variously homage to the burka, a sculptural installation of sleeping bags or the stair-challenged Daleks, the evil alien hybrids of the BBC's Dr Who TV series.  It also picked up also existing motifs from fashion design, appearing even as the playful hybrid of the mullet dress and a cloak.

A monolith somewhere may also have been a reference point but the puffer gown was not stylistically monolithic.  Although to describe the collection as mix-n-match might be misleading, as well as designer colors, some of the pieces technically were jackets, there were sleeves, long and short and though most hems went to the floor, the mullet offered variety, especially for those who drawn to color combination.  Most daring, at least in this context, were the sleeveless, some critics suggesting this worked best with gowns cinched at the middle.


By the time of the commercial release early in 2019, solid colors weren’t the only offering, the range reflecting the influence of Ethiopian patterns although, in a nod to the realities of life, only puffer jackets were made available for purchase.  Tantalizingly (or ominously, depending on one’s view), Moncler indicated the work was part of what they called their “genius series”, the brand intending in the future to collaborate with other designers as well as creating a series of Moncler events in different cities, the stated aim to “showcase the artistic genius found in every city”.  The venture was pursued but in subsequent collections, many found the quality of genius perhaps too subtly executed for anyone but fellow designers and magazine editors to applaud.  The shock of the new has become harder to achieve.

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text” endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.