Showing posts sorted by relevance for query Interpolate. Sort by date Show all posts
Showing posts sorted by relevance for query Interpolate. Sort by date Show all posts

Friday, April 22, 2022

Interpolate & Extrapolate

Interpolate (pronounced in-tur-puh-leyt)

(1) To introduce (something additional or extraneous) between other things or parts; interject; interpose; intercalate; to make additions, interruptions, or insertions.

(2) In mathematics, to estimate (a value of a function) between the values already known or determined.

(3) To alter a text by the insertion of new matter (with a long history of being applied especially if done deceptively or without authorization but technically a neutral term and can be used either way).

(4) To insert (additional or spurious material) in this manner.

1605–1615: From the Latin interpolātus, past participle of interpolātus & interpolāre (to make new, refurbish, touch up; to give a new appearance to), the construct being inter- (between, among, together) + -polā- (verb stem (akin to polīre (to smooth or polish) + -tus (the past participle suffix) from polare, from the primitive Indo-European root pel- (to thrust, strike, drive), the connecting notion being "to full cloth".  The sense evolved in Latin from the neutral "refurbish" to the slightly more loaded "alter appearance of" to the actually accusative "falsify” (especially or specifically by adding new material".  By the early fifteenth century Middle English had gained interpolen in a similar sense and by the 1650s also interpolator, from the Late Latin interpolator (one who corrupts or spoils), agent noun from past participle stem of Latin interpolāre.  The noun interpolation (that which is interpolated) dates from the 1670s and appears to have evolved both from the seventeenth century French interpolation and directly from the Latin interpolationem (nominative interpolatio) from the past participle stem of interpolāre.  Interpolate, interpolated & interpolating are verbs, interpolater (or interpolator) & interpolation are nouns, interpolable, interpolatory, interpolative are adjectives and interpolatively is an adverb.

Extrapolate (pronounced ik-strap-uh-leyt)

(1) To infer (an unknown) from something that is known; an evidence-based conjecture.

(2) In statistics, to estimate (the value of a variable) outside the tabulated or observed range.

(3) In mathematics, to estimate (a function that is known over a range of values of its independent variable) to values outside the known range.

(4) To perform extrapolation.

1830s: The construct was extra- + -polate (extracted and borrowed from interpolate).  The verb extrapolate in the sense of “make an approximate calculation by inferring unknown values from trends in the known data" became popular among astronomers, statisticians, economists & mathematicians after appearing in an 1862 Harvard Observatory account of Comet Donati (Donati's Comet (C/1858 L1 & 1858 VI)) in 1858).  In contemporary accounts, it was said to have been a word used since the 1830s by English mathematician and astronomer Sir George Airy (1801-1892).  Extrapolation (an approximate calculation made by inferring unknown values from trends in the known data) dates from 1867 and was the noun of action from extrapolate by analogy with the long-established interpolation although the original sense was "an inserting of intermediate terms in a mathematical series", the transferred sense of "drawing of a conclusion about the future based on present tendencies" adopted since 1889.  Extrapolate, extrapolated & extrapolating are verbs, extrapolater (or extrapolator) & extrapolation are nous, extrapolable, extrapolatory, extrapolative are adjectives and extrapolatively is an adverb.

Extrapolation and Interpolation

The common root of the words is the Latin verb (polīre) meaning “to polish” which in this context means “adding finish” to a data-set by adding what’s missing but the prefix is most useful in distinguishing between the two, inter- meaning “between” or “among,” and extra-, “outside” or “beyond”.  The two words look similar and at first glance it’d be not unreasonable to assume they might be antonyms but, although related in use and tangled in history, they are used in different ways and, one highly nuanced and the other sometimes applied correctly but inducing the drawing of erroneous or at least misleading conclusions.  Interpolation refers to inserting something between other things, while extrapolation is the act of drawing conclusions about something unknown based on what is known.  In mathematics, the meanings are uncontroversial in that interpolation is the process of determining an unknown value within a sequence based on other points in that set, while extrapolation is the process of determining an unknown value outside of a set based on the existing data (often expressed as a “curve”).  Interpolation is a commonly used tool of mathematicians, statisticians and others in the data-based sciences where it’s necessary to determine a function’s value based on the value of other points, an unknown value within the sequence is determined based on what else is in the sequence.

Interpolation, used beyond mathematics can be a loaded word because it’s the act of introducing something (additional or extraneous) between other parts, usually in text or musical notation and thus the technical equivalent of “insert” or (sometimes) “interject or interpose”.  Interpolation can thus be a merely neutral description but because of the history of the word (in Latin it evolved from the neutral "refurbish" to the slightly more loaded "alter appearance of" to the actually accusative "falsify” (especially or specifically by adding new material"), can imply that what has been inserted is spurious, false, misleading or done with some other nefarious purpose.  It’s thus a word which needs to be used with caution lest implications be drawn where no inference was intended.

A big word with lots of syllables, interpolate may be unfamiliar to many and that’s maybe why sometimes it’s been used apparently in an attempt to impart some sense of gravitas or perhaps disguise what’s really happening.  In pop music, sampling, the interpolation of other people’s music into one’s own is now probably a sub-genre and it’s well understood although, despite the involvement of courts and copyright lawyers, the distinctions between sampling, interpretation and actual appropriation although well-trimmed, remain frayed at the margins and all three can be interpolated.  One derided as a form of plagiarism, sampling seems to have gained respectability, at least among those who practice the art, the critical legal device apparently being to sample by using a fragment from a previously recorded song, but re-recording rather than directly copying the original.  The origin of the practice appears to be as the work-around for when the copyright holder refuses to license the original for sampling purposes.  Use in this way, only a publisher’s permission is required although in some common-law jurisdictions, the original can be subject to a compulsory licensing regime.

Extrapolation is related to deduction, an act of drawing a conclusion about something unknown based on what is known so the verb extrapolate is often used synonymously with infer and deduce.  However, in mathematics, while the act of interpolation involves a closed data set with defined low and high values, extrapolation involves estimating the value of a variable or function outside an observed range so it can be necessary to understand the context (social, economic etc) of the numbers being used in the exercise.  A Roll-Royce dealership which has a good month and sells ten cars should probably not from that data-set extrapolate that in the year ahead they will sell 120; other factors need to be considered beyond the simple math.

Xanax (Alprazolam), a fast-acting benzodiazepine.  It is marketed as anti-anxiety medication.

Lindsay Lohan released the track Xanax in 2019.  With a contribution from Finnish pop star Alma (Alma-Sofia Miettinen; b 1996), the accompanying music video was said to be “a compilation of vignettes of life”, Xanax reported as being inspired by Ms Lohan’s “personal life, including an ex-boyfriend and toxic friends”.  Structurally, Xanax was quoted as being based around "an interpolation ofBetter Off Alone, by Dutch Eurodance-pop collective Alice Deejay, slowed to a Xanax-appropriate tempo.


Lindsay Lohan risked going straight to Hell by creating a promotional meme featuring Pope Francis (b 1936; Roman Catholic Pope since 2013).  Cryptically captioned Blessed Be The Fruit, it included an image of the art-work used for her debut album Speak (2004).  Given the problems he's expected to manage, solve or conceal (depending on the circumstances), most would forgive the pope if he popped the odd Xanax.


The original photograph (top left) was taken in 2013 during a mass conducted in the Catedral Basílica do Santuário Nacional de Nossa Senhora Aparecida (Cathedral Basilica of the National Shrine of Our Lady Aparecida) in Aparecida, Brazil.  His Holiness was at the time administering communion.  It has since proved a popular photograph for meme-makers interpolating optical discs.

Xanax by Lindsay Lohan

I don't like the parties in LA, I go home
In a bad mood, pass out, wake up alone
Just to do it all over again, oh
Looking for you

Only one reason I came here
Too many people, I can't hear
Damn, I got here at ten
Now it's 4 AM

I can't be in this club
It's too crowded and I'm fucked
Ain't nobody here for love
Ain't nobody care about us
I got social anxiety, but you're like Xanax to me, yeah
Social anxiety, when you kiss me, I can't breathe
No, I can't be in this club
It's too crowded and I'm fucked
Ain't nobody here for love
Ain't nobody care 'bout us
 
I got social anxiety, but you're like Xanax to me, yeah
Social anxiety, when you kiss me, I can't breathe, yeah
 
But you're like Xanax to me
When you kiss me, I can't breathe
 
I try to stay away from you, but you get me high
Only person in this town that I like
Guess I can take one more trip for the night
Just for the night
 
Only one reason I came here
Too many people, I can't hear
Damn, I got here at ten
Now it's 4 AM
 
I can't be in this club
It's too crowded and I'm fucked
Ain't nobody here for love
Ain't nobody care about us
I got social anxiety, but you're like Xanax to me, yeah
Social anxiety, when you kiss me, I can't breathe
No, I can't be in this club
It's too crowded and I'm fucked
Ain't nobody here for love
Ain't nobody care 'bout us
 
I got social anxiety, but you're like Xanax to me, yeah
Social anxiety, when you kiss me, I can't breathe, yeah
 
But you're like Xanax to me
When you kiss me, I can't breathe
 
But you're like Xanax to me
When you kiss me, I can't breathe

Xanax lyrics Universal © Music Publishing Group


Wednesday, November 18, 2020

Epitaph

Epitaph (pronounced ep-i-taf or ep-i-tahf)

(1) A commemorative inscription on a tomb or mortuary monument about the person buried at that site.

(2) A brief poem or written passage composed in commemoration of a dead person.

(3) A final judgment on a person or thing.

(4) To commemorate in or with an epitaph.

(5) To write or speak after the manner of an epitaph. 

1350–1400: From the Middle English epitaphe (inscription on a tomb or monument), from the Old French epitafe, from the twelfth century Old French epitaphe, from the Latin epitaphium (funeral oration, eulogy), from the Ancient Greek epitáphion (over or at a tomb; a funeral oration), (noun use of neuter of πιτάφιος (epitáphios) ((words) spoken on the occasion of a funeral), the construct being epi- (From the Ancient Greek πί (epí) (at, over; on top of; in addition to (in a special use in chemistry, it denotes an epimeric form))) + τάφος (táph(os)) (tomb) + -ion (the noun-adjectival suffix).  Táphos (tomb, burial, funeral) was related to taphē (interment) & thaptō (to bury) of uncertain origin.  It has long been thought derived (like the Armenian damban (tomb)) from the primitive Indo-European root dhembh- (to dig, bury) but recent scholarship has cast doubts and some etymologists suggest both the Armenian and Greek could be borrowings.  There were equivalent words in the Old English and regional variations were many; the one which survived longest was byrgelsleoð.

The companion words, which differ not only in nuance but in convention of use, include eulogy (an oration about the dead, delivered usually at a funeral or memorial service), obituary (something in written form published soon after death which provides a potted biography and epigraph (a quote engraved on a tombstone, variously plaintive, humorous or barbed).  Not quite the same but very to the point is the Latin hic jacet (literally “here lies”).  Epitaph is a noun or verb (used with object), epitaphic, epitaphial, epitaphed & epitaphless are adjectives, epitaphically is an adverb and epitaphist is a noun.  The noun plural is epitaphs.

Jonathan Swift's marble memorial, St Patrick's Cathedral, Dublin.

One of the most celebrated epitaphs in English was saeva indignatio (literally “savage indignation”) which appeared on the tomb of the delightfully wicked Anglo-Irish satirist & poet Jonathan Swift (1667-1745), expressing a resigned contempt at human folly.  Swift is probably best remembered for Gulliver's Travels (1726) but it was A Modest Proposal (1729) which defined the genre of satire and work in this vein is often still labeled "Swiftian".  Swift started his political life as a Whig but ended it a Tory, becoming an Anglican cleric who was appointed Dean of St Patrick's Cathedral, Dublin.

Swift not only wrote his own epitaph but left instructions also for the stonemason and the authorities of Saint Patrick’s Cathedral, the memorial to be rendered in black marble, mounted seven feet from the ground, the large letters to be deeply cut and strongly gilded.  His specifications were followed but the stridency of Swift's Latin displeased a few who, finding it harsh or inelegant, didn't always reproduce it with complete fidelity.  The translation into modern English is Here is laid the body of Jonathan Swift.....where savage indignation can no longer tear his heart. Depart, wayfarer, and imitate if you can a man who to his utmost strenuously championed liberty.  Fellow Irish poet William Butler Yeats (1865–1939) rendered it as the punchier Swift has sailed into his rest; savage indignation there cannot lacerate his breast.  Imitate him if you dare, world-besotted traveller; he served human liberty.

Epitaph (1990) by Charles Mingus (CBS–466631 2).

Charles Mingus (1922–1979) was an American double bassist, pianist, composer and bandleader and one of the seminal figures in jazz.  Although lauded for the way his bands would interpolate passages of collective improvisation into performance pieces, he was influential also in his structured compositions, some of which were, by the standards of the genre, unusually long.  None however matched his Epitaph, comprising over four-thousand measures (a grouping of beats, which indicates the meter of a particular piece of music) and demanding more than two hours to perform, ranking with epic-length pieces such as Wynton Marsalis’s (b 1961) Blood On The Fields (1997) and Carla Bley’s (b 1936) Escalator Over The Hill (1968-1971); only Wadada Leo Smith’s (b 1941) sprawling Ten Freedom Summers (2012), unfolding over five hours, runs longer.

It’s not clear how long Mingus worked on Epitaph and its gestation may have absorbed as long as Ten Freedom Summers (thirty-four years in the making) because fragments of Epitaph were performed as early as 1962 although whether it was then envisaged as what it became is unknown.  It was only after his death, while Mingus’s work was being catalogued, that the whole of Epitaph was assembled and the score compiled.  This enabled the piece to be performed in 1989 by a thirty-piece orchestra, conducted by Gunther Schuller (1925-2015) and produced by Mingus's widow, Sue Graham Mingus (b circa 1933).  It has since had a number of performances, several in 2007, and the complete score has been published.

Lindsay Lohan reading the epitaphs, graveyard scene in I know who killed me (2007).

Epitaph, full of melodies, is rewarding and not entirely unfamiliar because Mingus over the years included several snatches in live recordings and concerts preformed with smaller bands, playfully sampling the music of a few others in sections although that’s not typical of Epitaph, a work all have noted for its originality.  A two-hour suite for thirty-one musicians is not necessarily unwieldy but Epitaph is complicated and really demands a band both familiar with each-other and well-rehearsed.  It’s not the sort of piece suited to an ensemble, however virtuosic, assembled for a one-off performance and the definitive performance which one day will be released will likely have been carefully edited and polished from any number of studio sessions.  Technically, it’s challenging for a conductor, there are shifts between melodic strains which sometimes are sudden and sometimes overlap, parts apparently unresolved skid to a stop, tempos pick-up at various paces and there’s an underlying cross-talking between extreme-register instruments; doubtlessly it's no less difficult for the musicians, two pianists, two bassists, a drummer and two percussionists needing peacefully to co-exist although, this is Mingus and that means creative tension is lives between the notes.  Even once détente was established however, there's still the piece itself to conquer, not all of it in the familiar language of jazz for there are vertiginous jumps in register, fast phrases slurring effortlessly to the languid and the jar sometimes of the polytonality of which American composers of the twentieth century were so fond.  Critics and other aficionados of the art were enchanted but it’s suspected there were those who dipped in and out of their CD and listened just to the bits they liked.

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text” endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Saturday, February 17, 2024

Algorithm

Algorithm (pronounced al-guh-rith-um)

(1) A set of rules for solving a problem in a finite number of steps.

(2) In computing, a finite set of unambiguous instructions performed in a prescribed sequence to achieve a goal, especially a mathematical rule or procedure used to compute a desired result.

(3) In mathematics and formal logic, a recursive procedure whereby an infinite sequence of terms can be generated.

1690s: From the Middle English algorisme & augrym, from the Anglo-Norman algorisme & augrimfrom, from the French algorithme, re-fashioned (under mistaken connection with Greek αριθμός (arithmos) (number)) from the Old French algorisme (the Arabic numeral system) from the Medieval Latin algorismus, a (not untypical) mangled transliteration of the Arabic الخَوَارِزْمِيّ (al-awārizmiyy), the nisba (the part of an Arabic name consisting a derivational adjective) of the ninth century Persian mathematician Muammad ibn Mūsā al-Khwārizmī and a toponymic name meaning “person from Chorasmia” (native of Khwarazm (modern Khiva in Uzbekistan)).  It was Muammad ibn Mūsā al-Khwārizmī works which introduced to the West some sophisticated mathematics (including algebra). The earlier form in Middle English was the thirteenth century algorism from the Old French and in English, it was first used in about 1230 and then by the English poet Geoffrey Chaucer (circa 1344-1400) in 1391.  English adopted the French term, but it wasn't until the late nineteenth century that algorithm began to assume its modern sense.  Before that, by 1799, the adjective algorithmic (the construct being algorithm + -ic) was in use and the first use in reference to symbolic rules or language dates from 1881.  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (HSO) has more oxygen atoms per molecule than sulphurous acid (HSO).  The noun algorism, from the Old French algorisme was an early alternative form of algorithm; algorismic was a related form.  The meaning broadened to any method of computation and from the mid twentieth century became especially associated with computer programming to the point where, in general use, this link is often thought exclusive.  The spelling algorism has been obsolete since the 1920s.  Algorithm, algorithmist, algorithmizability, algorithmocracy, algorithmization & algorithmics are nouns, algorithmize is a verb, algorithmic & algorithmizable are adjectives and algorithmically is an adverb; the noun plural is algorithms.

Babylonian and later algorithms

An early Babylonian algorithm in clay.

Although there is evidence multiplication algorithms existed in Egypt (circa 1700-2000 BC), a handful of Babylonian clay tablets dating from circa 1800-1600 BC are the oldest yet found and thus the world's first known algorithm.  The calculations described on the tablets are not solutions to specific individual problems but a collection of general procedures for solving whole classes of problems.  Translators consider them best understood as an early form of instruction manual.  When translated, one tablet was found to include the still familiar “This is the procedure”, a phrase the essence of every algorithm.  There must have been many such tablets but there's a low survival rate of stuff from 40 centuries ago not regarded as valuable.

So associated with computer code has the word "algorithm" become that it's likely a goodly number of those hearing it assume this was its origin and any instance of use happens in software.  The use in this context, while frequent, is not exclusive but the general perception might be it's just that.  It remains technically correct that almost any set of procedural instructions can be dubbed an algorithm but given the pattern of use from the mid-twentieth century, to do so would likely mislead or confuse confuse many who might assume they were being asked to write the source code for software.  Of course, the sudden arrival of mass-market generative AI (artificial intelligence) has meant anyone can, in conversational (though hopefully unambiguous) text, ask their tame AI bot to produce an algorithm in the syntax of the desired coding language.  That is passing an algorithm (using the structures of one language) to a machine which interprets the text and converts it to language in another structure, something programmers have for decades been doing for their clients.

A much-distributed general purpose algorithm (really more of a flow-chart) which seems so universal it can be used by mechanics, programmers, lawyers, physicians, plumbers, carpet layers, concreting contractors and just about anyone whose profession is object or task-oriented.   

The AI bots have proved especially adept at such tasks.  While a question such as: "What were the immediate implications for Spain of the formation of the Holy Alliance?" produces varied results from generative AI which seem to range from the workmanlike to the inventive, when asked to produce computer code the results seem usually to be in accord with a literal interpretation of the request.  That shouldn't be unexpected; a discussion of early nineteenth century politics in the Iberian Peninsular is by its nature going to to be discursive while the response to a request for code to locate instances of split infinitives in a text file is likely to vary little between AI models.  Computer languages of course impose a structure where syntax needs exactly to conform to defined parameters (even the most basic of the breed such as that PC/MS-DOS used for batch files was intolerant of a single missing or mis-placed character) whereas something like the instructions to make a cup of tea (which is an algorithm even if not commonly thought of as one) greatly can vary in form even though the steps and end results can be the same.

An example of a "how to make a cup of tea" algorithm.  This is written for a human and thus contains many assumptions of knowledge; one written for a humanoid robot would be much longer and include steps such as "turn cold tap clockwise" and "open refrigerator door".

The so-called “rise of the algorithm” is something that has attracted much comment since social media gained critical mass; prior to that algorithms had been used increasingly in all sorts of places but it was the particular intimacy social media engenders which meant awareness increased and perceptions changed.  The new popularity of the word encouraged the coining of derived forms, some of which were originally (at least to some degree) humorous but beneath the jocularity, many discovered the odd truth.  An algorithmocracy describes a “rule by algorithms”, a critique in political science which discusses the implications of political decisions are being made by algorithms, something which in theory would make representative and responsible government not so much obsolete as unnecessary.  Elements of this have been identified in the machinery of government such as the “Robodebt” scandal in Australia in which one or more algorithms were used to raise and pursue what were alleged to be debts incurred by recipients of government transfer payments.  Despite those in charge of the scheme and relevant cabinet ministers being informed the algorithm was flawed and there had been suicides among those wrongly accused, the politicians did nothing to intervene until forced by various legal actions.  While defending Robodebt, the politicians found it very handy essentially to disavow connection with the processes which were attributed to the algorithm.

The feeds generated by Instagram, Facebook, X (formerly known as Twitter) and such are also sometimes described as algorithmocracies in that it’s the algorithm which determines what content is directed to which user.  Activists have raised concerns about the way the social media algorithms operate, creating “feedback loops” whereby feeds become increasingly narrow and one-sided in focus, acting only to reinforce opinions rather than inform.  In fairness, that wasn’t the purpose of the design which was simply to keep the user engaged, thereby allowing the platform to harvest more the product (the user’s attention) they sell to consumers (the advertisers).  Everything else is an unintended consequence and an industry joke was the word “algorithm” was used by tech company CEOs when they didn’t wish to admit the truth.  A general awareness of that now exists but filter bubbles won’t be going away but what it did produce were the words algorithmophobe (someone unhappy or resentful about the impact of algorithms in their life) and algorithmophile (which technically should mean “a devotee or admirer of algorithms” but is usually applied in the sense of “someone indifferent to or uninterested in the operations of algorithms”, the latter represented by the great mass of consumers digitally bludgeoned into a state of acquiescent insensibility.

Some of the products are fighting back: The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now (2024) by  by Hilke Schellmann, pp 336, Hachette Books (ISBN-13: 978-1805260981).

Among nerds, there are also fine distinctions.  There are subalgorithms (sub-algorithm seems not a thing) which is a (potentially stand-alone) algorithm within a larger one, a concept familiar in many programming languages as a “sub-routine” although distinct from a remote procedure call (RPC) which is a subroutine being executed in a different address space.  The polyalgorithm (again hyphens just not cool) is a set of two or more algorithms (or subalgorithms) with instructions for choosing which in some way integrated.  A very nerdy dispute does exist within mathematics and computer science around whether an algorithm, at the definitional level, really does need to be restricted to a finite number of steps.  The argument can eventually extend to the very possibility of infinity (or types of infinity according to some) so it really is the preserve of nerds.  In real-world application, a program is an algorithm only if (even eventually), it stops; it need not have a middle but must have a beginning and an end.

There is also the mysterious pseudoalgorithm, something les suspicious than it may first appear.  Pseudoalgorithms exist usually for didactic purposes and will usually interpolate (sometime large) fragments of a real algorithm bit it may be in a syntax which is not specific to a particular (or any) programming language, the purpose being illustrative and explanatory.  Intended to be read by humans rather than a machine, all a pseudoalgorithm has to achieve is clarity in imparting information, the algorithmic component there only to illustrate something conceptual rather than be literally executable.  The pseudoalgorithm model is common in universities and textbooks and can be simplified because millions of years of evolution mean humans can do their own error correction on the fly.

Of the algorithmic

The Netflix algorithm in action: Lindsay Lohan (with body-double) during filming of Irish Wish (2024).  The car is a Triumph TR4 (1961-1967), one of the early versions with a live rear axle, a detail probably of no significance in the plot-line.

The adjective algorithmic has also emerged as an encapsulated criticism, applied to everything from restaurant menus, coffee shop décor, choices of typefaces and background music.  An entire ecosystem (Instagram, TikTok etc) has been suggested as the reason for this multi-culture standardization in which a certain “look, sound or feel” becomes “commoditised by acclamation” as the “standard model” of whatever is being discussed.  That critique has by some been dismissed as something reflective of the exclusivity of the pattern of consumption by those who form theories about what seem not very important matters; it’s just they only go to the best coffee shops in the nicest parts of town.  In popular culture though the effect of the algorithmic is widespread, entrenched and well-understood and already the AI bots are using algorithms to write music will be popular, needing (for now) only human performers.  Some algorithms have become well-known such as the “Netflix algorithm” which presumably doesn’t exist as a conventional algorithm might but is understood as the sets of conventions, plotlines, casts and themes which producers know will have the greatest appeal to the platform.  The idea is nothing new; for decades hopeful authors who sent manuscripts to Mills & Boon would receive one of the more gentle rejection slips, telling them their work was very good but “not a Mills & Boon book”.  To help, the letter would include a brochure which was essentially a “how to write a Mills & Boon book” guide and it included a summary of the acceptable plot lines of which there were at one point reputedly some two dozen.  The “Netflix algorithm” was referenced when Falling for Christmas, the first fruits of Lindsay Lohan’s three film deal with the platform was released in 2022.  It was an example of followed a blending of several genres (redemption, Christmas movie, happy ending etc) and the upcoming second film (Irish Wish)  is of the “…always a bridesmaid, never a bride — unless, of course, your best friend gets engaged to the love of your life, you make a spontaneous wish for true love, and then magically wake up as the bride-to-be.” school; plenty of familiar elements there so it’ll be interesting to see if the algorithm was well-tuned.

Math of the elliptic curve: the Cox–Zucker machine can help.

Some algorithms have become famous and others can be said even to have attained a degree of infamy, notably those used by the search engines, social media platforms and such, the Google and TikTok algorithms much debated by those concerned by their consequences.  There is though an algorithm remembered as a footnote in the history of linguistic oddities and that is the Cox–Zucker machine, published in 1979 by Dr David Cox (b 1948) and Dr Steven Zucker (1949–2019).  The Cox–Zucker machine (which may be called the CZM in polite company) is used in arithmetic geometry and provides a solution to one of the many arcane questions which only those in the field understand but the title of the paper in which it first appeared (Intersection numbers of sections of elliptic surfaces) gives something of a hint.  Apparently it wasn’t formerly dubbed the Cox–Zucker machine until 1984 but, impressed by the phonetic possibilities, the pair had been planning joint publication of something as long ago as 1970 and undergraduate humor can’t be blamed because they met as graduate students at Princeton University.  The convention in academic publishing is for authors’ surnames to appear in alphabetical order and the temptation proved irresistible.