Showing posts sorted by date for query Interpolate. Sort by relevance Show all posts
Showing posts sorted by date for query Interpolate. Sort by relevance Show all posts

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text”) endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Saturday, November 16, 2024

Parole

Parole (pronounced puh-rohl or pa-rawl (French))

(1) In penology, the (supervised) conditional release of an inmate from prison prior to the end of the maximum sentence imposed.

(2) Such a release or its duration.

(3) An official document authorizing such a release (archaic except as a modifier).

(4) In military use, the promise (usually in the form of a written certificate) of a prisoner of war, that if released they either will return to custody at a specified time or will not again take up arms against their captors.

(5) Any password given by authorized personnel in passing by a guard (archaic but still used in video gaming).

(6) In military use, a watchword or code phrase; a password given only to officers, distinguished from the countersign, given to all guards (archaic but still used in video gaming).

(7) A word of honor given or pledged (archaic).

(8) In US immigration legislation, the temporary admission of non-U.S. citizens into the US for emergency reasons or on grounds considered in the public interest, as authorized by and at the discretion of the attorney general.

(9) In structural linguistics, language as manifested in the individual speech acts of particular speakers (ie language in use, as opposed to language as a system).

(10) To place or release on parole.

(11) To admit a non-US citizen into the US as provided for in the parole clauses in statute.

(12) Of or relating to parole or parolees:

(13) A parole record (technical use only).

1610–1620: From the Middle French parole (word, formal promise) (short for parole d'honneur (word of honor)), from the Old French parole, from the Late Latin parabola (speech), from the Classical Latin parabola (comparison), from the Ancient Greek παραβολή (parabol) (a comparison; parable (literally “a throwing beside”, hence “a juxtaposition").  The verb was derived from the noun an appeared early in the eighteenth century; originally, it described “what the prisoner did” (in the sense of a “pledge”) but this sense has long been obsolete.  The transitive meaning “put on parole, allow to go at liberty on parole” was in use by the early 1780s while the use to refer to “release (a prisoner) on his own recognizance” doesn’t appear for another century.  The adoption in English was by the military in the sense of a “word of honor” specifically that given by a prisoner of war not to escape if allowed to go about at liberty, or not to take up arms again if allowed to return home while the familiar modern sense of “a (supervised) conditional release of a inmate before their full term is served” was a part of criminal slang by at least 1910.  An earlier term for a similar thing was ticket of leave.  In law-related use, parol is the (now rare) alternative spelling.  Parole is a noun & verb, parolee is a noun, paroled & paroling are verbs and parolable, unparolable, unparoled & reparoled are adjectives (hyphenated use is common); the noun plural is paroles.

A parole board (or parole authority, parole panel etc) is panel of people who decide whether a prisoner should be released on parole and if released, the parolee is placed for a period under the supervision of a parole officer (a law enforcement officer who supervises offenders who have been released from incarceration and, often, recommends sentencing in courts of law).  In some jurisdictions the appointment is styled as “probation officer”.  The archaic military slang pass-parole was an un-adapted borrowing from French passe-parole (password) and described an order passed from the front to the rear by word of mouth. Still sometimes used in diplomatic circles, the noun porte-parole (plural porte-paroles) describes “a spokesperson, one who speaks on another's behalf” and was an un-adapted borrowing from mid sixteenth century French porte-parole, from the Middle French porteparolle.

The Parole Evidence Rule

In common law systems, the parol evidence rule is a legal principle in contract law which restricts the use of extrinsic (outside) evidence to interpret or alter the terms of a written contract.  The operation of the parol evidence rule means that if two or more parties enter into a written agreement intended to be a complete and final expression of their terms, any prior or contemporaneous oral or written statements that contradict or modify the terms of that written agreement cannot be used in court to challenge the contract’s provisions.  The rule applies only to properly constructed written contracts which can be regarded as “final and complete written agreements” and the general purpose is to protect the integrity of the document.  Where a contract is not “held to be final and complete”, parol evidence may be admissible, including cases of fraud, misrepresentation, mistake, illegality or where the written contract is ambiguous.  The most commonly used exceptions are (1) Ambiguity (if a court declares a contract term ambiguous, external evidence may be introduced to to clarify the meaning), (2) Void or voidable contracts (if a contract was entered into under duress or due to fraud or illegality, parol evidence can be used to prove this.  In cases of mistakes, the scope is limited but it can still be possible), (3) Incomplete contracts (if a court determines a written document doesn’t reflect the full agreement between the parties, parol evidence may be introduced to “complete it”, (4) Subsequent agreements (modifications or agreements made after the written contract can generally be proven with parol evidence although in the narrow technical sense such additions may be found to constitute a “collateral contract”.

Parole & probation

Depending on the jurisdiction, “parole” & “probation” can mean much the same thing or things quite distinct, not helped by parolees in some places being supervised by “probation officers” and vice versa.

In the administration of criminal law, “parole” and “probation” are both forms of supervised release but between jurisdictions the terms can either mean the same thing or be applied in different situations.  As a general principle, parole is the conditional release of a prisoner before completing their full sentence and those paroled usually are supervised by a parole officer and must adhere to certain conditions such as regular meetings, drug testing and maintaining employment and certain residential requirements.  The purpose of parole is (1) a supervised reintegration of an inmate into society and (2) a reward for good behavior in prison.  Should a parolee violate the conditions of their release, they can be sent back to prison to serve the remainder of their sentence.  As the word typically is used, probation is a court-ordered period of supervision in the community instead of, or in addition to, a prison sentence.  A term of probation often imposed at sentencing, either as an alternative to incarceration or as a portion of the sentence after release.  Like parolees, individuals on probation are monitored, often by a probation officer (although they may be styled a “parole officer”) and are expected to follow specific conditions.  Probation is in many cases the preferred sentencing option for first offenders, those convicted of less serious offences and those for whom a custodial sentence (with all its implications) would probably be counter-productive.  It has the advantage also of reducing overcrowding in prisons and is certainly cheaper for the state than incarceration.  Those who violate the terms of their probation face consequences such as an extended probation or being sent to jail.  The word “parole” in this context was very much a thing of US English until the post-war years when it spread first to the UK and later elsewhere in the English-speaking world.

Langue & parole

In structural linguistics, the terms “langue” & “parole” were introduced by the groundbreaking Swiss semiotician Ferdinand de Saussure (1857-1913) and remain two of the fundamental concepts in the framework of structuralism and are treated as important building blocks in what subsequently was developed as the science of human speech.  Within the profession, “langue” & “parole” continue to be regarded as “French words” because the sense in that language better describes things than the English translations (“language” & “speech” respectively) which are “approximate but inadequate”.  Langue denotes the system (or totality) of language shared by the “collective consciousness” so it encompasses all elements of a language as well as the rules & conventions for their combination (grammar, spelling, syntax etc).  Parole is the use individuals make of the resources of language, which the system produces and combines in speech, writing or other means of transmission.  As de Saussure explained it, the conjunction and interaction of the two create an “antinomy of the social and shared”, a further antinomy implied in the idea that langae is abstract and parole is concrete.

The construct of the noun antinomy was a learned borrowing from the Latin antinom(ia) + the English suffix “-y” (used to form abstract nouns denoting a condition, quality, or state).  The Latin antinomia was from the Ancient Greek ντινομία (antinomía), the construct being ντι- (anti- (the prefix meaning “against”), ultimately from the primitive Indo-European hent- (face; forehead; front)) + νόμος (nómos) (custom, usage; law, ordinance) from  νέμω (némō) (to deal out, dispense, distribute), from the primitive Indo-European nem- (to distribute; to give; to take))  + -́ (-íā) (the suffix forming feminine abstract nouns).  The English word is best understood as anti- (in the sense of “against”) + -nomy (the suffix indicating a system of laws, rules, or knowledge about a body of a particular field).  In law, it was once used to describe “a contradiction within a law, or between different laws or a contradiction between authorities” (a now archaic use) but by extension it has come to be used in philosophy, political science and linguistics to describe “any contradiction or paradox”.  A sophisticated deconstruction of the concept was provided by the German German philosopher Immanuel Kant (1724–1804) who in Kritik der reinen Vernunft (Critique of Pure Reason (1781)) explained that apparent contradictions between valid conclusions (a paradox) could be resolved once it was understood the two positions came from distinct and exclusive sets, meaning no paradox existed, the perception of one merely the inappropriate application of an idea from one set to another.

So langue is what people use when thinking and conceptualizing (abstract) while parole what they use in speaking or writing (concrete), Saussure’s evaluative distinction explained as “The proper object of linguistic study is the system which underlies any particular human signifying human practice, not the individual utterance.” and the implication of that was that langue is of more importance than parole.  In the English-speaking world, it was the work of US Professor Noam Chomsky (b 1928) which made the concept of langue & parole well-known through his use of the more accessible terms “competence” & “performance”.  Chomsky’s latter day role as a public intellectual (though a barely broadcasted one in his home country) commenting on matters such as US foreign policy or the contradictions of capitalism has meant his early career in linguistics is often neglected by those not in the profession (the highly technical nature of the stuff does mean it’s difficult for most to understand) but his early work truly was revolutionary.

Noam Chomsky agitprop by Shepard Fairey (b 1970) on Artsy.

Chomsky used “competence” to refer to a speaker's implicit knowledge of the rules and principles of a language, something which permits them to understand and generate grammatically correct sentences which can be understood by those with a shared competence.  Competence is the idealized, internalized system of linguistic rules that underlies a speaker's ability to produce and comprehend language. It reflects one’s mental grammar, independent of external factors like memory limitations or social context.  Performance refers to the actual use of language IRL (in real life), influenced by psychological and physical factors such as memory, attention, fatigue, and social context.  Performance includes the errors, hesitations, and corrections that occur in everyday speech and Chomsky made the important point these do not of necessity reveal lack of competence.  Indeed, understood as “disfluencies”, (the “ums & ahs” etc) these linguistic phenomenon turned out to be elements it was essential to interpolate into the “natural language” models used to train AI (artificial intelligence) (ro)bots to create genuinely plausible “human analogues”.  Chomsky argued competence should be the primary domain of inquiry for theoretical linguistics and he focused on these abstract, universal principles in his early work which provoked debates which continue to this day.  Performance, subject to errors, variability and influenced by non-linguistic factors, he declared better studied by those in fields like sociolinguistics and psycholinguistics.

Saturday, February 17, 2024

Algorithm

Algorithm (pronounced al-guh-rith-um)

(1) A set of rules for solving a problem in a finite number of steps.

(2) In computing, a finite set of unambiguous instructions performed in a prescribed sequence to achieve a goal, especially a mathematical rule or procedure used to compute a desired result.

(3) In mathematics and formal logic, a recursive procedure whereby an infinite sequence of terms can be generated.

1690s: From the Middle English algorisme & augrym, from the Anglo-Norman algorisme & augrimfrom, from the French algorithme, re-fashioned (under mistaken connection with Greek αριθμός (arithmos) (number)) from the Old French algorisme (the Arabic numeral system) from the Medieval Latin algorismus, a (not untypical) mangled transliteration of the Arabic الخَوَارِزْمِيّ (al-awārizmiyy), the nisba (the part of an Arabic name consisting a derivational adjective) of the ninth century Persian mathematician Muammad ibn Mūsā al-Khwārizmī and a toponymic name meaning “person from Chorasmia” (native of Khwarazm (modern Khiva in Uzbekistan)).  It was Muammad ibn Mūsā al-Khwārizmī works which introduced to the West some sophisticated mathematics (including algebra). The earlier form in Middle English was the thirteenth century algorism from the Old French and in English, it was first used in about 1230 and then by the English poet Geoffrey Chaucer (circa 1344-1400) in 1391.  English adopted the French term, but it wasn't until the late nineteenth century that algorithm began to assume its modern sense.  Before that, by 1799, the adjective algorithmic (the construct being algorithm + -ic) was in use and the first use in reference to symbolic rules or language dates from 1881.  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (HSO) has more oxygen atoms per molecule than sulphurous acid (HSO).  The noun algorism, from the Old French algorisme was an early alternative form of algorithm; algorismic was a related form.  The meaning broadened to any method of computation and from the mid twentieth century became especially associated with computer programming to the point where, in general use, this link is often thought exclusive.  The spelling algorism has been obsolete since the 1920s.  Algorithm, algorithmist, algorithmizability, algorithmocracy, algorithmization & algorithmics are nouns, algorithmize is a verb, algorithmic & algorithmizable are adjectives and algorithmically is an adverb; the noun plural is algorithms.

Babylonian and later algorithms

An early Babylonian algorithm in clay.

Although there is evidence multiplication algorithms existed in Egypt (circa 1700-2000 BC), a handful of Babylonian clay tablets dating from circa 1800-1600 BC are the oldest yet found and thus the world's first known algorithm.  The calculations described on the tablets are not solutions to specific individual problems but a collection of general procedures for solving whole classes of problems.  Translators consider them best understood as an early form of instruction manual.  When translated, one tablet was found to include the still familiar “This is the procedure”, a phrase the essence of every algorithm.  There must have been many such tablets but there's a low survival rate of stuff from 40 centuries ago not regarded as valuable.

So associated with computer code has the word "algorithm" become that it's likely a goodly number of those hearing it assume this was its origin and any instance of use happens in software.  The use in this context, while frequent, is not exclusive but the general perception might be it's just that.  It remains technically correct that almost any set of procedural instructions can be dubbed an algorithm but given the pattern of use from the mid-twentieth century, to do so would likely mislead or confuse confuse many who might assume they were being asked to write the source code for software.  Of course, the sudden arrival of mass-market generative AI (artificial intelligence) has meant anyone can, in conversational (though hopefully unambiguous) text, ask their tame AI bot to produce an algorithm in the syntax of the desired coding language.  That is passing an algorithm (using the structures of one language) to a machine which interprets the text and converts it to language in another structure, something programmers have for decades been doing for their clients.

A much-distributed general purpose algorithm (really more of a flow-chart) which seems so universal it can be used by mechanics, programmers, lawyers, physicians, plumbers, carpet layers, concreting contractors and just about anyone whose profession is object or task-oriented.   

The AI bots have proved especially adept at such tasks.  While a question such as: "What were the immediate implications for Spain of the formation of the Holy Alliance?" produces varied results from generative AI which seem to range from the workmanlike to the inventive, when asked to produce computer code the results seem usually to be in accord with a literal interpretation of the request.  That shouldn't be unexpected; a discussion of early nineteenth century politics in the Iberian Peninsular is by its nature going to to be discursive while the response to a request for code to locate instances of split infinitives in a text file is likely to vary little between AI models.  Computer languages of course impose a structure where syntax needs exactly to conform to defined parameters (even the most basic of the breed such as that PC/MS-DOS used for batch files was intolerant of a single missing or mis-placed character) whereas something like the instructions to make a cup of tea (which is an algorithm even if not commonly thought of as one) greatly can vary in form even though the steps and end results can be the same.

An example of a "how to make a cup of tea" algorithm.  This is written for a human and thus contains many assumptions of knowledge; one written for a humanoid robot would be much longer and include steps such as "turn cold tap clockwise" and "open refrigerator door".

The so-called “rise of the algorithm” is something that has attracted much comment since social media gained critical mass; prior to that algorithms had been used increasingly in all sorts of places but it was the particular intimacy social media engenders which meant awareness increased and perceptions changed.  The new popularity of the word encouraged the coining of derived forms, some of which were originally (at least to some degree) humorous but beneath the jocularity, many discovered the odd truth.  An algorithmocracy describes a “rule by algorithms”, a critique in political science which discusses the implications of political decisions are being made by algorithms, something which in theory would make representative and responsible government not so much obsolete as unnecessary.  Elements of this have been identified in the machinery of government such as the “Robodebt” scandal in Australia in which one or more algorithms were used to raise and pursue what were alleged to be debts incurred by recipients of government transfer payments.  Despite those in charge of the scheme and relevant cabinet ministers being informed the algorithm was flawed and there had been suicides among those wrongly accused, the politicians did nothing to intervene until forced by various legal actions.  While defending Robodebt, the politicians found it very handy essentially to disavow connection with the processes which were attributed to the algorithm.

The feeds generated by Instagram, Facebook, X (formerly known as Twitter) and such are also sometimes described as algorithmocracies in that it’s the algorithm which determines what content is directed to which user.  Activists have raised concerns about the way the social media algorithms operate, creating “feedback loops” whereby feeds become increasingly narrow and one-sided in focus, acting only to reinforce opinions rather than inform.  In fairness, that wasn’t the purpose of the design which was simply to keep the user engaged, thereby allowing the platform to harvest more the product (the user’s attention) they sell to consumers (the advertisers).  Everything else is an unintended consequence and an industry joke was the word “algorithm” was used by tech company CEOs when they didn’t wish to admit the truth.  A general awareness of that now exists but filter bubbles won’t be going away but what it did produce were the words algorithmophobe (someone unhappy or resentful about the impact of algorithms in their life) and algorithmophile (which technically should mean “a devotee or admirer of algorithms” but is usually applied in the sense of “someone indifferent to or uninterested in the operations of algorithms”, the latter represented by the great mass of consumers digitally bludgeoned into a state of acquiescent insensibility.

Some of the products are fighting back: The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now (2024) by  by Hilke Schellmann, pp 336, Hachette Books (ISBN-13: 978-1805260981).

Among nerds, there are also fine distinctions.  There are subalgorithms (sub-algorithm seems not a thing) which is a (potentially stand-alone) algorithm within a larger one, a concept familiar in many programming languages as a “sub-routine” although distinct from a remote procedure call (RPC) which is a subroutine being executed in a different address space.  The polyalgorithm (again hyphens just not cool) is a set of two or more algorithms (or subalgorithms) with instructions for choosing which in some way integrated.  A very nerdy dispute does exist within mathematics and computer science around whether an algorithm, at the definitional level, really does need to be restricted to a finite number of steps.  The argument can eventually extend to the very possibility of infinity (or types of infinity according to some) so it really is the preserve of nerds.  In real-world application, a program is an algorithm only if (even eventually), it stops; it need not have a middle but must have a beginning and an end.

There is also the mysterious pseudoalgorithm, something les suspicious than it may first appear.  Pseudoalgorithms exist usually for didactic purposes and will usually interpolate (sometime large) fragments of a real algorithm bit it may be in a syntax which is not specific to a particular (or any) programming language, the purpose being illustrative and explanatory.  Intended to be read by humans rather than a machine, all a pseudoalgorithm has to achieve is clarity in imparting information, the algorithmic component there only to illustrate something conceptual rather than be literally executable.  The pseudoalgorithm model is common in universities and textbooks and can be simplified because millions of years of evolution mean humans can do their own error correction on the fly.

Of the algorithmic

The Netflix algorithm in action: Lindsay Lohan (with body-double) during filming of Irish Wish (2024).  The car is a Triumph TR4 (1961-1967), one of the early versions with a live rear axle, a detail probably of no significance in the plot-line.

The adjective algorithmic has also emerged as an encapsulated criticism, applied to everything from restaurant menus, coffee shop décor, choices of typefaces and background music.  An entire ecosystem (Instagram, TikTok etc) has been suggested as the reason for this multi-culture standardization in which a certain “look, sound or feel” becomes “commoditised by acclamation” as the “standard model” of whatever is being discussed.  That critique has by some been dismissed as something reflective of the exclusivity of the pattern of consumption by those who form theories about what seem not very important matters; it’s just they only go to the best coffee shops in the nicest parts of town.  In popular culture though the effect of the algorithmic is widespread, entrenched and well-understood and already the AI bots are using algorithms to write music will be popular, needing (for now) only human performers.  Some algorithms have become well-known such as the “Netflix algorithm” which presumably doesn’t exist as a conventional algorithm might but is understood as the sets of conventions, plotlines, casts and themes which producers know will have the greatest appeal to the platform.  The idea is nothing new; for decades hopeful authors who sent manuscripts to Mills & Boon would receive one of the more gentle rejection slips, telling them their work was very good but “not a Mills & Boon book”.  To help, the letter would include a brochure which was essentially a “how to write a Mills & Boon book” guide and it included a summary of the acceptable plot lines of which there were at one point reputedly some two dozen.  The “Netflix algorithm” was referenced when Falling for Christmas, the first fruits of Lindsay Lohan’s three film deal with the platform was released in 2022.  It was an example of followed a blending of several genres (redemption, Christmas movie, happy ending etc) and the upcoming second film (Irish Wish)  is of the “…always a bridesmaid, never a bride — unless, of course, your best friend gets engaged to the love of your life, you make a spontaneous wish for true love, and then magically wake up as the bride-to-be.” school; plenty of familiar elements there so it’ll be interesting to see if the algorithm was well-tuned.

Math of the elliptic curve: the Cox–Zucker machine can help.

Some algorithms have become famous and others can be said even to have attained a degree of infamy, notably those used by the search engines, social media platforms and such, the Google and TikTok algorithms much debated by those concerned by their consequences.  There is though an algorithm remembered as a footnote in the history of linguistic oddities and that is the Cox–Zucker machine, published in 1979 by Dr David Cox (b 1948) and Dr Steven Zucker (1949–2019).  The Cox–Zucker machine (which may be called the CZM in polite company) is used in arithmetic geometry and provides a solution to one of the many arcane questions which only those in the field understand but the title of the paper in which it first appeared (Intersection numbers of sections of elliptic surfaces) gives something of a hint.  Apparently it wasn’t formerly dubbed the Cox–Zucker machine until 1984 but, impressed by the phonetic possibilities, the pair had been planning joint publication of something as long ago as 1970 and undergraduate humor can’t be blamed because they met as graduate students at Princeton University.  The convention in academic publishing is for authors’ surnames to appear in alphabetical order and the temptation proved irresistible.

Tuesday, February 6, 2024

Clerestory

Clerestory (pronounced kleer-stawr-ee or kleer-stohr-ee)

(1) In architecture, a portion of an interior rising above adjacent rooftops, fitted with windows admitting daylight.

(2) In church architecture, a row of windows in the upper part of the wall, dividing the nave from the aisle and set above the aisle roof (associated particularly with the nave, transept and choir of a church or cathedral.

(3) In transportation vehicles (usually busses, railroad cars and occasionally cars & vans), a raised construction, typically appended to the roof structure and fitted with (1) windows to admit light or enhance vision or (2) slits for ventilation (or a combination of the two).

1375–1425: From the late Middle English, the construct being clere (clear (in the sense of “light” or “lighted”)) + story (from storey (a level of a building).  The word is obviously analyzed as “a story (upper level) with light from windows”.  Storey was from Middle English stori & storie, from the Anglo-Latin historia (picture), from the Latin, from the Ancient Greek στορία (historía) (learning through research, narration of what is learned), from στορέω (historéō) (to learn through research, to inquire), from στωρ (hístōr) (the one who knows, the expert, the judge).  In the Anglo-Latin, historia was a term from architecture (in this case “interior decorating” in the modern sense) describing a picture decorating a building or that part of a building so decorated.  The less common alternative spellings are clearstory & clerstory.  Clerestory is a noun and clerestoried is an adjective; the noun plural is clerestories.

From here was picked up the transferred sense of “floor; level”.  The later use in church architecture of “an upper story of a church, perforated by windows” is thought simply to be a reference to the light coming through the windows and there is nothing to support the speculation the origin was related to a narrative (story) told by a series of stained glass windows, illuminated by sunlight.  Historians have concluded the purpose of the design was entirely functional; a way of maximizing the light in the interior space.  The related architectural design is the triforium.  The noun triforium (triforia or triforiums in the plural) (from the Medieval Latin triforium, the construct being tria (three) + for (opening) + -ium) describes the gallery of arches above the side-aisle vaulting in a church’s nave.  The –ium suffix (used most often to form adjectives) was applied as (1) a nominal suffix (2) a substantivisation of its neuter forms and (3) as an adjectival suffix.  It was associated with the formation of abstract nouns, sometimes denoting offices and groups, a linguistic practice which has long fallen from fashion.  In the New Latin, it was the standard suffix appended when forming names for chemical elements.

Clerestories which once shone: Grand Central Terminal (the official abbreviation is GCT although the popular form is "Grand Central Station" (often clipped to "Grand Central")), Midtown Manhattan, New York City, 1929 (left) and the same (now dimmed) location in a scene from the Lindsay Lohan film Just my Luck (2006) (right).  Because of more recent development in the surrounding space, the sunlight no longer enters the void through the clerestoried windows is such an eye-catching way.  In modern skyscrapers, light-shafts or atriums can extend hundreds of feet.

Tourist boat on a canal cruise, Amsterdam, the Netherlands.

Boats with clerestory windows are commonly seen on waterways like the canals of Amsterdam and are valued by tour-guides because they make excursions possible in (almost) all-weathers.  Those designing passenger busses and train carriages were also early adopters of clerestory windows, initially because they were a source of “free” light but as packaged tourism developed into “sightseeing”, tour operators recognized the potential and commissioned versions optimized for outward visibility, essentially a form of “value-adding” which made even the dreary business of bus travel from one place to the next more of a “sightseeing” experience, something probably most valued by those afforded a greater vista to observe in mountainous regions.

Greyhound Scenicruiser in original livery.

A variation of the idea was used for long-distance busses in North America, the best known of which remains the Greyhound Scenicruiser (PD-4501), featuring a raised upper deck with a clerestory windscreen.  Although the view through that was enjoyed by many passengers, the design was less about giving folk a view and more a way to maximize revenue within the length restrictions imposed by many US states.  What the upper deck did was allow a increase in passenger numbers because the space their luggage would absorb in a conventional (single layer) design could be re-allocated to people, their suitcases (and in some cases also freight, another revenue stream) relegated to the chassis level which also improved weight distribution and thus stability.

A predecessor: 1930 Lancia Omicron.

Famous as it became, the Scenicruiser, 1001 of which were built by General Motors (GM) between 1954-1956, was to cause many problems for both manufacturer and operator, the first of which caused by the decision to use twin-diesel engines to provide the necessary power for the new, heavy platform.  The big gas (petrol) units available certainly would have provided that but their fuel consumption would not only have made their operation ruinously expensive (both the fuel burn and the time lost by needing frequently to re-fill) and the volume of gas which would have to be carried would have both added to weight and reduced freight capacity.  Bigger GM diesel units weren’t produced in the early 1950s so the twin engines were a rational choice and the advantages were real, tests confirming that even when fully loaded, the coupled power-train would be sufficient for hill climbing while on the plains, the Scenicruiser happily would cruise using just a single engine.  However, as many discovered (on land, sea and in the air), running two engines coupled together is fraught with difficulties and these never went away, the busses eventually adopting one of GM’s new generation of big-displacement diesels as part of the major re-building of the fleet in 1961, a programme which also (mostly) rectified some structural issues which had been recognized.  Despite all that, by 1975 when Greyhound retired the model, the company still had hundreds in daily service and many of those auctioned off were subsequently used by other operators and in private hands, some are still running, often as motor homes or (mostly) static commercial displays or museum exhibits.

1951 Pegaso Z-403 (left) and 1949 Brill Continental (right).

GM’s Scenicruiser was influential and clerestory windscreens soon proliferated on North American roads although the idea wasn’t new.  The Spanish manufacturer Pegaso (a creation of Generalissimo Francisco Franco’s (1892-1975; Caudillo of Spain 1939-1975) industrial policy) between 1951-1957 produced the Z-403 (1951-1957) which used the same design and before even that a bus with an almost identical profile had been sold in the US by the JG Brill Company, albeit with a conspicuous lack of success.  As early as 1930 the Italian concern Lancia offered the Omicron bus with a 2½ half deck arrangement with a clerestoried upper windscreen.  The Omicron’s third deck was configured usually as a first-class compartment but at least three which operated in Italy were advertised as “smoking rooms”, the implication presumably that the rest of the passenger compartment was smoke-free.  History doesn't record if the bus operators were any more successful in enforcing smoking bans than the usual Italian experience.

1968 Oldsmobile Vista Cruiser.

One quirky offering which picked up the Scenicruiser’s clerestoried windscreen was the Oldsmobile Vista Cruiser (1964-1977), a station wagon which, until the release of the third series in 1973 featured one at the leading edge of a raised roof section, the glass ending midway over the backseat, the shape meaning it functioned also as a “skylight”.  Above the rear side windows were matching clerestories which might sound a strange thing to add above a luggage compartment but during those years, a “third seat” was a popular option to install in the space, transforming the things into eight or nine seat vehicles; families were bigger then.  The Vista Cruiser sold well and Buick later adopted the idea for some of its range until the concept was abandoned in 1972 because of concerns about upcoming safety regulations.  Such rules were however never imposed and both manufacturers revived the idea in the 1990s for their (frankly ugly) station wagons and when the Buick was discontinued in 1996, it was the last full-sized station wagon to be made in the US, the once popular market segment cannibalized to the point of un-viability by the mini-van (people-mover) and the sports utility vehicle (SUV).

The arrangement of a series of windows in a high-mounted row, borrowed from architecture, became familiar on train carriages and buses, especially those which plied scenic routes.  Usually these were added to the coachwork of buses built on existing full-sized commercial chassis which could seat 40-60 passengers but in the 1950s, there emerged the niche of the smaller group tour, either curated to suit a narrower market or created ad-hoc by hotels or operators; smaller vehicles were required and these offered the additional advantage of being able sometimes to go where big buses could not.  In places like the Alps, where those on the trip liked to look up as well as out, rows of clerestoried windows were desirable.

1959 Volkswagen Microbus Deluxe (23 Window Samba). 

The best known of these vehicles was the Volkswagen “Samba”, a variation of the Microbus, one of the range of more than a dozen a models built on the platform of the Type 2, introduced in 1950 after a chance sighting by a European distributer of a VW Beetle (Type 1) chassis which had been converted by the factory into a general-purpose utility vehicle.  The company accepted the suggestion a market for such a thing existed and in its original, air-cooled, rear-engined configuration, it remained in production well into the twenty-first century.  Between 1951-1966, the Microbus was available in a “Deluxe” version which featured both a folding fabric sunroof and rows of rows of clerestoried windows which followed the curve of the sides of the roof.  Available in 21 & 23 window versions, these are now highly collectable and such is the attraction there’s something of a cottage industry in converting Microbuses to the clerestoried specification but it’s difficult exactly to emulate the originals, the best of which can command several times the price of a fake (a perfectly restored genuine Samba in 2017 selling at auction in the US for US$302,000).  Such was the susceptibility to rust, the survival rate wasn’t high and many led a hard life when new, popular with the tour guides who would conduct bus-loads of visitors on (slow) tours of the Alps, the sunroof & clerestory windows ideal for gazing at the peaks.  To add to the mood, a dashboard-mounted valve radio was available as an option, something still for many a novelty in the early 1950s.  The Microbus Deluxe is rarely referred to as such, being almost universally called the “Samba” and the origin of that in uncertain.  One theory is it’s a borrowing from the Brazilian dance and musical genre that is associated with things lively, colorful, and celebratory, the link being that as well as the sunroof and windows, the Deluxe had more luxurious interior appointments, came usually in bright two-tone paint (other Type 2s were usually more drably finished) and featured lashings of external chrome.  It’s an attractive story but some prefer something more Germanic: Samba as the acronym for the business-like phrase Sonnendach-Ausführung mit besonderem Armaturenbrett (sunroof version with special dashboard).  However it happened, Samba was in colloquial use by at least 1952 and became semi official in 1954 when the distributers in the Netherlands added the word to their brochures.  Production ended in July 1967 after almost 100,000 had been built.

1966 Volkswagen Samba (21 Window, Left) and 1962 Volkswagen Samba (23 Window, Right).  The 23 Window van is a conversion of a Microbus and experts (of which there seem to be many, such is the following these things have gained) say it's close to impossible exactly to replicate a factory original.  In theory, the approach would be to take the parts with serial numbers (tags, engine, gearbox etc) from a real Samba which has rusted into oblivion (something not uncommon) and interpolate these into the sound body of a Microbus with as close a build date as possible.  Even then, such are the detail differences that an exact replication would be a challenge.  Because the Sambas received the same running changes and updates as the rest of the Microbus range, there was much variation in the details of the specification over the years but the primary distinction is between the “21” & “23” window vans, the difference accounted for by the latter’s pair of side-corner windows to the left & right of the rear top gate opening.  In 1964, when the rear doors were widened, the curved windows in the roof were eliminated because there would no longer be sufficient metal in the coachwork to guarantee structural integrity.