Showing posts sorted by date for query Idiot. Sort by relevance Show all posts
Showing posts sorted by date for query Idiot. Sort by relevance Show all posts

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text” endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Thursday, July 24, 2025

Kamikaze

Kamikaze (pronounced kah-mi-kah-zee or kah-muh-kah-zee)

(1) A member of a World War II era special corps in the Japanese air force charged with the suicidal mission of crashing an aircraft laden with explosives into an enemy target, especially Allied Naval vessels.

(2) In later use, one of the (adapted or specifically built) airplanes used for this purpose.

(3) By extension, a person or thing that behaves in a wildly reckless or destructive manner; as a modifier, something extremely foolhardy and possibly self-defeating.

(4) Of, pertaining to, undertaken by, or characteristic of a kamikaze; a kamikaze pilot; a kamikaze attack.

(5) A cocktail made with equal parts vodka, triple sec and lime juice.

(6) In slang, disastrously to fail.

(7) In surfing, a deliberate wipeout.

1945: From the Japanese 神風 (かみかぜ) (kamikaze) (suicide flyer), the construct being kami(y) (god (the earlier form was kamui)) + kaze (wind (the earlier form was kanzai)), usually translated as “divine wind” (“spirit wind” appearing in some early translations), a reference to the winds which, according to Japanese folklore, destroying Kublai Khan's Mongol invasionfleet in 1281.  In Japanase military parlance, the official designation was 神風特別攻撃隊 (Shinpū Tokubetsu Kōgekitai (Divine Wind Special Attack Unit)).  Kamikaze is a noun, verb & adjective and kamikazeing & kamikazed are verbs; the noun plural is kamikazes.  When used in the original sense, an initial capital is used. 

HESA Shahed 136 UAV.

The use of kamikaze to describe the Iranian delta-winged UAV (unmanned aerial vehicle, popularly known as “drones”) being used by Russia against Ukraine reflects the use of the word which developed almost as soon as the existence of Japan’s wartime suicide bomber programme became known.  Kamikaze was the name of the aviators and their units but it was soon also applied to the aircraft used, some re-purposed from existing stocks and some rocket powered units designed for the purpose.  In 1944-1945 they were too little, too late but they proved the effectiveness of precision targeting although not all military cultures would accept the loss-rate the Kamikaze sustained.  In the war in Ukraine, the Iranian HESA Shahed 136 (شاهد ۱۳۶ (literally "Witness-136" and designated Geran-2 (Герань-2 (literally "Geranium-2") by the Russians) the kamikaze drone have proved extraordinarily effective being cheap enough to deploy en masse and capable of precision targeting.  They’re thus a realization of the century-old dream of the strategic bombing theorists to hit “panacea targets” at low cost while sustaining no casualties.  Early in World War II, the notion of panacea targets had been dismissed, not because as a strategy it was wrong but because the means of finding and bombing such targets didn’t exist, thus “carpet bombing” (bombing for several square miles around any target) was adopted because it was at the time the best option.  Later in the war, as techniques improved and air superiority was gained, panacea targets returned to the mission lists but the method was merely to reduce the size of the carpet.  The kamikaze drones however can be pre-programmed or remotely directed to hit a target within the tight parameters of a GPS signal.  The Russians know what to target because so many blueprints of Ukrainian infrastructure sit in Moscow’s archives and the success rate is high because, deployed in swarms because they’re so cheap, the old phrase from the 1930s can be updated for the UAV age: “The drone will always get through”.

Imperial Japan’s Kamikazes

By 1944, it was understood by the Japanese high command that the strategic gamble simultaneously to attack the US Pacific Fleet at anchor in Pearl Harbor and the Asian territories of the European powers.  Such was the wealth and industrial might of the US that within three years of the Pearl Harbor raid, the preponderance of Allied warships and military aircraft in the Pacific was overwhelming and Japan’s defeat was a matter only of time.  That couldn’t be avoided but within the high command it was thought that if the Americans understood how high would be the causality rate if they attempted an invasion of the Japanese home islands, that and the specter of occupation might be avoided and some sort of "negotiated settlement" might be possible, the notion of the demanded "unconditional surrender" unthinkable.

HMS Sussex hit by Kamikaze (Mitsubishi Ki-51 (Sonia)), 26 July 1945 (left) and USS New Mexico (BB-40) hit by Kamikaze off Okinawa, 12 May 1945 (right).

Although on paper, late in the war, Japan had over 15,000 aircraft available for service, a lack of development meant most were at least obsolescent and shortages of fuel increasingly limited the extent to which they could be used in conventional operations.  From this analysis came the estimate that if used as “piloted bombs” on suicide missions, it might be possible to sink as many as 900 enemy warships and inflict perhaps 22,000 causalities and in the event of an invasion, when used at shorter range against landing craft or beachheads, it was thought an invading force would sustain over 50,000 casualties by suicide attacks alone.  Although the Kamikaze attacks didn't achieve their strategic objective, they managed to sink dozens of ships and kill some 5000 allied personnel.  All the ships lost were smaller vessels (the largest an escort carrier) but significant damage was done to fleet carriers and cruisers and, like the (also often dismissed as strategically insignificant) German V1 & V2 attacks in Europe, resources had to be diverted from the battle plan to be re-tasked to strike the Kamikaze air-fields.  Most importantly however, so vast by 1944 was the US military machine that it was able easily to repair or replace as required.  Brought up in a different tradition, US Navy personnel the target of the Kamikaze dubbed the attacking pilots Baka (Japanese for “Idiot”).

A captured Japanese Yokosuka MXY-7 Ohka (Model 11), Yontan Airfield, April 1945.

Although it’s uncertain, the first Kamikaze mission may have been an attack on the carrier USS Frankin by Rear Admiral Arima (1895-1944) flying a Yokosuka D4Y Suisei (Allied codename Judy) and the early flights were undertaken using whatever airframes were available and regarded, like the pilots, as expendable.  Best remembered however, although only 850-odd were built, were the rockets designed for the purpose.  The Yokosuka MXY-7 Ohka (櫻花, (Ōka), (cherry blossom)) was a purpose-built, rocket-powered attack aircraft which was essentially a powered bomb with wings, conceptually similar to a modern “smart bomb” except that instead of the guidance being provided by on board computers and associated electronics which were sacrificed in the attack, there was a similarly expendable human pilot.  Shockingly single-purpose in its design parameters, the version most produced could attain 406 mph (648 km/h) in level flight at relatively low altitude and 526 mph (927 km/h) while in an attack dive but the greatest operational limitation was the range was limited to 23 miles (37 km), forcing the Japanese military to use lumbering Mitsubishi G4N (Betty) bombers as “carriers” (the Ohka the so-called "parasite aircraft") with the rockets released from under-slung assemblies when within range.  As the Ohka was originally conceived (with a range of 80 miles (130 km)), as a delivery system that may have worked but such was the demand on the designers to provide the highest explosive payload, the fuel load was reduced, restricting the maximum speed to 276 mph (445 km/h), making the barely maneuverable little rockets easy prey for fighters and even surface fire.

Yokosuka MXY-7 Ohka.

During the war, Japan produced more Mitsubishi G4Ms than any other bomber and its then remarkable range (3130 miles (5037 km)) made it a highly effective weapon early in the conflict but as the US carriers and fighters were deployed in large numbers, its vulnerabilities were exposed: the performance was no match for fighters and it was completely un-armored without even self-sealing fuel tanks, hence the nick-name “flying lighter” gained from flight crews.  However, by 1945 Japan had no more suitable aircraft available for the purpose so the G4M was used as a carrier and the losses were considerable, an inevitable consequence of having to come within twenty-odd miles of the US battle-fleets protected by swarms of fighters.  It had been planned to develop a variant of the much more capable Yokosuka P1Y (Ginga) (as the P1Y3) to perform the carrier role but late in the war, Japan’s industrial and technical resources were stretched and P1Y development was switched to night-fighter production, desperately needed to repel the US bombers attacking the home islands.  Thus the G4M (specifically the G4M2e-24J) continued to be used.

Watched by Luftwaffe chief Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945), Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) presents test pilot Hanna Reitsch (1912-1979) with the Iron Cross (2nd class), Berlin, March, 1941 (left); she was later (uniquely for a woman), awarded the 1st-class distinction.  Conceptual sketch of the modified V1 flying bomb (single cockpit version, right).

The idea of suicide missions also appealed to some Nazis, predictably most popular among those never likely to find themselves at the controls, non-combatants often among the most blood-thirsty of politicians.  The idea had been discussed earlier as a means of destroying the electricity power-plants clustered around Moscow but early in 1944, the intrepid test pilot Hanna Reitsch suggested to Adolf Hitler (1889-1945; German head of government 1933-1945 & of state 1934-1945) a suicide programme as the most likely means of hitting strategic targets.  Ultimately, she settled on using a V1 flying bomb (the Fieseler Fi 103R, an early cruise missile) to which a cockpit had been added, test-flying it herself and even mastering the landing, a reasonable feat given the high landing speed.  As a weapon, assuming a sufficient supply of barely-trained pilots, it would probably have been effective but Hitler declined to proceed, feeling things were not yet sufficiently desperate.  The historic moment passed although in the skies above Germany, in 1945 there were dozens of what appeared to be "suicide attacks" by fighter pilots ramming their aircraft into US bombers.  The Luftwaffe was by this time so short of fuel that training had been cut to the point new recruits were being sent into combat with only a few hours of solo flying experience so it's believed some incidents may have been "work accidents" but the ad-hoc Kamikaze phenomenon was real.

According to statics compiled by the WHO (World Health Organization) in 2021, globally, there were an estimated 727,000 suicides and within the total: (1) among 15–29-year-olds, suicide was the third leading cause of death (2) for 15–19-year-olds, it was the fourth leading and (3) for girls aged 15–19, suicide ranked the third leading.  What was striking was that in middle & high income nations, suicide is the leading cause of death in the young (typically defined as those aged 15-29 or 15-34.  Because such nations are less affected by infectious disease, armed conflicts and accident mortality that in lower income countries, it appeared there was a “mental health crisis”, one manifestation of which was the clustering of self-harm and attempted suicides, a significant number of the latter successful.  As a result of the interplay of the economic and social factors reducing mortality from other causes, intentional self-harm stands out statistically, even though suicide rates usually are not, in absolute terms, “extremely” high.  Examples quoted by the WHO included:

Republic of Korea (ROK; South Korea): Among people aged 10–39, suicide is consistently the leading cause of death and that’s one of the highest youth suicide rates in the OECD (Organization of Economic Cooperation & Development, sometimes called the “rich countries club” although changes in patterns of development have compressed relativities and that tag is not as appropriate as once it was.

Japan (no longer styled the “Empire of Japan although the head of state remain an emperor): Suicide is the leading cause of death among those aged 15-39 and while there was a marked decline in the total numbers after the government in the mid 1990s initiated a public health campaign the numbers did increase in the post-COVID pandemic period.  Japan is an interesting example to study because its history has meant cultural attitudes to suicide differ from those in the West.

New Zealand (Aotearoa): New Zealand has one of the highest youth suicide rates in the developed world, especially among Māori youth and although the numbers can bounce around, for those aged 15–24, suicide is often the leading or second leading cause of death.

Finland:  For those aged 15-24, suicide is always among leading causes of mortality and in some reporting periods the leading one.  Because in Finland there are there are extended times when the hours of darkness are long and the temperatures low, there have been theories these conditions may contribute to the high suicide rate (building on research into rates of depression) but the studies have been inconclusive.

Australia: Suicide is the leading cause of death for those in the cohorts 15–24 and 25–44 and a particular concern is the disproportionately high rate among indigenous youth, the incidents sometimes happening while they’re in custody.  In recent years, suicide has road accidents and cancer as the leading cause in these age groups.

Norway & Sweden: In these countries, suicide is often one of the top three causes of death among young adults and in years when mortality from disease and injury are especially low it typically will rise to the top.

Kamikaze Energy Cans in all six flavors (left) and potential Kakikaze Energy Can customer Lindsay Lohan (right).

Ms Lohan was pictured here with a broken wrist (fractured in two places in an unfortunate fall at Milk Studios during New York Fashion Week) and 355 ml (12 fluid oz) can of Rehab energy drink, Los Angeles, September 2006.  Some recovering from injuries find energy drinks a helpful addition to the diet.  The car is a 2005 Mercedes-Benz SL 65 (R230; 2004-2011) which earlier had featured in the tabloids after a low-speed crash.  The R230 range (2001-2011) was unusual because of the quirk of the SL 550 (2006-2011), a designation used exclusively in the North American market, the RoW (rest of the world) cars retaining the SL 500 badge even though both used the 5.5 litre (333 cubic inch) V8 (M273).

Given the concerns about suicide among the young, attention has in the West been devoted the way the topic is handled on social media and the rise in the use of novel applications for AI (artificial intelligence) has flagged new problems, one of the “AI companions” now wildly popular among youth (the group most prone to attempting suicide) recently in recommending their creator take his own life.  That would have been an unintended consequence of (1) the instructions given to the bot and (2) the bot’s own “learning process”, the latter something which the software developers would have neither anticipated nor expected.  Given the sensitivities to the way suicide is handled in the media, on the internet or in popular culture, it’s perhaps surprising there’s an “energy drink” called “Kamikaze”.  Like AI companions, the prime target for the energy drink suppliers is males aged 15-39 which happens to be the group most at risk of suicide thoughts and most likely to attempt suicide.  Despite that, the product’s name seems not to have attracted much criticism and the manufacturer promises: “With your Kamikaze Energy Can, you'll enjoy a two-hour energy surge with no crash.  Presumably the word “crash” was chosen with some care although, given the decline in the teaching of history at school & university level, it may be a sizeable number of youth have no idea about the origin of “Kamikaze”.  Anyway, containing “200mg L-Citrulline, 160mg Caffeine Energy, 1000mg Beta Alanine, vitamin B3, B6 & B12, zero carbohydrates and zero sugar, the cans are available in six flavours: Apple Fizz, Blue Raspberry, Creamy Soda, Hawaiian Splice, Mango Slushy & Rainbow Gummy.

Thursday, December 19, 2024

Pylon

Pylon (pronounced pahy-lon)

(1) A marking post or tower for guiding aviators, much used in air-racing to mark turning points in a a prescribed course of flight.

(2) A relatively tall structure at the side of a gate, bridge, or avenue, marking an entrance or approach.

(3) A monumental tower forming the entrance to an ancient Egyptian temple, consisting either of a pair of tall quadrilateral masonry masses with sloping sides and a doorway between them or of one such mass pierced with a doorway.

(4) In electricity transmission, a steel tower or mast carrying high-tension lines, telephone wires, or other cables and lines (usually as power-pylon, electricity pylon or transmission tower).

(5) In architecture (1) a tall, tower-like structure (usually of steel or concrete) from which cables are strung to support other structures and (2) a lighting mast; a freestanding support for floodlights.

(6) In aeronautics, a streamlined, finlike structure used to attach engines, auxiliary fuel tanks, bombs, etc to an aircraft wing or fuselage.

(7) In modeling, as “pylon shot”, a pose in which a model stands with arms raised or extended outwards, resembling an electricity pylon.

(8) An alternative name for an obelisk.

(9) In aviation, a starting derrick for an aircraft (obsolete) and a tethering point for an dirigible (airship).

(10) In American football (gridiron), an orange marker designating one of the four corners of the field’s end zones.

(11) In the slang of artificial limb makers (1) a temporary artificial leg and (2) a rigid prosthesis for the lower leg.

(12) In literature, as "Pylon Poet" (usually in the plural as “the Pylons”), a group of British poets who during the 1930s included in their work many references to new & newish mechanical devices and other technological developments.

(13) In slang, a traffic cone.

1823: A learned borrowing from Ancient Greek πυλών (puln; pyln) (gateway; gate tower), from pylē (gate, wing of a pair of double gates; an entrance, entrance into a country; mountain pass; narrow strait of water) of unknown origin but etymologists suspect it may be a technical term (from architecture or construction) from another language.  The first use was in archaeology to describe a “gateway to an Egyptian temple”, a direct adaptation of the original Greek.  In Western architecture, it’s believed the first “modern” pylons were the tall, upright structures installed at aerodromes to guide aviators and it was the appearance of these things which inspired the later use as “power pylon” (steel tower for high-tension wires over distance, use noted since 1923) and the word spread to any number of similar looking devices (even those on a small scale such as traffic cones).  Until then, in engineering and architecture, tall structures used to carry cables or in some way provide support (or even be mere decorative) were described as a “tower” or “obelisk” (such use continuing).  Pylon is a noun and pylonless, pylonlike, pylonesque & pylonish are adjectives; the noun plural is pylons.  Despite the fondness in engineering for such forms to emerge, the verbs pyloned & pyloning seem never to have been coined.

The Ancient Greek πυλών (puln; pyln) was used of the grand architecture seen in the entrances to temples and the usual word for doors (and gates) rather more modest was θύρα (thýra).  It was a feminine noun and appears in various forms depending on the grammatical case (θύρα (nominative singular; a door), θύρας (genitive singular; of a door) & θύραι (nominative plural; doors).  Etymologists believe θύρα may have undergone phonological changes, adapting to Greek morphology and pronunciation patterns, while retaining its fundamental meaning tied to entryways or openings.  The word was from the primitive Indo-European dhur or dhwer (door; gateway) which was the source also of the Latin foris (door, entrance), the Sanskrit dvā́r (door, gate), the Old English duru (door) and the Old Norse dyrr (door).  Because of their functional role and symbolism as thresholds (ie transition, entry, protection), the door played a prominent part in linguistic as well as architectural evolution.

Temple of Isis, first pylon, north-eastern view.

The Ancient Greek πυλών (puln; pyln) was the classical term for an Egyptian ceremonial gateway (bekhenet) used in temples from at least the Middle Kingdom to the Roman period (circa 2040 BC–AD 395) and anthropologists have concluded the intent was to symbolize the horizon.  The basic structure of a pylon consisted of two massive towers of rubble-filled masonry tapering upwards, surmounted by a cornice and linked in the centre by an elaborate doorway.  Ancient depictions of pylons show that the deep vertical recesses visible along the facades of surviving examples were intended for the mounting of flag staffs.

An “anchor pylon” is the one which forms the endpoint of a high-voltage and differs from other pylons in that it uses horizontal insulators, necessary when interfacing with other modes of power transmission and (owing to the inflexibility of the conductors), when significantly altering the direction of the pylon chain.  In large-scale display advertizing, a “pylon sign” is a tall sign supported by one or more poles and in the original industry jargon was something in what would now be called “portrait mode”; a sign in “landscape mode” being a “billboard”.  Not surprisingly, there are a number of mountains known as “Pylon Peak”.  The task of naming such geological features is part of the field of toponymy (in semantics the lexicological study of place names(a branch of onomastics)) and a specialist in such things is known as a toponymist.  The term toponomy was later borrowed by medicine where it was used of the nomenclature of anatomical regions. In aviation, the “pylon turn” is a flight maneuver in which an aircraft banks into a circular turn around a fixed point on the ground.

The Ancient Greek πυλών (puln; pyln) was used of the grand architecture seen in the entrances to temples and the usual word for doors (and gates) rather more modest was θύρα (thýra).  It was a feminine noun and appears in various forms depending on the grammatical case (θύρα (nominative singular; a door), θύρας (genitive singular; of a door) & θύραι (nominative plural; doors).  Etymologists believe θύρα may have undergone phonological changes, adapting to Greek morphology and pronunciation patterns, while retaining its fundamental meaning tied to entryways or openings.  The word was from the primitive Indo-European dhur or dhwer (door; gateway) which was the source also of the Latin foris (door, entrance), the Sanskrit dvā́r (door, gate), the Old English duru (door) and the Old Norse dyrr (door).  Because of their functional role and symbolism as thresholds (ie transition, entry, protection), the door played a prominent part in linguistic as well as architectural evolution.

The plyon pose: Lindsay Lohan demonstrates some variations.

In modeling, the “pylon shot” is used to describe the pose in which a model stands with arms raised or extended outwards, resembling (at least vaguely) an electricity pylon, the appearance of which is anthropomorphic.  There are practical benefits for designers in that raising the arms permits a photographer to include more of a garment in the frame and this can be significant if there’s detailing which are at least partially concealed with the arms in their usual position.  Topless models also adopt variations of the pose because the anatomical affect of raising the arms also lifts and to some extent re-shapes the breasts, lending them temporarily a higher, a more pleasing aspect.

The Pylons

The so-called “pylon poets” (referred to usually as “the Pylons”) were a group who dominated British poetry during the 1930s, a time when the form assumed a greater cultural and intellectual significance than today.  The best known (and certainly among the most prolific) of the Pylons were Louis MacNeice (1907–1963), Stephen Spender (1909–1995), WH Auden (1907-1973) and Cecil Day-Lewis (1904–1972), their names sometimes conflated as “MacSpaunday”.  It was Spender’s poem The Pylons which inspired the nickname and it referenced the frequent references to the images of “industrial modernity”, drawn from new(ish) technology and the machinery of factories.  The intrusion of novel machinery and technology into a variety of fields is not unusual; in the age of steam the devices were used as similes when speculating about the operation of the human brain, just as the terminology of computers came to be used when the lexicon entered the public imagination.  Their method underlying the output of the pylons was influenced by the metaphysical poetry of John Donne (circa 1571-1631) whose use of “scientific” imagery was much admired by TS Eliot (1888–1965), the work of whom was acknowledged as influential by both Auden and Spender.  However, the 1930s were the years of the Great Depression and probably their most fertile source was Marxist materialism although, of the Pylons, historians tend to regard only Day-Lewis as one of the “useful idiots”.

The Pylons (1933) by Stephen Spender.

The secret of these hills was stone, and cottages
Of that stone made,
And crumbling roads
That turned on sudden hidden villages
 
Now over these small hills, they have built the concrete
That trails black wire
Pylons, those pillars
Bare like nude giant girls that have no secret.
 
The valley with its gilt and evening look
And the green chestnut
Of customary root,
Are mocked dry like the parched bed of a brook.
 
But far above and far as sight endures
Like whips of anger
With lightning's danger
There runs the quick perspective of the future.
 
This dwarfs our emerald country by its trek
So tall with prophecy
Dreaming of cities
Where often clouds shall lean their swan-white neck.

The term “useful idiot” is from political science and so associated with Lenin (Vladimir Ilyich Ulyanov (1870–1924; first leader of Soviet Russia 1917-1922 & USSR 1922-1924) that it's attributed to him but there's no evidence he ever spoke or wrote the words.  It became popular during the Cold War to describe pro-communist intellectuals and apologists in the West, the (probably retrospective) association with Lenin probably because had the useful idiots actually assisted achieving a communist revolution there, their usefulness outlived, he'd likely have had at least some of them shot as "trouble-makers".  Although it took many Western intellectuals decades to recant (some never quite managed) their support for the Soviet Union, the watershed was probably Comrade Khrushchev's (1894–1971; Soviet leader 1953-1964)  so called "Secret Speech" (On the Cult of Personality and Its Consequences) to the 20th Congress of the Communist Party of the Soviet Union on 25 February 1956 in which he provided a detailed critique of the rule of comrade Stalin (1878-1953; Soviet leader 1924-1953), especially the bloody purges of the late 1930s.

Some had however already refused to deny what had become obvious to all but avid denialists, and in 1949 a contribution by Spender appeared in The God that Failed, a collection of six essays in which the writers lay bare their sense of betrayal and disillusionment with communism because of the totalitarian state forged by comrade Stalin which was in so many ways just another form of fascism.  Spender was associated with the intellectual wing of left-wing politics during the 1930s and was briefly a member of the Communist Party but his attraction seems to have been motivated mostly by the Soviet Union’s promises of equality and its anti-fascist stance.  He quickly became disillusioned with the Soviet state, unable to reconcile its authoritarianism with his personal beliefs in freedom and individual rights, a critical stance differentiated him from figures like George Bernard Shaw (GBS; 1856-1950) and Sidney (1859–1947) & Beatrice Webb (1858–1943), the latter couple for some time definitely useful idiots.

The sort of sights which would have inspired Spender’s line “Bare like nude giant girls that have no secret”.

Louis MacNeice, was politically engaged during the 1930s but that was hardly something unusual among writers & intellectuals during that troubled decade.  Among the pylons he seems to have been the most sceptical about the tenets of communism and the nature of comrade Stalin’s state and no historians seem every to have listed him among the useful idiots, his views of the left as critical and nuanced as they were of the right.  What he most objected to was the tendency among idealistic & politically committed intellectuals to engage in a kind of reductionism which allowed them to present simplistic solutions to complex problems in a form which was little more than propaganda, a critique he explored in his poem Autumn Journal (1939) captures his doubts about political certainty and his disillusionment with simplistic solutions to complex problems.  Auden certainly wasn’t a “useful idiot” and while politically engaged and associated with several leftist intellectual circles during the 1930s, his sympathy for Marxism and anti-fascist causes were really not far removed from those share by even some mainstream figures and a capacity for self-reflection never deserted him.  Much was made of the time he spent in Spain during the Spanish Civil War (1936-1940) but he went as an observer and a propagandist rather than a combatant and what he saw made his disillusioned with the ideological rigidity and in-fighting among leftist factions and he made no secret of his distaste for Stalinist communists.  By the early 1940s, he was distancing himself from Marxism, the process much accelerated by his re-embrace of Christianity where, at least debatably, he discharged another form of useful idiocy, his disapproval of collectivist ideologies apparently not extending to the Church of England.

Profiles of some electricity pylons.  There a literally dozens of variations, the designs dictated by factors such as the ground environment, proximity to people, voltage requirements, weight to be carried, economics, expected climatic conditions and a myriad of other specifics.

Of the Pylons, Cecil Day-Lewis (who served as Poet Laureate of the UK 1968-1972) had the most active period engagement with communism and Marxist ideals and he was for a time politically aligned with the Soviet Union; it was a genuine ideological commitment.  During the 1930s, the true nature of the Soviet Union wasn’t generally known (or accepted) in the West and Day-Lewis admired the Soviet Union as an experiment in social and economic equality which he championed and it wasn’t until late in the decade he realized the ideals he had embraced had been betrayed; it was Great Purge and the Moscow Show-Trials which triggered his final disillusionment.  Day-Lewis later acknowledged the naivety and moral compromises of his earlier stance and came to argue poetry and art should not be subordinated to political ideology, a view formed by his understanding of the implications of propagandistic pieces of his younger years being exactly that.