Showing posts with label Word. Show all posts
Showing posts with label Word. Show all posts

Saturday, January 10, 2026

Aphantasia

Aphantasia (pronounced ay-fan-tay-zhuh)

The inability voluntarily to recall or form mental images.

2015: The word (not the diagnosis) was coined by UK neurologist Dr Adam Zeman (b 1957), neuropsychologist Dr Michaela Dewar (b 1976) and Italian neurologist Sergio Della Sala (b 1955), first appearing the paper Lives without imagery.  The construct was a- (from the Ancient Greek ἀ- (a-), used as a privative prefix meaning “not”, “without” or “lacking” + phantasía (from the Greek φαντασία (“appearance”, “imagination”, “mental image” or “power of imagination”, from φαίνω (phaínō) ( “to show”, “to make visible” or “to bring to light”).  Literally, aphantasia can be analysed as meaning “an absence of imagination” or “an absence of mental imagery” and in modern medicine it’s defined as “the inability voluntarily to recall or form mental images”.  Even in Antiquity, there was some meaning shift in phantasía, Plato (circa 427-348 BC) using the word to refer generally to representations and appearances whereas Aristotle (384-322 BC) added a technical layer, his sense being faculty mediating between perception (aisthēsis) and thought (noēsis).  It’s the Aristotelian adaptation (the mind’s capacity to form internal representations) which flavoured the use in modern neurology.  Aphantasia is a noun and aphantasic is a noun & adjective; the noun plural is aphantasics.

Scuola di Atene (The School of Athens, circa 1511), fresco by Raphael (Raffaello Sanzio da Urbino, 1483–1520), Apostolic Palace, Vatican Museums, Vatican City, Rome.  Plato and Aristotle are the figures featured in the centre.

In popular use, the word “aphantasia” can be misunderstood because of the paths taken in English by “phantasy”, “fantasy” and “phantasm”, all derived from the Ancient Greek φαντασία (phantasía) meaning “appearance, mental image, imagination”.  In English, this root was picked up via Latin and French but the multiple forms each evolved in distinct semantic trajectories.  The fourteenth century phantasm came to mean “apparition, ghost, illusion” so was used of “something deceptive or unreal”, the connotation being “the supernatural; spectral”.  This appears to be the origin of the association of “phantas-” with unreality or hallucination rather than normal cognition.  In the fifteenth & sixteenth centuries, the spellings phantasy & fantasy were for a time interchangeable although divergence came with phantasy used in its technical senses of “mental imagery”; “faculty of imagination”; “internal representation”, this a nod to Aristotle’s phantasía.  Fantasy is the familiar modern form, used to suggest “a fictional invention; daydream; escapism; wish-fulfilment, the connotation being “imaginative constructions (in fiction); imaginative excess (in the sense of “unreality” or the “dissociative”); indulgence (as in “speculative or wishful thoughts”)”.

While the word “aphantasia” didn’t exist until 2015, in the editions of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) published between 1952 (DSM-I) and 2013 (DSM-5), there was not even any any discussion (or even mention) of a condition anything like “an inability voluntarily to recall or form mental images”.  That’s because despite being “a mental condition” induced by something happening (or not happening) in the brain, the phenomenon has never been classified as “a mental disorder”.  Instead it’s a cognitive trait or variation in the human condition and technically is a spectrum condition, the “pure” aphantasic being one end of the spectrum, the hyperaphantasic (highly vivid, lifelike mental imagery, sometimes called a “photorealistic mind's eye”) the other.  That would of course imply the comparative adjective would be “more aphantasic” and the superlative “most aphantasic” but neither are standard forms.

If That rationale for the “omission” was the DSM’s inclusion criteria including the requirement of some evidence of clinically significant distress or functional impairment attributable to a condition.  Aphantasia, in isolation, does not reliably meet this threshold in that many individuals have for decades functioned entirely “normally” without being aware they’re aphantasic  while others presumably had died of old age in similar ignorance.  That does of course raise the intriguing prospect the mental health of some patients may have been adversely affected by the syndrome only by a clinician informing them of their status, thus making them realize what they were missing.  This, the latest edition of the DSM (DSM-5-TR (2022)) does not discuss.  The DSM does discuss imagery and perceptual phenomena in the context of other conditions (PTSD (post-traumatic stress disoder), psychotic disorders, dissociative disorders etc), but these references are to abnormal experiences, not the lifelong absence of imagery.  To the DSM’s editors, aphantasis remains a recognized phenomenon, not a diagnosis.

Given that aphantasia concerns aspects of (1) cognition, (2) inner experience and (3) mental representation, it wouldn’t seem unreasonable to expect the condition now described as aphantasia would have appeared in the DSM, even if only in passing or in a footnote.  However, in the seventy years between 1952-2022, over nine editions, there is no mention, even in DSM-5-TR (2022), the first volume released since the word was in 2015 coined.  That apparently curious omission is explained by the DSM never having been a general taxonomy of mental phenomena.  Instead, it’s (an ever-shifting) codification of the classification of mental disorders, defined by (1) clinically significant distress and/or (2) functional impairment and/or (3) a predictable course, prognosis and treatment relevance.  As a general principle the mere existence of an aphantasic state meets none of these criteria.

Crooked Hillary Clinton in Orange Nina McLemore pantsuit, 2010.  If in response to the prompt "Imagine a truthful person" one sees an image of crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), that obviously wrong but is not an instance of aphantasia because the image imagined need not be correct, it needs just to exist.

The early editions (DSM-I (1952) & DSM-II (1968)) heavily were slanted to the psychoanalytic, focusing on psychoses, neuroses and personality disorders with no mention of any systematic treatment of cognition as a modular function; the matter of mental imagery (even as abstract though separated from an imagined image), let alone its absence, wholly is ignored.  Intriguingly, given what was to come in the field, there was no discussion of the cognitive phenomenology beyond gross disturbances (ie delusions & hallucinations).  Even with the publication of the DSM-III (1980) & DSM-III-R (1987), advances in scanning and surgical techniques, cognitive psychology and neuroscience seem to have made little contribution to what the DSM’s editorial board decided to include and although DSM-III introduced operationalized diagnostic criteria (as a part of a more “medicalised” and descriptive psychiatry), the entries still were dominated by a focus on dysfunctions impairing performance, the argument presumably that it was possible (indeed, probably typical) for those with the condition to lead, full, happy lives; the absence of imagery ability thus not considered a diagnostically relevant variable.  Even in sections on (1) amnestic disorders (a class of memory loss in which patients have difficulty forming new memories (anterograde) or recalling past ones (retrograde), not caused by dementia or delirium but of the a consequence of brain injury, stroke, substance abuse, infections or trauma), with treatment focusing on the underlying cause and rehabilitation, (2) organic mental syndromes or (3) neuro-cognitive disturbance, there was no reference to voluntary imagery loss as a phenomenon in its own right.

Although substantial advances in cognitive neuroscience meant by the 1990s neuropsychological deficits were better recognised, both the DSM-IV (1994) and DSM-IV-TR (2000) continued to be restricted to syndromes with behavioural or functional consequences.  In a way that was understandable because the DSM still was seen by the editors as a manual for working clinicians who were most concerned with helping those afflicted by conditions with clinical salience; the DSM has never wandered far into subjects which might be matters of interesting academic research and mental imagery continued to be mentioned only indirectly, hallucinations (percepts without stimuli) and memory deficits (encoding and retrieval) both discussed only in the consequence of their affect on a patient, not as phenomenon.  The first edition for the new century was DSM-5 (2013) and what was discernible was that discussions of major and mild neuro-cognitive disorders were included, reflecting the publication’s enhanced alignment with neurology but even then, imagery ability is not assessed or scaled: not possessing the power of imagery was not listed as a symptom, specifier, or associated feature.  So there has never in the DSM been a category for benign cognitive variation and that is a product of a deliberate editorial stance rather than an omission, many known phenomenon not psychiatrised unless in some way “troublesome”.

The term “aphantasia” was coined to describe individuals who lack voluntary visual mental imagery, often discovered incidentally and not necessarily associated with brain injury or psychological distress.  In 2015 the word was novel but the condition had been documented for more than a century, Sir Francis Galton (1822–1911) in a paper published in 1880 describing what would come to be called aphantasia.  That work was a statistical study on mental imagery which doubtless was academically solid but Sir Francis’s reputation later suffered because he was one of the leading lights in what was in Victorian times (1837-1901) the respectable discipline of eugenics.  Eugenics rightly became discredited so Sir Francis was to some extent retrospectively “cancelled” (something like the Stalinist concept of “un-personing”) and these days his seminal contribution to the study of behavioural genetics is acknowledged only grudgingly.

Galton in 1880 noted a wide variation in “visual imagination” (ie it was understood as a “spectrum condition”) and in the same era, in psychology publications the preferred term seems to have been “imageless thought”.  In neurology (and trauma medicine generally) there were many reports of patients losing the power of imagery after a brain injury but no agreed name was ever applied because the interest was more in the injury.  The unawareness that some people simply lacked the facility presumably must have been held among the general population because as Galton wrote: “To my astonishment, I found that the great majority of the men of science to whom I first applied, protested that mental imagery was unknown to them, and they looked on me as fanciful and fantastic in supposing that the words “mental imagery” really expressed what I believed everybody supposed them to mean. They had no more notion of its true nature than a colour-blind man who has not discerned his defect has of the nature of colour.

His paper must have stimulated interest because one psychologist reported some subjects possessing what he called a “typographic visual type” imagination in which ideas (which most would visualize as an image of some sort) would manifest as “printed text” which was intriguing because in the same way a computer in some aspects doesn’t distinguish between an image file (jpeg, TIFF, webp, avif etc) which is a picture of (1) someone and (2) their name in printed form, that would seem to imply at least some who are somewhere on the aphantasia spectrum retain the ability to visualize printed text, just not the object referenced.  Professor Zeman says he first became aware of the condition in 2005 when a patient reported having lost the ability to visualize following minor surgery and after the case was in 2010 documented in the medical literature in the usual way, it provoked a number of responses in which multiple people informed Zeman they had never in their lifetime been able to visualize objects.  This was the origin of Zeman and his collaborators coining “congenital aphantasia”, describing individuals who never enjoyed the ability to generate voluntary mental images.  Because it was something which came to general attention in the age of social media, great interest was triggered in the phenomenon and a number of “on-line tests” were posted, the best-known of which was the request for readers to “imagine a red apple” and rate their “mind's eye” depiction of it on a scale from 1 (photorealistic visualisation) through to 5 (no visualisation at all).  For many, this was variously (1) one’s first realization they were aphantasic or (2) an appreciation one’s own ability or inability to visualise objects was not universal.

How visualization can manifest: Lindsay Lohan and her lawyer in court, Los Angeles, December. 2011.  If an aphantasic person doesn't know about aphantasia and doesn't know other people can imagine images, their lives are probably little different from them; it's just their minds have adapted to handle concepts in another way.

Top right: What’s thought “normal” visualization (thought to be possessed by most of the population) refers to the ability to imagine something like a photograph of what’s being imagined.  This too is a spectrum condition in that some will be able to imagine an accurate “picture”, something like a HD (high definition photograph” while others will “see” something less detailed, sketchy or even wholly inaccurate.  However, even if when asked to visualize “an apple” one instead “sees a banana”, that is not an indication of aphantasia, a condition which describes only an absence of an image.  Getting it that wrong is an indication of something amiss but it’s not aphantasia.

Bottom left: “Seeing” text in response to being prompted to visualize something was the result Galton in 1880 reported as such a surprise.  It means the brain understands the concept of what is being described; it just can’t be imagined as an image.  This is one manifestation of aphantasia but it’s not related to the “everything is text” school of post-modernism.  Jacques Derrida’s (1930-2004) fragment “Il n'y a pas de hors-texte” (literally “there is no outside-text”) is one of the frequently misunderstood phrases from the murky field of deconstruction bit it has nothing to do with aphantasia (although dedicated post-modernists probably could prove a relationship).

Bottom right: The absence of any image (understood as a “blankness” which does not necessarily imply “whiteness” or “blackness” although this is the simple way to illustrate the concept), whether text or to some degree photorealistic is classic aphantasia.  The absence does not mean the subject doesn’t understand the relevant object of concept; it means only that their mental processing does not involve imagery and for as long as humans have existed, many must have functioned in this way, their brains adapted to the imaginative range available to them.  What this must have meant was many became aware of what they were missing only when the publicity about the condition appeared on the internet, am interesting example of “diagnostic determinism”.

WebMD's classic Aphantasia test.

The eyes are an out-growth of the brain and WebMD explains aphantasia is caused by the brain’s visual cortex (the part of the brain that processes visual information from the eyes) “working differently than expected”, noting the often quoted estimate of it affecting 2-4% of the population may be understated because many may be unaware they are “afflicted”.  It’s a condition worthy of more study because aphantasics handle the characteristic by processing information differently from those who rely on visual images.  There may be a genetic element in aphantasia and there’s interest too among those researching “Long Covid” because the symptom of “brain fog” can manifest much as does aphantasia.

Aphantasia may have something to do with consciousness because aphantasics can have dreams (including nightmares) which can to varying degrees be visually rich.  There’s no obvious explanation for this but while aphantasia is the inability voluntarily to generate visual mental imagery while awake, dreaming is an involuntary perceptual experience generated during sleep; while both are mediated by neural mechanisms, these clearly are not identical but presumably must overlap.  The conclusions from research at this stage remains tentative the current neuro-cognitive interpretation seems to suggest voluntary (conscious) imagery relies on top-down activation of the visual association cortex while dream (unconscious) dream imagery relies more on bottom-up and internally driven activation during REM (rapid eye movement) sleep.  What that would seem to imply is that in aphantasia, the former pathway is impaired (or at least inaccessible), while the latter may remain intact (or accessible).

The University of Queensland’s illustration of the phantasia spectrum.

The opposite syndrome is hyperphantasia (having extremely vivid, detailed, and lifelike mental imagery) which can be a wonderful asset but can also be a curse, rather as hyperthymesia (known also as HSAM (Highly Superior Autobiographical Memory) and colloquially as “total recall”) can be disturbing.  Although it seems not to exist in the sense of “remembering everything, second-by-second”, there are certainly those who have an extraordinary recall of “events” in their life and this can have adverse consequences for mental health because one of the mind’s “defensive mechanisms” is forgetting or at least suppressing memories which are unwanted.  Like aphantasia & hyperphantasia, hyperthymesia is not listed by the DSM as a mental disorder; it is considered a rare cognitive trait or neurological phenomenon although like the imaging conditions it can have adverse consequences and these include disturbing “flashbacks”, increased rumination and increased rates of anxiety or obsessive tendencies.

Thursday, January 8, 2026

Foxbat

Foxbat or fox-bat (pronounced foks-bat)

(1) As Foxbat, the NATO (North Atlantic Treaty Organization) reporting name for the Soviet-era MiG-25 (Mikoyan-Gurevich MiG-25) high-altitude supersonic interceptor and reconnaissance aircraft.

(2) A common name for members of the Megachiroptera (the Pteropus (suborder Yinpterochiroptera), a genus of megabats), some of the largest bats in the world.

Fox is from the Middle English fox, from the Old English fox (fox), from the Proto-West Germanic fuhs, from the Proto-Germanic fuhsaz (fox), from the primitive Indo-European sos (the tailed one), derive possibly from pu- (tail).  It was cognate with the Scots fox (fox), the West Frisian foks (fox), the Fering-Öömrang North Frisian foos, the Sölring and Heligoland fos, the Dutch vos (fox), the Low German vos (fox), the German Fuchs (fox), the Icelandic fóa (fox), the Tocharian B päkā (tail, chowrie), the Russian пух (pux) (down, fluff), the Sanskrit पुच्छ (púccha) (source of the Torwali پوش (pūš) (fox) and the Hindi पूंछ (pūñch) (tai”).

Bat in the context of the animal was a dialectal variant (akin to the dialectal Swedish natt-batta) of the Middle English bake & balke, from the North Germanic. The Scandinavian forms were the Old Swedish natbakka, the Old Danish nathbakkæ (literally “night-flapper”) and the Old Norse leðrblaka (literally “leather-flapper”).  The Old English word for the animal was hreremus, from hreran (to shake) and it was known also as the rattle-mouse, an old dialectal word for "bat", attested from the late sixteenth century.  A more rare form, noted from the 1540s, was flitter-mouse (the variants were flinder-mouse & flicker-mouse) in imitation of the German fledermaus (bat) from the Old High German fledaron (to flutter).  In Middle English “bat” and “old bat” were used as a (derogatory) term to describe an old woman, perhaps a suggestion of witchcraft rather than a link to bat as "a prostitute who plies her trade by night".  It’s ancient slang and one etymologist noted the French equivalent hirondelle de nuit (night swallow) was "more poetic".  To “bat the eylids” is an Americanism from 1847, an extended of the earlier (1610s) meaning "flutter (the wings) as a hawk", a variant of bate.  Fox-bat is a noun; the noun plural is fox-bats.  When used of the MiG-25 (as "Foxbat", the NATO reporting name), it's a proper noun and thus used with an initial capital.

Fox-bat in flight.

The term fox-bat or flying fox, (genus Pteropus), covers some sixty-five bat species found on tropical islands from Madagascar to Australia and north through Indonesia and mainland Asia.  Most species are primarily nocturnal and are the largest bats, some attaining a wingspan of 5 feet (1.5 m) with an overall body length of some 16 inches (400 mm).  Zoologists list fox-bats as “Old World fruit bats” (family Pteropodidae) that roost in large numbers and eat fruit and are thus a potential pest, many countries restricting their importation.  Like nearly all Old World fruit bats, flying foxes use sight rather than echolocation, a physiological process for locating distant or invisible objects (such as prey) by means of sound waves reflected back to the emitter by the objects) to navigate, despite the largely nocturnal habit of most species.  In the database maintained by the International Union for Conservation of Nature and Natural Resources (IUCN), about half of all flying fox species are listed as suffering declining populations, 15 said to be vulnerable and 11 endangered. The fox-bats were previously classified in the suborder Megachiroptera, but most researchers now place them in the suborder Yinpterochiroptera, which also contains the superfamily Rhinolophoidea, a diverse group that includes horseshoe bats, trident bats, mouse-tailed bats, and others.

MiG-25 (Mikoyan-Gurevich MiG-25).

Once the most controversial fighter in the skies, there was so much mystery surrounding the MiG-25 that US, British and NATO planners spent years spying on it with a mixture of awe and dread.  Conceived originally by Soviet designers to counter the threat posed by Boeing’s B-70 Valkyrie bomber, development continued even after the B70 project, rendered redundant by advances in missile technology, was cancelled.  First flown in 1964 and entering service in 1970, nearly 1200 were built and were operated by several nations as well as the USSR.  Able (still) to outrun any other fighter, only the US Lockheed SR-71 Blackbird was faster but fewer than three dozen of those were built and those were configured only for strategic reconnaissance.  When first the West became aware of the Foxbat, it caused quite a stir because, combining stunningly high speed with high altitude tolerance and a heavy weapons load, it did appear to be the long-feared platform which would render Soviet airspace immune from US penetration.  It was the threat the Foxbat was thought to pose which was influential in the direction pursued by US engineers when developing the McDonnell Douglas F15.

A brunette-phase Lindsay Lohan in MiG-25 Foxbat T-shirt, rendered by Vovsoft as pen drawing.

The Foxbat however never realized its apparently awesome implications. Because the original design brief was to produce a device which could combat the fast, high-flying B-70, many of the characteristics desirable in a short-range interceptor were neglected in the quest for something which could get very high, very quickly.  At that it was a breathtakingly successful but there were compromises, the fuel burn was epic and, with a very high take-off and landing speed, it could operate only from the longest runways.  Still, at what it was good at it was really good and its very presence meant the US had to plan any mission within range of a Foxbat, cognizant of the threat it was thought to present.  Unbeknown to the West, at lower altitudes it presented little threat and was no dog-fighter; it was essentially a dragster built for the skies, faster in a straight line than just about anything but really not good at turning.  Its design philosophy was essentially the same as the Lockheed F-104 Starfighter, a US supersonic interceptor which first flew in 1954 with over 2,500 built and supplied to many air forces, the last of which wasn’t retired from active service until 2004.  An uncompromising machine built for speed, pilots dubbed it the “winged missile” and that assessment was not unrelated to it later gaining the nickname “widow-maker”; those who flew the thing described the characteristics it exhibited in low speed turns as: “banking with intent to turn”.

It wasn’t until 1976 when a Soviet defector landed a new Foxbat in Japan in 1976 that US engineers were able to examine the airframe and draw an understanding of its capabilities.  What their analysis found was that the limitations in Soviet metallurgy and manufacturing techniques had resulted in a heavy airframe, one which really couldn’t maneuver at high speeds, and handled poorly at low altitudes. The surprisingly primitive radar was of limited effectiveness in conventional combat situations against enemy fighters, which, combined with the low altitude clumsiness meant that its drawbacks tended to outweigh the advantage it had in sheer speed at altitude, something which meant less to the US since missiles had replaced the B-70 strategic bomber (which never entered production).

In its rare combat outings, those advantages did however confer the occasional benefit.  In 1971, a Soviet Foxbat operating out of Egypt used its afterburners to sustain Mach 3 for an extended duration, enabling it to outrun three pursuing Israeli F4-Phantoms and one downed a US Navy F/A-18 Hornet during the first Gulf War (1991).  During the Iran-Iraq War (1980-1988), the Iraqi Air Force found them effective against old, slow machinery but sustained heavy losses when confronted with the Iran’s agile F-14 but most celebrated was probably the Foxbat’s success during the Gulf War in claiming both of the last two American aircraft lost in air-to-air combat.  Otherwise, the Foxbat has at low altitude proved vulnerable, the Israeli Defense Force (IDF) shooting down several in the war over Lebanon (1981) although they have of late been used, most improbably, in a ground attack role in the Syrian Civil War, the Syrian Arab Air Force, lacking a more appropriate platform, pressing the Foxbats into a ground support role, in at least one case using air-to-air missiles to attack ground targets.  The Soviet designers took note of the operating environment when developing the Foxbat’s successor, the MiG-31 (NATO reporting name Foxhound), a variant which sacrificed a little of the pure speed and climb-rate in order to produce a better all-round fighter.

Usually unrelated: 1957 Morris Minor Traveller (left) and 1960 Jaguar XK150 FHC (right).  Stations wagons with wood frames (real and fake) are in the US called "woodies" but the spelling "woody" also appears in UK use.

Although for the whole of the Jaguar XK150’s production run (1957-1961) the Morris Minor Traveller (1952-1973) was also being made in factories never more than between 20-60 odd miles (32-100 km) distant, so different in form and function were the two it’s rare they’re discussed in the same context.  One was powered by an engine which had five times won the Le Mans 24 hour endurance classic while the other was one of several commercially-oriented variants of a small, post-war economy car, introduced in the austere England of 1948.  The Traveller did however have charm and it was also authentic in its construction, the varnished ash genuinely structural, an exoskeleton which provided the strength while the panels behind were there just to keep out the rain.  By contrast, by the mid-1950s, the US manufacturers had abandoned the method and produced “woodies” with a combination of fibreglass (fake timber) and DI-NOC, (Diurno Nocturna, from the Spanish, literally “daytime-nighttime” and translated for marketing purposes as “beautiful day & night”) appliqué, an embossed vinyl or polyolefin material with a pressure-sensitive adhesive backing produced since the 1930s and perfected by Minnesota Mining & Manufacturing (3M).  In phased releases over 1957-1958, Jaguar made available the usual three versions of its XK sports car, the DHC (drophead coupé, a style which elsewhere was usually called a cabriolet or convertible) and FHC (fixed head coupé, ie coupé), later joined by the more minimalist OTS (open two-seater, a roadster) and the line was a link between flowing lines of the 1930s and the new world, celebrated by the E-Type (1961-1974) which created such a sensation upon debut at the 1961 Geneva Motor Show.

Minor modification: 1960 Jaguar XK150 3.4 Shooting Brake (“Foxbat”).

The Morris Minor Traveller was the last true woodie in production and is now a thing in the lower reaches of the collector market but there's one less available for fans because one was sacrificed to a project by by industrial chemist and noted Jaguar enthusiast, the late Geoffrey Stevens, construction undertaken between 1975-1977. He wanted the Jaguar XK150 shooting brake the factory never made so blended a XK150 FHC with the rear compartment of a Morris Minor Traveller of similar vintage.  Mr Stevens in 1976 dubbed his hybrid creation “Foxbat” because just as a MiG-25 landing in Japan was an event so unexpected it made headlines around the world, he suspected that in the circles he moved, a timber-framed XK150 shooting brake would be as much a surprise.  It has been restored as a charming monument to English eccentricity and even the usually uncompromising originality police among the Jaguar community seem fond of it.  In a nice touch (and typical of an engineer’s attention to detail), a “Foxbat” badge was hand-cut, matching the original Jaguar script.  Other than the coach-work, the XK150 is otherwise “matching-numbers” (chassis number S825106DN; engine number V7435-8).   

On the Wings of a Russian Foxbat: Deep Purple bootleg, 1977.

The origin of the term “bootlegging” dates from the late eighteenth century when it was used by British customs and excise officers to describe the trick smugglers used hiding valuables in their large sea-boots.  Since then, it’s been applied variously including (1) the distilling, transporting and selling of unlawful liquor (2) unlicensed copies of software and (3) unauthorized recordings of music and film.  In music, bootleg recordings began to appear in some volume in the 1960s and originally were often from live performances.  Often created from tapes of dubious quality with little or no editing, these bootlegs generally were tolerated by the industry because they tended to circulate among fans who anyway purchased the official product and were thought of just a form of free promotional material.  Later, when things became more organized and bootleggers began distributing replicas of official releases, the attitude changed and for decades the software industry fought ongoing battles against bootleg copies (which in some non-Western markets represented in excess of 90% of installations).

On the Wings of a Russian Foxbat, re-released (in re-mastered form with bonus tracks) in 1995 as Live in California, Long Beach Arena, 1976.

Taken from a performance by the English heavy metal band Deep Purple at the Long Beach Arena, Los Angeles on 27 February 1976, the bootleg On the Wings of a Russian Foxbat was released in 1977 and was another example of the effect on popular culture of the Soviet pilot’s defection.  The link with the event in Japan was that the quality of the band’s performance was unexpectedly good, their reputation at the time not good (they would break-up only weeks after the Long Beach show).  Additionally, the sound quality was outstanding (certainly by the usual bootleg standards), something not then easy to achieve in outdoor venues with a raucous audience.  Curiously, the original On the Wings of a Russian Foxbat bootleg used for the cover art a picture of unsmiling soldiers from the PLA (People’s Liberation Army) from the Republic of China (then usually called “Red China” or “Communist China); presumably the bootleggers decided the star on the caps was “sufficiently Russian”.  In 1995, re-mastered, the recording (with a few bundled “extras”) was re-issued as an “official” release, the fate of many a bootleg.  With memories of the diplomatic incident in 1976 having faded, although On the Wings of a Russian Foxbat still appeared on the cover, the album was marketed as Live in California, Long Beach Arena, 1976.

Tuesday, January 6, 2026

Inamorata

Inamorata (pronounced in-am-uh-rah-tuh or in-am-uh-rah-tuh)

A woman with whom one is in love; a female lover

1645-1655: From the Italian innamorata (mistress, sweetheart), noun use of the feminine form of innamorato (the noun plural innamoratos or innamorati) (lover, boyfriend), past principle of innamorare (to inflame with love), the construct being in- (in) + amore (love), from the Latin amor.  A familiar modern variation is enamor.  Inamorata is a noun; the noun plural is inamoratas.

Words like inamorata litter English and endure in their niches, not just because poets find them helpful but because they can be used to convey subtle nuances in a way a word which appears synonymous might obscure.  One might think the matter of one’s female lover might be linguistically (and sequentially) covered by (1) girlfriend, (2) fiancé, (3) wife and (4) mistress but to limit things to those is to miss splitting a few hairs.  A man’s girlfriend is a romantic partner though not of necessity a sexual one because some religions expressly prohibit such things without benefit of marriage and there are the faithful who follow these teachings.  One can have as many girlfriends as one can manage but the expectation they should be enjoyed one at a time.  Women can have girlfriends too but (usually) they are “friends who are female” rather than anything more except of course among lesbians where the relationship is the same as with men.  Gay men too have girlfriends who are “female friends”, some of whom may be “fag hags” a term which now is generally a homophobic slur unless used within the LGB factions of the LGBTQQIAAOP community where it can be jocular or affectionate.

A fiancé is a women to whom one is engaged to be married, in many jurisdictions once a matter of legal significance because an offer of marriage could be enforced under the rules of contract law.  While common law courts didn’t go as far as ordering “specific performance of the contract”, they would award damages on the basis of a “breach of promise”, provided it could be adduced that three of the four essential elements of a contract existed: (1) offer, (2) certainty of terms and (3) acceptance.  The fourth component: (4) consideration (ie payment), wasn’t mentioned because it was assumed to be implicit in the nature of the exchange; a kind of deferred payment as it were.  It was one of those rarities in common law where things operated almost wholly in favor of women in that they could sue a man who changed his mind while they were free to break-off an engagement without fear of legal consequences though there could be social and familial disapprobation.  Throughout the English-speaking world, the breach of promise tort in marriage matters has almost wholly been abolished, remaining on the books in the a couple of US states (not all of which lie south of the Mason-Dixon Line) but even where it exists it’s now a rare action and one likely to succeed only in exceptional circumstances or where a particularly fragrant plaintiff manages to charm a particularly sympathetic judge.

The spelling fiancé (often as fiance) is now common for all purposes.  English borrowed both the masculine (fiancé) and feminine (fiancée) from the French verb fiancer (to get engaged) in the mid nineteenth century and that both spellings were used is an indication it was one of those forms which was, as an affectation, kept deliberately foreign because English typically doesn’t use gendered endings. Both the French forms were ultimately from the Classical Latin fidare (to trust), a form familiar in law and finance in the word fiduciary, from the Latin fīdūciārius (held in trust), from fīdūcia (trust) which, as a noun & adjective, describes relationships between individuals and entities which rely on good faith and accountability.  Pronunciation of both fiancé and fiancée is identical so the use of the differentiated forms faded by the late twentieth century and even publications like Country Life and Tattler which like writing with class-identifiers seem to have updated.  Anyway, because English doesn’t have word endings that connote gender, differentiating between the male and the female betrothed would seem unfashionable in the age of gender fluidity but identities exist as they’re asserted and one form or the other might be deployed as a political statement by all sides in the gender wars.

Model Emily Ratajkowski's (b 1991) clothing label is called Inamorata, a clever allusion to her blended nickname EmRata.  This is Ms Ratajkowski showing Inamorata’s polka-dot line in three aspects.

Wife was from the Middle English wyf & wif, from the Old English wīf (woman, wife), from the Proto-West Germanic wīb, from the Proto-Germanic wībą (woman, wife) and similar forms existed as cognates in many European languages.  The wife was the woman one had married and by the early twentieth century, in almost all common law jurisdictions (except those where systems of tribal law co-existed) it was (more or less) demanded one may have but one at a time.  Modern variations include “common-law wife” and the “de-facto wife”.  The common-law marriage (also known as the "sui iuris (from the Latin and literally “of one's own right”) marriage", the “informal marriage” and the “non-ceremonial marriage”) is a kind of legal quasi-fiction whereby certain circumstances can be treated as a marriage for many purposes even though no formal documents have been registered, all cases assessed on their merits.  Although most Christian churches don’t long dwell on the matter, this is essentially what marriage in many cases was before the institutional church carved out its role.  In popular culture the term is used loosely to refer sometimes just about any un-married co-habitants regardless of whether or not the status has been acknowledged by a court.  De facto was from the Latin de facto, the construct being (from, by) + the ablative of factum (fact, deed, act).  It translates as “in practice, what actually is regardless of official or legal status” and is thus differentiated from de jure, the construct being (from) + iūre (law) which describes something’s legal status.  In general use, a common-law wife and de facto wife are often thought the same thing but the latter differs that in some jurisdictions the parameters which define the status are codified in statute whereas a common law wife can be one declared by a court on the basis of evidence adduced.

Mistress dates from 1275–1325 and was from the Middle English maistresse, from the Old & Middle French maistresse (in Modern French maîtresse), feminine of maistre (master), the construct being maistre (master) + -esse or –ess (the suffix which denotes a female form of otherwise male nouns denoting beings or persons), the now rare derived forms including the adjective mistressed and the noun mistressship.  In an example of the patriarchal domination of language, when a woman was said to have acquired complete knowledge of or skill in something, she’s was said to have “mastered” the topic.  A mistress (in this context) was a woman who had a continuing, extramarital sexual relationship with one man, especially a man who, in return for an exclusive and continuing liaison, provides her with financial support.  The term (like many) has become controversial and critics (not all of them feminists) have labeled it “archaic and sexist”, suggesting the alternatives “companion” or “lover” but neither convey exactly the state of the relationship so mistress continues to endure.  The critics have a point in that mistress is both “loaded” and “gendered” given there’s no similarly opprobrious term for adulterous men but the word is not archaic; archaic words are those now rare to the point of being no longer in general use and “mistress” has hardly suffered that fate, thought-crime hard to stamp out.

This is Ms Ratajkowski showing Inamorata’s polka-dot line in another three aspects.

Inamorata was useful because while it had a whiff of the illicit, that wasn’t always true but what it did always denote was a relationship of genuine love whatever the basis so one’s inamorata could also be one’s girlfriend, fiancé or mistress though perhaps not one’s wife, however fond one might be of her.  An inamorata would be a particular flavor of mistress in the way paramour or leman didn't imply.  Paramour was from the Middle English paramour, paramoure, peramour & paramur, from the Old French par amor (literally “for love's sake”), the modern pronunciation apparently an early Modern English re-adaptation of the French and a paramour was a mistress, the choice between the two perhaps influenced by the former tending to the euphemistic.  The archaic leman is now so obscure that it tends to be used only by the learned as a term of disparagement against women in the same way a suggestion mendaciousness is thought a genteel way to call someone a liar.  Dating from 1175-1225, it was from the Middle English lemman, a variant of leofman, from the Old English lēofmann (lover; sweetheart (and attested also as a personal name)), the construct being lief + man (beloved person).  Lief was from the Middle English leef, leve & lef, from the Old English lēof (dear), from the Proto-Germanic leubaz and was cognate with the Saterland Frisian ljo & ljoo, the West Frisian leaf, the Dutch lief, the Low German leev, the German lieb, the Swedish and Norwegian Nynorsk ljuv, the Gothic liufs, the Russian любо́вь (ljubóv) and the Polish luby.  Man is from the Middle English man, from the Old English mann (human being, person, man), from the Proto-Germanic mann (human being, man) and probably ultimately from the primitive Indo-European mon (man).  A linguistic relic, leman applied originally either to men or women and had something of a romantic range.  It could mean someone of whom one was very fond or something more although usage meant the meaning quickly drifted to the latter: someone's sweetheart or paramour.  In the narrow technical sense it could still be applied to men although it has for so long been a deliberate archaic device and limited to women, that would now just confuse.

About the concubine, while there was a tangled history, there has never been much confusion.  Dating from 1250-1300, concubine was from the Middle English concubine (a paramour, a woman who cohabits with a man without being married to him) from the Anglo-Norman concubine, from the Latin concubīna, derived from cubare (to lie down), the construct being concub- (variant stem of concumbere & concumbō (to lie together)) + -ina (the feminine suffix).  The status (a woman who cohabits with a man without benefit of marriage) existed in Hebrew, Greek, Roman and other civilizations, the position sometimes recognized in law as "wife of inferior condition, secondary wife" and there’s much evidence of long periods of tolerance by religious authorities, extended both to priests and the laity.  The concubine of a priest was sometimes called a priestess although this title was wholly honorary and of no religious significance although presumably, as a vicar's wife might fulfil some role in the parish, they might have been delegated to do this and that.

Once were inamoratas: Lindsay Lohan with former special friend Samantha Ronson, barefoot in Los Cabos, Mexico, 2008.

Under Roman civil law, the parties were the concubina (female) and the concubinus (masculine).  Usually, the concubine was of a lower social order but the institution, though ranking below matrimonium (marriage) was a cut above adulterium (adultery) and certainly more respectable than stuprum (illicit sexual intercourse, literally "disgrace" from stupere (to be stunned, stupefied)) and not criminally sanctioned like rapere (“sexually to violate” from raptus, past participle of rapere, which when used as a noun meant "a seizure, plundering, abduction").  In Medieval Latin it also meant meant also "forcible violation" & "kidnapping" and a misunderstanding of the context in which the word was then used has caused problems in translation ever since .  Concubinage is, in the West, a term largely of historic interest.  It describes a relationship in which a woman engages in an ongoing conjugal relationship with a man to whom she is not or cannot be married to the full extent of the local meaning of marriage.  This may be due to differences in social rank, an existing marriage, religious prohibitions, professional restrictions, or a lack of recognition by the relevant authorities.  Historically, concubinage was often entered into voluntarily because of an economic imperative.  In the modern vernacular, wives use many words to describe their husbands’ mistress(es).  They rarely use concubine.  They might however be tempted to use courtesan which was from the French courtisane, from the Italian cortigiana, feminine of cortigiano (courtier), from corte (court), from the Latin cohors.  A courtesan was a prostitute but a high-priced one who attended only to rich or influential clients and the origin of the term was when it was used of the mistresses of kings or the nobles in the court, the word mistress too vulgar to be used in such circles.

Sunday, January 4, 2026

Thunk

Thunk (pronounced thunk)

(1) Onomatopoeic slang for sounds (such as the impressive thud when the doors close on pre-modern Mercedes-Benz), representing the dull sound of the impact of a heavy object striking another and coming to an immediate standstill, with neither object being broken by the impact.

(2) In computer programming, a delayed computation (known also as a closure routine.

(3) In computing, in the Scheme programming language, a function or procedure taking no arguments.

(4) In computing, a specialized subroutine in operating systems where one software module is used to execute code in another or inject an additional calculation into another subroutine; a mapping of machine data from one system-specific form to another, usually for compatibility reasons, to allow a 16-bit program to run on a 32-bit operating system.

(5) In computing, to execute code by means of a thunk.

(6) As “get thunked” or “go thunk yourself”, an affectionate insult among the nerdiest of programmers.

(7) In colloquial use, a past tense form of think (the standard form being "thought").  Usually it's used humorously but, if erroneous, it's polite not to correct the mistake.

1876: The first documented instance as incorrect English is from 1876 but doubtlessly it had been used before and there’s speculation it may have be a dialectical form in one or more places before dying out.  There being no oral records and with nothing in writing prior to 1876, the history is unknown.  As an echoic of the sound of impact, it’s attested from 1952.  Although occasionally heard in jocular form, except in computing, thunk is non-standard English, used as a past tense or past participle of think.  The mistake is understandable given the existence of drink/drunk, sink/sunk etc so perhaps it’s a shame (like brung from bring) that it’s not a standard form except in computing.  The plural is thunks, the present participle thunking and the simple past and past participle thunked.  The numerical value of thunk in Chaldean Numerology is 4; the value in Pythagorean Numerology is 2.  Thunk & thunking are nouns & verbs, thunker is a noun and thunked is a verb; the noun plural is thunks.  The adjective thunkish is non-standard but is used in engineering and programming circles.

Getting thunked

The origin of the word to describe a number of variations of tricks in programming is contested, the earliest dating from 1961 as onomatopoeic abstractions of computer programming.  One holds a thunk is the (virtual) sound of data hitting the stack (some say hitting the accumulator).  Another suggestion is that it’s the sound of the expression being unfrozen at argument-evaluation time. The most inventive in that it was said to have been coined during an after-midnight programming session when it was realized a type of an argument in Algol 60 could be figured out in advance with a little compile-time thought, simplifying the evaluation machinery.  In other words, it had "already been thought of"; thus it was christened a "thunk", which is “the past tense of ‘think’ at two in the morning when most good programming is done on a diet of Coca-Cola and pizza”.


Door closing on 1967 Mercedes-Benz 230 S.  Until the 1990s, the quality of even the low-end Mercedes-Benz models was outstanding and the doors closed with a satisfying thunk.

Thunking as a programming concept does seem to have been invented in 1961 as “a chunk of code which provides an address”, a way of binding parameters to their formal definitions in procedure calls.  If a procedure is called with an expression in the place of a formal parameter, the compiler generates a thunk which computes the expression and leaves the address of the result in some standard location.  It usefulness was such it was soon generalised into: an expression, frozen with its environment for later evaluation if and when needed (that point being the closure), the process of unfreezing thunks called forcing.  As operating systems evolved into overlay-rich environments, the thunk became a vital stub-routine to load and jump to the correct overlay, Microsoft and IBM both defining the mapping of the 16-bit Intel environment with segment registers and 64K address limits whereas 32 & 64-bit systems had flat addressing and semi-real memory management.  Thunking permits multiple environments to run on the same computer and operating system and to achieve this, there’s the universal thunk, the generic thunk and the flat thunk, the fine distinctions of which only programmers get.  In another example of nerd humor, a person can be said to have their activities scheduled in a thunk-like manner, the idea being they need “frequently to be forced to completion”, especially if the task is boring.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

So it’s a bit nerdy but modern operating systems need thunking because 8, 16, 32 and 64-bit routines can need to run both concurrently and interactively on the same machine (real or virtual).  If a 32-bit application sends something with precision data types to a 64-bit handler, without thunking, the call will fail because the precise address can’t be resolved.  Although not literally true, it’s easiest to visualise thunking as a translation layer.

IBM OS/2 2.0 in shrink-wrap, 1992.

Thunking first entered consumer computing at scale with IBM’s OS/2 in 1987, an operating system still in surprisingly wide use and supported by IBM until early in the century.  Interestingly, although both OS/2 (and its successor eCom) have been unavailable for years, in August 2017, a private project released ArcaOS, an x86 operating system derived from OS/2 and, for personal use, it retails at US$129.00.  Like OS/2, it has some features which are truly unique such as, for the handful of souls on the planet who either need or wish simultaneously to run multiple 8, 16 and 32-bit text-mode sessions, (including those internally booting different operating systems in segregated memory) in their hundreds on the one physical machine.  First done in 1992 on OS/2 2.0, it’s still quite a trick and the on-line OS/2 Museum hosts an active community, development continuing.

Saturday, January 3, 2026

Defiant

Defiant (pronounced dih-fahy-uhnt)

Characterized by defiance or a willingness to defy; boldly resistant or challenging.

1830s: From the French défiant, from the Old French, present participle of the verb défier (to challenge, defy, provoke), the construct thus def(y) + “i” + -ant.  Defy dates from the mid thirteenth century and was from the Middle English defien, from the Old French desfier, from the Vulgar Latin disfidare (renounce one's faith), the construct being dis- (away) + fidus (faithful).  The construct in French was thus des- (in the sense of negation) + fier (to trust), (from the Vulgar Latin fīdāre, from the Classical Latin fīdere (fidelity),  In the fourteenth century, the meaning shifted from “be disloyal” to “challenge”.  The suffix –ant was from the Middle English –ant & -aunt, partly from the Old French -ant, from Latin -āns; and partly (in adjectival derivations) a continuation of the use of the Middle English -ant, a variant of -and, -end, from the Old English -ende ( the present participle ending).  Extensively used in the sciences (especially medicine and pathology), the agent noun was derived from verb.  It was used to create adjectives (1) corresponding to a noun in -ance, having the sense of "exhibiting (the condition or process described by the noun)" and (2) derived from a verb, having the senses of: (2a) "doing (the verbal action)", and/or (2b) "prone/tending to do (the verbal action)".  In English, many of the words to which –ant was appended were not coined in English but borrowed from the Old French, Middle French or Modern French.  The negative adjectival forms are non-defiant & undefiant although there is a kind of middle ground described by quasi-defiant, semi-defiant & half-defiant, the latter three sometimes used in military conflicts where, for whatever reason, it’s been necessary (or at least desirable) for a force to offer a “token resistance” prior to an inevitable defeat.  The adjective over-defiant refers to a resistance or recalcitrance, the extent or duration of which is not justified by the circumstances; in such cases the comparative is “more defiant” and the superlative “most defiant”.  Defiant is a noun & adjective, defiantness is a noun and defiantly is an adverb; the noun plural is defiants.

Defiance in politics: use with caution

The commonly used synonyms include rebellious, direful, truculent, insolent, rebellious, recalcitrant, refractory, contumacious & insubordinate but in diplomacy, such words must be chosen with care because what is one context may be a compliment, in another it may be a slight.  This was in 1993 discovered by Paul Keating (b 1944; Prime Minister of Australia 1991-1996) who labelled Dr Mahathir bin Mohamad (b 1925; prime minister of Malaysia 1981-2003 & 2018-2020) one of the “recalcitrant” when the latter declined to attend a summit meeting of the Asia-Pacific Economic Cooperation (APEC).  For historic reasons, Dr Mahathir was sensitive to the memories of the imperialist oppressors telling colonized people what to do and interpreted Mr Keating’s phrase as a suggestion he should be more obedient (the most commonly used antonym of defiant, the others including obedient & submissive).  Things could quickly have been resolved (Dr Mahathir of the “forgive but not forget” school of IR (international relations)) but, unfortunately, Mr Keating was brought up in the gut-wrenching “never apologize” tradition of the right-wing of the New South Wales (NSW) Labor Party so what could have been handled as a clumsy linguistic gaffe was allowed to drag on.

Circa 1933 Chinese propaganda poster featuring a portrait of Generalissimo Chiang Kai-shek (Chiang Chung-cheng).  Set in an oval frame below flags alongside stylized Chinese lettering, the generalissimo is depicted wearing his ceremonial full-dress uniform with decorations.

The admission an opponent is being “defiant” must also sometimes be left unsaid.  Ever since Generalissimo Chiang Kai-shek (1887-1975; leader of the Republic of China (mainland) 1928-1949 & the renegade province of Taiwan 1949-1975) in 1949 fled mainland China, settling on and assuming control of the island of Taiwan, the status of the place has been contested, most dramatically in the incidents which flare up occasionally in the in the straits between the island and the mainland, remembered as the First (1954–1955), Second (1958) and Third (1995-1996) Taiwan Strait Crises which, although sometimes in retrospect treated as sabre rattling or what Hun Sen (b 1952; prime minister (in one form or another) 1985-2023) might have called “the boys letting off steam”, were at the time serious incidents, each with the potential to escalate into something worse.  Strategically, the first two crises were interesting studies in Cold War politics, the two sides at one stage exchanging information about when and where their shelling would be aimed, permitting troops to be withdrawn from the relevant areas on the day.  Better to facilitate administrative arrangements, each side’s shelling took place on alternate days, satisfying honor on both sides.  The other landmark incident was China’s seat at the United Nations (UN), held by the Republic of China (ROC) (Taiwan) between 1945-1971 and the People’s Republic of China (PRC) (the mainland) since.

Jiefang Taiwan, xiaomie Jiangzei canyu (Liberate Taiwan, and wipe out the remnants of the bandit Chiang) by Yang Keyang (楊可楊) and Zhao Yannian (趙延年). 

A 1954 PRC propaganda poster printed as part of anti-Taiwan campaign during first Taiwan Strait Crisis (1954-1955), Generalissimo Chiang Kai-shek depicted as a scarecrow erected on Taiwan by the US government and military. Note the color of the generalissimo’s cracked and disfigured head (tied to a pole) and the similarity to the color of the American also shown.  The artists have included some of the accoutrements often associated with Chiang’s uniforms: white gloves, boots and a ceremonial sword.  The relationship between Chiang and the leaders of PRC who defeated his army, Chairman Mao (Mao Zedong. 1893–1976; paramount leader of PRC 1949-1976) and Zhou Enlai (1898–1976; PRC premier 1949-1976) was interesting.  Even after decades of defiance in his renegade province, Mao and Zhou still referred to him, apparently genuinely, as “our friend”, an expression which surprised both Richard Nixon (1913-1994; US president 1969-1974) and Henry Kissinger (b 1923; US national security advisor 1969-1973 & secretary of state 1973-1977) who met the chairman and premier during their historic mission to Peking in 1972.

A toast: Comrade Chairman Mao Zedong (left) and  Generalissimo Chiang Kai-shek (right), celebrating the Japanese surrender, Chongqing, China, September 1945.  After this visit, they would never meet again.

Most people, apparently even within the PRC, casually refer to the place as “Taiwan” but state and non-governmental entities, anxious not to upset Beijing, use a variety of terms including “Chinese Taipei” (the International Olympic Committee (IOC) and the Fédération Internationale de Football Association (FIFA, the International Federation of Association Football) & its continental confederations (AFC, CAF, CONCACAF, CONMEBOL, OFC and UEFA)), “Taiwan District” (the World Bank) and “Taiwan Province of China (the International Monetary Fund (IMF)).  Taiwan’s government uses an almost declarative “Republic of China” which is the name adopted for China after the fall of the Qing dynasty and used between 1912-1949 and even “Chinese Taipai” isn’t without controversy, “Taipei” being the Taiwanese spelling whereas Beijing prefers “Taibei,” the spelling used in the mainland’s Pinyin system.  There have been variations on those themes and there’s also the mysterious “Formosa”, use of which persisted in the English-speaking world well into the twentieth century, despite the Republic of Formosa existing on the island of Taiwan for only a few months in 1895.  The origin of the name Formosa lies in the island in 1542 being named Ilha Formosa (beautiful island) by Portuguese sailors who had noticed it didn’t appear on their charts.  From there, most admiralties in Europe and the English-speaking world updated their charts, use of Formosa not fading until the 1970s.

All that history is well-known, if sometimes subject to differing interpretations but some mystery surrounds the term “renegade province”, used in recent years with such frequency that a general perception seems to have formed that it’s Beijing’s official (or at least preferred) description of the recalcitrant island.  That it’s certainly not but in both the popular-press and specialist journals, the phrase “renegade province” is habitually used to describe Beijing’s views of Taiwan.  Given that Beijing actually calls Taiwan the “Taiwan Province” (sometimes styled as “Taiwan District” but there seems no substantive difference in meaning) and has explicitly maintained it reserves the right to reclaim the territory (by use of military invasion if need be), it’s certainly not unreasonable to assume that does reflect the politburo's view but within the PRC, “renegade province” is so rare (in Chinese or English) as to be effectively non-existent, the reason said to be that rather than a renegade, the island is thought of as a province pretending to be independent; delusional rather than defiant.  Researchers have looked into the matter when the phrase “renegade province” was first used in English when describing Taiwan.  There may be older or more obscure material which isn’t indexed or hasn’t been digitized but of that which can be searched, the first reference appears to be in a US literary journal from 1973 (which, it later transpired, received secret funding from the US Central Intelligence Agency (CIA)).  It took a while to catch on but, appearing first in the New York Times in 1982, became a favorite during the administration of Ronald Reagan (1911-2004; US president 1981-1989) and has been part of the standard language of commentary since.  Diplomats, aware of Beijing's views on the matter, tend to avoid the phrase, maintaining the “delusional rather than defiant” line.

Picture of defiance: Official State Portrait of Vladimir Putin (2002), oil on canvas by Igor Babailov (b 1965).

The idea of a territory being a “renegade province” can be of great political, psychological (and ultimately military) significance.  The core justification used by Mr Putin (Vladimir Vladimirovich Putin; b 1952; president or prime minister of Russia since 1999) when explaining why his “special military operation” against Ukraine in 2022 was not an “invasion” or “war of aggression” (he probably concedes it may be a “state of armed conflict”) was that he denied Ukraine was a sovereign, independent state and that Volodymyr Zelenskyy (b 1978, president of Ukraine since 2019) was not a legitimate president.  In other words, Ukraine is merely a region of the modern Russia in something of the way it was once one of the 15 constituent SSRs (Soviet Socialist Republic) of the Soviet Union.  Although the Kremlin doesn’t use the phrase, in Mr Putin’s world view, Ukraine is a renegade province and he likely believes that applies also to the Baltic States (Latvia, Lithuania & Estonia) and possibly other former SSRs.  Lake many, the CCP is watching events in Ukraine with great interest and, as recent “exercises” seem to suggest the People’s Liberation Army (PLA) have sufficiently honed their techniques to execute either a blockade (which would be an “act of war”) or a “quarantine” (which would not), the attention of Western analysts is now focused on the hardly secret training being undertaken to perfect what’s needed for the triphibious operations demanded by a full-scale invasion.  The US think-tanks which think much about this possibility have suggested “some time” in 2027 as the likely point at which the military high command would assure the CCP’s central committee such a thing is possible.  What will happen will then depend upon (1) the state of things in the PRC and (2) the CCP’s assessment of how the long-term “strategic ambiguity” of Washington would manifest were an attempt made to finish the “unfinished business” of 1949.

Lindsay Lohan, who has lived a life of defiance.

The objectification of women’s body parts has of course been a theme in Western culture since at least Antiquity but rarely can as much attention been devoted to a single fingernail as the one photographed on Lindsay Lohan’s hand in July 2010 (during her “troubled starlet” phase).  The text printed on the fingernail was sufficiently explicit not to need a academic deconstruction of its alleged meaning, given image was taken when she sitting in court listening to a judge sentence her for one of her many transgressions; the consensus was the text was there to send a “defiant message” the internet’s collective conclusion (which wasn’t restricted to entertainment and celebrity sites) presumably reinforced by the nail being on the middle finger.  Ms Lohan admitted to fining this perplexing, tweeting on X (then known as Twitter) it was merely a manicure and had “…nothing to do w/court, it's an airbrush design from a stencil.  So, rather than digital defiance, it was fashion.  Attributing a motif of defiance to Ms Lohan wasn’t unusual during “troubled starlet” phase, one site assessing a chronological montage of her famous mug shots before concluding with each successive shot, “Lindsay's face becomes more defiant — a young woman hardening herself against a world that had turned her into a punch-line”.

The Bolton-Paul Defiant (1939-1943)

The Parthian shot was a military tactic, used by mounted cavalry and made famous by the Parthians, an ancient people of the Persian lands (the modern-day Islamic Republic of Iran since 1979).  While in real or feigned retreat on horseback, the Parthian archers would, in full gallop, turn their bodies backward to shoot at the pursuing enemy.  This demanded both fine equestrian skills (a soldier’ hands occupied by his bows & arrows) and great confidence in one's mount, something gained only by time spent between man & beast.  To make the achievement more admirable still, the Parthians used neither stirrups nor spurs, relying solely on pressure from their legs to guide and control their galloping mounts and, with varying degrees of success, the tactic was adopted by many mounted military formations of the era including the Scythians, Huns, Turks, Magyars, and Mongols.  The Parthian Empire existed between 247 BC–224 AD.  The Royal Air Force (RAF) tried a variation of the Parthian shot with Bolton-Paul Defiant, a single-engined fighter and Battle of Britain contemporary of the better remembered Spitfire and Hurricane.  Uniquely, the Defiant had no forward-firing armaments, all its firepower being concentrated in four .303 machine guns in a turret behind the pilot.  The theory behind the design dates from the 1930s when the latest multi-engined monoplane bombers were much faster than contemporary single-engined biplane fighters then in service. The RAF considered its new generation of heavily-armed bombers would be able to penetrate enemy airspace and defend themselves without a fighter escort and this of course implied enemy bombers would similarly be able to penetrate British airspace with some degree of impunity.

Bolton-Paul Defiant.

By 1935, the concept of a turret-armed fighter emerged.  The RAF anticipated having to defend the British Isles against massed formations of unescorted enemy bombers and, in theory, turret-armed fighters would be able approach formations from below or from the side and coordinate their fire.  In design terms, it was a return to what often was done early in the World War I, though that had been technologically deterministic, it being then quite an engineering challenge to produce reliable and safe (in the sense of not damaging the craft's own propeller) forward-firing guns.  Deployed not as intended, but as a fighter used against escorted bombers, the Defiant enjoyed considerable early success, essentially because at attack-range, it appeared to be a Hurricane and the German fighter pilots were of course tempted attack from above and behind, the classic hunter's tactic.  They were course met by the the Defiant's formidable battery.  However, the Luftwaffe learned quickly, unlike the RAF which for too long persisted with their pre-war formations which were neat and precise but also excellent targets.  Soon the vulnerability of the Defiant resulted in losses so heavy its deployment was unsustainable and it was withdrawn from front-line combat.  It did though subsequently proved a useful stop-gap as a night-fighter and provided the RAF with an effective means of combating night bombing until aircraft designed for the purpose entered service.

The Trump class "battleships"

In a surprise announcement, the Pentagon announced the impending construction of a “new battleship class” the first of the line (USS Defiant) to be the US Navy’s “largest surface combatant built since World War II [1939-1945]”.  The initial plans call for a pair to be launched with a long-term goal of as many as two dozen with construction to begin in 2030.  Intriguingly, Donald Trump (b 1946; US president 2017-2021 and since 2025) revealed that while the Department of Defense’s (it’s also now the Department of War) naval architects would “lead the design”, he personally would be involved “…because I’m a very aesthetic person.  That may sound a strange imperative when designing something as starkly functional as a warship but in navies everywhere there’s a long tradition of “the beautiful ship” and the design language still in use, although much modified, is recognizably what it was more than a century earlier.  The Secretary of the Navy certainly stayed on-message, announcing the USS Defiant would be “…the largest, deadliest and most versatile and best-looking warship anywhere on the world’s oceans”, adding that components for the project would “be made in every state.”  It won't however be the widest because quirk of ship design in the US Navy is that warships tend to be limited to a beam (width) of around 33 metres (108 feet) because that’s the limit for vessels able to pass through the Panama Canal.

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

By comparison with the existing surface fleet the 35,000 ton Defiant will be impressively large although, by historic standards, the largest (non-carrier) surface combatants now in service are of modest dimensions and displacement.  The largest now afloat are the 15,000-ton Zumwalt class destroyers (which really seem to be cruisers) while the 10,000 ton Ticonderoga class cruisers (which really are destroyers) are more numerous.  So, even the Defiant will seem small compared with the twentieth century Dreadnoughts (which became a generic term for “biggest battleship”), the US Iowa class displacing 60,000 ton at their heaviest while the Japanese Yamato-class weighted-in at 72,000.  Even those behemoths would have been dwarfed by the most ambitious of the H-Class ships in Plan-Z which were on German drawing boards early in World War II.  Before reality bit hard, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) left physics to the engineers and wasn't too bothered by economics.  After being disappointed the proposals the successors to the Bismarck-class ships would have their main armament increased only from eight 15-inch (380 mm) to eight 16 inch cannons, he ordered OKM (Oberkommando der Marine; the Naval High Command) to design bigger ships.  That directive emerged as the ambitious Plan Z which would have demanded so much steel, essentially nothing else in the Reich could have been built.  Although not one vessel in Plan Z ever left the slipway (the facilities even to lay down the keels non-existent), such a fleet would have been impressive, the largest (the H-44) fitted with eight 20-inch (508 mm) cannons.  Even more to the Führer’s liking was the concept of the H-45, equipped with eight 31.5 inch (800 mm) Gustav siege guns.  However, although he never lost faith in the key to success on the battlefield being bigger and bigger tanks, the experience of surface warfare at sea convinced Hitler the days of the big ships were over and he would even try to persuade the navy to retire all their capital ships and devote more resources to the submarines which, as late as 1945, he hoped might still prolong the war.  Had he imposed such priorities in 1937-1938 so the Kriegsmarine (German Navy) could have entered World War II with the ability permanently to have 100 submarines engaged in high-seas raiding rather than barely a dozen, the early course of the war might radically have been different.  Germany indeed entered the war without a single aircraft carrier (the only one laid down never completed), such was the confidence the need to confront the Royal Navy either would never happen or was years away.

The US Navy in 1940 began construction of six Iowa class battleships but only four were ever launched because it had become clear the age of the aircraft carrier and submarine had arrived and the last battleship launched was the Royal Navy’s HMS Vanguard which entered service in 1946.  Although the admirals remained fond of the fine cut of her silhouette on the horizon, to the Treasury (an institution in the austere, post-war years rapidly asserting its authority over the Admiralty) the thing was a white elephant, something acknowledged even by the romantic, battleship-loving Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) who, when in November, 1953 planning a trip to Bermuda for a summit meeting with Dwight Eisenhower (1890-1969; US POTUS 1953-1961), opted to fly because “it costs Stg£30,000 if we go by Vanguard, and only £3,000 by air.  In 1959, Vanguard was sold for scrap and broken up the next year while the last of the Iowa class ships were decommissioned in 1992 after having spent many years of their life in a non-active reserve.  Defiant is of course a most Churchillian word and after World War I (1914-1918, he was asked by a French municipality to devise the wording for its war memorial.  He proposed:

IN WAR: RESOLUTION

IN DEFEAT: DEFIANCE

IN VICTORY: MAGNANIMITY

IN PEACE: GOODWILL

At the time, old Georges Clemenceau (1841–1929; French prime minister 1906-1909 & 1917-1920) wasn’t feeling much magnanimity towards the Germans and nor was he much in the mood to extend any goodwill so Churchill’s suggestion was rejected.  

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

The conventional wisdom therefore was the days of the big warships were done and the Soviet Navy’s curious decision in the 1980s to lay down five (four of which were launched) Kirov class battlecruisers seemed to confirm the view.  Although the Kremlin called the ships тяжёлый атомный ракетный крейсер (heavy nuclear-powered guided missile cruisers), admiralties in the West, still nostalgic lot, choose to revive the old name “battlecruiser”.  The battlecruiser (essentially a battleship with less armor) was a brainchild of the naval theorists of the early twentieth century but while the concept was sound (and in practice may have proved so if the theory had been followed at sea) but in service was a disappointment and none were commissioned after 1920 until the Soviets revived the idea.  As recently as 2018, NATO (North Atlantic Treaty Organization) sources were sceptical any of the Russian ships would ever return to service but in 2025 the Admiral Nakhimov (ex-Kalinin) emerged from a long and expensive re-fit & modernization to serve as the world’s biggest warship.  Although fast and heavily armed, concern remains about her vulnerability to missiles and torpedoes.

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

The US Navy seems confident about the protection afforded by the Trump class’s systems, claiming “the battleship [the Pentagon’s term] will be capable of operating independently, as part of a Carrier Strike Group, or commanding its own Surface Action Group depending on the mission and threat environment.  In other words, unlike an aircraft carrier, the security of the vessel does not depend on a flotilla of destroyers and other smaller escort vessels.  The first of the Trump class is projected to cost between US$10-15 billion although, on the basis of experience, few will be surprised if this number “blows out”.  The Trump class will be the flagships for the Navy’s “Golden Fleet” initiative (an old naval term dating from days of the Spanish colonial Empire and nothing to do with Mr Trump’s fondness for the metal).  In an age in which small, cheap, UAVs (unmanned aerial vehicles, usually referred to as drones) have revolutionized warfare (on land and at sea), the return of the big ships is as interesting as it was unexpected and analysts are already writing their assessments of the prospects of success.

Although the concept wasn’t new, it was late in the nineteenth century naval architects began to apply the word “class” systematically to group ships of the same design, the pioneers the Royal Navy but other powers soon adopted the practice.  It had long been the practice for warships to be constructed on the basis of substantially replicating existing designs and some truly were “identical” to the extent a series would now be called a “class” but before the terminology became (more or less) standardized, warships usually were described by their “Rate” or “Type” (first-rate ship of the line, corvette, frigate etc) but, in the usual military way, there was also much informal slang including phrases such as “the Majestic battleships” or “ships of the Iron Duke type”.  The crystallization of the “class” concept was really a result of technological determinism as the methods developed in factories which emerged during the industrial revolution spread to ship-building; steam power, hulls of iron & steel and the associated complex machinery made design & construction increasingly expensive, thus the need to amortize investment and reduce build times by ordering ships in batches with near-identical specifications.

Navies in the era were also becoming more bureaucratic (a process which never stopped and some believe is accelerating still) and Admiralties became much taken with precise accounting and doctrinal categorisation.  The pragmatic admirals however saw no need to reinvent the wheel, “class” already well-established in engineering and taxonomy, the choice thus an obvious administrative convenience.  The “new” nomenclature wasn’t heralded as a major change or innovations, the term just beginning to appear in the 1870s in Admiralty documents, construction programmes and parliamentary papers in which vessels were listed in groups including Devastation class ironclad turret ships (laid down 1869), Colossus class battleships (laid down 1879) and Admiral class battleships (1880s).  In recent history tests, warships prior to this era sometimes are referred to as “Ship-of-the-line class”, “Three decker class” etc but this use is retrospective.  The French Navy adopted the convention almost simultaneously (with the local spelling classe) with Imperial Germany’s Kaiserliche Marine (Imperial Navy) following in the 1890s with Klasse.  The US Navy was comparatively late to formalise the use and although “class” in this context does appear in documents in the 1890s, the standardization wasn’t complete until about 1912.

As a naming convention (“King George V class”, “Iowa class” etc), the rule is the name chosen is either (1) the first ship laid down, or (2) the lead ship commissioned.  According to Admiralty historians, this wasn’t something determined by a committee or the whim of an admiral (both long naval traditions) but was just so obviously practical.  It certainly wasn’t an original idea because the term “class” was by the late nineteenth century well established in industrial production, civil engineering, and military administration; if anything the tradition-bound admirals were late-adopters, sticking to their old classificatory habit long after it had outlived its usefulness.  With ships becoming bigger and more complex, what was needed was a system (which encompassed not only the ships but also components such as guns, torpedoes, engines etc) which grouped objects according to their defined technical specification rather than their vague “type” (which by then had become most elastic) or individual instances; naval architecture had entered the “age of interchangability”.

A docked Boomin' Beaver.

It’s good the US Navy is gaining (appropriately large) “Trump Class” warships (which the president doubtless will call “battleships” although they’re more in the “battlecruiser” tradition).  Within the fleet however there are on the register many smaller vessels and the most compact is the 19BB (Barrier Boat), a specialized class of miniature tugboat used deploy and maintain port security booms surrounding Navy ships and installations in port.  Over the last quarter century there have been a dozen-odd commissioned of which ten remain in active service.  Unlike many of the Pentagon’s good (and expensive) ideas, the Barrier Boats were a re-purposing of an existing design, their original purpose being in the logging industry where they were used to manoeuvre logs floating along inland waterways.  In that role the loggers dubbed them “log broncs” because the stubby little craft would “rear up like a rodeo bronco” when spun around by 180o.  Sailors of course have their own slang and they (apparently affectionately) call the 19BBs the “Boomin’ Beaver”, the origin of that being uncertain but it may verge on the impolite.  It’s not known if President Trump is aware of the useful little BB19 but if brought to his attention, he may be tempted to order two of them renamed “USS Joe Biden” and “USS Crooked Hillary” although, unlike those reprobates, the Boomin’ Beavers have done much good work for the nation.

The Arc de Triomphe, Paris (left), Donald Trump with model of his proposed arch, the White House, October, 2025 (centre) and a model of the arch, photographed on the president's Oval Office desk (right).  Details about the arch remain sketchy but it's assumed (1) it will be "big" and (2) there will be some gold, somewhere.

As well as big ships (and the big Donald J Trump Ballroom already under construction where the White House’s East Wing once stood), Mr Trump is also promising a “big arch”.  A part of the president’s MDCBA (Make D.C. Beautiful Again) project, the structure (nicknamed the “Triumphal Arch” and in the style of the Arc de Triomphe which stands in the centre of the Place Charles de Gaulle (formerly the Place de l’Étoile), the western terminus of the avenue des Champs-Élysées) is scheduled to be completed in time to celebrate the nation’s 250th anniversary on 4 July 2026.  Presumably, on that day, it will be revealed the official name is something like the “Donald J Trump Sestercentennial Arch” which will appear on the structure in large gold letters.  The arch is said to be “privately funded”, using money left over from what was donated to build the ballroom, a financing mechanism which has attracted some comment from those concerned about the “buying of influence”.

Adolf Hitler's (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) sketch of an arch (1926, left) and Hitler, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) and others examining Speer's model of the arch, presented 20 April 1939 upon the occasion of the Führer’s 50th birthday (right; note the pattern in carpet).

A model of Germania.  To give some indication of the scale, within the dome of the huge meeting hall (at top of image), St. Peter's Cathedral in Rome would have fitted several times over; its diameter of the dome would have been 250 metres (825 feet).

Commissioned to honor those who fought and died for France during the French Revolutionary (1792-1802) and Napoleonic Wars (1803-1815), construction of the Arc de Triomphe (officially the Arc de Triomphe de l'Étoile) absorbed 30-odd years between 1806-1836, as a piece of representational architecture the structure is thought perfectly proportioned for assessment by the human eye and perhaps for this reason it has been admired by many.  As early as 1926, Adolf Hitler sketched his vision of a grand arch for Berlin, while bitter experience taught him the big warships were a bad idea because of their vulnerability to air attack, he never lost his enthusiasm for megalomania in architecture and in Albert Speer he found the ideal architect.  Noting the dimensions in Hitler’s sketch, Speer responded with something in the spirit of their blueprint for Germania.  Hitler’s planned the rebuilding of Berlin to be complete by 1950, less than ten years after the expected victory in a war which would have made him the master of Europe from the French border to the Ural mountains (things didn’t work out well for him).  While the 50 metre (163 feet) tall Arc de Triomphe presented a monumental appearance and provided a majestic terminus for the Champs Elysees, Speer’s arch stood 117 meters (384 feet) in height but even though obviously substantial, it would have been entirely in scale with the rest of Germania, the whole place built in a way to inspire awe simply by virtue of sheer size.