Showing posts sorted by date for query Isolation. Sort by relevance Show all posts
Showing posts sorted by date for query Isolation. Sort by relevance Show all posts

Tuesday, January 20, 2026

Fork

Fork (pronounced fawrk)

(1) An instrument having two or more tines (popularly called prongs), for holding, lifting, etc., as an implement for handling food or any of various agricultural tools.

(2) Something resembling or suggesting this in form or conceptually.

(3) As tuning fork, instruments used (1) in the tuning of musical instruments and (2) by audiologists and others involved in the study or treatment of hearing.

(4) In machinery, a type of yoke; a pronged part of any device.

(5) A generalized description of the division into branches.

(6) In physical geography and cartography, by abstraction, the point or part at which a thing, as a river or a road, divides into branches; any of the branches into which a thing divides (and used by some as a convention to describe a principal tributary of a river.

(7) In horology, (in a lever escapement) the forked end of the lever engaging with the ruby pin.

(8) In bicycle & motorcycle design, the support of the front wheel axles, having the shape of a two-tined fork.

(9) In archery, the barbed head of an arrow.

(10) To pierce, raise, pitch, dig etc, with a fork.

(11) Metonymically (and analogous with the prongs of a pronged tool), to render something to resemble a fork or describe something using the shape as a metaphor.

(12) In chess, to maneuver so as to place two opponent's pieces under simultaneous attack by the same piece (most associated with moves involving the knight).

(13) In computer programming, to modify a software’s source code to create a version sufficiently different to be considered a separate path of development.

(14) In computer programming, as "fork bomb", a program that creates a large number of self-replicating tasks or processes in a computer system in order to cause a DoS (denial of service).

(15) To turn as indicated at a fork in a road, path etc.

(16) Figuratively, a point in time when a decision is taken.

(17) In fulminology (the scientific (as opposed to the artistic or religious) study of lightning), as "forked lightning", the type of atmospheric discharge of electricity which hits the ground in a bolt.

(18) In software development, content management & data management, figuratively (by abstraction, from a physical fork), a departure from having a single source of truth (SSOT) (unintentionally as originally defined but later also applied where the variation was intentional; metonymically, any of the instances of software, data sets etc, thus created.

(19) In World War II (1939-1945) era British military jargon, the male crotch, used to indicate the genital area as a point of vulnerability in physical assault.

(20) in occupational slang, a clipping of forklift; any of the blades of a forklift (or, in plural, the set of blades), on which the goods to be raised are loaded.

(21) In saddlery, the upper front brow of a saddle bow, connected in the tree by the two saddle bars to the cantle on the other end.

(22) In slang, a gallows (obsolete).

(23) As a transitive verb, a euphemistic for “fuck” one of the variations on f***, ***k etc and used typically to circumvent text-based filters.

(24) In underground, extractive mining, the bottom of a sump into which the water of a mine drains; to bale a shaft dry (still often spelled forcque).

(25) As the variant chork, an eating utensil made with a combination of chopstick & fork, intended for neophyte chopstick users.

(26) In literature, as "silver fork novel" a genre in nineteenth century English literature that depicted the lives of the upper class and the aristocracy (known also as the "fashionable novel" and "drawing room fiction").

Pre-1000: From the Middle English forke (digging fork), from the Old English force & forca (pitchfork, forked instrument, forked weapon; forked instrument used to torture), from the Proto-West Germanic furkō (fork), from the Latin furca (pitchfork, forked stake; gallows, beam, stake, support post, yoke) of uncertain origin. The Middle English was later reinforced by the Anglo-Norman & Old Northern French forque (it was from the Old French forche which French gained fourche), also from the Latin.  It was cognate with the Old Frisian forke, the North Frisian forck (fork), the Dutch vork (fork), the Danish vork (fork) and the German Forke (pitchfork).  The evolved Middle English form displaced the native Old English gafol, ġeafel & ġeafle (fork) (and the apparently regionally specific forcel (pitchfork) though the use from circa 1200 to mean “forked stake or post used as a prop when erecting a gallows” did for a while endure, probably because of the long-life of the architectural plans for a structure which demanded no change or functional improvement.  The alternative spelling forcque is used in mining and describes the "bottom of a sump".  Perhaps surprisingly, dictionaries don't list forkish or forkesque as standard adjectives.  Fork is a noun & verb, forking is a noun, verb, adjective & adverb, forklike is an adjective and forked is a verb & adjective; the noun plural is forks.

Representation of the forks the Linux operating system.  Software forks can extend, die off or merge with other forks.

The forks of The Latin furca (in its primary sense of “fork”) may be from the primitive Indo-European gherk & gherg (fork) although etymologists have never traced any explanation for the addition of the -c-, something which remains mysterious even if the word was influenced by the Proto-Germanic furkaz & firkalaz (stake, stick, pole, post) which was from the primitive Indo-European perg- (pole, post).  If such a link existed, it would relate the word to the Old English forclas pl (bolt), the Old Saxon ferkal (lock, bolt, bar), the Old Norse forkr (pole, staff, stick), the Norwegian fork (stick, bat) and the Swedish fork (pole).  The descendants in other languages include the Sranan Tongo forku, the Dutch vork, the Japanese フォーク (fōku), the Danish korf, the Kannada ಫೋರ್ಕ್ (phōrk), the Korean 포크 (pokeu), the Maori paoka, the Tamil போர்க் (pōrk) and the Telugu ఫోర్క్ (phōrk).  In many languages, the previous form was retained for most purposes while the English fork was adopted in the context of software development.

Forks can be designed for specific applications, this is a sardine fork, the dimensions dictated by the size of the standard sardine tin.

Although visitors from Western Europe discovered the novelty of the table fork in Constantinople as early as the eleventh century, the civilizing influence from Byzantium seems not routinely to have appeared on the tables of the English nobility until the 1400s and the evidence suggests it didn’t come into common use before the early seventeenth century.  The critical moment is said to have come in 1601 when the celebrated traveller and writer Thomas Coryat (or Coryate) (circa 1577–1617) returned to London from one of his tours, bringing with him the then almost unknown "table fork" which he'd seen used in Italy.  This "continental affectation" made him the subject of mirth and playwrights dubbed him "the fork-carrying traveller" while the street was earthier, the nickname "Furcifer" (from the Latin meaning "fork-bearer, rascal") soon adopted and despite the early scepticism, there soon were many types of "specific purpose forks (cake fork, cocktail fork, dessert fork etc).  Mr Coryat thus made one of the great contributions to the niceties of life, his other being the introduction to the  English language of the word "umbrella", another influence from Italy.

Cause and effect: The fork in the road.

In Lewis Carroll’s (1832–1898, the (pen name of Charles Lutwidge Dodgson (1832–1898)) Alice's Adventures in Wonderland (1865), when Alice comes to a fork in the road, she encounters the Cheshire Cat sitting in a tree:

Alice: “Would you tell me, please, which way I ought to go from here?

Cat: “That depends a good deal on where you want to get to.

Alice: “I don’t know.

Cat: “Then it doesn't matter which way you go.

One can see the cat’s point and a reductionist like Donald Rumsfeld (1932–2021: US defense secretary 1975-1977 & 2001-2006) there would have ended the exchange but the feline proved more helpful, telling Alice she’ll see the Mad Hatter and the March Hare if she goes in certain directions, implying that no matter which path she chooses, she’ll encounter strange characters.  That she did and the book is one of the most enjoyable flights of whimsy in English.

The idiomatic phrase “fork in the road” wasn’t in use early in the seventeenth century when translators were laboring to create the King James Bible (KJV, 1611) so “…the king of Babylon so stood at the parting of the way, at the head of the two ways…” appeared whereas by 1982 when the New King James Version (NKJV, 1982) was released, that term would have been archaic so the translation was rendered as “…the king of Babylon stands at the parting of the road, at the fork of the two roads…”.

Ezekiel 21:19-23; King James Version of the Bible (KJV, 1611):

Also, thou son of man, appoint thee two ways, that the sword of the king of Babylon may come: both twain shall come forth out of one land: and choose thou a place, choose it at the head of the way to the city. Appoint a way, that the sword may come to Rabbath of the Ammonites, and to Judah in Jerusalem the defenced. For the king of Babylon stood at the parting of the way, at the head of the two ways, to use divination: he made his arrows bright, he consulted with images, he looked in the liver. At his right hand was the divination for Jerusalem, to appoint captains, to open the mouth in the slaughter, to lift up the voice with shouting, to appoint battering rams against the gates, to cast a mount, and to build a fort. And it shall be unto them as a false divination in their sight, to them that have sworn oaths: but he will call to remembrance the iniquity, that they may be taken.

Ezekiel 21:19-23; New King James Version of the Bible (NKJV, 1982):

And son of man, appoint for yourself two ways for the sword of the king of Babylon to go; both of them shall go from the same land. Make a sign; put it at the head of the road to the city. Appoint a road for the sword to go to Rabbah of the Ammonites, and to Judah, into fortified Jerusalem. For the king of Babylon stands at the parting of the road, at the fork of the two roads, to use divination: he shakes the arrows, he consults the images, he looks at the liver. In his right hand is the divination for Jerusalem: to set up battering rams, to call for a slaughter, to lift the voice with shouting, to set battering rams against the gates, to heap up a siege mound, and to build a wall. And it will be to them like a false divination in the eyes of those who have sworn oaths with them; but he will bring their iniquity to remembrance, that they may be taken.

The KJV & NKJV closely are related but do in detail differ in the language used, the objective of the latter being to enhance readability while retaining the stylistic beauty and literary structure of the original.  Most obviously, the NKJV abandoned the use of archaic words and convention of grammar (thee, thou, ye, thy, thine, doeth, speaketh etc) which can make it difficult for modern readers to understand, rather as students can struggle with Shakespeare’s text, something not helped by lecturers reminding them of its beauty, a quality which often escapes the young.  The NKJV emerged from a reaction to some of the twentieth century translations which traditionalist readers thought had “descended” too far into everyday language; it was thus a compromise between greater readability and a preservation of the original tone.  Both the KJV & NKJV primarily used the Textus Receptus (received text) for the New Testament and Masoretic Text for the Old Testament and this approach differed from other modern translations (such as the New International Version (NIV, 1978) & English Standard Version (ESV, which 2001) used a wider sub-set of manuscripts, including older ones like the Alexandrian texts (Codex Vaticanus, Sinaiticus etc)  So, the NKJV is more “traditional” than modern translations but not as old-fashioned as the KJV and helpfully, unlike the KJV which provided hardly any footnotes about textual variants, the NKJV was generous, showing where differences existed between the major manuscript traditions (Textus Receptus, Alexandrian & Byzantine), a welcome layer of transparency but importantly, both used a formal equivalence (word-for-word) approach which put a premium on direct translation over paraphrasing, the latter technique much criticized in the later translations.

Historians of food note word seems first to have appeared in this context of eating utensils in an inventory of household goods from 1430 and they suggest, because their influence in culinary matters was strongest, it was probably from the Old North French forque.  It came to be applied to rivers from 1753 and of roads by 1839.  The use in bicycle design began in 1871 and this was adopted directly within twenty years when the first motorcycles appeared.  The chess move was first so-described in the 1650s while the old slang, forks "the two forefingers" was from 1812 and endures to this day as “the fork”.  In the world of cryptocurrencies, fork has been adopted with fetish-like enthusiasm to refer to (1) a split in the blockchain resulting from protocol disagreements, or (2) a branch of the blockchain resulting from such a split.

Lindsay Lohan with Tiramisu and cake-fork, Terry Richardson (b 1965) photoshoot, 2012.

The verb dates from the early fourteenth century in the sense of (1) “to divide in branches, go separate ways" & (2) "disagree, be inconsistent", both derived from the noun.  The transitive meaning "raise or pitch with a fork" is from 1812, used most frequently in the forms forked & forking while the slang verb phrase “fork (something) over” is from 1839 while “fork out” (give over) is from 1831).  The now obsolete legal slang “forking” in the forensic sense of a "disagreement among witnesses" dates from the turn of the fifteenth century.  The noun forkful was an agricultural term from the 1640s while the specialized fourchette (in reference to anatomical structures, from French fourchette (diminutive of fourche (a fork)) was from 1754.  The noun pitchfork (fork for lifting and pitching hay etc.) described the long-used implement constructed commonly with a long handle and two or three prongs first in the mid fourteenth century, altered (by the influence of pichen (to throw, thrust), from the early thirteenth century Middle English pic-forken, from pik (source of pike).  The verb use meaning "to lift or throw with a pitchfork," is noted from 1837.  The spork, an eating utensil which was fashioned by making several long indents in the bowl to create prongs debuted in 1909.

Dining room of Huis Doorn.

Huis Doorn (Doorn House) near Utrecht in the Netherlands, was the country house in which the exiled Kaiser Wilhelm II (1859–1941; Emperor of Germany & King of Prussia 1888-1918) would live until his death.  Confiscated by the state at the end of World War II (1939-1945), Huis Doorn is now a museum, maintained much as the former Kaiser left it.  At his place on the dining room table sits one of his special forks with three tines, the widened one to the left a blade serving as a knife because a congenitally withered left-arm made the use of a conventional utensil too difficult.

Compelled by circumstances to abdicate at the end of World War I (1914-1918) Wilhelm was granted asylum by the neutral Netherlands, the cabinet insisting his status would be that of a private German citizen; to the status-conscious former Kaiser, it remained for the rest of his life a disappointment that Wilhelmina (1880–1962; Queen of the Netherlands 1890-1948) would neither receive nor visit him.  He’d arrived in the Netherlands accompanied by a reputed 64 train carriages of imperial household goods (furnishings, art, bibelots and such) and an unknown slice of the German exchequer so was able to purchase and adequately decorate Huis Doorn which he purchased, taking up residence in May 1920.  However much of the Imperial Treasury came with him remains a matter of speculation but until his death, he maintained a household staff sufficient to ensure “a certain grandeur”.  Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945) did on several occasions pay a visit but that stopped as soon as the Nazis took power in Germany in 1933; the former sovereign had out-lived any potential usefulness to the party.  Indeed, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) would have preferred if the old man had had the decency quietly to drop dead because the last thing he wanted was any possibility the monarchy might be restored.  He regarded Benito Mussolini’s (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) greatest mistake (and there were a few) as having not deposed the Victor Emmanuel III (1869–1947; King of Italy 1900-1946) when he had the chance and to his dying day suspected a conspiracy between the Freemasons and the royal court was behind the Duce’s downfall in 1943.  There may be something in that because Marshal Pietro Badoglio (1871–1956; Prime Minister of Italy 1943-1944), appointed by the King as Mussolini’s replacement, was a confessed Freemason.

Speciale vork voor Willem IIOne of Wilhelm's silver Kaisergabels (Imperial fork).

In a coda which would have amused those who remembered Winston Churchill’s (1875-1965; UK prime-minister 1940-1945 & 1951-1955) glee at hearing the chant “Hang the Kaiser!” at the end of World War I, after the Netherlands was invaded in 1940, fearing the Nazis might murder their former ruler, through diplomatic channels he offered to receive Wilhelm “with dignity and consideration” if he chose to seek refuge in the UK.  The offer was declined and he remained safely in Huis Doon until his death, the Nazis simply ignoring him because in the euphoria of victory, there was in Germany no longer a significant pro-monarchist movement.  Churchill's offer has been treated by some historians as “a humanitarian gesture” but he always had a fondness for monarchical government (his wife called him the last man in Europe still to believe in the divine right of kings”) and it's suspected he may have pondered the idea of a restoration (possibly Crown Prince Wilhelm (1882–1951)) in constitutional form.

Der Gableschwanz Teufl: The Lockheed P-38 Lightning (1939-1945).  During World War II, the Luftwaffe’s (German air force) military slang for the twin-boomed Lockheed P-38 Lightning was Der Gableschwanz Teufl (the fork-tailed devil).

Novelty nail-art by US restaurant chain Denny's.  The manicure uses as a base a clean, white coat of lacquer, to which was added miniature plastic utensils, the index finger a fork, the middle finger a knife, the ring finger a spoon, and the pinky finger presumably a toothpick or it could be something more kinky.

The idiomatic “speak with forked tongue” to indicate duplicitous speech dates from 1885 and was an invention of US English though reputedly influenced by phrases settlers learned in their interactions with first nations peoples (then called “Red Indians”).  The earlier “double tongue” (a la “two-faced”) in the same sense was from the fifteenth century.  Fork as a clipping of the already truncated fork-lift (1953) fom the fork-lift truck (1946), appears to have enter the vernacular circa 1994.  The adjective forked (branched or divided in two parts) was the past-participle adjective from the verb and came into use early in the fourteenth century.  It was applied to roads in the 1520s and more generally within thirty years while the use in the sixteenth and seventeenth century with a suggestion of "cuckold" (on the notion of "horned") is long obsolete.    Applied in many contexts (literally & figuratively), inventions (with and without hyphens) include fork-bomb, fork-buffet, fork-dinner, fork-head, rolling-fork, fork-over, fork-off & fork-up (the latter pair euphemistic substitutions for "fuck off" & "fuck-up).

Führerspork: Spork (left) from a flatware set (right) made for Adolf Hitler's 50th birthday, sold at auction in 2018 for £12,500.  The items had been discovered in England in a house once owned by a senior military officer, the assumption being they were looted in 1945 (“souveniring” or “spoils of war” in soldiers' parlance), the items all bearing the Nazi eagle, swastika and Hitler's initials.  Auction houses can be inconsistent in their descriptions of sporks and in some cases they're listed as splayds, the designs meaning sometimes it's a fine distinction.

1979 Benelli 750 Sei (left) and Benelli factory schematic of the 750 Sei’s fork (series 2a, right).

One quirk in the use of the word is the tendency of motorcyclists to refer to the front fork as “the forks”.  Used on almost every motorcycle made, the fork is an assembly which connects the front axle (and thus the wheel) to the frame, usually by via a pair (upper & lower) of yokes; the fork provides both the front suspension (springs or hydraulics) and makes possible the steering.  The reason the apparatus is often called “the forks” is the two most obvious components (the left & right) tubes appear to be separate when really they are two prongs connected at the top.  Thus, a motor cycle manufacturer describes the assembly (made of many components (clamp, tubes, legs, springs, dampers etc)) “a fork” but, because of the appearance, riders often think of them as a pair of forks, thus the vernacular “the forks”.  English does have other examples of such apparent aberrations such as a “pair of spectacles” which is sold as a single item but the origin of eye-glasses was in products sold as separate lens and users would (according to need) buy one glass (what became the monocle) or a pair of glasses.  That is a different structural creation than the bra which on the model of a “pair of glasses” would be a “pair of something” but the word is a clipping of “brassiere”.  English borrowed brassiere from the French brassière, from the Old French braciere (which was originally a lining fitted inside armor which protected the arm, only later becoming a garment), from the Old French brace (arm) although by then it described a chemise (a kind of undershirt) but in the US, brassiere was used from 1893 when the first bras were advertised and from there, use spread.  The three syllables were just too much to survive the onslaught of modernity and the truncated “bra” soon prevailed, being the standard form throughout the English-speaking world by the early 1930s.  Curiously, in French, a bra is a soutien-gorge which translates literally and rather un-romantically as “throat-supporter” although “chest uplifter” is a better translation.

2004 Dodge Tomahawk.

There have been variations on the classic fork and even designs which don’t use a conventional front fork, most of which have been variations on the “swinging arm” a structure which is either is or tends towards the horizontal.  One of the most memorable to use swinging arms was the 2004 Dodge Tomahawk, a “motorcycle” constructed around a 506 cubic inch (8.3 litre) version of the V10s used in the Dodge Viper (1991-2010 & 2013-2017) and the concept demonstrated what imaginative engineers can do if given time, money, resources and a disconnection from reality.  Designing a 500 horsepower (370 kW) motorcycle obviously takes some thought so what they did to equalize things a bit in what would otherwise be an unequal battle with physics was use four independently sprung wheels which allowed the machine to corner with a lean (up to 45o said to be possible) although no photographs seem to exist of an intrepid rider putting this projection to the test.  Rather than a fork, swinging arms were used and while this presumably enhanced high-speed stability, it also meant the turning circle was something like that of one of the smaller aircraft carriers.  There were suggestions a top speed of some 420 mph (675 km/h) was at least theoretically possible although a sense of reality did briefly intrude and this was later revised to 250 mph (400 km/h).  In the Dodge design office, presumably it was thought safe to speculate because of the improbability of finding anyone both sufficient competent and crazy enough to explore the limits; one would find plenty of either but the characteristics rarely co-exist.  Remarkably, as many as ten replicas were sold at a reputed US$555,000 and although (mindful of the country’s litigious habits) all were non-operative and described as “art deco inspired automotive sculpture” to be admired as static displays, some apparently have been converted to full functionality although there have been no reports of top speed testing.

Britney Spears (b 1981): "Video clip with fork feature", Instagram, 11 May 2025.

Unfortunately, quickly Ms Spears deleted the more revealing version of the clip but for those pondering the messaging, Spearologists (a thoughtful crew devoted to their discipline) deconstructed the content, noting it came some days after she revealed it had been four months she’d left her house.  The silky, strapless dress and sweat-soaked, convulsing flesh were (by her standards) uncontroversial but what may have mystified non-devotees was the fork she at times held in her grasp.  Apparently, the fork was an allusion to her earlier quote: “Shit!  Now I have to find my FORK!!!”, made during what was reported as a “manic meltdown” (itself interesting in that it at least suggests the existence of “non-manic” meltdowns) at a restaurant, following the abrupt departure of her former husband (2022-2024) Hesam "Sam" Asghari (b 1994).  The link between restaurant and video clip was reports Mr Asghari was soon to be interviewed and there would be questions about the marriage.  One of her earlier posts had included a fork stabbing a lipstick (forks smeared with lipstick a trick also used in Halloween costuming to emulate facial scratches) and the utensil in the clip was said to be “a symbol of her frustration and emotional state.”  Now we know.

Großadmiral (Grand Admiral, equivalent to an admiral of the fleet (Royal Navy) or five star (fleet) admiral (US Navy)) Alfred von Tirpitz (1849–1930; State Secretary of the German Imperial Naval Office 1897-1916).

He's remembered now for (1) his role in building up the Imperial German Navy, triggering events which would play some part in the coming of World War I, (2) his distinctive twin-forked beard and (3) being the namesake for the Bismarck class battleship Tirpitz (1939-1944) which, although she hardly ever took to the high seas and fired barely a shot in anger, merely by being moored in Norwegian fjords, she compelled the British Admiralty to watch her with a mix of awe and dread, necessitating keeping in home waters a number of warships badly needed elsewhere.  Such was the threat his namesake battleship represented, just the mistaken belief she was steaming into the path of a convoy (PQ 17, June 1942) of merchant ships bound for the Russian port of Archangel caused the Admiralty to issue a “scatter order” (ie disperse the convoy from the escorting warships), resulting in heavy losses.  After a number of attempts, in 1944, she finally was sunk in a raid by RAF (Royal Air Force) bombers but, because some of the capsized hull remained visible above the surface, some wags in the navy insisted the air force had not "sunk the beast" but merely "lowered her to the waterline".  It wasn't until after the war the British learned the RAF's successful mission, strategically, had been unnecessary, earlier attacks (including the Admiralty's using mines placed by crews in midget submarines) having inflicted so much damage there was by 1944 no prospect of the Tirpitz again venturing far from her moorings.

Lieutenant General Nagaoka Gaishi san, Tokyo, 1920.

When Großadmiral von Tirpitz died in 1930, he and twin-fork beard were, in the one casket, buried in Bavaria's Münchner Waldfriedhof “woodland cemetery”.  The “one body = one casket” protocol is of course the almost universal practice but there have been exceptions and one was Lieutenant General Gaishi Nagaoka (1858-1933) who served in the Imperial Japanese Army between 1978-1908, including as vice chief of the general staff during the Russo-Japanese War (1904-1905).  While serving as a military instructor, one of his students was the future Generalissimo Chiang Kai-shek (1887-1975; leader of the Republic of China (mainland) 1928-1949 & the renegade province of Taiwan 1949-1975).  After retiring from the military, he entered politics, elected in 1924 as a member of the House of Representatives (after Japan in the 1850s ended its “isolation” policy, it’s political and social system were a mix of Japanese, British and US influences).  After he died in 1933, by explicit request, his impressive “handlebar” moustache carefully was removed and buried in a separate casket in Aoyama Cemetery.

Saturday, January 10, 2026

Aphantasia

Aphantasia (pronounced ay-fan-tay-zhuh)

The inability voluntarily to recall or form mental images.

2015: The word (not the diagnosis) was coined by UK neurologist Dr Adam Zeman (b 1957), neuropsychologist Dr Michaela Dewar (b 1976) and Italian neurologist Sergio Della Sala (b 1955), first appearing the paper Lives without imagery.  The construct was a- (from the Ancient Greek ἀ- (a-), used as a privative prefix meaning “not”, “without” or “lacking” + phantasía (from the Greek φαντασία (“appearance”, “imagination”, “mental image” or “power of imagination”, from φαίνω (phaínō) ( “to show”, “to make visible” or “to bring to light”).  Literally, aphantasia can be analysed as meaning “an absence of imagination” or “an absence of mental imagery” and in modern medicine it’s defined as “the inability voluntarily to recall or form mental images”.  Even in Antiquity, there was some meaning shift in phantasía, Plato (circa 427-348 BC) using the word to refer generally to representations and appearances whereas Aristotle (384-322 BC) added a technical layer, his sense being faculty mediating between perception (aisthēsis) and thought (noēsis).  It’s the Aristotelian adaptation (the mind’s capacity to form internal representations) which flavoured the use in modern neurology.  Aphantasia is a noun and aphantasic is a noun & adjective; the noun plural is aphantasics.

Scuola di Atene (The School of Athens, circa 1511), fresco by Raphael (Raffaello Sanzio da Urbino, 1483–1520), Apostolic Palace, Vatican Museums, Vatican City, Rome.  Plato and Aristotle are the figures featured in the centre.

In popular use, the word “aphantasia” can be misunderstood because of the paths taken in English by “phantasy”, “fantasy” and “phantasm”, all derived from the Ancient Greek φαντασία (phantasía) meaning “appearance, mental image, imagination”.  In English, this root was picked up via Latin and French but the multiple forms each evolved in distinct semantic trajectories.  The fourteenth century phantasm came to mean “apparition, ghost, illusion” so was used of “something deceptive or unreal”, the connotation being “the supernatural; spectral”.  This appears to be the origin of the association of “phantas-” with unreality or hallucination rather than normal cognition.  In the fifteenth & sixteenth centuries, the spellings phantasy & fantasy were for a time interchangeable although divergence came with phantasy used in its technical senses of “mental imagery”; “faculty of imagination”; “internal representation”, this a nod to Aristotle’s phantasía.  Fantasy is the familiar modern form, used to suggest “a fictional invention; daydream; escapism; wish-fulfilment, the connotation being “imaginative constructions (in fiction); imaginative excess (in the sense of “unreality” or the “dissociative”); indulgence (as in “speculative or wishful thoughts”)”.

While the word “aphantasia” didn’t exist until 2015, in the editions of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) published between 1952 (DSM-I) and 2013 (DSM-5), there was not even any any discussion (or even mention) of a condition anything like “an inability voluntarily to recall or form mental images”.  That’s because despite being “a mental condition” induced by something happening (or not happening) in the brain, the phenomenon has never been classified as “a mental disorder”.  Instead it’s a cognitive trait or variation in the human condition and technically is a spectrum condition, the “pure” aphantasic being one end of the spectrum, the hyperaphantasic (highly vivid, lifelike mental imagery, sometimes called a “photorealistic mind's eye”) the other.  That would of course imply the comparative adjective would be “more aphantasic” and the superlative “most aphantasic” but neither are standard forms.

If That rationale for the “omission” was the DSM’s inclusion criteria including the requirement of some evidence of clinically significant distress or functional impairment attributable to a condition.  Aphantasia, in isolation, does not reliably meet this threshold in that many individuals have for decades functioned entirely “normally” without being aware they’re aphantasic  while others presumably had died of old age in similar ignorance.  That does of course raise the intriguing prospect the mental health of some patients may have been adversely affected by the syndrome only by a clinician informing them of their status, thus making them realize what they were missing.  This, the latest edition of the DSM (DSM-5-TR (2022)) does not discuss.  The DSM does discuss imagery and perceptual phenomena in the context of other conditions (PTSD (post-traumatic stress disoder), psychotic disorders, dissociative disorders etc), but these references are to abnormal experiences, not the lifelong absence of imagery.  To the DSM’s editors, aphantasis remains a recognized phenomenon, not a diagnosis.

Given that aphantasia concerns aspects of (1) cognition, (2) inner experience and (3) mental representation, it wouldn’t seem unreasonable to expect the condition now described as aphantasia would have appeared in the DSM, even if only in passing or in a footnote.  However, in the seventy years between 1952-2022, over nine editions, there is no mention, even in DSM-5-TR (2022), the first volume released since the word was in 2015 coined.  That apparently curious omission is explained by the DSM never having been a general taxonomy of mental phenomena.  Instead, it’s (an ever-shifting) codification of the classification of mental disorders, defined by (1) clinically significant distress and/or (2) functional impairment and/or (3) a predictable course, prognosis and treatment relevance.  As a general principle the mere existence of an aphantasic state meets none of these criteria.

Crooked Hillary Clinton in Orange Nina McLemore pantsuit, 2010.  If in response to the prompt "Imagine a truthful person" one sees an image of crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), that obviously wrong but is not an instance of aphantasia because the image imagined need not be correct, it needs just to exist.

The early editions (DSM-I (1952) & DSM-II (1968)) heavily were slanted to the psychoanalytic, focusing on psychoses, neuroses and personality disorders with no mention of any systematic treatment of cognition as a modular function; the matter of mental imagery (even as abstract though separated from an imagined image), let alone its absence, wholly is ignored.  Intriguingly, given what was to come in the field, there was no discussion of the cognitive phenomenology beyond gross disturbances (ie delusions & hallucinations).  Even with the publication of the DSM-III (1980) & DSM-III-R (1987), advances in scanning and surgical techniques, cognitive psychology and neuroscience seem to have made little contribution to what the DSM’s editorial board decided to include and although DSM-III introduced operationalized diagnostic criteria (as a part of a more “medicalised” and descriptive psychiatry), the entries still were dominated by a focus on dysfunctions impairing performance, the argument presumably that it was possible (indeed, probably typical) for those with the condition to lead, full, happy lives; the absence of imagery ability thus not considered a diagnostically relevant variable.  Even in sections on (1) amnestic disorders (a class of memory loss in which patients have difficulty forming new memories (anterograde) or recalling past ones (retrograde), not caused by dementia or delirium but of the a consequence of brain injury, stroke, substance abuse, infections or trauma), with treatment focusing on the underlying cause and rehabilitation, (2) organic mental syndromes or (3) neuro-cognitive disturbance, there was no reference to voluntary imagery loss as a phenomenon in its own right.

Although substantial advances in cognitive neuroscience meant by the 1990s neuropsychological deficits were better recognised, both the DSM-IV (1994) and DSM-IV-TR (2000) continued to be restricted to syndromes with behavioural or functional consequences.  In a way that was understandable because the DSM still was seen by the editors as a manual for working clinicians who were most concerned with helping those afflicted by conditions with clinical salience; the DSM has never wandered far into subjects which might be matters of interesting academic research and mental imagery continued to be mentioned only indirectly, hallucinations (percepts without stimuli) and memory deficits (encoding and retrieval) both discussed only in the consequence of their affect on a patient, not as phenomenon.  The first edition for the new century was DSM-5 (2013) and what was discernible was that discussions of major and mild neuro-cognitive disorders were included, reflecting the publication’s enhanced alignment with neurology but even then, imagery ability is not assessed or scaled: not possessing the power of imagery was not listed as a symptom, specifier, or associated feature.  So there has never in the DSM been a category for benign cognitive variation and that is a product of a deliberate editorial stance rather than an omission, many known phenomenon not psychiatrised unless in some way “troublesome”.

The term “aphantasia” was coined to describe individuals who lack voluntary visual mental imagery, often discovered incidentally and not necessarily associated with brain injury or psychological distress.  In 2015 the word was novel but the condition had been documented for more than a century, Sir Francis Galton (1822–1911) in a paper published in 1880 describing what would come to be called aphantasia.  That work was a statistical study on mental imagery which doubtless was academically solid but Sir Francis’s reputation later suffered because he was one of the leading lights in what was in Victorian times (1837-1901) the respectable discipline of eugenics.  Eugenics rightly became discredited so Sir Francis was to some extent retrospectively “cancelled” (something like the Stalinist concept of “un-personing”) and these days his seminal contribution to the study of behavioural genetics is acknowledged only grudgingly.

Galton in 1880 noted a wide variation in “visual imagination” (ie it was understood as a “spectrum condition”) and in the same era, in psychology publications the preferred term seems to have been “imageless thought”.  In neurology (and trauma medicine generally) there were many reports of patients losing the power of imagery after a brain injury but no agreed name was ever applied because the interest was more in the injury.  The unawareness that some people simply lacked the facility presumably must have been held among the general population because as Galton wrote: “To my astonishment, I found that the great majority of the men of science to whom I first applied, protested that mental imagery was unknown to them, and they looked on me as fanciful and fantastic in supposing that the words “mental imagery” really expressed what I believed everybody supposed them to mean. They had no more notion of its true nature than a colour-blind man who has not discerned his defect has of the nature of colour.

His paper must have stimulated interest because one psychologist reported some subjects possessing what he called a “typographic visual type” imagination in which ideas (which most would visualize as an image of some sort) would manifest as “printed text” which was intriguing because in the same way a computer in some aspects doesn’t distinguish between an image file (jpeg, TIFF, webp, avif etc) which is a picture of (1) someone and (2) their name in printed form, that would seem to imply at least some who are somewhere on the aphantasia spectrum retain the ability to visualize printed text, just not the object referenced.  Professor Zeman says he first became aware of the condition in 2005 when a patient reported having lost the ability to visualize following minor surgery and after the case was in 2010 documented in the medical literature in the usual way, it provoked a number of responses in which multiple people informed Zeman they had never in their lifetime been able to visualize objects.  This was the origin of Zeman and his collaborators coining “congenital aphantasia”, describing individuals who never enjoyed the ability to generate voluntary mental images.  Because it was something which came to general attention in the age of social media, great interest was triggered in the phenomenon and a number of “on-line tests” were posted, the best-known of which was the request for readers to “imagine a red apple” and rate their “mind's eye” depiction of it on a scale from 1 (photorealistic visualisation) through to 5 (no visualisation at all).  For many, this was variously (1) one’s first realization they were aphantasic or (2) an appreciation one’s own ability or inability to visualise objects was not universal.

How visualization can manifest: Lindsay Lohan and her lawyer in court, Los Angeles, December. 2011.  If an aphantasic person doesn't know about aphantasia and doesn't know other people can imagine images, their lives are probably little different from them; it's just their minds have adapted to handle concepts in another way.

Top right: What’s thought “normal” visualization (thought to be possessed by most of the population) refers to the ability to imagine something like a photograph of what’s being imagined.  This too is a spectrum condition in that some will be able to imagine an accurate “picture”, something like a HD (high definition photograph” while others will “see” something less detailed, sketchy or even wholly inaccurate.  However, even if when asked to visualize “an apple” one instead “sees a banana”, that is not an indication of aphantasia, a condition which describes only an absence of an image.  Getting it that wrong is an indication of something amiss but it’s not aphantasia.

Bottom left: “Seeing” text in response to being prompted to visualize something was the result Galton in 1880 reported as such a surprise.  It means the brain understands the concept of what is being described; it just can’t be imagined as an image.  This is one manifestation of aphantasia but it’s not related to the “everything is text” school of post-modernism.  Jacques Derrida’s (1930-2004) fragment “Il n'y a pas de hors-texte” (literally “there is no outside-text”) is one of the frequently misunderstood phrases from the murky field of deconstruction bit it has nothing to do with aphantasia (although dedicated post-modernists probably could prove a relationship).

Bottom right: The absence of any image (understood as a “blankness” which does not necessarily imply “whiteness” or “blackness” although this is the simple way to illustrate the concept), whether text or to some degree photorealistic is classic aphantasia.  The absence does not mean the subject doesn’t understand the relevant object of concept; it means only that their mental processing does not involve imagery and for as long as humans have existed, many must have functioned in this way, their brains adapted to the imaginative range available to them.  What this must have meant was many became aware of what they were missing only when the publicity about the condition appeared on the internet, am interesting example of “diagnostic determinism”.

WebMD's classic Aphantasia test.

The eyes are an out-growth of the brain and WebMD explains aphantasia is caused by the brain’s visual cortex (the part of the brain that processes visual information from the eyes) “working differently than expected”, noting the often quoted estimate of it affecting 2-4% of the population may be understated because many may be unaware they are “afflicted”.  It’s a condition worthy of more study because aphantasics handle the characteristic by processing information differently from those who rely on visual images.  There may be a genetic element in aphantasia and there’s interest too among those researching “Long Covid” because the symptom of “brain fog” can manifest much as does aphantasia.

Aphantasia may have something to do with consciousness because aphantasics can have dreams (including nightmares) which can to varying degrees be visually rich.  There’s no obvious explanation for this but while aphantasia is the inability voluntarily to generate visual mental imagery while awake, dreaming is an involuntary perceptual experience generated during sleep; while both are mediated by neural mechanisms, these clearly are not identical but presumably must overlap.  The conclusions from research at this stage remains tentative the current neuro-cognitive interpretation seems to suggest voluntary (conscious) imagery relies on top-down activation of the visual association cortex while dream (unconscious) dream imagery relies more on bottom-up and internally driven activation during REM (rapid eye movement) sleep.  What that would seem to imply is that in aphantasia, the former pathway is impaired (or at least inaccessible), while the latter may remain intact (or accessible).

The University of Queensland’s illustration of the phantasia spectrum.

The opposite syndrome is hyperphantasia (having extremely vivid, detailed, and lifelike mental imagery) which can be a wonderful asset but can also be a curse, rather as hyperthymesia (known also as HSAM (Highly Superior Autobiographical Memory) and colloquially as “total recall”) can be disturbing.  Although it seems not to exist in the sense of “remembering everything, second-by-second”, there are certainly those who have an extraordinary recall of “events” in their life and this can have adverse consequences for mental health because one of the mind’s “defensive mechanisms” is forgetting or at least suppressing memories which are unwanted.  Like aphantasia & hyperphantasia, hyperthymesia is not listed by the DSM as a mental disorder; it is considered a rare cognitive trait or neurological phenomenon although like the imaging conditions it can have adverse consequences and these include disturbing “flashbacks”, increased rumination and increased rates of anxiety or obsessive tendencies.

Wednesday, December 3, 2025

Crunning & Cromiting

Crunning (pronounced khrun-ing)

In high-performance sports training, simultaneously running and crying.

Circa 2020: the construct was cr(y) + (r)unning.

Cromiting (pronounced krom-et-ing)

In high-performance sports training, simultaneously running, crying & vomiting.

Circa 2020: the construct was cr(y) + (v)omit + (runn)ing.

The verb cry was from the thirteenth century Middle English crien, from the Old French crier (to announce publicly, proclaim, scream, shout) (from which Medieval Latin gained crīdō (to cry out, shout, publish, proclaim)). The noun is from Middle English crie, from the Old French cri & crïee.  The origin of the Old French & Middle Latin word is uncertain.  It may be of Germanic origin, from the Frankish krītan (to cry, cry out, publish), from the Proto-Germanic krītaną (to cry out, shout), from the primitive Indo-European greyd- (to shout) and thus cognate with the Saterland Frisian kriete (to cry), the Dutch krijten (to cry) & krijsen (to shriek), the Low German krieten (to cry, call out, shriek”), the German kreißen (to cry loudly, wail, groan) and the Gothic kreitan (to cry, scream, call out) and related to the Latin gingrītus (the cackling of geese), the Middle Irish grith (a cry), the Welsh gryd (a scream), the Persian گریه (gerye) (to cry) and the Sanskrit क्रन्दन (krandana) (cry, lamentation).  Some etymologists however suggest a connection with the Medieval Latin quiritō (to wail, shriek), also of uncertain origin, possibly from the Latin queror (to complain) through the form although the phonetic and semantic developments have proved elusive; the alternative Latin source is thought to be a variant of quirritare (to squeal like a pig), from quis, an onomatopoeic rendition of squeaking.  An ancient folk etymology understood it as "to call for the help of the Quirites (the Roman policemen).  In the thirteenth century, the meaning extended to encompass "shed tears", previously described as “weeping”, “to weep” etc and by the sixteenth century cry had displace weep in the conversational vernacular, under the influence of the notion of "utter a loud, vehement, inarticulate sound".  The phrase “to cry (one's) eyes out” (weep inordinately) is documented since 1704 but weep, wept etc remained a favorite of poets and writers.

Vomit as a verb (the early fifteenth century Middle English vomiten) was an adoption from the Latin vomitus (past participle of vomitāre) and was developed from the fourteenth century noun vomit (act of expelling contents of the stomach through the mouth), from the Anglo-French vomit, from the Old French vomite, from the Latin vomitus, from vomō & vomitare (to vomit often), frequentative of vomere (to puke, spew forth, discharge), from the primitive Indo-European root wemh & weme- (to spit, vomit), source also of the Ancient Greek emein (to vomit) & emetikos (provoking sickness), the Sanskrit vamati (he vomits), the Avestan vam- (to spit), the Lithuanian vemti (to vomit) and the Old Norse væma (seasickness).  It was cognate with the Old Norse váma (nausea, malaise) and the Old English wemman (to defile).  The use of the noun to describe the matter disgorged during vomiting dates from the late fourteenth century and is in common use in the English-speaking world although Nancy Mitford (1904–1973 and the oldest of the Mitford sisters) in the slim volume Noblesse Oblige: an Enquiry into the Identifiable Characteristics of the English Aristocracy (1956) noted “vomit” was “non-U” and the “U” word was “sick”, something perhaps to bear in mind after, if not during, vomiting. 

Run was from the Middle English runnen & rennen (to run), an alteration (influenced by the past participle runne, runnen & yronne) of the Middle English rinnen (to run), from the Old English rinnan & iernan (to run) and the Old Norse rinna (to run), both from the Proto-Germanic rinnaną (to run) and related to rannijaną (to make run), from the Proto-Indo-European hreyh- (to boil, churn”.  It was cognate with the Scots rin (to run), the West Frisian rinne (to walk, march), the Dutch rennen (to run, race), the Alemannic German ränne (to run), the German rennen (to run, race) & rinnen (to flow), the Danish rende (to run), the Swedish ränna (to run) and the Icelandic renna (to flow).  The non-Germanic cognates includes the Albanian rend (to run, run after).  The alternative spelling in Old English was ærning (act of one who or that which runs, rapid motion on foot) and that endured as a literary form until the seventeenth century.  The adjective running (that runs, capable of moving quickly) was from the fourteenth century and was from rennynge; as the present-participle adjective from the verb run, it replaced the earlier erninde, from the Old English eornende from ærning.  The meaning "rapid, hasty, done on the run" dates from circa 1300 while the sense of "continuous, carried on continually" was from the late fifteenth century.  The language is replete with phrases including “run” & “running” and run has had a most productive history: according to one source the verb alone has 645 meanings and while that definitional net may be widely cast, all agree the count is well into three figures.  The suffix –ing was from the Middle English -ing, from the Old English –ing & -ung (in the sense of the modern -ing, as a suffix forming nouns from verbs), from the Proto-West Germanic –ingu & -ungu, from the Proto-Germanic –ingō & -ungō. It was cognate with the Saterland Frisian -enge, the West Frisian –ing, the Dutch –ing, The Low German –ing & -ink, the German –ung, the Swedish -ing and the Icelandic –ing; All the cognate forms were used for the same purpose as the English -ing).

Lilly Dick (b 1999) of the Australian Women’s Rugby Sevens.

The portmanteau words crunning (simultaneously running and crying) & cromiting (simultaneously running, crying & vomiting) are techniques used in strength and conditioning training by athletes seeking to improve endurance.  The basis of the idea is that at points where the mind usually persuades a runner or other athlete to pause or stop, the body is still capable of continuing and thus signals like crying or vomiting should be ignored in the manner of the phrase “passing through the pain barrier”.  The idea is “just keep going no matter what” and that is potentially dangerous so such extreme approaches should be pursued only under professional supervision.  Earlier (circa 2015), crunning was a blend of crawl + running, a type of physical training which was certainly self-descriptive and presumably best practiced on other than hard surfaces; it seems not to have caught on.  Crunning & cromiting came to wider attention when discussed by members of the Australian Women’s Rugby Sevens team which won gold at the Commonwealth Games (Birmingham, UK, July-August 2022).  When interviewed, a squad member admitted crunning & cromiting were “brutal” methods of training but admitted both were a vital part of the process by which they achieved the level of strength & fitness (mental & physical) which allowed them to succeed.

The perils of weed.

Although visually similar (spelling & symptoms), crunning & cromiting should not be confused with "scromiting" (a portmanteau of “screaming” and “vomiting”) a word coined in the early twenty-first century as verbal shorthand for cannabinoid hyperemesis syndrome (CHS).  Hyperemesis is extreme, persistent nausea and vomiting during pregnancy, a kind of acute morning sickness and CHS presents in much the same way.  The recreational use of cannabis was hardly new but CHS was novel and the medical community initially speculated the reaction (induced only in some users) may be caused either by specific genetic differences or something added to or bred into certain strains of weed although the condition appeared to be both rare and geographically distributed.  The long-term effects are unknown except for damage to tooth enamel caused by the stomach acid in the vomit.  In October 2025, a new layer of institutional respectability was gained by the concept of scromiting when the WHO (World Health Organization) announced it had added CHS to its diagnostic manual, the first time the disorder had been granted a dedicated code.  In the US, the existence of the code meant easily it could be adopted by the US CDC (Centers for Disease Control and Prevention) and interpolated into their reporting databases, meaning physicians nationwide could identify, track and study the condition rather than listing it in the broader vomiting or gastrointestinal categories.  Although a dangerous syndrome which for generations has been suffered by a sub-set of (mostly chronic) cannabis users, despite CHS causing severe nausea, repeated vomiting, abdominal pain, dehydration, weight loss and (in rare cases), heart rhythm problems, seizures, kidney failure and death, it was only after use of the drug was made lawful in many places that increasing incidences were noted.   The data suggests in the US CHS-related vists to hospital ERs (Emergency Room) have spiked by an impressive 650% since 2016 although it’s not known to what extent this reflects the extent of the increase in use or a willingness for patients to present now there is no potential legal jeopardy.

One theory is that since “legalization” (the term somewhat misleading because on a strict constitutional interpretation the substance remains proscribed) commercial growers (some of which operate on an industrial scale) have been “improving the breed” to gain market share and historically high levels of THC (Tetrahydrocannabinol, the cannabinoid which is the most active of the psychoactive constituents) are now common in “over the counter weed, this increasing both the instance and severity of scromiting.  Intriguingly, studies of the available ER data suggested a sharp elevation in cases of CHS during the COVID-19 pandemic and that seems to have established a new baseline, vists remaining high since.  The working assumption among clinicians is the combination of stress (induced by isolation and other factors) and the access to high-potency weed (THC levels well over 20% now often detected, compared with the 5% typically during the 1990s) may have contributed to the rise.  That however remains speculative and the alternative theory is heavy, long-term cannabis use overstimulates the body's cannabinoid system, triggering the opposite of the drug’s usual anti-nausea effect.  Ceasing use is the obvious cure (strictly speaking a preventative) but one as yet not understood amelioration is a long, hot shower and although it’s wholly anecdotal, there does seem to be a link with warming the body’s surface area because those who have experimented with “breathing in steam” report no helpful effect.

Male role model: The legendary Corey Bellemore.

An athletic pursuit probably sometimes not dissimilar to the exacting business of crunning & cromiting is the Beer Mile, conducted usually on a standard 400 m (¼ mile) track as a 1 mile (1.6 km) contest of both running & drinking speed.  Each of the four laps begins with the competitor drinking one can (12 fl oz (US) (355 ml)) of beer, followed by a full lap, the process repeated three times.  The rules have been defined by the governing body which also publishes the results, including the aggregates of miles covered and beers drunk.  Now a sporting institution, it has encouraged imitators and there are a number of variations, each with its own rules.  The holder of this most illustrious world record is Canadian Corey Bellemore (b 1994), a five-time champion, who, at the Beer Mile World Classic in Portugal in July 2025, broke his own world record, re-setting the mark to 4:27.1.  That may be compared with the absolute world record for the mile, held by Morocco’s Hicham El Guerrouj (b 1974) who in 1999 ran the distance in 3:43.13, his additional pace made possible by not being delayed by having to down four beers.

University of Otago Medical School.

Some variations of the beer mile simply increase the volume or strength of the beer consumed and a few of these are dubbed Chunder Mile (“chunder” being circa 1950s Australia & New Zealand slang for vomiting and of disputed origin) on the basis that vomiting is more likely the more alcohol is consumed.  For some however, even this wasn’t sufficiently debauched and there were events which demanded a (cold) meat pie be enjoyed with a jug of (un-chilled) beer (a jug typically 1140 ml (38.5 fl oz (US)) at the start of each of the four laps.  Predictably, these events were most associated with orientation weeks at universities, a number still conducted as late as the 1970s and the best documented seems to have been those at the University of Otago in Dunedin, New Zealand.  Helpfully, at this time, it was the site of the country’s medical school, thereby providing students with practical experience of both symptoms and treatments for the inevitable consequences.  Whether the event was invented in Dunedin isn’t known but, given the nature of males aged 17-21 probably hasn’t much changed over the millennia, it wouldn’t be surprising to learn similar competitions, localized to suit culinary tastes, have been contested by the drunken youth of many places in centuries past.  As it was, even in Dunedin, times were changing and in 1972, the Chunder Mile was banned “…because of the dangers of asphyxiation and ruptured esophaguses.”