Showing posts sorted by relevance for query Dome. Sort by date Show all posts
Showing posts sorted by relevance for query Dome. Sort by date Show all posts

Sunday, May 29, 2022

Athwart

Athwart (pronounced uh-thwawrt)

(1) From side to side; crosswise, transversely.

(2) In admiralty use, at right angles to the fore-and-aft line; across.

(3) Perversely; awry; wrongly.

1425-1475: From the Late Middle English athwert & athirt and a proclitic form of preposition; the construct was a- (in the sense of "in the direction of, toward")  + thwart.  The a prefix was from the Old English an (on) which in Middle English meant “up, out, away”, both derived from the Proto-Germanic uz (out), from the primitive Indo-European uds (up, out); cognate with the Old Saxon ā which endures in Modern German as the prefix er.  Thwart was from the Middle English adverb & adjective thwert(crosswise; (cooking) across the grain, transverse; counter, opposing; contrary, obstinate, stubborn), a borrowing from Old Norse þvert (across, transverse), originally the neuter form of þverr (transverse, across), from the Proto-Germanic þwerhaz, altered or influenced by þweraną (to turn) and þerh, from the primitive Indo-European twork & twerk (to twist).  Cognates include the Old English þweorh (transverse, perverse, angry, cross), the Danish tvær, the Gothic þwaírs (angry), the West Frisian dwers (beyond, across, to the other side of), the Dutch dwars (cross-grained, contrary), the Low German dwars (cross-grained, contrary) and the German quer (crosswise; cross).  The modern English queer is related.  Although still used by poets good and bad, the word is probably otherwise obsolete for all purposes except historic admiralty documents.  Athwart is a noun & adverb, athwartship is an adjective & adverb and athwartships & athwartwise are adverb; the noun plural is athwarts.  Forms like athwartly are definitely non standard.

In nautical design, the term “athwart” is used to describe a direction or orientation that is perpendicular to the centreline of a ship or boat (ie that which runs across the vessel from side to side (port-to-starboard) at right angles to the fore-and-aft line.  In shipbuilding this can apply to various components and actions on a ship, such as beams, futtocks, bulkheads, or even the positioning of objects; as a general principle something can be said to be “athwart” if it sits perpendicular to the centreline but the term is most often applied to objects which span or crosses the vessel’s entire width.  In naval architecture specifically, athwart was used as a noun to refer to the cross-members which sat beneath the deck-mounted gun-turrets on warships.  Although they had long been a part of the supporting structures, the term “athwart” seems first to have been used on the blueprints of HMS Dreadnought, launched in 1906 and a design thought so revolutionary it lent its name to the class of the biggest battleships, previous such vessels immediately re-classified as “pre-dreadnoughts” and, when even bigger ships were launched, they were dubbed “super-dreadnoughts”.

Lindsay Lohan with former special friend Samantha Roinson, athwart, TV Guide's sixth annual Emmy after party, The Kress, September 2008, Hollywood, California.

Samuel Taylor Coleridge (1772-1834), Kubla Khan (1798)

In Xanadu did Kubla Khan
A stately pleasure-dome decree:
Where Alph, the sacred river, ran
Through caverns measureless to man
Down to a sunless sea.
So twice five miles of fertile ground
With walls and towers were girdled round:
And here were gardens bright with sinuous rills
Where blossomed many an incense-bearing tree;
And here were forests ancient as the hills,
Enfolding sunny spots of greenery.
But oh! that deep romantic chasm which slanted
Down the green hill athwart a cedarn cover!
A savage place! as holy and enchanted
As e'er beneath a waning moon was haunted
By woman wailing for her demon-lover!
And from this chasm, with ceaseless turmoil seething,
As if this earth in fast thick pants were breathing,
A mighty fountain momently was forced;
Amid whose swift half-intermitted burst
Huge fragments vaulted like rebounding hail,
Or chaffy grain beneath the thresher's flail:
And 'mid these dancing rocks at once and ever
It flung up momently the sacred river.
Five miles meandering with a mazy motion
Through wood and dale the sacred river ran,
Then reached the caverns measureless to man,
And sank in tumult to a lifeless ocean:
And 'mid this tumult Kubla heard from far
Ancestral voices prophesying war!
The shadow of the dome of pleasure
Floated midway on the waves:
Where was heard the mingled measure
From the fountain and the caves.
It was a miracle of rare device,
A sunny pleasure-dome with caves of ice!
A damsel with a dulcimer
In a vision once I saw:
It was an Abyssinian maid,
And on her dulcimer she played,
Singing of Mount Abora.
Could I revive within me
Her symphony and song,
To such a deep delight't would win me
That with music loud and long,
I would build that dome in air,
That sunny dome! those caves of ice!
And all who heard should see them there,
And all should cry, Beware! Beware!
His flashing eyes, his floating hair!
Weave a circle round him thrice,
And close your eyes with holy dread,
For he on honey-dew hath fed,
And drunk the milk of Paradise.

Tuesday, December 5, 2023

Prompt

Prompt (pronounced prompt)

(1) Something done, performed, delivered etc at once or without undue delay.

(2) Ready & quick to act as the circumstances demand (archaic).

(3) Quick or alert.

(4) Punctual.

(5) To move or induce to action; to occasion or incite (often as “prompted”).

(6) To assist by suggesting something.

(7) To remind someone of what has been forgotten (formalized in live performance (the stage, singing etc) where a “prompt” is a supplied from the wings to remind a performer of a missed cue or forgotten line (the noun prompter can indicate both a person employed to deliver cues or the device used (printed or on a screen).

(8) In computing, the message or symbol on the screen which indicates where an entry is require, the most basic of which is the “command prompt” of text-based operating systems which stood ready to receive a structured command.

(9) In computing, in artificial intelligence (AI), machine learning algorithms (MLI) and related systems, to request particular output by means of instructions, questions, examples, context, or other input.

(10) In commercial use, a time limit given for payment of an account for produce purchased, this limit varying with different goods (archaic).

(11) In futures trading, the “front” (closest or nearest).

(12) The act of prompting.

1350-1400: From the Middle English prompte (ready, eager (adjective) & prompten (verb), from the French prompt, all forms ultimately from the Latin prōmptus (evident; manifest, at hand, ready, quick, prepared), participle of prōmō (to take or bring out or forth, produce, bring to light) and the adjectival use of past participle of prōmere (to bring forth, deliver, set forth), the construct being from prō- (forth, forward; for; on behalf of, in the interest of, for the sake of; before, in front of; instead of; about; according to; as, like; as befitting), a combining form of the preposition prō, from the Proto-Italic pro-, from the primitive Indo-European pro-, o-grade of per-) + emere (to buy, obtain, take).  The synonyms can include urge, spur, remind, refresh, instigate, impel, punctual, quick, rapid, hasty & timely.  Modifiers are applied as requited including over-prompt, quasi-prompt & un-prompt.  Prompt is a noun, verb & adjective, promptness & prompter are nouns, prompter & promptest are adjectives, promptly is an adverb and prompting & prompted are verbs; the noun plural is prompts.

The noun (in the phrase “in prompte”) emerged in the early fifteenth century in the sense of “readiness" and was from the Latin verb prōmptus while the more familiar meaning “hint, information suggested, act of prompting” dates from the mid-1500s.  The formal use of prompt in the sense of the indicator on a screen ready to accept user input dates only from 1977 although the concept had been in use for decades.  The ideas of coaching (someone) or assisting them by providing a reminder of that which clearly had been forgotten (or imperfectly learned) was first used in the early fifteenth century, the best-known use in live theatre (to assist a speaker with lines) dating from the 1670s.  The adjectival use (ready, prepared (to do something), quick to act as occasion demands) was from the thirteenth century Old French prompt and directly from Latin prōmptus (brought forth), hence “visible, apparent, evident, at hand”, a use now obsolete.  The commercial sense of the noun prompt “a time limit given for payment for merchandise purchased" dates from the mid-eighteenth and while the concept remains, the word is no longer formally use although the phrase “prompt payment requested” often remains as a reminder.  It remains unclear whether the verb was derived from the adjective or vice-versa and another oddity is that the first recorded instance of “prompting”, the gerund (the verbal noun logically derived from prompt and meaning “incitement or impulse to action” is from 1402, a quarter of a century before the verb.

The formal use of prompt in the sense of the indicator on a screen ready to accept user input dates only from 1977 although the concept had been in use for decades and predates screens, prompts emerging as soon as user input switched from the flicking of switches to character-based entries via a keyboard or similar input device.  The first prompts were those which sat (undifferentiated) on a plotter or printer, awaiting user input.  Command prompts were familiar from the late 1970s and appeared in early versions of Apple and CP/M systems among others but it was the IBM PC which introduced them to what was then the (still small) mainstream.  When the IBM PC was released in 1981, the user interface was exclusively text-based and the PC-DOS (or MS-DOS) command prompt was (almost) the only way for users to interact with their hardware and software.  The quirky exception to that was that on genuine IBM machines, the BIOS (Basic Input/Output System) included a BASIC (the Beginners All-Purpose Symbolic Instruction Code programming language) interpreter so it was possible to do certain things with the hardware even if an operation system (OS) wasn’t present.  IBM’s lawyers guarded their BIOS with rare efficiency so the numerous PC clones almost all needed an OS to be useful.

While programmers, nerds, and other obsessive types understood the charm of the command prompt and took to it fondly, most users had no wish to memorize even part of the sometime arcane command set needed and modern capitalism soon responded, menu systems soon available which allowed users to interact with their machine while hiding the essential ugliness beneath.  In time, these were augmented by graphical environments (some of which frankly overwhelmed the OS) and ultimately, the most successful of these would evolve into OSs, some of which included the ability to run multiple command prompts which at first contained and later emulated PC-MS-DOS.  The most elaborate of these was IBM’s OS/2 2.0 (and its successors) which permitted on a single machine literally hundreds of simultaneous command prompt sessions in a mix of 8, 16 & 32-bit flavors, some of which could even been launched as a bootable virtual machine, started from a floppy-diskette image.  Technically, it was an impressive achievement but around the planet, there were only a relative handful of organizations which needed such capabilities (typically those with megalomaniacs seduced by the idea of replacing perhaps dozens of MS-DOS based PCs each housing an interface handler of some type with one machine).  That could be made to work but the aggregate need was so limited that the direction proved a cul-de-sac.

The command prompt (with long file names, left) and the PowerShell prompt (right).  Both use the classic $p$g configuration.

The prompt didn’t however go away and in one form or another most OSs include one, Microsoft’s PowerShell (introduced in 2006 on Windows and ported to cross-platform compatibility within .NET in 2016) in its default configuration almost identical to that of the IBM-PC-1, all those years ago.  PowerShell included an enhanced list of commands but the earlier prompts were also not static and many options became available to customize the look, the list changing from release to release but a typical version included:

$Q (equal sign).
$$ $ (dollar sign).
$T (Current time).
$D (Current date).
$P (Current drive and path).
$V (OS version number).
$N (Current drive).
$G> (greater than sign).
$L & (less than sign).
$B| (pipe).
$E (Escape code (ASCII code 27)).
$_ (Carriage return and line feed).

Few actually customized their line beyond $P$G (so they would know the active sub-directory and that became the default with which most versions of PC/MS-DOS shipped) but $t $d$_$p$g had its followers (its displayed the time and the date above the prompt when in DOS.  Those for who aesthetics mattered could even set text and background colors and there were some genuinely nostalgic types who liked to emulate the bright orange or acid green screens they remembered from the world of the mainframes.  Most pleasing though was probably bright blue on black.

Prompt was one of the finalists for the Oxford University Press (OUP) 2023 Word Of The Year (WotY) although it didn’t make the cut for the shortlist.  Prompt was there not because the selection committee noted either a new international interest in punctuality or Microsoft’s PowerShell convincing a new generation to start enjoying a CLI (command-line interpreter) but because of the social and technological phenononom that is generative AI (artificial intelligence), the best-known of which is ChatGPT.  Of course, even those who weren’t dedicated command-line jockeys have for decades been interacting with the prompts of search engines but the influence of generative AI has been extraordinary and nudging “prompt” to OUP’s WotY finals is just a footnote, the editors noting even the emergence of a new job description: prompt engineer although, given the implications of generative AI, it might be a short-lived profession.  OUP also explained the expansion of meaning was a development of a wider sense: “Something said or done to aid the memory; a reminder” and that the earlier sense “prepared, ready” was long extinct although many clearly think of ChatGPT in this way.

Prompt would have been a worthy WotY and it’ll be with us for the foreseeable future, not something guaranteed for the winner: “Rizz”.  In its explanatory note, OUP sid rizz was “a popular Gen Z internet slang term”, a shortened form of the word “charisma”, used to describe someone’s ability to attract another person through style or charm, able also to be used as a verb (such as to “rizz up”, meaning to attract or chat up another person.  Rizz has about it the whiff of something which may quickly become cheugy (something once cool which became uncool by becoming too widely used by those who will never be cool) and the imprimatur of OUP’s WotY might be a nail in its coffin.  Time will tell but additionally, rizz is probably better click-bait than prompt, something to which even OUP's editors probably aren’t immune.  The other six finalists were:

Situationship: This describes a relationship (which may be sexual or romantic or neither) not thought (by the participants) formal or established (ie outside what are regarded as society’s conventions).  So, the state of relationship it describes in hardly new but it’s a clever use of language (the construct a portmanteau of situation + (relation)ship and it seems to have existed since around 2008-2011 (the sources differ) but its only recently that the use on social media and various dating apps and television shows that it’s achieved critical mass.

The anyway statuesque Taylor Swift, adding to the effect in 6 inch (150 mm) heels.

Swiftie: A (devoted / enthusiastic / obsessive etc) fan of the singer Taylor Swift (b 1989).  It was once pop culture orthodoxy that the particular conjunction of technological, demographic, economic and social conditions which were unique to the Western world in the 1960s meant what was described as the “claustrophobic hothouse” which produced “Beatlemania” couldn’t again happen.  While various pop-culture figures developed fan-bases which picked up descriptors (such as the “Dead Heads” associated with the Grateful Dead), the particular fanaticism surrounding the Beatles has never quite been replicated.  The Swifties however are said in devotion to go close and their numbers probably greater, Taylor Swift’s appeal truly cross-cultural and international; probably only the Ayatollahs and such are unmoved.  Etymologically, “Swiftie” is a conventional affectionate diminutive and among Swifties there are factions including die-hard Swifties, hardcore Swifties and self-proclaimed Swifties.  Someone a little ashamed of their fondness would presumably be a “confessed Swiftie” but none appear to exist and her appeal seems to transcend the usual pop-music boundaries.  Her songs are said to be "infectiously catchy" (a pleonasm she'd probably not allow in her lyrics).

Beige flag: Beige flag has a range and can be a trait which while not something distasteful or shocking, is of a nature which makes one pause and perhaps reconsider one’s relationship with whoever exhibits it.  It can be something which does little more than indicate the person isn’t interesting and is thus a adaptation of “red flag” which is something to which the only rational reaction is an immediate sundering of a relationship.  So a red flag might be being a Scientologist, a Freemason or listening to country & western music whereas a beige flag might be driving a front wheel drive car; undesirable but perhaps not a deal-breaker.  It can also mean something which suggests someone is just not interesting though not actually evil.  Of late however, the meaning of beige flag has shifted, thus it’s making OUP’s list of finalists.  Now, it appears to be used to reference traits which can be thought “neutral” and it’s been further adapted to cover those situations or objects which cause one briefly to pause, before moving on and probably forgetting what they’ve just seen.  It just wasn’t interesting.

Lindsay Lohan, de-influencing.

De-influencing: De-influencing is one which will probably annoy the pedants.  In the social media era, the word influencer has come to mean “someone who seeks to influence the consumption, lifestyle, political behavior etc of their online audience by the creation of social media content, often as a part of a marketing campaign”.  A de-influencer is “someone who attempts to discourage consumption of particular products or consumption in general using the same platforms”.  So the de-influencers are the latest in the long tradition of anti-materialists who have existed at least since Antiquity, whole schools of philosophy sometimes constructed around their thoughts.  There’s said to be a discernible increase in their presence on the socials and many are linked also the various movements concerned with environmental concerns, notably climate change.  The pedants will object because the de-influencers are of course trying to exert influence but OUP are right to note the trend and the associated word.

Heat dome: A heat dome is a persistent high-pressure weather system over a particular geographic area, which traps a mass of hot air below it.  The weather phenomenon, the physics of which have for decades been understood by climate modelers and meteorologists, suddenly entered general in the high (northern) summer of 2023 when much of the northern hemisphere suffered from prolonged, unusually high temperatures, July measured as the hottest month ever recorded.  Under a heat dome, the atmospheric pressure aloft prevents the hot air from rising and dissipating, effectively acting as a lid or cap over the area, thus the image of a dome sitting over the land and they create their own feedback loop: Static areas of high pressure (which already contain warm or hot air trapped under the high) will become hotter and hotter, creating a heat dome.  Hot air will rise into the atmosphere, but high pressure acts as a lid and causes the air to subside or sink; as the air sinks, it warms by compression, and the heat builds. The ground also warms, losing moisture and making it easier to heat even more.  This is climate change in action and heat dome may well become as common an expression as “cyclone” or “hurricane”.

The UK's Royal Meteorological Service's simple illustration of the physics of a heat dome.  Heat domes are also their own feedback loop.  A static areas of high pressure which already contains warm or hot air trapped under the high will become hotter and hotter, creating a heat dome.  Hot air will rise into the atmosphere, but high pressure acts as a lid and causes the air to subside or sink; as the air sinks, it warms by compression, and the heat builds. The ground also warms, losing moisture and making it easier to heat even more.

Parasocial: The adjective parasocial designates a relationship characterized by the one-sided, unreciprocated sense of intimacy felt by a viewer, fan, or follower for a well-known or prominent figure (typically a pop-culture celebrity), in which the follower or fan comes to feel something similar to knowing the celebrity as they might an actual friend.  The parasocial is really a variation of fictosexual (an identity for someone for whom the primary form of sexual attraction is fictional characters) in that the pop-culture celebrity is also an at least partially fictional construct and the relationship is just as remote.  It’s almost irrelevant that one is flesh & blood and parasocial relationships do have certain advantages in that never having to have actual contact, one can never be rejected.  What appears most to have interested OUP is the idea that our relationship with celebrity culture is changing to something more intimate, presumably because the medium is the cell phone (mobile), increasingly our most personally intimate possession.

When one attempts transform a parasocial relationship into something conventional, one sometimes becomes a stalker.

Monday, August 28, 2023

Doomsday

Doomsday (pronounced doomz-dey)

(1) In Christian eschatology, the day of the Last Judgment, at the end of the world (sometimes capital letter); the end of days; the end of times.

(2) Any day of judgment or sentence (sometimes initial capital).

(3) In casual use, the destruction of the world, since the 1950s, by means of nuclear weapons.

(4) As doomsday weapon(s), the device(s) causing the destruction of the world; anything capable of causing widespread or total destruction.

(5) Given to or marked by forebodings or predictions of impending calamity; especially concerned with or predicting future universal destruction.

(6) As Doomsday Clock, a symbolic warning device indicating how close humanity is to destroying the world, run since 1947 as a private venture by the members of the Bulletin of the Atomic Scientists.

Pre 1000: A compound from the Middle English domes + dai from the Old English construct dom (judgment) + dæg (day), dōmesdæg (sometimes dōmes dæg) (Judgment Day) and related to the Old Norse domsdagr.  Dome was borrowed from the Middle French dome & domme (which survives in Modern French as dôme), from the Italian duomo, from the Latin domus (ecclesiae) (literally “house (of the church)”), a calque of the Ancient Greek οκος τς κκλησίας (oîkos tês ekklēsías); doublet of domus.  Dom was from the Proto-West Germanic dōm and was cognate with the Old Frisian dōm, the Old Saxon dōm, the Old High German tuom, the Old Norse dómr and the Gothic dōms.  The Germanic source was from a stem verb originally meaning “to place, to set”, a sense-development also found in the Latin statutum and the Ancient Greek θέμις (thémis).  Dai had the alternative forms deg, deag & dœg all from the Proto-West Germanic dag; it was cognate with the Old Frisian dei, the Old Saxon dag, the Old Dutch dag, the Old High German tag, the Old Norse dagr and the Gothic dags.

In medieval England, doomsday was expected when the world's age had reached 6,000 years from the creation, thought to have been in 5200 BC and English Benedictine monk, the Venerable Bede (circa 672-735) complained of being pestered by rustici (the "uneducated and coarse-mannered, rough of speech"), asking him "how many years till the sixth millennium be endeth?"  However, despite the assertions (circa 1999) of the Y2K doomsday preppers, there is no evidence to support the story of a general panic in Christian Europe in the days approaching the years 800 or 1000 AD.  The use to describe a hypothetical nuclear bomb powerful enough to wipe out human life (or all life) on earth is from 1960 but the speculation was the work of others than physicists and the general trend since the 1960s has been towards smaller devices although paradoxically, this has been to maximize the destructive potential through an avoidance of the "surplus ballistic effect" (ie the realization by military planners that blasting rubble into to smaller-sized rocks was "wasted effort and bad economics").

The Domesday Book

Domesday is a proper noun that is used to describe the documents known collectively as the Domesday Book, at the time an enormous survey (a kind of early census) ordered by William I (circa 1028-1087; styled usually as William the Conqueror, King of England 1066-1087) in 1085.  The survey enumerated all the wealth in England and determined ownership in order to assess taxes.  Domesday was the Middle English spelling of doomsday, and is pronounced as doomsday.

Original Domesday book, UK National Archives, London.

The name Domesday Book (which was Doomsday in earlier spellings) was first recorded almost a century after 1086.  An addition to the manuscript was made probably circa 1114-1119 when it was known as the Book of Winchester and between then and 1179, it acquired the name by which it has since been known.  Just to clarify its status, the Treasurer of England himself announced “This book is called by the native English Domesday, that is Day of Judgement” (Dialogus de scaccario), adding that, like the Biblical Last Judgment, the decisions of Domesday Book were unalterable because “… as from the Last Judgment, there is no further appeal.”  This point was reinforced by a clause in the Dialogue of the Exchequer (1179) which noted “just as the sentence of that strict and terrible Last Judgement cannot be evaded by any art or subterfuge, so, when a dispute arises in this realm concerning facts which are written down, and an appeal is made to the book itself, the evidence it gives cannot be set at nought or evaded with impunity.”  It was from this point that began in England the idea of the centralised written record taking precedence over local oral traditions, the same concept which would evolve as the common law.

The Doomsday Book described in remarkable detail the landholdings and resources of late eleventh century England and is illustrative of both the power of the government machine by the late medieval period and its deep thirst for information.  Nothing on the scale of the survey had been undertaken in contemporary Europe, and was not matched in comprehensiveness until the population censuses of the nineteenth century although, Doomsday is not a full population census, the names appearing almost wholly restricted to landowners who could thus be taxed.  It was for centuries used for administrative and legal purposes and remains often the starting point for many purposes for historians but of late has been subject to an increasingly detailed textual analysis and it’s certainly not error-free.

The Doomsday Clock

The Doomsday Clock is a symbol that represents the likelihood of a man-made global catastrophe.  Maintained since 1947 by the members of the Bulletin of the Atomic Scientists (BOTAS), the clock was created as a metaphor for threat to humanity posed by nuclear weapons.  On the clock, a hypothetical global catastrophe is represented as the stroke of midnight and BOTAS’s view of the closeness to that hour being reached by the number of minutes or seconds to midnight.  Every January, BOTAS’s Science and Security Board committee meets to decide where the second-hand of the clock should point and in recent years, other risk factors have been considered, including disease and climate change, the committee monitoring developments in science and technology that could inflict catastrophic damage.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

These concerns do have a long history in philosophy and theology but the use in 1945 of nuclear fission to create atomic weapons focused the minds of many more on the possibilities, the concerns growing in the second half of the twentieth century as the bombs got bigger and proliferated extraordinarily to the point where, if all were detonated in the right place at the right time, almost everyone on Earth would have been killed several times over.  At least on paper, the threat was real and even before Hiroshima made the world suddenly aware of the matter, there had been some in apocalyptic mood: Winston Churchill's (1875-1965; UK prime-minister 1940-1945 & 1951-1955) “finest hour” speech in 1940 warning of the risk civilization might “…sink into the abyss of a new Dark Age made more sinister, and perhaps more protracted, by the lights of perverted science”.  It had been a growing theme in liberal interwar politics since the implications of technology and the industrialisation of warfare had been writ large by the World War I (1914-1918).

HG Wells’ (1866–1946) last book was Mind at the End of its Tether (1945), a slim volume, best remembered for the fragment “…everything was driving anyhow to anywhere at a steadily increasing velocity”, seemingly describing a world which had become more complicated, chaotic and terrifying than anything he had prophesized in his fiction. In this it’s often contrasted with the spirit of cheerful optimism and forward-looking stoicism of the book he published a few months earlier, The Happy Turning (1945), but that may be a misreading.  Mind at the End of its Tether is a curious text, easy to read yet difficult to reduce to a theme; in his review, George Orwell (1903-1950) called it “disjointed” and it does have a quality of vagueness, some chapters hinting at despair for all humanity, others suggesting hope for the future.  It’s perhaps the publication date that tints the opinions of some.  Although released some three months after the first use of atomic bombs in August 1945, publishing has lead-times and Wells hadn’t heard of the A-bomb at the time of writing although, he had in 1914 predicted such a device in The World Set Free.  In writing Mind at the End of its Tether, Wells, the great seer of science, wasn’t in dark despair at news of science’s greatest achievement, nuclear fission, but instead a dying man disappointed about the terrible twentieth century which, at the end of the nineteenth, had offered such promise.

In 1947, though the USSR had still not even tested an atomic bomb and the US enjoyed exclusive possession of the weapon, BOTAS was well aware it was only a matter of time and the clock was set at seven minutes to midnight.  Adjustments have been made a couple of dozen times since, the most optimistic days being in 1991 with the end of the Cold War when it was seventeen minutes to midnight and the most ominous right now, BOTAS in 2023 choosing 90 seconds, ten seconds worse than the 100 settled on in 2020.

The committee each year issues an explanatory note and in 2021 noted the influences on their decision.  The COVID-19 pandemic was a factor, not because it threatened to obliterate civilization but because it “…revealed just how unprepared and unwilling countries and the international system are to handle global emergencies properly. In this time of genuine crisis, governments too often abdicated responsibility, ignored scientific advice, did not cooperate or communicate effectively, and consequently failed to protect the health and welfare of their citizens.  As a result, many hundreds of thousands of human beings died needlessly.  COVID-19 they noted, will eventually recede but the pandemic, as it unfolded, was a vivid illustration that national governments and international organizations are unprepared to manage nuclear weapons and climate change, which currently pose existential threats to humanity, or the other dangers—including more virulent pandemics and next-generation warfare—that could threaten civilization in the near future.  In 2023, the adjustment was attributed mostly to (1) the increased risk of the use of nuclear weapons after the Russian invasion of Ukraine, (2) climate change, (3) biological threats such as COVID-19 and (4) the spread of disinformation through disruptive technology such as generative AI (artificial intelligence).

The acceleration of nuclear weapons programs by many countries was thought to have increased instability, especially in conjunction with the simultaneous development of delivery systems increasingly adaptable to the use of conventional or nuclear warheads.  The concern was expressed this may raise the probability of miscalculation in times of tension.  Governments were considered to have “…failed sufficiently to address climate change” and that while fossil fuel use needs to decline precipitously if the worst effects of climate change are to be avoided, instead “…fossil fuel development and production are projected to increase.  Political factors were also mentioned including the corrosive effects of “false and misleading information disseminated over the internet…, a wanton disregard for science and the large-scale embrace” of conspiracy theories often “driven by political figures”.  They did offer a glimmer of hope, notably the change of administration in the US to one with a more aggressive approach to climate change policy and a renewed commitment to nuclear arms control agreements but it wasn’t enough to convince them to move the hands of the clock.  It remains a hundred seconds to midnight.

The clock is not without critics, even the Wall Street Journal (WSJ) expressing disapproval since falling under the control of Rupert Murdoch (b 1931).  There is the argument that after seventy years, its usefulness has diminished because over those decades it has become "the boy who cried wolf": a depiction of humanity on the precipice of the abyss yet life went on.  Questions have also been raised about the narrowness of the committee and whether a body which historically has had a narrow focus on atomic weapons and security is adequately qualified to assess the range of issues which should be considered.  Mission creep too is seen as a problem.  The clock began as a means of expressing the imminence of nuclear war.  Is it appropriate to use the same mechanism to warn of impending climate change which has anyway already begun and is likely accelerating?  Global thermo-nuclear war can cause a catastrophic loss of life and societal disruption within hours, whereas the climate catastrophe is projected to unfolds over decades and centuries.  Would a companion calendar be a more helpful metaphor?  The criticism may miss the point, the clock not being a track of climate change but of political will to do something to limit and ameliorate the effects (everyone having realised it can’t be stopped).

Saturday, March 26, 2022

Doom

Doom (pronounced doom)

(1) Fate or destiny, especially adverse fate; unavoidable ill fortune.

(2) Ruin; death.

(3) A judgment, decision, or sentence, especially an unfavorable one.

(4) In Christian eschatology, the Last Judgment, at the end of days.

Pre 900: From the Middle English dome & doome from the Old English dōm (a law, statute, decree; administration of justice, judgment; justice, equity, righteousness; condemnation) from the Proto-Germanic domaz (source also of the Old Saxon and Old Frisian dom, the Old Norse domr, the Old High German tuom (judgment, decree), the Gothic doms (discernment, distinction), possibly from the primitive Indo-European root dhe- (to set, place, put, do), (source also of the Sanskrit dhā́man (custom or law), the Greek themis (law) and the Lithuanian domė (attention)).  It was with the Old Norse dōmr (judgement), the Old High German tuom (condition) and the Gothic dōms (sentence).  A book of laws in Old English was a dombec. 

In all its original forms, it seems to have been used in a neutral sense but sometimes also "a decision determining fate or fortune, irrevocable destiny."  The Modern adverse sense of "fate, ruin, destruction" began in the early fourteenth century and evolved into its general sense after circa 1600, influenced by doomsday and the finality of the Christian Judgment. The "crack of doom" is the last trump, the signal for the dissolution of all things and the finality of the Christian Judgment Day, is most memorably evoked in the Old Testament, in Ezekiel 7:7-8.

(7) Doom has come upon you, upon you who dwell in the land. The time has come! The day is near! There is panic, not joy, on the mountains.

(8) I am about to pour out my wrath on you and spend my anger against you. I will judge you according to your conduct and repay you for all your detestable practices.

Doom Paintings

Doom paintings are the vivid depictions of the Last Judgment, that moment in Christian eschatology when Christ judges souls and send them either to Heaven or Hell.  They became popular in medieval English churches as a form of graphical advertising to an often illiterate congregation, dramatizing the difference between rapture of heaven and the agonies of hell, consequences of a life of virtue or wickedness.  During the English Reformation, many doom paintings were destroyed, thought by the new order rather too lavishly Romish.

Weltgericht (Last Judgement) (circa 1435)); Tempera on oak triptych by German artist Stefan Lochner (c1410–1451).

Thursday, June 29, 2023

Phlebotomy

Phlebotomy (pronounced fluh-bot-uh-mee)

(1) The act or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; the letting of blood and known in historic medicine as "a bleeding".

(2) Any surgical incision into a vein (also known as venipuncture & (less commonly) venesection).  It shouldn’t be confused with a phlebectomy (the surgical removal of a vein).

1350–1400: From the earlier flebotomye & phlebothomy, from the Middle French flebotomie, from the thirteenth century Old French flebothomie, (phlébotomie the Modern French) from the Late & Medieval Latin phlebotomia, from the Ancient Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein), the construct being φλέψ (phléps) (genitive phlebos) (vein), of uncertain origin + tomē (a cutting), from the primitive Indo-European root tem- (to cut).  The form replaced the Middle English fleobotomie.  The noun phlebotomist (one who practices phlebotomy, a blood-letter) is documented only as late as the 1650s but may have been in use earlier and operated in conjunction with the verb phlebotomize.  The earlier noun and verb in English (in use by the early fifteenth century) were fleobotomier & fleobotomien.  The Latin noun phlebotomus (genitive phlebotomī) (a lancet or fleam (the instruments used for blood-letting)) was from the Ancient Greek φλεβότομος (phlebótomos) (opening veins), the construct being φλέψ (phléps) (blood vessel) + τέμνω (témnō) (to cut) + -ος (-os) (the adjectival suffix).  The alternative spelling was flebotomusThe noun fleam (sharp instrument for opening veins in bloodletting (and this in the pre-anesthetic age)) was from the late Old English, from Old French flieme (flamme in Modern French), from the Medieval Latin fletoma, from the Late Latin flebotomus, from Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein).  The doublet was phlebotome and in barracks slang, a fleam was a sword or dagger.  Phlebotomy & Phlebotomist are nouns, phlebotomize is a verb and phlebotomic & phlebotomical are adjectives; the noun plural is phlebotomies.

Phlebotomy describes the process of making a puncture in a vein cannula for the purpose of drawing blood.  In modern medicine the preferred term is venipuncture (used also for therapy) although the title phlebotomist continues to be used for those who specialize in the task.  One of the most frequently performed procedures in clinical practice, it’s commonly undertaken also by doctors, nurses and other medical staff.  Although the origins of phlebotomy lie in the ancient tradition of blood letting, it’s now most associated with (1) the taking of blood samples for testing by pathologists and (2) those carried out as “therapeutic phlebotomies” as part of the treatment regimen for certain disorders of the blood.  The inner elbow is the most often used site but in therapeutic medicine or in cases where the veins in the arms are not suitable, other locations can be used.

Bleeding the foot (circa 1840), oil on canvas following Honoré Daumier (1808-1879).

It’s an urban myth the Hippocratic Oath includes the clause: “First, do no harm” but by any reading that is a theme of the document and while the Greek physician Hippocrates of Kos (circa 460-circa 375 BC) wouldn’t have been the first in his field to regard illness as something to be treated as a natural phenomenon rather than something supernatural, he’s remembered because of his document.  His doctrine was one which took a long time to prevail (indeed there are pockets where still it does not), holding that treatment of ailments needed to be based on science (“evidence-based” the current phrase) rather than devotion or appeals to the gods.  His influence thus endures but one of his most famous theories which persisted for decades resulted in much lost blood for no known benefit and an unknown number of deaths.  Drawing from the notion of earlier philosophers that the basis of the universe was air, earth, water & fire, the theory was that there were four “humors” which had to be maintained in perfect balance to ensure health in body & mind, the four being flegmat (phlegm), sanguin (blood), coleric (yellow bile) & melanc (black bile) which were the source of the four personality types, the phlegmatic, the sanguine, the choleric & the melancholic.  Had Hippocrates and his successors left the humors in the realm of the speculative, it would now be thought some amusing fragment from Antiquity but unfortunately surgical intervention was designed to ensure balance was maintained and the mechanism of choice was bloodletting to “remove excess liquids”.

George Washington in his last illness, attended by Doctors Craik and Brown (circa 1800) engraving by unknown artist, Collection of The New-York Historical Society.

Apparently, bloodletting was practiced by the ancient Egyptians some 3000 years ago and it’s not impossible it was among the medical (or even religious) practices of older cultures and From there it’s known to have spread to the Middle East, Rome, Greece and West & South Asia, physicians and others spilling blood in the quest to heal and the evidence suggests it was advocated for just about any symptom.  The very idea probably sounds medieval but in the West that really was the nature of so much medicine until the nineteenth century and even well into the twentieth, there were still some reasonably orthodox physicians advocating its efficacy.  Still, in fairness to Hippocrates, he was a pioneer in what would now be called “holistic health management” which involved taking exercise, eating a balanced diet and involving the mind in art & literature.  He was an influencer in his time.  All the humors were of course good but only in balance so there could be too much of a good thing.  When there was too much, what was in excess had to go and apart from bloodletting, there was purging, catharsis & diuresis, none of which sound like fun.  Bloodletting however was the one which really caught on and was for centuries a fixture in the surgeon’s bag.

Blood self-letting: Lindsay Lohan as Carrie from the eponymous film, Halloween party, Foxwoods Resort & Casino, Connecticut, October 2013.

Actually, as the profession evolved, the surgeons emerged from the barber shops where they would pull teeth too.  The formal discipline of the physician did evolve but they restricted themselves to providing the diagnosis and writing scripts from which the apothecary would mix his potions and pills, some of which proved more lethal than bloodletting.  The bloodletting technique involved draining blood from a large vein or artery (the most productive soon found to be the median cubital at the elbow) but if a certain part of the body was identified as being out-of-balance, there would be the cut.  The mechanisms to induce blood loss included cupping, leeching & scarification and with the leeches, they were actually onto something, the thirsty creatures still used today in aspects of wound repair and infection control, able often to achieve better results more quickly than any other method.  Leeches have demonstrated extraordinary success in handing the restoration of blood flow after microsurgery and reimplantation and works because the little parasites generate substances like fibrinase, vasodilators, anticoagulants & hyaluronidase, releasing them into the would area where they assist the healing process by providing an unrestricted blood flow.  Of course the leeches don't always effect a cure.   When in 1953 doctors were summoned to examine a barely conscious comrade Stalin (1878-1953; Soviet leader 1924-1953), after their tests they diagnosed a haemorrhagic stroke involving the left middle cerebral artery.  In an attempt to lower his blood pressure, two separate applications of eight leeches each were applied over 48 hours but it was to no avail.  Had he lived he might have had the leeches shot but they probably lived to be of further service.

A Surgeon Letting Blood from a Woman's Arm, and a Physician Examining a Urine-flask (in some descriptions named Barber-Surgeon Bleeding a Patient), eighteenth century oil on canvas, attributed to school of Jan Josef Horemans (Flemish; 1682-1752); Previously attributed to Richard Brakenburg (Dutch; 1650-1702); Previously attributed to the Flemish School,

Scarification was a scraping of the skin and if the circumstances demanded more, leeches could be added.  Cupping used dome-shaped cups placed on the skin to create blisters through suction and once in place, suction was achieved through the application of heat.  However it was done it could be a messy, bloody business and in the twelfth century the Church banned the practice, calling it “abhorrent” and that had the effect of depriving priests and monks of a nice, regular source of income which wasn’t popular.  However, especially in remote villages far from the bishop’s gaze, the friars continued to wield their blades and harvest their leeches, the business of bloodletting now underground.  In the big towns and cities though the barbers added bloodletting to their business model and it’s tempting to wonder whether package deals were offered, bundling a blooding with a tooth pulling or a haircut & shave.  From here it was a short step to getting into the amputations, a not uncommon feature of life before there were antibiotics and to advertise their services, the barber-surgeons would hang out white rags smeared in places with blood, the origin of the red and white striped poles some barbers still display.  To this day the distinctions between surgeons and physicians remains and in England the Royal College of Physicians (the RCP, a kind of trade union) was founded by royal charter in 1518.  By the fourteenth century there were already demarcation disputes between the barber surgeons and the increasingly gentrified surgeons and a number of competing guilds and colleges were created, sometimes merging, sometimes breaking into factions until 1800 when the Royal College of Surgeons (RCS) was brought into existence.  It's said there was a time when fellows of the RCP & RCS, when speaking of each-other, would only ever make reference to "the other college", the name of the institution never passing their lips. 

Bloodletting tools: Late eighteenth century brass and iron “5-fingered” fleam.

Unfortunately, while doubtlessly lobbying to ensure the fees of their members remained high, the colleges did little to advance science and the byword among the population remained: “One thing's for sure: if illness didn't kill you, doctors would”.  It was the researchers of the nineteenth century, who first suggested and then proved germ theory, who sounded the death knell for most bloodletting, what was visible through their microscopes rendering the paradigm of the four humors obsolete.  By the twentieth century it was but a superstition.