Showing posts sorted by date for query Doomsday. Sort by relevance Show all posts
Showing posts sorted by date for query Doomsday. Sort by relevance Show all posts

Monday, October 16, 2023

Sponge

Sponge (pronounced spuhnj)

(1) Any aquatic, chiefly marine animal of the phylum Porifera (also called poriferan), having a porous structure and usually a horny, siliceous or calcareous internal skeleton or framework, occurring in large, sessile (permanently attached to a substrate and not able independently to move) colonies.

(2) The light, yielding, porous, fibrous skeleton or framework of certain animals or colonies of this group, especially of the genera Spongia and Hippospongia, from which the living matter has been removed, characterized by readily absorbing water and becoming soft when wet while retaining toughness: used in bathing, in wiping or cleaning surfaces, etc.

(3) Any of various other similar substances (made typically from porous rubber or cellulose and similar in absorbency to this skeleton), used for washing or cleaning and suited especially to wiping flat, non-porous surfaces; bat sponge, car-wash sponge etc).

(4) Used loosely, any soft substance with a sponge-like appearance or structure.

(5) Use loosely, any object which rapidly absorbs something.

(6) As “sponge theory” (1) a term used in climate science which tracks the processes by which tropical forests "flip" from absorbing to emitting carbon dioxide and (2) one of the competing ideas in the configuration of the US nuclear arsenal which supports the retention of the triad (intercontinental ballistic missiles (ICBM), submarine launched ballistic missiles (SLMB) and those delivered by strategic bombers).

(7) A person who absorbs something efficiently (usually in the context of information, education or facts).

(8) A person who persistently borrows from or lives at the expense of others; a parasite (usually described as “a sponger” or one who “sponges off” and synonymous with a “leech”.

(9) In disparaging slang, a habitual drinker of alcohol who is frequently intoxicated (one who is more mildly affected said to be “spongy” (a synonym of “tipsy”).

(10) In metallurgy, a porous mass of metallic particles, as of platinum, obtained by the reduction of an oxide or purified compound at a temperature below the melting point; iron from the puddling furnace, in a pasty condition; iron ore, in masses, reduced but not melted or worked.

(11) In clinical medicine, a sterile surgical dressing of absorbent material, usually cotton gauze, for wiping or absorbing pus, blood, or other fluids during a surgical operation.

(12) In hospitals and other care institutions, as sponge bath, a method of hygiene whereby a patient is cleaned with a sponge (usually with soap & water) while in a chair or bed.

(13)In cooking (baking), dough raised with yeast before it is kneaded and formed into loaves and after it is converted into a light, spongy mass by the agency of the yeast or leaven.

(14) In cooking, a light, sweet pudding of a porous texture, made with gelatin, eggs, fruit juice or other flavoring ingredients; popular as a cake, often multi-layered with whipped cream (or similar) between.

(15) In birth control, a contraceptive made with a disposable piece of polyurethane foam permeated with a spermicide for insertion into the vagina.

(16) As “makeup sponge” or “beauty sponge”, a device for applying certain substances to the skin (most often blusher and similar products to the face).

(17) In ballistics, a mop for cleaning the bore of a cannon after a discharge, consisting of a cylinder of wood, covered with sheepskin with the wool on, or cloth with a heavy looped nap, and having a handle, or staff.

(18) In farriery, the extremity (or point) of a horseshoe, corresponding to the heel.

(19) In the slang of the nuclear industry, a worker routinely exposed to radiation.

(20) To wipe or rub with or (as with a wet sponge), to moisten or clean.

(21) To remove with a Usually moistened) sponge (usually followed by off, away, etc.).

(22) To wipe out or efface with or as with a sponge (often followed by out).

(23) To take up or absorb with or as with a sponge (often followed by up).

(24) Habitually to borrow, use, or obtain by imposing on another's good nature.

(25) In ceramics, to decorate (a ceramic object) by dabbing at it with a sponge soaked with color or any use of a sponge to render a certain texture on the sirface.

(26) To take in or soak up liquid by absorption.

(27) To gather sponges (from the beach or ocean).

(28) In marine biology (in behavioral zoology, of dolphins), the description of the use of a piece of wild sponge as a tool when foraging for food.

Pre 1000: From the Middle English noun sponge, spunge & spounge, from the Old English noun sponge & spunge (absorbent and porous part of certain aquatic organisms), from the Latin spongia & spongea (a sponge (also (the “sea animal from which a sponge comes”), from the Ancient Greek σπογγιά (spongiá), related to σπόγγος (spóngos) (sponge).  At least one etymologist called it “an old Wandewort” while another speculated it was probably a loanword from a non-Indo-European language, borrowed independently into Greek, Latin and Armenian in a form close to “sphong-”.  From the Latin came the Old Saxon spunsia, the Middle Dutch spongie, the Old French esponge, the Spanish esponja and the Italian spugna.  In English, the word has been used of the sea animals since the 1530s and of just about any sponge-like substance since the turn of the seventeenth century and the figurative use in reference to one adept at absorbing facts or learning emerged about the same time.  The sense of “one who persistently and parasitically lives on others" has been in use since at least 1838.  The sponge-cake (light, fluffy & sweet) has been documented since 1808 but similar creations had long been known.  Sponge is a noun & verb, sponged & sponging are verbs, Spongeless, spongy, sponginess, spongable, spongiform & spongelike are adjectives and spongingly is an adverb; the noun plural is sponges.

The verb emerged late in the fourteenth century as spongen (to soak up with a sponge) or (as a transitive verb) “to cleanse or wipe with a sponge”, both uses derived from the noun and presumably influenced by the Latin spongiare.  The intransitive sense “dive for sponges, gather sponges where they grow” was first documented in 1881 by observers watching harvesting in the Aegean.  The slang use meaning “deprive someone of (something) by sponging” was in use by at least the 1630s, the later intransitive sense of “live in a parasitic manner, live at the expense of others” documented in the 1670, the more poetic phrase “live upon the sponge” (live parasitically, relying on the efforts of others) dating from the 1690s; such folk described as “spongers” since the 1670s.  However, in the 1620s, the original idea was that the victim was “the sponge” because they were “being squeezed”.  The noun sponge in the general sense of “an object from which something of value may be extracted” was in use by circa 1600; the later reference to “the sponger” reversed this older sense.  In what was presumably an example of military humor, the noun sponger also had a use in the army and navy, referring to the member of a cannon’s crew who wielded the pole (with a sponge attached to one end) to clean the barrel of the weapon after discharge.  It’s not clear when it came into use but it’s documented since 1828.

The adjective spongiform (resembling a sponge, sponge-like; porous, full of holes) dates from 1774 and seems now restricted to medical science, the incurable and invariably fatal neurodegenerative disease of cattle "bovine spongiform encephalopathy" (BSE) the best known use although the public understandably prefer the more evocative "mad cow disease".  The adjective spongy (soft, elastic) came into use in the 1530s in medicine & pathology, in reference to morbid tissue (not necessarily soft and applied after the 1590s to hard material (especially bone)) seen as open or porous.  In late fourteenth century Middle English, there was spongious (sponge-like in nature), again, directly from the Latin.  In idiomatic use and dating from the 1860s, to “throw in the sponge” was to concede defeat; yield or give up the context.  The form is drawn from prize-fighting where the sponge (sitting usually in a bucket of water and used to wipe blood from the boxer’s face) is thrown into the ring by the trainer or second, indicating to the referee the fight must immediately be stopped.  The phrase later “throw in the towel” means the same thing and is of the same origin although some older style guides insist the correct use is “throw up the sponge” and “throw in the towel”.  To the beaten and bloodied boxer, it probably was an unnoticed technical distinction.

Sea sponges.

In zoology, sponges are any of the many aquatic (mostly sea-based) invertebrate animals of the phylum Porifera, characteristically having a porous skeleton, usually containing an intricate system of canals composed of fibrous material or siliceous or calcareous spicules.  Water passing through the pores is the delivery system the creatures use to gain nutrition.  Sponges are known to live at most depths of the sea, are sessile (permanently attached to a substrate; all but a handful not able independently to move (fully-grown sponges do not have moving parts, but the larvae are free-swimming)) and often form irregularly shaped colonies.  Sponges are considered now the most primitive members of the animal kingdom extant as they lack a nervous system and differentiated body tissues or organs although they have great regenerative capacities, some species able to regenerate a complete adult organism from fragments as small as a single cell.  Sponges first appeared during the early Cambrian Period over half a billion years ago and may have evolved from protozoa.

Of sponges and brushes

Dior Backstage Blender (Professional Finish Fluid Foundation Sponge).

Both makeup brushes and makeup sponges can be used to apply blush or foundation and unless there’s some strong personal preference, most women probably use both, depending on the material to be applied and the look desired.  Brushes are almost always long-bristled and soft sometimes to the point of fluffiness with a rounded shape which affords both precision and the essential ability to blend at the edges.  Brushes are popular because they offer great control over placement & blending (users debating whether a long or short handle is most beneficial in this and it may be that both work equally well if one’s technique is honed).  Brushes can be used with most varieties of formulation including powders and creams.

Lindsay Lohan in court, October 2011.

This not entirely flattering application of grey-brown shade of blusher attracted comment, the consensus being it was an attempt to create the effect of hollowed cheekbones, a look wildly popular during the 1980s-1990s and one which to which her facial structure was well-suited.  However, the apparently “heavy handed” approach instead suggesting bruising.  The “contoured blush look” is achieved with delicacy and Benjamin Disraeli (1804-1881, UK prime-minister 1868 & 1874-1880) might have called this: “laying it on with a trowel”.  It’s not known if Ms Lohan used a brush or a sponge but her technique may have been closer to that of the bricklayer handling his trowel.  Makeup sponges (often called “beauty blenders” are preferred by many to brushes and are recommended by the cosmetic houses especially for when applying cream or liquid products.  They’re claimed to be easier to use than a brush and for this reason are often the choice of less experienced or occasional users and they create a natural, dewy finish, blending the product seamlessly into the skin and avoiding the more defined lines which brushes can produce.  When used with a powder blush, sponges produce an airbrushed, diffused effect and are much easier to use for those applying their own make-up in front of a mirror, a situation in which the “edging” effect inherent in brush use can be hard to detect.  For professional makeup artists, both sponges and brushes will be used when working on others, the choice dictated by the product in use and the effect desired.

Sponge theory

The awful beauty of our weapons: Test launch of Boeing LGM-30G Minuteman III ICBM.

Ever since the US military (sometimes in competition with politicians) first formulated a set of coherent policies which set out the circumstances in which nuclear weapons would be used, there have been constant revisions to the plans.  At its peak, the nuclear arsenal contained some 30,000 weapons and the target list extended to a remarkable 10,000 sites, almost all in the Soviet Union (USSR), the People’s Republic of China (PRC) the Baltic States and countries in Eastern Europe.  Even the generals admitted there was some degree of overkill in all this but rationalized the system on the basis it was the only way to guarantee a success rate close to 100%.  That certainly fitted in with the US military’s long established tradition of “overwhelming” rather than merely “solving” problems.

US nuclear weapons target map 1956 (de-classified in 2015).

Over the decades, different strategies were from time-to-time adopted as tensions rose and fell or responded to changes in circumstances such as arms control treaties and, most obviously, the end of the Cold War when the USSR was dissolved.  The processes which produced these changes were always the same: (1) inter-service squabbles between the army, navy & air force, (2) the struggle between the politicians and the top brass (many of who proved politically quite adept), (3) the influence of others inside and beyond the “nuclear establishment” including the industrial concerns which designed and manufactured the things, those in think tanks & academic institutions and (4) the (usually anti-nuclear) lobby and activist community.  Many of the discussions were quite abstract, something the generals & admirals seemed to prefer, probably because one of their quoted metrics in the early 1950s was that if in a nuclear exchange there were 50 million dead Russians and only 20 million dead Americans then the US could be said to have “won the war”.  When critics pursued this to its logical conclusion and asked if that was the result even if only one Russian and two Americans were left alive, the military tended to restrict themselves to targets, megatons and abstractions, any descent to specifics like body-counts just tiresome detail.  This meant the strategies came to be summed-up in short, punchy, indicative terms like “deterrence”, “avoidance of escalation” & “retaliation” although the depth was sufficient for even the “short” version prepared for the president’s use in the event of war to be an inch (25 mm) thick.  What was describe varied from a threat of use, a limited strike, various forms of containment (the so-called "limited nuclear war") and sometimes the doomsday option: global thermo-nuclear war.  However, during the administration of Barack Obama (b 1961; US president 2009-2017) there emerged a genuine linguistic novelty: “sponge theory”.

US Air Force Boeing B-52 Stratofortress (1952-, left) and Northrop Grumman B-2 Spirit (1989-, right).

The term “sponge theory” had been used in climate science to describe a mechanism which tracks the processes by which tropical forests "flip" from absorbing to emitting carbon dioxide (a la a sponge which absorbs water which can be expelled when squeezed) but in the matter of nuclear weapons it was something different.  At the time, the debates in the White House, the Congress and even some factions within the military were about whether what had become the traditional “triad” of nuclear weapons ((1) intercontinental ballistic missiles (ICBM), (2) submarine launched ballistic missiles (SLMB) and (3) those delivered by strategic bombers) should be maintained.  By “maintained” that of course meant periodically refurbished & replaced.  The suggestion was that the ICBMs should be retired, the argument being they were a Cold War relic, the mere presence of which threatened peace because they encouraged a "first strike" (actually be either side).  However, the counter argument was that in a sense, the US was already running a de-facto dyad because, dating from the administration of George HW Bush (George XLI, 1924-2018; US president 1989-1993), none of the big strategic bombers had been on “runway alert” (ie able to be scrambled for a sortie within minutes) and only a tiny few were stored in hangers with their bombs loaded.  Removing the ICBMs from service, went the argument, would leave the nation dangerously reliant on the SLMBs which, in the way of such things, might at any time be rendered obsolete by advances in sensor technology and artificial intelligence (AI).  The British of course had never used ICMBs and had removed the nuclear strike capability from their bombers, thus relying on a squadron of four submarines (one of which is on patrol somewhere 24/7/365) with SLMBs but the British system was a pure "independent nuclear deterrent", what the military calls a "boutique bomb".  

Test launch of US Navy Trident-II-D5LE SLBM.

There was also the concern that land or air to submarine communications were not wholly reliable and this, added to the other arguments, won the case for the triad but just in case, the Pentagon had formulated “sponge theory”, about their catchiest phrase since “collateral damage”.  The idea of sponge theory was that were the ICBMs retired, Moscow or Beijing would have only five strategic targets in the continental US: the three bomber bases (in the flyover states of Louisiana, Missouri & North Dakota) and the two submarine ports, in Georgia on the south Atlantic coast and in Washington state in the Pacific north-west.  A successful attack on those targets could be mounted with less than a dozen (in theory half that number because of the multiple warheads) missiles which would mean the retaliatory capacity of the US would be limited to the SLMBs carried by the six submarines on patrol.  Given that, a president might be reluctant to use them because of the knowledge Moscow (and increasingly Beijing) could mount a second, much more destructive attack.  However, if the 400 ICBMs remained in service, an attack on the US with any prospect of success would demand the use of close to 1000 missiles, something to which any president would be compelled to respond and the US ICBMs would be in flight to their targets long before the incoming Soviet or Chinese missiles hit.  The function of the US ICBM sites, acting as a sponge (soaking up the targeting, squeezing the retaliation) would deter an attack.  As it was, the 400-odd Boeing LGM-30 Minuteman ICBMs remained in service in silos also in flyover states: Montana, North Dakota and Wyoming.  After over fifty years in service, the Minuteman is due for replacement in 2030 and there’s little appetite in Washington DC or in the Pentagon to discuss any change to the triad.

Thursday, October 12, 2023

Gap

Gap (pronounced gap)

(1) A break or opening, as in a fence, wall, or military line; breach; an opening that implies a breach or defect (vacancy, deficit, absence, or lack).

(2) An empty space or interval; interruption in continuity; hiatus.

(3) A wide divergence or difference; disparity

(4) A difference or disparity in attitudes, perceptions, character, or development, or a lack of confidence or understanding, perceived as creating a problem.

(5) A deep, sloping ravine or cleft through a mountain ridge.

(6) In regional use (in most of the English-speaking world and especially prominent in the US), a mountain pass, gorge, ravine, valley or similar geographical feature (also in some places used of a sheltered area of coast between two cliffs and often applied in locality names).

(7) In aeronautics, the distance between one supporting surface of an airplane and another above or below it.

(8) In electronics, a break in a magnetic circuit that increases the inductance and saturation point of the circuit.

(9) In various field sports (baseball, cricket, the football codes etc), those spaces between players which afford some opportunity to the opposition.

(10) In genetics, an un-sequenced region in a sequence alignment.

(11) In slang (New Zealand), suddenly to depart.

(12) To make a gap, opening, or breach in.

(13) To come open or apart; form or show a gap.

1350–1400: From the Middle English gap & gappe (an opening in a wall or hedge; a break, a breach), from Old Norse gap (gap, empty space, chasm) akin to the Old Norse gapa (to open the mouth wide; to gape; to scream), from the Proto-Germanic gapōną, from the primitive Indo-European root ghieh (to open wide; to yawn, gape, be wide open) and related to the Middle Dutch & Dutch gapen, the German gaffen (to gape, stare), the Danish gab (an expanse, space, gap; open mouth, opening), the Swedish gap & gapa and the Old English ġeap (open space, expanse).  Synonyms for gap can include pause, interstice, break, interlude, lull but probably not lacuna (which is associated specifically with holes).  Gap is a noun & verb, gapped & gapping are verbs, Gapless & gappy are adjectives; the noun plural is gaps.

Lindsay Lohan demonstrates a startled gape, MTV Movie-Awards, Gibson Amphitheatre, Universal City, California, June 2010.

The use to describe natural geographical formations (“a break or opening between mountains” which later extended to “an unfilled space or interval, any hiatus or interruption”) emerged in the late fifteenth century and became prevalent in the US, used of deep breaks or passes in a long mountain chain (especially one through which a waterway flows) and often used in locality names.  The use as a transitive verb (to make gaps; to gap) evolved from the noun and became common in the early nineteenth century as the phrases became part of the jargon of mechanical engineering and metalworking (although in oral use the forms may long have existed).  The intransitive verb (to have gaps) is documented only since 1948.  The verb gape dates from the early thirteenth century and may be from the Old English ġeap (open space, expanse) but most etymologists seem to prefer a link with the Old Norse gapa (to open the mouth wide; to gape; to scream); it was long a favorite way of alluding to the expressions thought stereotypical of “idle curiosity, listlessness, or ignorant wonder of bumpkins and other rustics” and is synonymous with “slack-jawed yokels”).  The adjective gappy (full of gaps; inclined to be susceptible to gaps opening) dates from 1846.  The adjectival use gap-toothed (having teeth set wide apart) has been in use since at least the 1570s, but earlier, Geoffrey Chaucer (circa 1344-1400) had used “gat-toothed” for the same purpose, gat from the Middle English noun gat (opening, passage) from the Old Norse gat and cognate with gate.

Lindsay Lohan demonstrates her admirable thigh gap, November 2013.

The “thigh gap” seems first to have been documented in 2012 but gained critical mass on the internet in 2014 when it became of those short-lived social phenomenon which produced a minor moral panic.  “Thigh gap” described the empty space between the inner thighs of a women when standing upright with feet touching; a gap was said to be good and the lack of a gap bad.  Feminist criticism noted it was not an attribute enjoyed by a majority of mature human females and it thus constituted just another of the “beauty standards” imposed on women which were an unrealizable goal for the majority.  The pro-ana community ignored this critique and thinspiration (thinspo) bloggers quickly added annotated images and made the thigh gap and essential aspect of female physical attractiveness.  

A walking, talking credibility gap: crooked Hillary Clinton (b 1947; US secretary of state 2009-2013).

In English, gap has been prolific in the creation of phrases & expressions.  The “generation gap” sounds modern and as a phrase it came into wide use only in the 1960s in reaction to the twin constructs of “teenagers” and the “counter-culture” but the concept has been documented since antiquity and refers to a disconnect between youth and those older, based on different standards of behavior, dress, artistic taste and social mores.  The term “technology gap” was created in the early 1960s and was from economics, describing the various implications of a nation’s economy gaining a competitive advantage over others by the creation or adoption of certain technologies.  However, the concept was familiar to militaries which had long sought to quantify and rectify any specific disadvantage in personnel, planning or materiel they might suffer compared to their adversaries; these instances are described in terms like “missile gap”, “air gap”, “bomber gap”, “megaton gap” et al (and when used of materiel the general term “technology deficit” is also used).  Rearmament is the usual approach but there can also be “stop gap” solutions which are temporary (often called “quick & dirty” (Q&D)) fixes which address an immediate crisis without curing the structural problem.  For a permanent (something often illusory in military matters) remedy for a deficiency, one is said to “bridge the gap”, “gap-fill” or “close the gap”.  The phrase “stop gap” in the sense of “that which fills a hiatus, an expedient in an emergency” appears to date from the 1680s and may have been first a military term referring to a need urgently to “plug a gap” in a defensive line, “gap” used by armies in this sense since the 1540s.  The use as an adjective dates from the same time in the sense of “filling a gap or pause”.  A “credibility gap” is discrepancy between what’s presented as reality and a perception of what reality actually is; it’s applied especially to the statements of those in authority (politicians like crooked Hillary Clinton the classic but not the only examples).  “Pay gap” & “gender gap” are companion terms used most often in labor-market economics to describe the differences in aggregate or sectoral participation and income levels between a baseline group (usually white men) and others who appear disadvantaged.

“Gap theorists” (known also as “gap creationists”) are those who claim the account of the Earth and all who inhabit the place being created in six 24 hour days (as described in the Book of Genesis in the Bible’s Old Testament) literally is true but that there was a gap of time between the two distinct creations in the first and the second verses of Genesis.  What this allows is a rationalization of modern scientific observation and analysis of physical materials which have determined the age of the planet.  This hypothesis can also be used to illustrate the use of the phrase “credibility gap”.  In Australia, gap is often used to refer to the (increasingly large) shortfall between the amount health insurance funds will pay compared with what the health industry actually charges; the difference, paid by the consumer, (doctors still insist on calling them patients) is the gap (also called the “gap fee”).  In Australia, the term “the gap” has become embedded in the political lexicon to refer to the disparity in outcomes between the indigenous and non-indigenous communities in fields such as life expectancy, education, health, employment, incarceration rates etc.  By convention, it can be used only to refer to the metrics which show institutional disadvantage but not other measures where the differences are also striking (smoking rates, crime rates, prevalence of domestic violence, drug & alcohol abuse etc) and it’s thus inherently political.  Programmes have been designed and implemented with the object of “closing the gap”; the results have been mixed.

Opinion remains divided on the use of platinum-tipped spark plugs in the Mercedes-Benz M100 (6.3 & 6.9) V8.

A “spark gap” is the space between two conducting electrodes, filled usually with air (or in specialized applications some other gas) and designed to allow an electric spark to pass between the two.  One of the best known spark gaps is that in the spark (or sparking) plug which provides the point of ignition for the fuel-air mixture in internal combustion engines (ICE).  Advances in technology mean fewer today are familiar with the intricacies of spark plugs, once a familiar (and often an unwelcome) sight to many.  The gap in a spark plug is the distance between the center and ground electrode (at the tip) and the size of the gap is crucial in the efficient operation of an ICE.  The gap size, although the differences would be imperceptible to most, is not arbitrary and is determined by the interplay of the specifications of the engine and the ignition system including (1) the compression ratio (low compression units often need a larger gap to ensure a larger spark is generated), (2) the ignition system, high-energy systems usually working better with a larger gap, (3) the materials used in the plug’s construction (the most critical variable being their heat tolerance); because copper, platinum, and iridium are used variously, different gaps are specified to reflect the variations in thermal conductivity and the temperature range able to be endured and (4) application, high performance engines or those used in competition involving sustained high-speed operation often using larger gaps to ensure a stronger and larger spark.

Kennedy, Khrushchev and the missile gap

The “missile gap” was one of the most discussed threads in the campaign run by the Democratic Party’s John Kennedy (JFK, 1917–1963; US president 1961-1963) in the 1960 US presidential election in which his opponent was the Republican Richard Nixon (1913-1994; US president 1969-1974).  The idea there was a “missile gap” was based on a combination of Soviet misinformation, a precautionary attitude by military analysts in which the statistical technique of extrapolation was applied on the basis of a “worst case scenario” and blatant empire building by the US military, notably the air force (USAF), anxious not to surrender to the navy their pre-eminence in the hierarchy of nuclear weapons delivery systems.  It’s true there was at the time a missile gap but it was massively in favor of the US which possessed several dozen inter-continental ballistic missiles (ICBM) while the USSR had either four or six, depending on the definition used.  President Dwight Eisenhower (1890-1969; US president 1953-1961), a five-star general well acquainted with the intrigues of the military top brass, was always sceptical about the claims and had arranged the spy flights which confirmed the real count but was constrained from making the information public because of the need to conceal his source of intelligence.  Kennedy may actually have known his claim was incorrect but, finding it resonated with the electorate, continued to include it in his campaigning, knowing the plausibility was enhanced in a country where people were still shocked by the USSR having in 1957 launched Sputnik I, the first ever earth-orbiting satellite.  Sputnik had appeared to expose a vast gap between the scientific capabilities of the two countries, especially in the matter of big missiles. 

President Kennedy & comrade Khrushchev at their unproductive summit meeting, Vienna, June 1961.

Fake gaps in such matters were actually nothing new.  Some years earlier, before there were ICBMs so in any nuclear war the two sides would have to have used aircraft to drop bombs on each other (al la Hiroshima & Nagasaki in 1945), there’d been a political furore about the claim the US suffered a “bomber gap” and would thus be unable adequately to respond to any attack.  In truth, by a simple sleight of hand little different to that used by Nazi Germany to 1935 to convince worried British politicians that the Luftwaffe (the German air force) was already as strong as the Royal Air Force (RAF), Moscow had greatly inflated the numbers and stated capability of their strategic bombers, a perception concerned US politicians were anxious to believe.  The USAF would of course be the recipient of the funds needed to build the hundreds (the US would end up building thousands) of bombers needed to equip all those squadrons and their projections of Soviet strength were higher still.  If all of this building stuff to plug non-existent gaps had happened in isolation it would have been wasteful of money and natural resources which was bad enough but this hardware made up the building blocks of nuclear strategy; the Cold war was not an abstract exercise where on both sides technicians with clipboards walked from silo to silo counting warheads.

Instead, the variety of weapons, their different modes of delivery (from land, sea, undersea and air), their degrees of accuracy and their vulnerability to counter-measures was constantly calculated to assess their utility as (1) deterrents to an attack, (2) counter-offensive weapons to respond to an attack or (3) first-strike weapons with which to stage a pre-emptive or preventative attack.  In the Pentagon, the various high commands and the burgeoning world of the think tanks, this analysis was quite an industry and it had to also factor in the impossible: working out how the Kremlin would react.  In other words, what the planners needed to do was create a nuclear force which was strong enough to deter an attack yet not seem to be such a threat that it would encourage an attack and that only scratched the surface of the possibilities; each review (and there were many) would produce detailed study documents several inches thick.

US Navy low-level photograph spy of San Cristobal medium-range ballistic missile (MRBM) site #1, Cuba, 23 October, 1962.

In October 1962, during the Cuban Missile Crisis, the somewhat slimmer nuclear war manuals synthesized from those studies were being read with more interest than usual.  It was a tense situation and had Kennedy and comrade Nikita Khrushchev (1894–1971; Soviet leader 1953-1964) not agreed to a back-channel deal, the US would probably have attacked Cuba in some manner, not knowing three divisions of the Red Army were stationed there to protect the Soviet missiles and that would have been a state of armed conflict which could have turned into some sort of war.  As it was, under the deal, Khrushchev withdrew the missiles from Cuba in exchange for Kennedy’s commitment not to invade Cuba and withdraw 15 obsolescent nuclear missiles from Turkey, the stipulation being the Turkish component must be kept secret.  That secrecy colored for years the understanding of the Cuban Missile Crisis and the role of the US nuclear arsenal played in influencing the Kremlin.  The story was that the US stayed resolute, rattled the nuclear sabre and that was enough to force the Soviet withdrawal.  One not told the truth was Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) who became president after Kennedy was assassinated in 1963 and historians have attributed his attitude to negotiation during the Vietnam War to not wishing to be unfavorably compared to his predecessor who, as Dean Rusk (1909–1994; US secretary of state 1961-1969) put it, stood “eyeball to eyeball” with Khrushchev and “made him blink first”.  The existence of doomsday weapon of all those missiles would distort Soviet and US foreign policy for years to come.

Monday, August 28, 2023

Doomsday

Doomsday (pronounced doomz-dey)

(1) In Christian eschatology, the day of the Last Judgment, at the end of the world (sometimes capital letter); the end of days; the end of times.

(2) Any day of judgment or sentence (sometimes initial capital).

(3) In casual use, the destruction of the world, since the 1950s, by means of nuclear weapons.

(4) As doomsday weapon(s), the device(s) causing the destruction of the world; anything capable of causing widespread or total destruction.

(5) Given to or marked by forebodings or predictions of impending calamity; especially concerned with or predicting future universal destruction.

(6) As Doomsday Clock, a symbolic warning device indicating how close humanity is to destroying the world, run since 1947 as a private venture by the members of the Bulletin of the Atomic Scientists.

Pre 1000: A compound from the Middle English domes + dai from the Old English construct dom (judgment) + dæg (day), dōmesdæg (sometimes dōmes dæg) (Judgment Day) and related to the Old Norse domsdagr.  Dome was borrowed from the Middle French dome & domme (which survives in Modern French as dôme), from the Italian duomo, from the Latin domus (ecclesiae) (literally “house (of the church)”), a calque of the Ancient Greek οκος τς κκλησίας (oîkos tês ekklēsías); doublet of domus.  Dom was from the Proto-West Germanic dōm and was cognate with the Old Frisian dōm, the Old Saxon dōm, the Old High German tuom, the Old Norse dómr and the Gothic dōms.  The Germanic source was from a stem verb originally meaning “to place, to set”, a sense-development also found in the Latin statutum and the Ancient Greek θέμις (thémis).  Dai had the alternative forms deg, deag & dœg all from the Proto-West Germanic dag; it was cognate with the Old Frisian dei, the Old Saxon dag, the Old Dutch dag, the Old High German tag, the Old Norse dagr and the Gothic dags.

In medieval England, doomsday was expected when the world's age had reached 6,000 years from the creation, thought to have been in 5200 BC and English Benedictine monk, the Venerable Bede (circa 672-735) complained of being pestered by rustici (the "uneducated and coarse-mannered, rough of speech"), asking him "how many years till the sixth millennium be endeth?"  However, despite the assertions (circa 1999) of the Y2K doomsday preppers, there is no evidence to support the story of a general panic in Christian Europe in the days approaching the years 800 or 1000 AD.  The use to describe a hypothetical nuclear bomb powerful enough to wipe out human life (or all life) on earth is from 1960 but the speculation was the work of others than physicists and the general trend since the 1960s has been towards smaller devices although paradoxically, this has been to maximize the destructive potential through an avoidance of the "surplus ballistic effect" (ie the realization by military planners that blasting rubble into to smaller-sized rocks was "wasted effort and bad economics").

The Domesday Book

Domesday is a proper noun that is used to describe the documents known collectively as the Domesday Book, at the time an enormous survey (a kind of early census) ordered by William I (circa 1028-1087; styled usually as William the Conqueror, King of England 1066-1087) in 1085.  The survey enumerated all the wealth in England and determined ownership in order to assess taxes.  Domesday was the Middle English spelling of doomsday, and is pronounced as doomsday.

Original Domesday book, UK National Archives, London.

The name Domesday Book (which was Doomsday in earlier spellings) was first recorded almost a century after 1086.  An addition to the manuscript was made probably circa 1114-1119 when it was known as the Book of Winchester and between then and 1179, it acquired the name by which it has since been known.  Just to clarify its status, the Treasurer of England himself announced “This book is called by the native English Domesday, that is Day of Judgement” (Dialogus de scaccario), adding that, like the Biblical Last Judgment, the decisions of Domesday Book were unalterable because “… as from the Last Judgment, there is no further appeal.”  This point was reinforced by a clause in the Dialogue of the Exchequer (1179) which noted “just as the sentence of that strict and terrible Last Judgement cannot be evaded by any art or subterfuge, so, when a dispute arises in this realm concerning facts which are written down, and an appeal is made to the book itself, the evidence it gives cannot be set at nought or evaded with impunity.”  It was from this point that began in England the idea of the centralised written record taking precedence over local oral traditions, the same concept which would evolve as the common law.

The Doomsday Book described in remarkable detail the landholdings and resources of late eleventh century England and is illustrative of both the power of the government machine by the late medieval period and its deep thirst for information.  Nothing on the scale of the survey had been undertaken in contemporary Europe, and was not matched in comprehensiveness until the population censuses of the nineteenth century although, Doomsday is not a full population census, the names appearing almost wholly restricted to landowners who could thus be taxed.  It was for centuries used for administrative and legal purposes and remains often the starting point for many purposes for historians but of late has been subject to an increasingly detailed textual analysis and it’s certainly not error-free.

The Doomsday Clock

The Doomsday Clock is a symbol that represents the likelihood of a man-made global catastrophe.  Maintained since 1947 by the members of the Bulletin of the Atomic Scientists (BOTAS), the clock was created as a metaphor for threat to humanity posed by nuclear weapons.  On the clock, a hypothetical global catastrophe is represented as the stroke of midnight and BOTAS’s view of the closeness to that hour being reached by the number of minutes or seconds to midnight.  Every January, BOTAS’s Science and Security Board committee meets to decide where the second-hand of the clock should point and in recent years, other risk factors have been considered, including disease and climate change, the committee monitoring developments in science and technology that could inflict catastrophic damage.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

These concerns do have a long history in philosophy and theology but the use in 1945 of nuclear fission to create atomic weapons focused the minds of many more on the possibilities, the concerns growing in the second half of the twentieth century as the bombs got bigger and proliferated extraordinarily to the point where, if all were detonated in the right place at the right time, almost everyone on Earth would have been killed several times over.  At least on paper, the threat was real and even before Hiroshima made the world suddenly aware of the matter, there had been some in apocalyptic mood: Winston Churchill's (1875-1965; UK prime-minister 1940-1945 & 1951-1955) “finest hour” speech in 1940 warning of the risk civilization might “…sink into the abyss of a new Dark Age made more sinister, and perhaps more protracted, by the lights of perverted science”.  It had been a growing theme in liberal interwar politics since the implications of technology and the industrialisation of warfare had been writ large by the World War I (1914-1918).

HG Wells’ (1866–1946) last book was Mind at the End of its Tether (1945), a slim volume, best remembered for the fragment “…everything was driving anyhow to anywhere at a steadily increasing velocity”, seemingly describing a world which had become more complicated, chaotic and terrifying than anything he had prophesized in his fiction. In this it’s often contrasted with the spirit of cheerful optimism and forward-looking stoicism of the book he published a few months earlier, The Happy Turning (1945), but that may be a misreading.  Mind at the End of its Tether is a curious text, easy to read yet difficult to reduce to a theme; in his review, George Orwell (1903-1950) called it “disjointed” and it does have a quality of vagueness, some chapters hinting at despair for all humanity, others suggesting hope for the future.  It’s perhaps the publication date that tints the opinions of some.  Although released some three months after the first use of atomic bombs in August 1945, publishing has lead-times and Wells hadn’t heard of the A-bomb at the time of writing although, he had in 1914 predicted such a device in The World Set Free.  In writing Mind at the End of its Tether, Wells, the great seer of science, wasn’t in dark despair at news of science’s greatest achievement, nuclear fission, but instead a dying man disappointed about the terrible twentieth century which, at the end of the nineteenth, had offered such promise.

In 1947, though the USSR had still not even tested an atomic bomb and the US enjoyed exclusive possession of the weapon, BOTAS was well aware it was only a matter of time and the clock was set at seven minutes to midnight.  Adjustments have been made a couple of dozen times since, the most optimistic days being in 1991 with the end of the Cold War when it was seventeen minutes to midnight and the most ominous right now, BOTAS in 2023 choosing 90 seconds, ten seconds worse than the 100 settled on in 2020.

The committee each year issues an explanatory note and in 2021 noted the influences on their decision.  The COVID-19 pandemic was a factor, not because it threatened to obliterate civilization but because it “…revealed just how unprepared and unwilling countries and the international system are to handle global emergencies properly. In this time of genuine crisis, governments too often abdicated responsibility, ignored scientific advice, did not cooperate or communicate effectively, and consequently failed to protect the health and welfare of their citizens.  As a result, many hundreds of thousands of human beings died needlessly.  COVID-19 they noted, will eventually recede but the pandemic, as it unfolded, was a vivid illustration that national governments and international organizations are unprepared to manage nuclear weapons and climate change, which currently pose existential threats to humanity, or the other dangers—including more virulent pandemics and next-generation warfare—that could threaten civilization in the near future.  In 2023, the adjustment was attributed mostly to (1) the increased risk of the use of nuclear weapons after the Russian invasion of Ukraine, (2) climate change, (3) biological threats such as COVID-19 and (4) the spread of disinformation through disruptive technology such as generative AI (artificial intelligence).

The acceleration of nuclear weapons programs by many countries was thought to have increased instability, especially in conjunction with the simultaneous development of delivery systems increasingly adaptable to the use of conventional or nuclear warheads.  The concern was expressed this may raise the probability of miscalculation in times of tension.  Governments were considered to have “…failed sufficiently to address climate change” and that while fossil fuel use needs to decline precipitously if the worst effects of climate change are to be avoided, instead “…fossil fuel development and production are projected to increase.  Political factors were also mentioned including the corrosive effects of “false and misleading information disseminated over the internet…, a wanton disregard for science and the large-scale embrace” of conspiracy theories often “driven by political figures”.  They did offer a glimmer of hope, notably the change of administration in the US to one with a more aggressive approach to climate change policy and a renewed commitment to nuclear arms control agreements but it wasn’t enough to convince them to move the hands of the clock.  It remains a hundred seconds to midnight.

The clock is not without critics, even the Wall Street Journal (WSJ) expressing disapproval since falling under the control of Rupert Murdoch (b 1931).  There is the argument that after seventy years, its usefulness has diminished because over those decades it has become "the boy who cried wolf": a depiction of humanity on the precipice of the abyss yet life went on.  Questions have also been raised about the narrowness of the committee and whether a body which historically has had a narrow focus on atomic weapons and security is adequately qualified to assess the range of issues which should be considered.  Mission creep too is seen as a problem.  The clock began as a means of expressing the imminence of nuclear war.  Is it appropriate to use the same mechanism to warn of impending climate change which has anyway already begun and is likely accelerating?  Global thermo-nuclear war can cause a catastrophic loss of life and societal disruption within hours, whereas the climate catastrophe is projected to unfolds over decades and centuries.  Would a companion calendar be a more helpful metaphor?  The criticism may miss the point, the clock not being a track of climate change but of political will to do something to limit and ameliorate the effects (everyone having realised it can’t be stopped).