Showing posts sorted by date for query Doom. Sort by relevance Show all posts
Showing posts sorted by date for query Doom. Sort by relevance Show all posts

Saturday, August 16, 2025

Solastalgia

Solastalgia (pronounced sol-las-jee-uh)

The pain or distress caused by the loss or lack or solace and the sense of desolation connected to the present state of one’s home and territory

2003: A coining by Professor Glenn Albrecht (b 1953), the construct built from the Latin sōlācium (solace, comfort) +‎ -algia (pain).  Sōlācium was from sōlor (to comfort, console, solace) + –ac- (a variant of āx- (used to form adjectives expressing a tendency or inclination to the action of the root verb)) +‎ -ium, from the Latin -um (in this context used to indicate the setting where a given activity is carried out).  The –algia suffix was from the New Latin -algia, from the Ancient Greek -αλγία (-algía), from compounds ending in Ancient Greek ἄλγος (álgos) (pain) +‎ the Ancient Greek -ῐ́ᾱ (-ĭ́ā).  The most well-known was probably kephalalgíā (headache).  Solastalgia is a noun, Solastalgic is a noun and adjective and solastalgically is an adverb; the noun plural is solastalgias.

Elements what became the modern environmentalism can be found in writings from Antiquity and there are passages in Biblical Scripture which are quoted to support the notion Christ and God Himself were greenies.  However, as a political movement, it was very much a creation of the late twentieth century although Theodore Roosevelt (TR, 1858–1919; US president 1901-1909), despite his reputation as a big game hunter, made some notable contributions.  In what proved an active retirement, Roosevelt would often remark that more than the landmark anti-trust laws or his Nobel Peace Prize, the most enduring legacy of his presidency would be the federal legislation relating to the conservation and protection of the natural environment, both land and wildlife.  While he was in the White House, new national parks and forests were created, the total areas an impressive 360,000 square miles (930,000 km2), a reasonable achievement given the pressure vested interests exerted upon the Congress to prevent anything which would impinge upon “development”.

Portrait of Theodore Roosevelt (1903) by John Singer Sargent (1856–1925).

Roosevelt though was not typical and in most places the profits from industrialization & development proved more compelling than abstractions about the environment; even when the effects of climate change became obvious, it was clear only a crisis would rapidly create the conditions for change.  Events such as the London’s “Great Smog” of 1952 were so dramatic changes were made (culminating in the Clean Air Act (1956)) and the state of the air quality in San Francisco & Los Angeles was by the late 1950s so obviously deteriorating that California enacted anti-pollution laws even before there was much federal legislation, the state remaining in the vanguard to this day.  Those political phenomenon for a while encouraged the thought that even though decisive action to reduce carbon emissions was improbable while climate change (once referred to as “the greenhouse effect” and later “global warming”) seemed both remote and conceptual, once the “crisis events” began to affect those living in the rich countries of the global north (ie “the white folks”), the term would morph into “climate crisis” and resource allocation would shift to address the problem.  That theory remains sound but what was under-estimated was the threshold point for the word “crisis”.  Despite the increasing frequency and severity of wildfires, soaring temperatures, polar vortexes and floods, thus far the political system is still being adjusted on the basis of gradual change: the imperative remains managing rather than rectifying the problem.  Once, television-friendly events such as (1) melting glaciers creating landslides destroying entire villages which have for centuries sate in the Swiss Alps, (2) suburbs of mansions in the hills of Los Angeles being razed to the ground by wildfires, (3) previously unprecedented floods in Europe and Asia killing hundreds and (4) heat waves routinely becoming a feature of once temperate regions would have been thought “crisis triggers” but the political system has thus far absorbed them.

Silent Spring (First edition, 1962) by Rachel Carson.

The origins of the environment movement in its modem form are often traced to the publication in 1962 of Silent Spring by marine biologist Rachel Carson (1907–1964) although it took years for the controversy that book generated to coalesce into an embryonic “green” movement.  Silent Spring was a best-seller which (in an accessible form) introduced to the general public notions of the threat chemical pollution posed to ecology, the power of her argument being to identify the issue not as something restricted to a narrow section of agricultural concerns but as part of a systemic threat to the balance of nature and the very survival of human civilization.  There were many other influences (demographic, cultural, economic, educational etc) at this time and by the late 1960s, it was apparent concerns about pollution, over-population, pesticide use and such had created an identifiable shared language and public visibility although it was something too fragmented to be called a movement, the goals and advocated courses of action remaining disparate.  Structurally however, organizations were being formed and a convenient turning point suggesting critical mass had been achieved came in the US in April, 1970 when some 20 million participants received wide coverage in the media for Earth Day, a warning to the politicians that “the environment” might affect voting patterns.  It was in this era that the framework of US environmental legislation was built including the Clean Air Act (1970), Clean Water Act (1972) and Endangered Species Act (1973) was formed, all passed during the administration of Richard Nixon (1913-1994; US president 1969-1974) and under Nixon, in 1970, the EPA (Environmental Protection Agency) was created, an institution of which Theodore Roosevelt would have approved.

Earth Emotions: New Words for a New World (2019) by Professor Glenn Albrecht.

When working as a academic, Glenn Albrecht was granted conventional academic titles (such as Professor of Sustainability) but his work puts him in the category of “ecophilosopher”, a concept which would have been understood by the natural scientists of Antiquity; it’s now an increasingly populated field with a niche in popular publishing.  The eco- prefix was from the French éco-, from the Latin oeco-, from Ancient Greek οἶκος (oîkos) (house, household) and was for generations familiar in “economy” and its derivatives but is now most associated with ecology or the environment (in the ecological sense).  For better or worse, it has come to be applied to novel constructs including ecotourism (forms of “sustainable” tourism claimed to cause less environmental damage), ecofascism (literally “fascist politics with support for ecological concerns” but used usually (as a derogatory) to refer to uncompromising, aggressive or violent environmental activism, the most extreme form of which is ecoterrorism (a label used rather loosely, even of vegans who stage protests outside restaurants serving the products of the slaughter industry)) and ecofeminism (a socio-political movement combining feminism and environmentalism).

The ecophilosophers have produced many publications but Professor Albrecht has been unusual in that he has been prolific also in the coining of words, especially those which relate to or are consequent upon what he calls the “sumbiocentric” (taking into account the centrality of the process of symbiosis in all of our deliberations on human affairs”).  Such creations in emerging or expanding fields of study are of course not unusual.  In environmentalism, new terms and words have in recent decades appeared but there’s been a element of technological determinism to some.  Although the notion humanity lives on a “ship travelling through space” had been in use since at least the mid-nineteenth century, the metaphor had been nautical and it wasn’t until “spaceships” started to be launched the 1960s the term was updated to the now familiar “spaceship earth”.  Neologisms, even if used in context can be baffling but helpfully, Professor Albrecht published also a “glossary of psycho erratic terms” with pocket definitions explaining his lexicon of the “Earth’s emotions”.

Endemophilia: A “love of place”, specifically the “particular love of the locally and regionally distinctive in the people of a place. The mechanism for this is: “Once a person realizes that the landscape they have before them is not replicated in even a general way elsewhere in the country or on their continent or even in the world, there is ample room for a positive Earth emotion based on rarity and uniqueness.  This is classified as a spectrum condition in that the more “a uniqueness is understood… the more it can be appreciated”.  Professor Albrecht was speaking of geology, florna & fauna but figuratively the concept can be applied to the built environment in urban areas and it doesn’t demand an interest in architecture to take pleasure from the form of (some) buildings.

Eutierria: A “feeling of total harmony with our place, and the naïve loss of ego (merging subject and ego) we often felt as children”.  Professor Albrecht cites the author Richard Louv (b 1949) who used the phrase “nature deficit disorder” in suggesting a word was needed to describe the state of harmony one could achieve if “connected to the Earth”.  Eutierria is a “positive feeling of oneness with the Earth and its life forces, where the boundaries between self and the rest of nature are obliterated, and a deep sense of peace and contentedness pervades consciousness”.

The HUCE (Harvard University Center for the Environment) in 2017 noted the phenomenon of mermosity, recording that some six months earlier New York Magazine had “published its most-read article ever, surpassing a photo spread of Lindsay Lohan.”  The topic the HUCE summarized as “Doom”, the apocalyptic visions of a world ravaged by climate change, the young especially afflicted by a crushing sense of dread.

Mermosity: “An anticipatory state of being worried about the possible passing of the familiar, and its replacement by that which does not sit comfortably in one’s sense of place. This is a word now with great currency because researchers have noted one aspect of the prominence in the media of (1) human-induced climate change and (2) the apparent inevitability of its adverse consequences has resulted in a pervading sense of doom among some, especially the young.  According to some psychologists, their young patients are exhibiting “mourning-like” behaviour, thinking the planet already in the throes of destruction and they exist merely as mourners at its protracted funeral.

Meteoranxiety: The “anxiety felt in the face of the threat of the frequency and severity of extreme weather events”.  This is an example of a feedback loop in that weather events (rain, storms, heatwaves etc) now tending by many to be attributed exclusively to human-induced climate change, thus exacerbating one’s mermosity.  In the literature of psychology, behavioral economics, neuroscience, philosophy, sociology & political science there are explanations (often replete with house jargon) explaining how “perception bias” & “cognitive bias” operate and interact but such things rarely are discussed on the TikTok news feeds which these days are so influential in shaping world views.

Solastalgia: “The pain or distress caused by the loss or lack or solace and the sense of desolation connected to the present state of one’s home and territory”.  This is the “lived experience of negative environmental change” and reflects the sense of loss of what once was (or one’s imagined construct of what once was), a phenomenon Professor Albrecht describes as “the homesickness you have when you are still at home”.  Although coined to be used in the context of climate change, it can be applied more widely and the feeling will be familiar to those who notice the lack of familiar landmarks in cities as urban redevelopment changes the architecture.  In those cases, the distress can be made more troubling still because even a building one may for years frequently have seen rapidly can fade from memory to the point where it can be hard to remember its appearance, even if it stood for decades.

Google ngram: Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.  Being recent, the ngram for solastagia should be an untypically accurate indication of trends in use but it’s a quantitative and not qualitative measure: Although a word very much of the climate change era, it has been used in other contexts as, as a neologism, it appears also in many dictionaries and other on-line lists.

Sumbiocentric: “Taking into account the centrality of the process of symbiosis in all of our deliberations on human affairs”.  The special place environmentalism has assumed in the public consciousness means the sumbiocentric is positioned as something beyond just another construction of ethics and should be thought a kind of secular, moral theology.  Ominously, one apparent implication in this would appear to be the desirability (according to some the necessity) for some sort of internationally “co-ordinated” government, a concept with a wide vista and in various forms at times advocated by figures as diverse as the polemicist playwright George Bernard Shaw (GBS; 1856-1950) and Edward Teller (1908–2003), the so-called “father of the hydrogen bomb”.

Sumbiophilia: “The love of living together”.  This would apparently be the state of things in the symbiocene, a speculative era which would succeed the Anthropocene and be characterized by a harmonious and cooperative coexistence between humans and the rest of nature which presumably would be something of a new Jerusalem although shepherds, child care workers and others would be advised not to take literally the Biblical Scripture: “The wolf also shall dwell with the lamb, and the leopard shall lie down with the kid; and the calf and the young lion and the fatling together; and a little child shall lead them.” (Isaiah 11:6, King James Version (KJV, 1611)).  However, other than sensible precautions when around carnivorous predators, all would exist in a symbiosis (living together for mutual benefit) without the destructive practices of the anthropocene.  In the world of Green Party wine & cheese evenings, sumbiophilia probably seems the most natural thing in the world although the party leadership would be sufficiently realistic to understand not all would agree so, when it was made compulsory, “re-education camps” would be needed to “persuade” the recalcitrant.  As used by Professor Albrecht, sumbiophilia is an ideal but one obviously counter-historical because the development of the nation state (which took millennia and was (more or less) perfected in the nationalisms which have been the dominant political paradigm since the nineteenth century) suggests what people love is not us all “living together” but groups of us “keeping the others out”.  Not for nothing are idealists thought the most dangerous creatures on Earth.

Terrafuric: “The extreme anger unleashed within those who can clearly see the self-destructive tendencies in the current forms of industrial-technological society and feel they must protest and act to change its direction”.  This is another spectrum condition ranging from writing truculent letters to the New York Times, to members of Extinction Rebellion super-gluing themselves to the road to assassinating the “guilty parties”, a la Luigi Mangione (b 1998).

Terranascia (“Earth creating forces”) and terraphthora (“Earth destroying forces”) are companion terms which could be used by geologists, cosmologists and others but the significance in this context is that humans are now (and have long been) among the most ecologically destructive forces known.

Hannah Arendt and Martin Heidegger (2017) by Antonia Grunenberg (b 1944).  Hannah Arendt's (1906-1975) relationship with Martin Heidegger (1889–1976) began when she was a 19 year old student of philosophy and he her professor, married and aged 36.  Both, for different reasons, would more than once have experienced solastalgia.

Solastalgia began life in the milieu of the climate change wars but poets and others beyond the battleground have been drawn to the word, re-purposing it in abstract or figurative ways, comparing the process of literal environmental degradation with losses elsewhere.  The adaptations have included (1) Social & cultural change (loss of familiar traditions or communities), (2) Linguistic erosion (mourning the disappearance of words, dialects or the quirks in language with which one grew up, replaced often by new (and baffling) forms of slang), (3) One’s personal emotional framework (the loss of friends, partner or family members), (4) Aging (the realization of mounting decrepitude), (5) Digital displacement (a more recent phenomenon which covers a range including an inability to master new technology, grief when once enjoyed digital spaces become toxic, commercialized or abandoned and having to “upgrade” from familiar, functional software to newer versions which offer no advantages), (6) Artistic loss (one’s favourite forms of music, art or literature become unfashionable and neglected) and (7) Existential disconnection (not a new idea but now one an increasing number claim to suffer; a kind of philosophical estrangement in which one feels “the world” (in the sense the German philosopher Martin Heidegger (1889–1976) used the word) has become strange and unfamiliar).

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Monday, March 3, 2025

Chair

Chair (pronounced cherr)

(1) A seat, especially if designed for one person, usually with four legs (though other designs are not uncommon) for support and a rest for the back, sometimes with rests for the arms (as distinct from a sofa, stool, bench etc).

(2) Something which serves as a chair or provides chair-like support (often used in of specialized medical devices) and coined as required (chairlift, sedan chair, wheelchair etc).

(3) A seat of office or authority; a position of authority such as a judge.

(4) In academic use, a descriptor of a professorship.

(5) The person occupying a seat of office, especially the chairperson (the nominally gendered term “chairman” sometimes still used, even of female or non-defined chairs).

(6) In an orchestra, the position of a player, assigned by rank (1st chair, 2nd chair etc).

(7) In informal use, an ellipsis of electric chair (often in the phrase “Got the chair” (ie received a death sentence)).

(8) In structural engineering, the device used in reinforced-concrete construction to maintain the position of reinforcing rods or strands during the pouring operation.

(9) In glass-blowing, a glassmaker's bench having extended arms on which a blowpipe is rolled in shaping glass.

(10) In railroad construction, a metal block for supporting a rail and securing it to a crosstie or the like (mostly UK).

(11) To place or seat in a chair.

(12) To install in office.

(13) To preside over a committee, board, tribunal etc or some ad hoc gathering; to act as a chairperson.

(14) To carry someone aloft in a sitting position after a triumph or great achievement (mostly UK and performed after victories in sport).

(15) In chemistry, one of two possible conformers of cyclohexane rings (the other being boat), shaped roughly like a chair.

(16) A vehicle for one person; either a sedan chair borne upon poles, or a two-wheeled carriage drawn by one horse (also called a gig) (now rare).

(17) To award a chair to the winning poet at an eisteddfod (exclusive to Wales).

1250-1300: From the Middle English chayer, chaire, chaiere, chaere, chayre & chayere, from the Old French chaiere & chaere (chair, seat, throne), from the Latin cathedra (seat), from the Ancient Greek καθέδρα (kathédra), the construct being κατά (katá) (down) + δρα (hédra) (seat).  It displaced the native stool and settle, which shifted to specific meanings.  The twelfth century modern French chaire (pulpit, throne) in the sixteenth century separated in meaning when the more furniture came to be known as a chaise (chair).  Chair is a noun & verb and chaired & chairing are verbs; the noun plural is chairs.

The figurative sense of "seat of office or authority" emerged at the turn of the fourteenth century and originally was used of professors & bishops (there once being rather more overlap between universities and the Church).  That use persisted despite the structural changes in both institutions but it wasn’t until 1816 the meaning “office of a professor” was extended from the mid-fifteenth century sense of the literal seat from which a professor conducted his lectures.  Borrowing from academic practice, the general sense of “seat of a person presiding at meeting” emerged during the 1640s and from this developed the idea of a chairman, although earliest use of the verb form “to chair a meeting” appears as late as 1921.  Although sometimes cited as indicative of the “top-down” approach taken by second-wave feminism, although it was in the 1980s that the term chairwoman (woman who leads a formal meeting) first attained general currency, it had actually been in use since 1699, a coining apparently thought needed for mere descriptive accuracy rather than an early shot in the culture wars, chairman (occupier of a chair of authority) having been in use since the 1650s and by circa 1730 it had gained the familiar meaning “member of a corporate body appointed to preside at meetings of boards or other supervisor bodies”.  By the 1970s however, the culture wars had started and the once innocuous “chairwoman” was to some controversial, as was the gender-neutral alternative “chairperson” which seems first to have appeared in 1971.  Now, most seem to have settled on “chair" which seems unobjectionable although presumably, linguistic structuralists could claim it’s a clipping of (and therefore implies) “chairman”.

Chairbox offers a range of “last shift” coffin-themed chairs, said to be ideal for those "stuck in a dead-end job, sitting on a chair in a cubicle".  The available finishes include walnut (left) and for those who enjoy being reminded of cremation, charcoal wood can be used for the seating area (right).  An indicative list price is Stg£8300 (US$10,400) for a Last Shift trimmed in velvet.

The slang use as a short form of electric chair dates from 1900 and was used to refer both to the physical device and the capital sentence.  In interior decorating, the chair-rail was a timber molding fastened to a wall at such a height as would prevent the wall being damaged by the backs of chairs.  First documented in 1822, chair rails are now made also from synthetic materials.  The noun wheelchair (also wheel-chair) dates from circa 1700, and one so confined is said sometimes to be “chair bound”.  The high-chair (an infant’s seat designed to make feeding easier) had probably been improvised for centuries but was first advertised in 1848.  The term easy chair (a chair designed especially for comfort) dates from 1707.  The armchair (also arm-chair), a "chair with rests for the elbows", although a design of long-standing, was first so-described in the 1630s and the name outlasted the contemporary alternative (elbow-chair).  The adjectival sense, in reference to “criticism of matters in which the critic takes no active part” (armchair critic, armchair general etc) dates from 1879.  In academic use, although in the English-speaking world the use of “professor” seems gradually to be changing to align with US practice, the term “chair” continues in its traditional forms: There are chairs (established professorships), named chairs (which can be ancient or more recent creations which acknowledge the individual, family or institution providing the endowment which funds the position), personal chairs (whereby the title professor (in some form) is conferred on an individual although no established position exists), honorary chairs (unpaid appointments) and even temporary chairs (which means whatever the institution from time-to-time says it means).

In universities, the term “named chair” refers usually to a professorship endowed with funds from a donor, typically bearing the name of the donor or whatever title they nominate and the institution agrees is appropriate.  On rare occasions, named chairs have been created to honor an academic figure of great distinction (usually someone with a strong connection with the institution) but more often the system exists to encourage endowments which provide financial support for the chair holder's salary, research, and other academic activities.  For a donor, it’s a matter both of legacy & philanthropy in that a named chair is one of the more subtle and potentially respectable forms of public relations and a way to contribute to teaching & research in a field of some interest or with a previous association.

Professor Michael Simons (official photograph issued by Yale University's School of Medicine).

So it can be a win-win situation but institutions do need to practice due diligence in the process of naming or making appointments to named chairs as a long running matter at Yale University demonstrates.  In 2013, an enquiry convened by Yale found Professor Michael Simons (b 1957) guilty of sexual harassment and suspended him as Chief of Cardiology at the School of Medicine.  Five years on, the professor accused Yale of “punishing him again” for the same conduct in a gender-discriminatory effort to appease campus supporters of the #MeToo movement which had achieved national prominence.  That complaint was prompted when Professor Simons was in 2018 appointed to, and then asked to resign from a named chair, the Robert W Berliner Professor of Medicine, endowed by an annual grant of US$500,000 from the family of renal physiologist, Robert Berliner (1915-2002).  Professor Simons took his case to court and early in 2024 at a sitting of federal court ruled, he obtained a ruling in his favour, permitting him to move to trial, Yale’s motion seeing a summary judgment in all matters denied, the judge fining it appropriate that two of his complaints (one on the basis of gender discrimination in violation of Title VII of the Civil Rights Act (1964) and one under Title IX of the Education Amendments Act (1972)) should be heard before a jury.  The trial judge noted in his judgment that there appeared to be a denial of due process in 1918 and that happened at a time when (as was not disputed), Yale was “the subject of news reports criticizing its decision to reward a sexual harasser with an endowed chair.

What the documents presented in Federal court revealed was that Yale’s handling of the matter had even within the institution not without criticism.  In 2013 the University-Wide Committee on Sexual Misconduct found the professor guilty of sexual harassment and he was suspended (but not removed) as chief of cardiology at the School of Medicine.  Internal documents subsequently leaked to the New York Times (NYT) revealed there were 18 faculty members dissatisfied with that outcome and a week after the NYT sought comment from Yale, it was announced Simons would be removed from the position entirely and in November 2014, the paper reported that Yale had also removed him from his position as director of its Cardiovascular Research Center.  Simons alleges that these two additional actions were taken in response to public reaction to the stories published by the NYT but the university disputed that, arguing the subsequent moves were pursuant to the findings of an internal “360 review” of his job performance.  In 2018, Simons was asked to relinquish the Berliner chair on the basis he would be appointed instead to another endowed chair.  In the documents Simons filed in Federal Court, this request came after “one or more persons … sympathetic to the #MeToo movement” contacted the Berliner family encouraging them to demand that the University remove Simons from the professorship, prompting Yale, “fearing a backlash from the #MeToo activists and hoping to placate them,” to “began exploring” his removal from the chair.

School of Medicine, Yale University, New Haven, Connecticut, USA.

Later in 2018, Simons was duly appointed to another named chair, prompting faculty members, students and alumni to send an open letter to Yale’s president expressing “disgust and disappointment” at the appointment.  The president responded with a formal notice to Simmons informing him he had 24 hours to resign from the chair, and Simmons also alleges he was told by the president of “concerns” the institution had about the public criticism.  In October 2019, Simons filed suit against Yale (and a number of individuals) on seven counts: breach of contract, breach of the implied warranty of fair dealing, wrongful discharge, negligent infliction of emotional distress, breach of privacy, and discrimination on the basis of gender under Title VII of the Civil Rights Act of 1964 and Title IX of the Education Amendments of 1972.   Three of these (wrongful discharge, negligent infliction of emotional distress and breach of privacy) were in 2020 struck-out in Federal Court and this was the point at which Yale sought summary judgment for the remainder.  This was partially granted but the judge held that the matter of gender discrimination in violation of Title VII and Title IX needed to be decided by a jury.  A trial date has not yet been set but it will be followed with some interest.  While all cases are decided on the facts presented, it’s expected the matter may be an indication of the current state of the relative strength of “black letter law” versus “prevailing community expectations”.

Personal chair: Lindsay Lohan adorning a chair.

The Roman Catholic Church’s dogma of papal infallibility holds that a pope’s rulings on matters of faith and doctrine are infallibility correct and cannot be questioned.  When making such statements, a pope is said to be speaking ex cathedra (literally “from the chair” (of the Apostle St Peter, the first pope)).  Although ex cathedra pronouncements had been issued since medieval times, as a point of canon law, the doctrine was codified first at the First Ecumenical Council of the Vatican (Vatican I; 1869–1870) in the document Pastor aeternus (shepherd forever).  Since Vatican I, the only ex cathedra decree has been Munificentissimus Deus (The most bountiful God), issued by Pius XII (1876–1958; pope 1939-1958) in 1950, in which was declared the dogma of the Assumption; that the Virgin Mary "having completed the course of her earthly life, was assumed body and soul into heavenly glory".  Pius XII never made explicit whether the assumption preceded or followed earthly death, a point no pope has since discussed although it would seem of some theological significance.  Prior to the solemn definition of 1870, there had been decrees issued ex cathedra.  In Ineffabilis Deus (Ineffable God (1854)), Pius IX (1792–1878; pope 1846-1878) defined the dogma of the Immaculate Conception of the Blessed Virgin Mary, an important point because of the theological necessity of Christ being born free of sin, a notion built upon by later theologians as the perpetual virginity of Mary.  It asserts that Mary "always a virgin, before, during and after the birth of Jesus Christ", explaining the biblical references to brothers of Jesus either as children of Joseph from a previous marriage, cousins of Jesus, or just folk closely associated with the Holy Family.

Technically, papal infallibility may have been invoked only the once since codification but since the early post-war years, pontiffs have found ways to achieve the same effect, John Paul II (1920–2005; pope 1978-2005) & Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022) both adept at using what was in effect a personal decree a power available to one who sits at the apex of what is in constitutional terms an absolute theocracy.  Critics have called this phenomenon "creeping infallibility" and its intellectual underpinnings own much to the tireless efforts of Benedict XVI while he was head of the Inquisition (by then called the Congregation for the Doctrine of the Faith (CDF) and now renamed the Dicastery for the Doctrine of the Faith (DDF)) during the late twentieth century.  The Holy See probably doesn't care but DDF is also the acronym, inter-alia, for "drug & disease free" and (in gaming) "Doom definition file".  There's also the DDF Network which is an aggregator of pornography content.

The “chair” photo (1963) of Christine Keeler (1942-2017) by Hong Kong Chinese photographer Lewis Morley (1925-2013) (left) and Joanne Whalley-Kilmer (b 1961) in Scandal (1989, a Harvey Weinstein (b 1952) production) (centre).  The motif was reprised by Taiwanese-American photographer Yu Tsai (b 1975) in his sessions for the Lindsay Lohan Playboy photo-shoot; it was used for the cover of the magazine’s January/February 2012 issue (right).  Ms Lohan wore shoes for some of the shoot but these were still "nudes" because "shoes don't count"; everybody knows that. 

The Profumo affair was one of those fits of morality which from time-to-time would afflict English society in the twentieth century and was a marvellous mix of class, sex, spying & money, all things which make an already good scandal especially juicy.  The famous image of model Christine Keeler, nude and artfully positioned sitting backwards on an unexceptional (actually a knock-off) plywood chair, was taken in May 1963, during the moral panic over the disclosure the young lady simultaneously was enjoying the affection of both a member of the British cabinet and a Soviet spy.  John Profumo (1915-2006) was the UK’s Minister for War (the UK cabinet retained the position until 1964 although it was dis-established in the US in 1947) who, then 46, was found to be conducting an adulterous affair with the then 19 year old topless model at the same time she (presumably as her obviously crowded schedule permitted) fitted in trysts with a KGB agent, attached to the Soviet embassy with the cover of naval attaché.  Although there are to this day differing interpretations of the scandal, there have never been any doubts this potential Cold-War conduit between Moscow and Her Majesty’s Secretary of State for War represented at least a potential conflict of interest.  The fallout from the scandal ended Profumo’s political career, contributed to the fall of Harold Macmillan’s (1894–1986; UK prime-minister 1957-1963) government and was one of a number of the factors in the social changes which marked English society in the 1960s.  Commendably, the former Grenadier Guards captain's sang froid didn't desert him: woken from his sleep to be told the scandal was about to break, he remarked: "Well, at least it was with a woman".  That line was for years quoted approvingly and it was only when the old Etonian's bisexuality became common knowledge it was re-appraised.   

Commercially & technically, photography then was a different business and the “chair” image was the last shot on a 12-exposure film, all taken in less than five minutes at the end of a session which hurriedly had been arranged because Ms Keeler had signed a contract which included a “nudity” clause for photos to be used as “publicity stills” for a proposed film about the scandal.  As things turned out, the film was never released (not until Scandal (1989) one would appear) but the photograph was leaked to the tabloid press, becoming one of the more famous of the era although later feminist critiques would deconstruct the issues of exploitation they claimed were inherent.  Playboy’s editors would not be unaware of the criticism but the use of a chair to render a nude image SFW (suitable for work) remains in the SOP (standard operating procedures) manual.

Contact sheet from photoshoot, Victoria and Albert (V&A) Museum: exhibit E.2830-2016.

Before the “nude” part which concluded the session, two rolls of film had already been shot with the subject sitting in various positions (on the chair and the floor) while “wearing” a small leather jerkin.  At that point the film’s producers mentioned the “nude” clause.  Ms Keeler wasn’t enthusiastic but the producers insisted so all except subject and photographer left the room and the last roll was shot, some of the earlier poses reprised while others were staged, the last, taken with the camera a little further away with the subject in what Mr Morley described as “a perfect positioning”, was the “chair” shot.

The “Keeler Chair” (left) and an Arne Jacobsen Model 3107 (right).

Both chair & the gelatin-silver print of the photograph are now in the collections of London’s Victoria and Albert (V&A) Museum (the photograph exhibit E.2-2002; the chair W.10-2013).  Although often wrongly identified a Model 3107 (1955) by Danish modernist architect & furniture designer Arne Jacobsen (1902-1971), it’s actually an example of one of a number of inexpensive knock-offs produced in the era.  Mr Morley in 1962 bought six (at five shillings (50c) apiece) for his studio and it’s believed his were made in Denmark although the identity of the designer or manufacturer are unknown.  Unlike a genuine 3107, the knock-off has a handle cut-out (in a shape close to a regular trapezoid) high on the back, an addition both functional and ploy typical of those used by knock-off producers seeking to evade accusations of violations of copyright.  Structurally, a 3017 uses a thinner grade of plywood and a more subtle molding.  The half-dozen chairs in Mr Morley’s studio were mostly unnoticed office furniture until Ms Keeler lent one its infamy although they did appear in others of his shoots including those from his session with television personality & interviewer Sir David Frost (1939–2013) and it’s claimed the same chair was used for both.  In London’s second-hand shops it’s still common to see the knock-offs (there were many) described as “Keeler” chairs and Ms Lohan’s playboy shoot was one of many in which the motif has been used.  The obvious choice of pose for Joanne Whalley-Kilmer’s promotional shots for the 1989 film in which she played Ms Keeler, it appeared also on the the covers of the DVD & Blu-ray releases 

Old Smoky, the electric chair once used in the Tennessee prison system, Alcatraz East Crime Museum.  "Old Sparky" was once the preferred but in modern use "the chair" seems to have prevailed.

"Then we'd get the chair": The Simpsons, season six.

Crooked Hillary Clinton in pantsuit.

Although the numbers did bounce around a little, polling by politico.com found that typically about half of Republican voters believe crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) should be locked up while fewer than 2% think she should “get the chair”, apparently on the basis of her being guilty of something although some might just find her “really annoying” and take the pragmatic view a death sentence would remove at least that problem from their life.  The term “electric chair” is most associated with the device used for executions but is also common slang to describe other machinery including electric wheelchairs and powered (heat, cooling or movement) seats or chairs of many types.  First used in the US during the 1890s, like the guillotine, the electric chair was designed as a more humane (ie faster) method of execution compared with the then common hanging where death could take minutes.  Now rarely used (and in some cases declared unconstitutional as a “cruel & unusual punishment”), in some US states, technically it remains available including as an option the condemned may choose in preference to lethal injection or the firing squad.  Interestingly, although during the successful 2016 campaign, Donald Trump (b 1946; US president 2017-2021 and since 2025) made much of "locking up" crooked Hillary were he to be elected, once in the White House, the usefulness of the "promise" was exhausted.  His supporters however expected a prosecution and journalists did whether he would order investigations into the conduct of Bill  (b 1946; US president 1993-2001) and crooked Hillary.  He replied with a perfunctory shake of the head and an almost mumbled "No, they're good people" and that was the end of that.  It was an interesting insight into many aspects of Mr Trump's character and political techniques.    

Electric Chair Suite (1971) screen print decalogy by Andy Warhol.

Based on a newspaper photograph (published in 1953) of the death chamber at Sing Sing Prison in New York, where US citizens Julius (1918-1953) & Ethel Rosenberg (1915-1953) were that year executed as spies, Andy Warhol (1928–1987) produced a number of versions of Electric Chair, part of the artist’s Death and Disaster series which, beginning in 1963, depicted imagery such as car crashes, suicides and urban unrest.  The series was among the many which exploited his technique of transferring a photograph in glue onto silk, a method which meant each varied in some slight way.  His interest was two-fold: (1) what is the effect on the audience of render the same image with variations and (2) if truly gruesome pictures repeatedly are displayed, is the effect one of reinforcement or desensitization?  His second question was later revisited as the gratuitous repetition of disturbing images became more common as the substantially unmediated internet achieved critical mass.  The first of the Electric Chair works was created in 1964.