Showing posts sorted by date for query Align. Sort by relevance Show all posts
Showing posts sorted by date for query Align. Sort by relevance Show all posts

Saturday, December 20, 2025

Enthrone

Enthrone (pronounced en-throhn)

(1) To put on the throne in a formal installation ceremony (sometimes called an enthronement) which variously could be synonymous with (or simultaneously performed with) a coronation or other ceremonies of investiture.

(2) Figuratively in this context, to help a candidate to the succession of a monarchy or by extension in any other major organisation (ie the role of “kingmakers”, literal and otherwise).

(3) To invest with sovereign or episcopal authority (ie a legal instrument separate from any ceremony).

(4) To honour or exalt (now rare except in literary or poetic use).

(5) Figuratively, to assign authority to or vest authority in.

Circa 1600: The construct was en- + throne and the original meaning was “to place on a throne, exalt to the seat of royalty”.  For this purpose it replaced the late fourteenth century enthronize, from the thirteenth century Old French introniser, from the Late Latin inthronizare, from Greek the enthronizein.  In the late fourteenth century the verb throne (directly from the noun) was used in the same sense.  Throne (the chair or seat occupied by a sovereign, bishop or other exalted personage on ceremonial occasions) dates from the late twelfth century and was from the Middle English trone, from the Old French trone, from the Latin thronus, from the Ancient Greek θρόνος (thrónos) (chair, high-set seat, throne).  It replaced the earlier Middle English seld (seat, throne).  In facetious use, as early as the 1920s, throne could mean “a toilet” (used usually in the phrase “on the throne”) and in theology had the special use (in the plural and capitalized) describing the third (a member of an order of angels ranked above dominions and below cherubim) of the nine orders into which the angels traditionally were divided in medieval angelology.  The en- prefix was from the Middle English en- (en-, in-), from the Old French en- (also an-), from the Latin in- (in, into).  It was also an alteration of in-, from the Middle English in-, from the Old English in- (in, into), from the Proto-Germanic in (in).  Both the Latin & Germanic forms were from the primitive Indo-European en (in, into).  The intensive use of the Old French en- & an- was due to confluence with Frankish intensive prefix an- which was related to the Old English intensive prefix -on.  It formed a transitive verb whose meaning is to make the attached adjective (1) in, into, (2) on, onto or (3) covered.  It was used also to denote “caused” or as an intensifier.  The prefix em- was (and still is) used before certain consonants, notably the labials b and p.  Enthrone, dethrone, enthronest & enthronize are verbs, enthronementm, enthronization & enthroner are nouns, enthroning is a noun & verb, enthroned is a verb & adjective; the noun plural is enthronements.  The noun enthronable is non-standard.  The derived forms include the verb unenthrone, reenthrone & disenthrone and although there have been many enthroners, the form enthronee has never existed.

Alhaji Ibrahim Wogorie (b 1967) being enskinned as North Sisala community chief, Ghana, July 2023.

In colonial-era West Africa the coined forms were “enskin” (thus enskinment, enskinning, enskinned) and “enstool” (thus enstoolment, enstooling, enstooled).  These words were used to refer to the ceremonies in which a tribal chief was installed in his role; the meanings thus essentially the same as enjoyed in the West by “enthrone”.  The constructs reflected a mix of indigenous political culture and English morphological adaptation during the colonial period, the elements explained by (1) the animal skins (the distinctive cheetah often mentioned in the reports of contemporary anthropologists although in some Islamic and Sahelian-influenced chieftaincies (including the Dagomba, Mamprusi, Hausa emirates), a cow or lion skin often was the symbol of authority) which often surrounded the new chief and (2) the tradition in Africa of a chief sitting on a stool.  Sometimes, the unfortunate animal’s skin would be laid over the stool (and almost always, one seems to have been laid at the chief’s feet) but in some traditions (notably in northern Ghana and parts of Nigeria) it was a mark of honor for the chief to sit on a skin spread on the ground.

Dr Mahamudu Bawumia (b 1963), enstooled as Nana Ntentankesehene (Chief of the Internet/Web), Ghana, August 2024.  Note the cheetah skin used to trim the chair.

The stool was the central symbol of chieftaincy and kingship among Akan-speaking peoples (still in present-day Ghana where “to enskin” is used generally to mean “to install as a leader of a group” and the constitution (1992) explicitly protects the institution of chieftaincy and judicial decisions routinely use “enstool” or “enskin” (depending on region)).  In Akan political culture, the most famous use was the Sika Dwa Kofi (the Golden Stool) of the Asante and it represented the embodiment of the polity and ancestors, not merely a seat (used rather like the synecdoches “the Pentagon” (for the US Department of Defense (which appears now to be headed by a cabinet office who simultaneously is both Secretary of Defense & Secretary of War)) or “Downing Street” (for the UK prime-minister or the government generally).  Thus, to be “enstooled” is ritually to be placed into office as chief, inheriting the authority vested in the stool.  Enskin & enstool (both of which seem first to have appeared in the records of the Colonial Office in the 1880s and thus were products of the consolidation of British indirect rule in West Africa, rather than being survivals from earlier missionary English which also coined its own terms) were examples of semantic calquing (the English vocabulary reshaped to encode indigenous concepts) and, as it was under the Raj in India, it was practical administrative pragmatism, colonial officials needing precise (and standardized) terms that distinguished between different systems of authority.  In truth, they were also often part of classic colonial “fixes” in which the British would take existing ceremonies and add layers of ritual to afforce the idea of a chief as “their ruler” and within a couple of generations, sometimes the local population would talk of the newly elaborate ceremony as something dating back centuries; the “fix” was a form of constructed double-legitimization.

A classic colonial fix was the Bose Levu Vakaturaga (Great Council of Chiefs) in Fiji which the British administrators created in 1878.  While it's true that prior to European contact, there had been meetings between turaga (tribal chiefs) to settle disputes and for other purposes, all the evidence suggests they were ad-hoc appointments with little of the formality, pomp and circumstance the British introduced.  Still, it was a successful institution which the chiefs embraced, apparently with some enthusiasm because the cloaks and other accoutrements they adopted for the occasion became increasingly elaborate and it was a generally harmonious form of indigenous governance which enabled the British to conduct matters of administration and policy-making almost exclusively through the chiefs.  The council survived even after Fiji gained independence from Britain in 1970 until it was in 2012 abolished by the military government of Commodore Frank Bainimarama (b 1954; prime minister of Fiji 2007-2022), as part of reform programme said to be an attempt to reduce ethnic divisions and promote a unified national identity.  The commodore's political future would be more assured had he learned lessons from the Raj.

There was of course an element of racial hierarchy in all this and “enskin” & “enstool” denoted a “tribal chief” under British rule whereas “enthrone” might have been thought to imply some form of sovereignty because that was the linkage in Europe and that would never do.  What the colonial authorities wanted was to maintain the idea of “the stool” as a corporate symbol, the office the repository of the authority, not the individual.  The danger with using a term like “enthronement” was the population might be infected by the European notion of monarchy as a hereditary kingship with personal sovereignty; what the Europeans wanted was “a stool” and they would decide who would be enstooled, destooled or restooled. 

Prince Mangosuthu Buthelezi, Moses Mabhida Stadium, Durban, South Africa, October 2022.

English words and their connotations did continue to matter in the post-colonial world because although the colonizers might have departed, often the legacy of language remained, sometimes as an “official” language of government and administration.  In the 1990s, the office of South Africa’s Prince Mangosuthu Buthelezi (1928–2023) sent a series of letters to the world’s media outlets advising he should be styled as “Prince” and not “Chief”, on the basis of being the grandson of one Zulu king and the nephew of another.  The Zulus were once described as a “tribe” and while that reflected the use in ethnography, the appeal in the West was really that it represented a rung on the racist hierarchy of civilization, the preferred model being: white people have nations or states, Africans cluster in tribes or clans.  The colonial administrators recognized these groups had leaders and typically they used the style “chief” (from the Middle English cheef & chef, from the Old French chef & chief (leader), from the Vulgar Latin capus, from the Classical Latin caput (head), from the Proto-Italic kaput, from the primitive Indo-European káput).  As the colonial records make clear, there were “good” chiefs and “troublesome” chiefs, thus the need sometimes to arrange a replacement enstooling.

Unlike in the West where styles of address and orders of precedence were codified (indeed, somewhat fetishized), the traditions in Africa seem to have been more fluid and Mangosuthu Buthelezi didn’t rely on statute or even documented convention when requesting the change.  Instead, he explained “prince” reflected his Zulu royal lineage not only was appropriate (he may have cast an envious eye at the many Nigerian princes) but was also commonly used as his style by South African media, some organs or government and certainly his own Zulu-based political party (IQembu leNkatha yeNkululeko (the IPF; Inkatha Freedom Party).  He had in 1953 assumed the Inkosi (chieftainship) of the Buthelezi clan, something officially recognized four year laters by Pretoria although not until the early 1980s (when it was thought he might be useful as a wedge to drive into the ANC (African National Congress) does the Apartheid-era government seem to have started referring to him as “prince”).  Despite that cynical semi-concession, there was never a formal re-designation.

Enthroned & installed: Lindsay Lohan in acrylic & rhinestone tiara during “prom queen scene” in Mean Girls (2004).

In the matter of prom queens and such, it’s correct to say there has been “an enthronement” because even in the absence of a physical throne (in the sense of “a chair”), the accession is marked by the announcement and the placing of the crown or tiara.  This differs from something like the “enthroning” of a king or queen in the UK because, constitutionally, there is no interregnum, the new assuming the title as the old took their last breath and “enthronement” is a term reserved casually to apply to the coronation.  Since the early twentieth century, the palace and government have contrived to make an elaborate “made for television” ceremony although it has constitutional significance beyond the rituals related to the sovereign’s role as Supreme Governor of the Church of England.

Dame Sarah Mullally in the regalia of Bishop of London; in January 2026, she will take office as Archbishop of Canterbury, the formal installation in March.  No longer one of the world's more desirable jobs (essentially because it can't be done), all wish her the best of British luck.

In October 2025, the matter of enthronement (or, more correctly, non-enthronement) in the Church of England made a brief splash in some of the less explored corners of social media after it was announced the ceremony marking the accession of the next Archbishop of Canterbury would be conducted in Canterbury Cathedral in March 2026.  The announcement was unexceptional in that it was expected and for centuries Archbishops of Canterbury have come and gone (although the last one was declared gone rather sooner than expected) but what attracted some comment was the new appointee was to be “installed” rather than the once traditional “enthroned”.  The conclusion some drew was this apparent relegation was related to the next archbishop being Dame Sarah Mullally (née Bowser; b 1962) the first woman to hold the once desirable job, the previous 105 prelates having been men, the first, Saint Augustine of Canterbury in 597.

However, there is in the church no substantive legal or theological significance in the use of “installed” rather than “enthroned” and the choice reflects modern ecclesiastical practice rather than having any doctrinal or canonical effect.  A person become Archbishop of Canterbury through a sequence of juridical acts and these constitute the decisive legal instruments; ceremonial rites have a symbolic value but nothing more, the power of the office vested from the point at which the legal mechanisms have correctly been executed (in that, things align with the procedures used for the nation’s monarchs).  So the difference is one of tone rather than substance and the “modern” church has for decades sought to distance itself from perceptions it may harbor quasi-regal aspirations or the perpetuation of clerical grandeur and separateness; at least from Lambeth Palace, the preferred model long has been: pastoral; most Church of England bishops have for some times been “installed” in their cathedrals (despite “enthronement” surviving in some press reports, a product likely either of nostalgia or “cut & paste journalism”).  That said, some Anglican provinces outside England still “enthrone” (apparently on the basis “it’s always been done that way” rather than the making of a theological or secular point”).

Lambeth Palace, the Archbishop of Canterbury's official London residence.

Interestingly, Archbishops of York (“the church in the north”) have continued to be enthroned while those at Canterbury became installations.  Under canon law, the wording makes literally no difference and historians have concluded the retention of the older form is clung to for no reason other than “product differentiation”, York Minster often emphasizing their continuity with medieval ceremonial forms; it’s thus a mere cultural artefact, the two ceremonies performing the same liturgical action: seating the archbishop in the cathedra (the chair (throne) of the archbishop).  Because it’s the Archbishop of Canterbury and not York who sits as the “spiritual head of the worldwide Anglican community”, in York there’s probably not the same sensitivity to criticism of continuing with “Romish ways” with the whiff of “popery”.

In an indication of how little the wording matters, it’s not clear who was the last Archbishop of Canterbury who could be said to have been “enthroned” because there was never any differentiation of form in the ceremonies and the documents suggest the terms were used casually and even interchangeably.  What can be said is that Geoffrey Fisher (1887–1972; AoC-99: 1945-1961) was installed at a ceremony widely described (in the official programme, ecclesiastical commentaries and other church & secular publications) as an “enthronement” and that was the term used in the government Gazette; that’s as official an endorsement of the term as seems possible because, being an established church, bishops are appointed by the Crown on the advice of the prime minister although the procedure has at least since 2007 been a “legal fiction” because the church’s CNC (Crown Nominations Commission) sends the names to the prime minister who acts as a “postbox”, forwarding them to the palace for the issuing of letters patent confirming the appointment.  When Michael Ramsey (1904–1988; AoC-100: 1961-1974), was appointed, although the term “enthrone” did appear in press reports, the church’s documents almost wholly seem to have used “install” and since then, in Canterbury, it’s been installations all the way,

Pope Pius XII in triple tiara at his coronation, The Vatican, March, 1939.

So, by the early 1960s the church was responding, if cautiously, to the growing anti-monarchical sentiment in post-war ecclesiology although this does seem to have been a sentiment of greater moment to intellectuals and theologians than parishioners.  About these matters there was however a kind of ecumenical sensitivity emerging and the conciliar theology later was crystallised (if not exactly codified) in the papers of Second Vatican Council (Vatican II, 1962-1965, published 1970).  The comparison with the practice in Rome is interesting because there are more similarities than differences although that is obscured by words like “enthronement” and “coronation” being seemingly embedded in the popular (and journalistic) imagination. That’s perhaps understandable because for two millennia as many as 275 popes (officially the count is 267 but it’s not certain how many there have been because there have been “anti-popes” and allegedly even one woman (although that’s now largely discounted)) have sat “on the throne of Saint Peter” (retrospectively the first pope) so the tradition is long.  In Roman Catholic canon law, “enthronement” is not a juridical term; the universal term is capio sedem (taking possession of the cathedral (ie “installation”)) and, as in England, an appointment is formalized once the legal instruments are complete, the subsequent ceremony, while an important part of the institution’s mystique, exists for the same reason as it does for the Church of England or the House of Windsor: it’s the circuses part of panem et circenses (bread and circuses).  Unlike popes who once had coronations, archbishops of Canterbury never did because they made no claim to temporal sovereignty.

Pope Paul VI in triple tiara at his coronation, The Vatican, June. 1963.  It was the last papal coronation.

So, technically, modern popes are “installed as Bishop of Rome” and in recent decades the Holy See has adjusted the use of accoutrements to dispel any implication of an “enthronement”, the last papal coronation at which a pope was crowned with the triple tiara was that of Paul VI (1897-1978; pope 1963-1978) but in “an act of humility” he removed it, placing it on the on the alter where (figuratively), it has since sat.  Actually, Paul VI setting aside the triple tiara as a symbolic renunciation of temporal and monarchical authority was a bit overdue because the Papal States had been lost to the Holy See with the unification of Italy in 1870 though the Church refused to acknowledge that reality; in protest, no pope for decades set foot outside the Vatican.  However, in the form of the Lateran Treaty (1929), the Holy See entered into a concordat with the Italian state whereby the (1) the Vatican was recognized as a sovereign state and (2) the church was recognized as Italy’s state religion in exchange for which the territorial and political reality was recognized.  Despite that, until 1963 the triple tiara (one tier of which was said to symbolize the pope’s temporal authority over the papal states) appeared in the coronations of Pius XII (1876-1958; pope 1939-1958), John XXIII (1881-1963; pope 1958-1963) and Paul VI (who didn’t formal abolished the rite of papal coronation from the Ordo Rituum pro Ministerii Petrini Initio Romae Episcopi (Order of Rites for the Beginning of the Petrine Ministry of the Bishop of Rome (the liturgical book detailing the ceremonies for a pope's installation)) until 1975.

The Chair of St Augustine.  In church circles, archbishops of Canterbury are sometimes said to "occupy the Chair of St Augustine".

The Chair of St Augustine sits in Canterbury Cathedral but technically, an AoC is “twice installed”: once on the Diocesan throne as the Bishop of the see of Canterbury and also on the Chair of St Augustine as Primate of All England (the nation's first bishop) and spiritual leader of the worldwide Anglican Communion. So, there’s nothing unusual in Sarah Mullally being “installed” rather than “enthroned” as would have been the universal terminology between the reformation and the early twentieth century.  Linguistically, legally and theologically, the choice of words is a non-event and anyone who wishes to describe Dame Sarah as “enthroned” may do so without fear of condemnation, excommunication or a burning at the stake.  What is most likely is that of those few who notice, fewer still are likely to care.

Monday, October 20, 2025

Etching

Etching (pronounced ech-ing)

(1) The art, act or process of making designs or pictures on a metal plate, glass etc, by the corrosive action of an acid instead of by a burin.

(2) An impression, as on paper, taken from an etched plate.

(3) The design so produced.

(4) A flat (usually metal) plate bearing such a design.

1625–1635: The construct was etch + -ing.  The verb etch was from the Dutch etsen (to engrave by eating away the surface of with acids), from the German ätzen (to etch), from the Old High German azzon (to cause to bite or feed), from the Proto-Germanic atjaną, causative of etaną (to eat), from the primitive Indo-European root ed- (to eat) (from these sources English gained “eat”).  The suffix –ing was from the Middle English -ing, from the Old English –ing & -ung (in the sense of the modern -ing, as a suffix forming nouns from verbs), from the Proto-West Germanic –ingu & -ungu, from the Proto-Germanic –ingō & -ungō. It was cognate with the Saterland Frisian -enge, the West Frisian –ing, the Dutch –ing, The Low German –ing & -ink, the German –ung, the Swedish -ing and the Icelandic –ing; All the cognate forms were used for the same purpose as the English -ing).  The “etching scribe” was a needle-sharp steel tool for incising into plates in etching and the production of dry points.  Etching is a noun & verb; the noun plural is etchings.

The noun was the present participle and gerund of etch (the verbal noun from the verb etch) and was used also in the sense of “the art of engraving”; by the 1760s, it was used also to mean “a print etc, made from an etched plate" and the plates themselves.  The term etching (to cut into a surface with an acid or other corrosive substance in order to make a pattern) is most associated with the creation of printing plates for the production of artistic works but the technique was used also as a way to render decorative patterns on metal.  In modern use, it’s also a term used in the making of circuit boards.  In idiomatic use (often as “etched in the memory”), it’s used of events, ideas etc which are especially memorable (for reasons good and ill) and as a slang word meaning “to sketch; quickly to draw”.  The Etch A Sketch drawing toy was introduced 1960 by Ohio Art Company; a kind of miniature plotter, it was a screen with two knobs which moved a stylus horizontally & vertically, displacing an aluminum powder to produce solid lines.  To delete the creation, the user physically shook the device which returned the powder to its original position, blanking the screen.

Rembrandt's Jan Asselyn, Painter (1646) (left) and Faust (circa 1652).  Rembrandt (Rembrandt Harmenszoon van Rijn (1606-1669)) wasn’t the most prolific etcher but remains among the most famous and his output provides an illustrative case-study in the evolution of his mastering of the technique, his early work really quite diffident compared with his later boldness.

What came to be known as etching gained the name from the Germanic family of words meaning “eat & “to eat”, the transferred sense an allusion to the acid which literally would “eat the metal”.   Etching is an intaglio (from the Italian intagliare (to engrave)) technique in printmaking, a term which includes methods such as hard and soft ground etching, engraving, dry-point, mezzotint and aquatint, all of which use an ink transferring process.  In this, a design is etched into a plate, the ink added over the whole surface plate before a scrim (historically starched cheesecloth) is used to force the ink into the etched areas and remove any excess.  Subsequently, the plate (along with dampened paper) is run through a press at high pressure, forcing the paper into etched areas containing the ink.  The earliest known signed and dated etching was created by Swiss Renaissance goldsmith Urs Graf (circa 1485-circa 1525) in 1513 and it’s from those who worked with gold that almost all forms of engraving are ultimately derived.

Lindsay Lohan, 1998, rendered in the style of etchings.

A phrase which was so beloved by comedy writers in the early-mid twentieth century that it became a cliché was “Want to come up and see my etchings?”, a euphemism for seduction.  Probably now a “stranded phrase”, the saying was based on some fragments of text in a novel by Horatio Alger Jr (1832–1899), a US author regarded as the first to formalize as genre fiction the “rags-to-riches” stories which had since the early days of the republic been the essence of the “American Dream” although it wasn’t until the twentieth century the term came into common use (often it’s now used ironically).

The practice of making etchings, woodcuts or engravings of famous paintings became popular, both artistically and commercially (which may for this purpose be much the same thing) in the late sixteenth century and developed over the next 250-odd years, an evolution tied closely to technological progress in printmaking and the materials available to artists.  The trend seems to have been accelerated by the spread in northern Europe (notably certain districts in Antwerp, Rome and Paris) of copperplate etching & engraving and while it may be dubious to draw conclusions from the works which have survived, the artists most re-produced clearly included Titian (Tiziano Vecellio, circa 1490-1576), Raphael (Raffaello Sanzio da Urbino, 1483–1520) and Michelangelo (Michelangelo di Lodovico Buonarroti Simoni; 1475–1564), the most prolific in the business of reproductive engraving including the Dutch specialist Cornelis Cort (circa 1533–circa 1578 and known in Italy where he spent his final years as Cornelio Fiammingo) and the Flemish Sadeler family, scions of which operated in many European cities.

Melencolia I (Melancholy I (1514)), etching by the German painter & printmaker Albrecht Dürer (1471–1528).

By the seventeenth century, the practice was well established and an entrenched part of the art market, fulfilling some of the functions which would later be absorbed by photography and Peter Paul Rubens (1577-1640), among others, employed in his studio engravers whose task was to reproduce his paintings for sale with those etchings available in a variety of sizes (and thus price-points so there’s little which is new in the structures of the modern art market).  Again, in the eighteenth century, technological determinism interplayed with public taste as techniques were refined to adapt the works to what wasn’t exactly a production line but certainly an arrangement which made possible larger and more rapid volumes with printing houses commissioning runs (which could be in the hundreds) of engravings following (faithfully and not) British and Continental paintings, advances in mezzotint meaning a greater tonal range had become possible, mimicking the light and shade of oil paintings.  In genteel homes, various institutions and even museums, it was entirely respectable to have hanging: “prints after the Old Masters”.

But as technology giveth, so can it taketh away and it was the late eighteenth century invention of lithography and, a few decades later photography, which triggered a decline in demand, the constantly improving quality of the new mediums gradually displacing reproductive etching in the marketplace although the tradition didn’t die as artists such as Francisco Goya (1746–1828), Eugène Delacroix (1798–1863) and Édouard Manet (1832–1883) maintained (or, in a sense, “revived”) etching as a creative rather than reproductive discipline and historians regards the early-mid nineteenth century as the era in which etching became a legitimate genre in art and not merely a means cheaply of distributing representations of existing works.

Self portrait: reflection (1996), etching by Lucian Freud (1922–2011).

Under modern copyright law, as a general principle, the selling of etchings of works still in copyright is not lawful without the permission from the holder of the rights and while details differ between jurisdictions, the basic rule of modern law (the Berne Convention, EU directives, national statutes) is that the sale (or even public display) of a reproduction or derivative work (the latter something which obviously follows without being an obvious duplicate of a copyrighted artwork) requires the consent of the copyright owner.  That protection tends in most jurisdictions to last 70 years after the death of the artist (the “life + 70” rule) and what matters is not that a reproduction is new (in the sense of the physical object) but that it is derivative because (1) to a high degree it emulates the composition, form(s), and expression of the original and (2) in appearance it to a high degree resembles a pre-existing, protected work.  Thus, if produced and offered for sale without permission, the object will infringe on the rights of whoever holds exclusive or delegated rights of reproduction or adaptation.  With exact or close replicas it’s not difficult for court to determine whether rights have been violated but at the margins, such as where works are “in the style of”, of “influenced by”, judgments are made on a case-by-case basis, the best publicized in recent years being those involving pop music.

The Lovers (circa 1528), etching by Parmigianino (Girolamo Francesco Maria Mazzola, 1503-1540) of the Italian Mannerist school.

All that means is one (usually) can sell reproductions of works by those dead for at least 70 years and even works “in their style” as long as there’s no attempt to misrepresent them as the product of their original artist’s hand.  There are in copyright exceptions such as “fair dealing” & “fair use” but their range is narrow and limited to fields such as criticism, review, education or satire and does not extend to normal commercial transactions.  Additionally, there’s a work-around in that if something is found to have been sufficiently “transformed”, it can be regarded as a new work but this exception tightly is policed and the threshold high.  Additionally, in recent years, as museums and galleries have put content on-line, what has emerged is the additional complication of such an institution hanging on its walls paintings long out of copyright yet asserting copyright on its commissioned photographs of those works.  That development must have delighted lawyers working in the lucrative field but the courts came to read-down the scope, most following the principles explained in an action before the US Federal District Court (Bridgeman Art Library v. Corel Corp., 36 F. Supp. 2d 191 (S.D.N.Y. 1999), which held (1) copyright in a photograph can in many cases exist but, (2) if a photograph is a “purely mechanical copy of a public-domain work” there usually no protection, even if the taking of the image required technical skill in lighting, angle selection and such.

Morphinomanes (Morphine Addicts, 1887), etching and and drypoint by French artist Albert Besnard (1849–1934).

There was still something for the lawyers because inherently there remained two separate layers of rights at play: (1) copyright (intellectual property) which belongs usually to the artist or their estate and (2) the property rights (ownership of the physical object) which are held by whomever may possess lawful title to the object (this may align with possession but not of necessity).  In other words a museum will likely own the paint & canvas yet not the copyright, something analogous with the discovery made by the few diligent souls who troubled themselves to read the small print they’d agreed to when installing software: in most cases one had lawful title to the physical media (diskette, CD, DVD etc) but often nothing more than a revocable licence to use a single instance of the software. It’s possible lawfully to produce and sell etchings of Goya (the artist having had the decency to drop dead more than 70 years ago) but one may not without permission reproduce or trace from a museum’s copyrighted photo of a Goya (and institutions sometimes maintain separate conditions of use for low and high-resolution images).  Even if one paints, draws or etches by hand, depending on this and that, a court can still hold there’s been a “substantial reproduction” of the composition, colour and such has been effected and thus there’s been an infringement of the underlying copyright.  So, the rules in this area are (1) proceed with caution if producing art not wholly original and (2) if a young lady is asked: “Want to come up and see my etchings?”, she should proceed with caution.

Wednesday, August 20, 2025

Mansfield

Mansfield (pronounced manz-feeld)

The slang term for the protective metal structures attached to the underside of trucks and trailers, designed to protect occupants of vehicles in “under-run” crashes (the victim’s vehicle impacting, often at mid-windscreen height with the solid frame of the truck’s tray).  A Mansfield bar, technically is called the RUPS (Rear Underrun Protection System).

1967: The devices are known as “Mansfield” bars because interest in the system was heightened after the death of the actress Jayne Mansfield (1933-1967), killed in an under-run accident on 28 June 1967.  The origin of the surname Mansfield is habitational with origins in Mansfield, Nottinghamshire. The early formations, recorded in the thirteenth century Domesday Book, show the first element uniformly as the Celtic Mam- (mother or breast (Manchester had a similar linkage)) with the later addition of the Old English feld (pasture, open country, field) as the second element.  The locational sense is thus suggestive of an association of the field by a hill called “Man”.  The etymology, one suspects, would have pleased Jayne Mansfield.

The "Mansfield crash" aftermath, 1966 Buick Electra 225, 28 June, 1967 (left) and the much re-printed photograph (right) of Sofia Loren (b 1934, left) and Jayne Mansfield (right), Romanoff's restaurant, Beverly Hills, Los Angeles, April 1957.  Ms Loren's sideways glance, one of the most famous in Hollywood's long history of such looks has been variously interpreted as "sceptical", "disapproving" and "envious", the latter a view probably restricted to men.  Ms Loren herself explained her look as one of genuine concern the pink satin gown might not prove equal to the occasion.  On the night, there were several photographers covering the event and images taken from other angles illustrate why that concern was reasonable.  There has never been any doubt Ms Mansfield's "wardrobe malfunction" was "engineered and rehearsed".   

On 28 June 1967, Jayne Mansfield was a front-seat passenger in a 1966 Buick Electra 225 four-door hardtop, en route to New Orleans where she was next day to be the subject of an interview.  While cruising along the highway at around two in the morning, the driver failed to perceive the semi-truck in front had slowed to a crawl because an anti-mosquito truck ahead was conducting fogging and blocking the lane.  The mist from the spray masked the truck's trailer and, the driver unable to react in time, the car hit at high speed, sliding under the semi-trailer, killing instantly the three front-seat occupants.  Although the myth has long circulated she was decapitated, an idea lent some credence by the visual ambiguity of photographs published at the time, while it was a severe head trauma, an autopsy determined the immediate cause of death was a "crushed skull with avulsion of cranium and brain".  The phenomenon of the “under-run” accident happens with some frequency because of a co-incidence of dimensions in the machines using the roads.  Pre-dating motorised transport, loading docks were built at a height of around four feet (48 inches; 1.2 m) because that was the most convenient height for men of average height engaged in loading and unloading goods.  Horse-drawn carts and later trucks were built to conform to this standard so trays would always closely align with dock.  Probably very shortly after cars and trucks began sharing roads, they started crashing into each other and, despite impact speeds and traffic volumes being relatively low, the under-run accident was noted in statistics as a particular type as early as 1927.

In the post-war years, speeds and traffic volumes rose and, coincidentally, the hood-lines (bonnet) on cars became lower, the windscreen now often somewhere around four feet high so the under-run vulnerability was exacerbated, cars now almost designed to slide under a truck to the point of the windscreen, thus turning the tray into a kind of horizontal guillotine, slicing into the passenger compartment at head-height.  That’s exactly how Jayne Mansfield died and while the Buick was an imposing 223.4 inches (5,674 mm) in length, it was much lower than the sedans of earlier generations.  As a footnote, when introduced in 1959, the Electra 225 (1959-1980) gained its name from being 225.4 inches (5,725 mm) long and while during the 1960s it would be just a little shorter, by 1970 it did again deserve the designation even by 1975 growing to 233.4 inches (5,928 mm), making it the longest four-door hardtop ever built by GM (General Motors), a record unlikely to be broken.  The use of length as a model name was unusual but others have done it, most recently the Maybach (2002-2013), a revived marquee intended by Mercedes-Benz as a competitor for Rolls-Royce & Bentley.  The Maybach was an impressive piece of engineering but its very existence only devalued the Mercedes-Benz brand and was an indication the MBAs who has supplanted the engineers as the company’s dynamic really didn’t have a clue, even about marketing which was supposed to be their forte.  The Maybachs were designated “57” & “62”, the allusion to their length (5.7 & 6.2 metres respectively).  Between 1948-2016, many Land Rovers were given model designation according to their wheelbase (with a bit or rounding up or down for convenience) in inches, thus "80", "88", "110" etc. 

Rear under-run Mansfield bar.

The US authorities did react, federal regulations requiring trucks and trailers be built with under-ride guards (reflectorized metal bars hanging beneath the back-end of trailers) passed in 1953, but the standards were rudimentary and until the incident in 1967, little attention was paid despite similar accidents killing hundreds each year.  The statistics probably tended to get lost among the ever-increasing road-toll, cars of the era being death traps, seat belts and engineering to improve crashworthiness almost unknown.  Predictably, the industry did its math (which took longer in the pre-spreadsheet era) and argued, given that above a certain speed impacts would still cause fatalities, the costs of retro-fitting heavy vehicles would be disproportionate to the number of lives saved or injuries avoided or made less severe.  It's macabre math but it part of business and the most infamous example was Ford's numbers people working out it was projected to be cheaper to pay the costs associated with people being incinerated in rear-ended Pintos than it would be to re-engineer the fuel tank.   

The Mansfield bar works by preventing the nods of the car being slung under the truck, protecting the passenger compartment from impact.

After 1967, although regulations were tightened and enforcement, though patchy, became more rigorous, deaths continued and in the US there are still an average of two-hundred fatalities annually in crashes involving Mansfield Bars.  There are proposals by the Federal Motor Carrier Safety Administration (FMCSA) to include Mansfield Bars on any truck inspection and suggesting to improve the design to something more effective, the devices since 1963 little more than brute force impact barriers and there’s interest in spring-loaded devices which would absorb more of the energy generated in a crash.  Coincidentally, the increasing preference by consumers for higher, bluff-fronted SUVs and light (a relative term, the "light" pick-up trucks popular in the US market regarded as "big" just about everywhere else, even in the Middle East, Australia and New Zealand where they're also sold) trucks has helped improve this aspect or road safety.

There’s concern too about side impacts.  Only a very small numbers of trucks have ever been fitted with any side impact protection and this omission also make corner impacts especially dangerous.  The cost of retro-fitting side (and therefore corner) Mansfield bars to a country’s entire heavy transport fleet would be onerous and it may be practical to phase in any mandatory requirements only over decades.

A photograph of a parked car & truck, the juxtaposition illustrating the limits of the protection afforded, especially in cases when the truck's tray extends well beyond the rear axle-line.  The moving truck was one of two hired by Lindsay Lohan (b 1986) when in early 2012 she moved out of 419 Venice Way, Venice Beach, Los Angeles where during 2011 she lived (in the house to the right; the semi-mirrored construction sometimes called a “pigeon pair”) next door to former special friend DJ Samantha Ronson (b 1977) who inhabited the one to the left (417).  She was compelled to move after a “freemason stalker threatened to kill her”, proving the Freemasons will stop at nothing.

Truganina, Melbourne, Australia, 4 June, 2025.

Mansfield bars can reduce injuries & fatalities but if the energy in a crash is sufficient (a product of mass, speed and the angle of contact at the point of impact), the consequences will still be catastrophic.  In the early morning of 4 June 2025 in Melbourne, Australia, a Mustang coupé crashed into the right-rear corner of a parked truck, the passenger (sitting in the left front seat of the RHD (right-hand drive) car) killed instantly while the driver was taken to hospital with non-life-threatening injuries.

Truganina, Melbourne, Australia, 4 June, 2025.

The damage sustained by the vehicles was what would be expected in the circumstances, the truck (build on a rigid steel ladder-chassis with a steel-framed freight compartment atop) suffering relatively minor damage while the Mustang’s (built to modern safety standards with the structure outside the passenger compartment designed as a “crumple zone” intended to absorb an impact’s energy before it reaches the occupants) left-front corner substantially was destroyed.  The right-side portion of the Mansfield bar which was hit was torn off in the impact, illustrating the limitations of the technology when speeds are very high, the same reason the car’s “safety cell” was unable to prevent a fatality.

The Seven Ups (1973).

Footage of crashes conducted during testing is illustrative but Hollywood does it better.  In the movie The Seven Ups (1973, produced & directed by Philip D'Antoni (1929-2018), a 1973 Pontiac Ventura Custom, while pursuing a 1973 Pontiac Grand Ville, crashes into a truck with an impact similar to the one in which Jayne Mansfield died; this being Hollywood, the driver emerges bruised & bloodied but intact.  In the movie, the truck is not fitted with a Mansfield bar but if the speed at the point of impact is sufficient, the physics are such that even such a device is unlikely to prevent fatalities.  A re-allocation of a name used on Pontiac’s full-sized (B-Body) line between 1960-1970, the Ventura (1971-1977) was built on the GM (General Motors) compact platform (X-Body), until then exclusive to the Chevrolet Nova (1968-1979 and badged between 1962-1968 as the Chevy II).

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.