Showing posts sorted by date for query harbinger. Sort by relevance Show all posts
Showing posts sorted by date for query harbinger. Sort by relevance Show all posts

Thursday, August 28, 2025

Houndstooth

Houndstooth (pronounced houns-tuth)

(1) A two-colour fabric pattern of broken checks (multi-color versions using the pattern do now exist and are also so-described).

(2) Fabric with a houndstooth pattern; an item of clothing made with such fabric.

(3) In botany, as Cynoglossum officinale (houndstongue, houndstooth, dog's tongue, gypsy flower (and “rats and mice” due to its smell), a herbaceous plant of the family Boraginaceae.

1936: A word, based on the appearance of the design, the pattern (in architecture, decorative art, fabric etc) is ancient but the descriptive term “houndstooth” has been in use only since 1936.  The shape is sometimes referred to as dogstooth (or dog's tooth) and in French it’s the more pleasing pied-de-poule (chicken feet), preferred also by the Italians.  In 1936 there must have been pedants who insisted it should have been “hound's tooth” because that does appear in some advertisements but in commercial use, houndstooth quickly was standardized.  The name was chosen a reference directly to a dog’s tooth, not the pattern of teeth marks left by its bite.  The construct was hounds + tooth.  Hound was from the Middle English hound, from the Old English hund, from the Proto-West Germanic hund, from the Proto-Germanic hundaz and was congnate with the West Frisian hûn, the Dutch hond, the Luxembourgish Hond, the German Hund, the German Low German Hund, the Danish hund, the Faroese hundur, the Icelandic hundur, the Norwegian Bokmål hund, the Norwegian Nynorsk hund and the Swedish hund, from the pre-Germanic untós (which may be compared with the Latvian sùnt-ene (big dog), an enlargement of the primitive Indo-European w (dog).  Elsewhere, the forms included the Old Irish (dog), the Tocharian B ku, the Lithuanian šuõ, the Armenian շուն (šun), and the Russian сука (suka)).  

In England, as late as the fourteenth century, “hound” remained the word in general use to describe most domestic canines while “dog” was used of a sub-type resembling the modern mastiff and bulldog.  By the sixteenth century, dog had displaced hound as the general word descriptor. The latter coming to be restricted to breeds used for hunting and in the same era, the word dog was adopted by several continental European languages as their word for mastiff.  Dog was from the Middle English dogge (source also of the Scots dug (dog)), from the Old English dogga & docga of uncertain origin.  Interestingly, the original sense appears to have been of a “common dog” (as opposed one well-bred), much as “cur” was later used and there’s evidence it was applied especially to stocky dogs of an unpleasing appearance.  Etymologists have pondered the origin:  It may have been a pet-form diminutive with the suffix -ga (the similar models being compare frocga (frog) & picga (pig), appended to a base dog-, or doc-(the origin and meaning of these unclear). Another possibility is Old English dox (dark, swarthy) (a la frocga from frog) while some have suggested a link to the Proto-West Germanic dugan (to be suitable), the origin of Old English dugan (to be good, worthy, useful), the English dow and the German taugen; the theory is based on the idea that it could have been a child’s epithet for dogs, used in the sense of “a good or helpful animal”.  Few support that and more are persuaded there may be some relationship with docce (stock, muscle), from the Proto-West Germanic dokkā (round mass, ball, muscle, doll), from which English gained dock (stumpy tail).  In fourteenth century England, hound (from the Old English hund) was the general word applied to all domestic canines while dog referred to some sub-types (typically those close in appearance to the modern mastiff and bulldog.  In German, the form endures as der Hund (the dog) & die Hunde (the dogs) and the houndstooth pattern is Hahnentritt.  Houndstooth is a noun; the noun plural is houndsteeth.  Strictly speaking, it may be that certain use of the plural (such as several houndstooth jackets) should be called “houndstooths” but this is an ugly word which should be avoided and no sources seem to list it as standard.  The same practice seems to have been adopted for handing the plural of cars called “Statesman”, “statesmen” seeming just an absurdity.

Although the classic black & white remains the industry staple, designer colors are now not uncommon.

In modern use in English, a “hound” seems to be thought of as a certain sort of dog, usually large, with a finely honed sense of smell and used (often in packs) for hunting and the sense development may also have been influenced by the novel The Hound of the Baskervilles (1901-1902) by the physician Sir Arthur Conan Doyle (1859–1930).  The best regarded of Conan Doyle’s four novels, it’s set in the gloomy fog of Dartmoor in England’s West Country and is the tale of the search for a “fearsome, diabolical hound of supernatural origin”.  The author's name is an example of how conventions of use influence things.  He's long been referred to as “Sir Arthur Conan Doyle” or “Conan Doyle” which would imply the surname “Conan Doyle” but his surname was “Doyle” and he was baptized with the Christian names “Arthur Ignatius Conan”, the “Conan” from his godfather.  Some academic and literary libraries do list him as “Doyle” but he's now referred to almost universally as “Conan Doyle” and the name “Arthur Doyle” would be as un-associated with him as “George Shaw” would with George Bernard Shaw (GBS; 1856-1950).  Conan Doyle's most famous creation was of course the detective Sherlock Holmes and he wore a houndstooth deerstalker cap.   Tooth (a hard, calcareous structure present in the mouth of many vertebrate animals, generally used for biting and chewing food) was from the Middle English tothe, toth & tooth, from the Old English tōþ (tooth), from the Proto-West Germanic tanþ, from the Proto-Germanic tanþs (tooth), from the primitive Indo-European h₃dónts (tooth) and related to tusk.

Lindsay Lohan in monochrome check jacket, Dorchester Hotel, London, June 2017 (left), Lindsay Lohan in L.A.M.B. Lambstooth Sweater, Los Angeles, April 2005 (centre) and racing driver Sir Lewis Hamilton (b 1985) in a Burberry Houndstooth ensemble, Annual FIA Prize Giving Ceremony, Baku, Azerbaijan, December 2023 (right).  The Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation) is world sport's dopiest regulatory body.  Although, at a distance, a wide range of fabrics look like houndstooth, some are really simple symmetrical, monochrome checks without the distinctive pattern and where designers have varied the shape, other descriptors (and L.A.M.B. couldn’t resist “lambstooth”) are used, something which helps also with product differentiation.  Sir Lewis though, sticks to the classics.  Regarded as the most fashion conscious of the Formula One drivers of his generation, it’s clear that assiduously he studies Lohanic fashion directions.

Designers consider houndstooth part of the plaid “family”, the jagged contours of the shape the point of differentiation from most which tend towards uniform, straight lines.  Although for the archaeological record its clear the concept of the design has an ancient lineage, what’s now thought of as the “classic” black & white houndstooth was defined in the mid-nineteenth century when it began to be produced at scale in the Scottish lowlands, in parallel with the plaid most associated with the culture, the tartan (although in some aspects the “history & cultural traditions” of tartan were a bit of a commercial construct).  Technically, a houndstooth is a two tone (the term monochrome often used in the industry to convey the idea of “black & white” (a la photography) rather than being etymologically accurate) plaid in four bands, two of each color (in both the weft & warp weave), woven with the simple 2:2 twill.  One of the charms of the design is that with slight variations in size and scale, different effects can be achieved and color mixes are now not uncommon although the classic black & white remains the standard.

Houndstooth has received the imprimatur of more than one Princess of Wales: Catherine, Princess of Wales (b 1982, left) and Diana, Princess of Wales (1961-1997, right) in a typically daring color mix.

The history in the Lowlands is murky but it seems certain the early fabrics were woven from wool which makes sense given the importance of sheep to the economy and the early garments were utilitarian, often cloak-like outer garments for those tending the flocks.  The early term was “shepherd’s check” which became first “dogstooth” and then “houndstooth”, canine teeth something with which shepherds would have been familiar because of the threat to their animals from the predations of wild dogs.  Fabric with smaller checks could be called “puppycheck”.  Interestingly, despite its striking appearance, the houndstooth pattern remained a generic and was never adopted as a family or clan symbol, a la the tartans.  It gained a new popularity in the 1930s when photographs began to appear of members of the British royal family and various gentry wearing houndstooth jackets while hunting or riding, thus the association with wealth and privilege which so appealed to the middle class who started wearing them too.  By the time designers began to put them on the catwalks, houndstooth’s future was assured.

1969 Holden Monaro GTS 350 (left), 1972 Holden Monaro GTS 308 (centre) and 1977 Chrysler Cordoba (right).

Despite the popular perception, not all the “personal luxury” Chryslers of the era and not even all the Cordobas (1975-1983) were finished in “Rich Corinthian Leather”; except for a one-off appearance in the 1975 Imperial Brochures, the Corinthian hides were exclusive to the Cordoba.  For passenger car interiors, houndstooth (rendered usually with a synthetic material) enjoyed a late mid-century spate of popularity, used for what were called generically “cloth inserts” and the use of houndstooth trended towards vehicles marketed as “sporty” whereas for luxury cars plusher fabrics like velour were preferred.  The cloth inserts were usually paired with vinyl although in some more expensive ranges they were used with leather.

Houndstooth (left), Pepita (Shepherd's Check) (centre) and Vichy Check (right).

For decades, it’s been common to refer to the optional upholstery offered by Porsche in the 1960s as “houndstooth” but according to Recaro Automotive Seating, the German concern which supplied the fabric, the correct name is “Pepita” (known also as “Shepherd’s Check”), a design built with interconnected squares.  What has happened is that “houndstooth” has for most purposes in colloquial English become a generic term, used to describe anything “houndstoothesque” and it’s an understandable trend given that not only would a close examination be required to determine which pattern appears on a fabric, unless one is well-acquainted with the differences in shape, most would be none the wiser.  Nor did Recaro use “Vichy Check” for the seats they trimmed for Porsche although that erroneous claim too is sometimes made.  Further confusing the history, when Stuttgarter Karosseriewerk Reutter (Porsche’s original supplier) started production of seats used in the Porsche 356 (1948-1965) a number of fabrics were offered including one in nylon in a similar black-and-white pattern which was neither Houndstooth nor Pepita.

1967 Porsche 911S, trimmed in Recaro Pepita.

The Reutter family founded Recaro in 1963 and in December that year the first Pepita pattern fabrics were made commercially available, used on the later Porsche 356Cs, the 911 (which briefly was called the 901) & the 912.  Porsche’s best known use of the pepita fabric was on the Recaro Sportsitz (Sport seat), first displayed at the 1965 Frankfurt Motor Show and they’re a prized part of the early 911S models, the first of which were delivered in the northern summer of 1966.  At that point, the Pepita fabric became a factory option for the 911 and the last use was in the Recaro Idealsitz (Ideal seat), offered only in 1970–71 in black & white, red & beige, brown & beige and blue & green.  In a nostalgic nod, Porsche returned Pepita seats to the option list for the 911 legacy model, released in 2013 to mark the car’s 50th anniversary although Recaro was not involved in the production.

1969 Porsche 912.  The Pepita key-fob, sun visors and dashboard trim will appeal to some.

The factory at the time didn't apply the Pepita fabric quite so liberally but the originality police seem more indulgent towards departures from specification in 912s, especially if done in a way the factory might have done it; if seen on a 911, automatically, they deduct points.  The Porsche 912 (1965-1969 & (as 912E) 1976) was essentially a four-cylinder version of the 911 with less standard equipment and the early models used a version of the air-cooled flat-four from the superseded 356 (1948-1965).  It was highly successful (initially out-selling the much more expensive, six-cylinder, 911) and production ceased only because the factory’s capacity was needed for the new 914 (1969-1976) which, being mid-engined, Porsche believed was a harbinger for its future sports cars, there being little belief the rear-engine configuration would endure into the 1980s.  However, the customer always being right, things didn’t work out that way and, still in high demand, the rear-engined 911 has already entered the second quarter of the twenty-first century.  The 912E was a single-season “stop-gap model” for the US market to provide an entry-level Porsche between the end of 914 production and the introduction of the front-engined 924 (1976-1988).  Like the four-cylinder 914s and the early 924s, the 912E used a Volkswagen engine, Porsches old 356 unit having never been made compliant with emission control regulations.  Long something of an orphan, the 912 now has a following and while there are faithful restorations, modifications are not uncommon, many with interior appointment upgraded to include those used on the more expensive 911s.

Matching numbers, matching houndstooth: 1970 Holden HG GTS 350 Monaro in Indy Orange with black detailing (paint combo code 567-122040) and houndstooth cloth seat inserts in Indy Orange & black (trim code 1199-10Z).  This car (VIN: 81837GJ255169; Model: HG81837; Chassis: HG16214M) is one of the most prized Monaros because the specification includes a 350 cubic inch (5.7 litre) small block Chevrolet V8 (L48) with the “McKinnon block”, paired with the four-speed manual Saginaw gearbox.  Holden built 405 HG GTS 350s, 264 as manuals and 141 with the two-speed Powerglide automatic transmission.  The “McKinnon block” is a reference to the General Motors (GM) McKinnon Industries plant in St. Catharine's, Ontario where the engines were built; the “American” cars exported to the UK, Australia, New Zealand and elsewhere in the Commonwealth often came from Canada because of the preferential tariff arrangements.

Very 1970s: GM's Indy Orange houndstooth fabric; in the US it was also offered in the Chevrolet Camaro.

Introduced in 1968, the Holden Monaro was the car which triggered Australia’s brief flirtation with big (in local terms, the cars were “compact” size in US nomenclature) coupés, a fad which would fade away by the mid 1970s.  It had been Ford which had first tested the market with a Falcon two-door hardtop (XM, 1964-1965 & XP, 1965-1966) but when the restyled model was released, it was again based on the US Falcon and the range no longer included a two-door hardtop, the wildly successful Mustang having rendered it unnecessary.  There was still a two-door Falcon sedan but it was thought to have limited appeal in Australia and was never offered so Ford didn’t have a model comparable with the Monaro until the XA Falcon Hardtop made its debut late in 1972 although by then the brief moment had passed.  While the Falcon Hardtop remained successful as a race-car, sales never net expectations, compelling the factory to produce a number of promotional "special models", unchanged in mechanical specification but usually with distinctive paint schemes and "bundled options", the latter at a notional discount.

Friday, July 11, 2025

Dixiecrat

Dixiecrat (pronounced dik-see-krat)

(1) In US political history, a member of a faction of southern Democrats stressing states' rights and opposed to the civil rights programs of the Democratic Party, especially a southern Democrat who left the party in 1948 to support candidates of the States' Rights Democratic Party.

(2) In historic US use, a member of the US Democratic Party from the southern states (especially one of the former territories of the Confederacy), holding socially conservative views, supporting racial segregation and the continued entrenchment of a white hegemony.

1948: A portmanteau word of US origin, the construct being Dixie + (Demo)crat.  Wholly unrelated to other meanings, Dixie (also as Dixieland) in this context is a reference to the southern states of the United States, especially those formerly part of the Confederacy.  The origin is contested, the most supported theory being it’s derived from the Mason-Dixon Line, a historic (if not entirely accurate) delineation between the "free" North and "slave-owning" South.  Another idea is it was picked up from any of several songs with this name, especially the minstrel song Dixie (1859) by (northerner) Daniel Decatur Emmett (1815-1904), popular as a Confederate war song although most etymologists hold this confuses cause and effect, the word long pre-dating any of the known compositions.  There’s also a suggested link to the nineteenth-century nickname of New Orleans, from the dixie, a Confederate-era ten-dollar bill on which was printed the French dix (ten) but again, it came later.  The –crat suffix was from the Ancient Greek κράτος (krátos) (power, might), as used in words of Ancient Greek origin such as democrat and aristocrat; the ultimate root was the primitive Indo-European kret (hard).  Dixiecrat is a noun and Dixiecratic is an adjective; the noun plural is Dixiecrats.  The noun Dixiecratocracy (also as dixieocracy) was a humorous coining speculating about the nature of a Dixiecrat-run government; it was built on the model of kleptocracy, plutocracy, meritocracy, gerontocracy etc.

The night old Dixie died.

Former Dixiecrat, Senator Strom Thurmond (1902-2003; senator (Republican) for South Carolina 1954-2003) lies in state, Columbia, South Carolina, June 2003.

Universally called Dixiecrats, the States' Rights Democratic Party was formed in 1948 as a dissident breakaway from the Democratic Party.  Its core platform was permanently to secure the rights of states to legislate and enforce racial segregation and exclude the federal government from intervening in these matters.  Politically and culturally, it was a continuation of the disputes and compromises which emerged in the aftermath of the US Civil War almost a century earlier.  The Dixiecrats took control of the party machine in several southern states and contested the elections of 1948 with South Carolina governor Strom Thurmond as their presidential nominee but enjoyed little support outside the deep South and by 1952 most had returned to the Democratic Party.  However, in the following decades, they achieved a much greater influence as a southern faction than ever was achieved as a separatist party.  The shift in the south towards support for the Republican Party dates from this time and by the 1980s, the Democratic Party's control of presidential elections in the South had faded and many of the Dixiecrats had joined the Republicans.

US Electoral College map, 1948.

In the 1948 presidential election, the Dixiecrats didn’t enjoy the success polls had predicted (although that was the year of the infamous “Dewey Defeats Truman” headline and the polls got much wrong), carrying only four states, all south of the Mason-Dixon line and not even the antics of one “faithless elector” (one selected as an elector for the Democratic ticket who instead cast his vote for Dixiecrats) was sufficient to add Tennessee to the four (South Carolina, Mississippi, Alabama, and Louisiana) won.  Nor did they in other states gain sufficient support to act as “spoilers” as Ross Perot (1930–2019) in 1992 & 1996 and Ralph Nadar (b 1934) in 2000 achieved, the “narrowing of margins” in specific instances being of no immediate electoral consequence in the US system.  With that, the Dixiecrats (in the sense of the structure of the States' Rights Democratic Party) in a sense vanished but as an idea they remained for decades a potent force within the Democratic Party and their history is an illustration of why the often-quoted dictum by historian Professor Richard Hofstadter (1916–1970): “The role of third parties is to sting like a bee, then die” needs a little nuance.  What the Dixiecrats did after 1948 was not die but instead undergo a kind of “resurrection without crucifixion”, emerging to “march through the institutions” of the Democratic Party, existing as its southern faction.

That role was for generations politically significant and example of why the “third party” experience in the US historically wasn’t directly comparable with political behaviour elsewhere in the English-speaking world where “party discipline” tended to be “tight” with votes on the floors of parliaments almost always following party lines.  Until recent years (and this is something the “Trump phenomenon” radically has at least temporarily almost institutionalized), there was often only loose party discipline applied within the duopoly, Democrats and Republicans sometimes voting together on certain issues because the politicians were practical people who wished to be re-elected and understood what Tip O'Neill (1912–1994; (Democrat) speaker of the US Representatives 1977-1987) meant when he said “All politics is local”.  Structurally, that meant “third parties” can operate in the US and achieve stuff (for good or evil) as the Dixiecrats and later the Republican’s Tea Party Movement proved; it just that they do it as factions within the duopoly and that’s not unique, the Australian National Party (a re-branding of the old Country Party) really a regional pressure group of political horse traders disguised as a political party.

US Electoral College map, 1924.

The 1924 Electoral College results were a harbinger of the later Dixiecrat movement and a graphical representation of terms such as "solid South" or "south of the Mason-Dixon Line".  At the time of the 1924 election, slavery in the South was still in living memory.  Although there was fracturing at the edges, the "solid south" did remain a Democratic Party stronghold until the civil rights legislation of the 1960s and it was was the well-tuned political antennae of Texan Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) which picked up the implications and consequences of the reforms his skills had ushered through the Congress:  "I think I've just lost us the South" he was heard to remark when the Senate passed a landmark voting rights bill into law.

In recent years, what has changed in the US is the Republicans and Democrats have become the captive organizations of a tiny number of dedicated political operatives pursuing either their own ideological agendas or (more typically), those providing the funding.  The practical implication of that is the elections which now most matter are the primaries (where candidates for the election proper are selected) and because primary contests are voted on by a relative handful, outcomes are easier to influence and control that in general elections where there are millions to nudge.  Party discipline has thus become tighter than can often be seen on the floor of the House of Commons in the UK, not because the ideological commitments of politicians within parties have coalesced but because they’re now terrified of being “primaried” if they vote against the party line.  Re-election is a powerful inducement because the money politicians make during their careers is many, many times what might be expected given their notional earnings from their salary and entitlements.  There are few easier ways to get rich, thus the incentive to “toe the party line”.  This behavioural change, mapped onto something which structurally remains unchanged, is one of the many factors which have produced a country now apparently as polarized as ever it has been.  The nature of that polarization is sometimes misunderstood because of the proliferation of “red state, blue state” maps of the US which make the contrast between the “corrupting coastlines” and “flyover states” seem so stark but each state is of course a shade of purple (some darker, some lighter) but because of the way the two parties now operate, politics as it is practiced tends to represent the extreme, radical elements which now control the machines.  So while in the last twenty-odd years there’s been much spoken about “the 1%” in the sense of the tiny number of people who own or control so much, it’s political scientists and historians who much fret over the less conspicuous “1%” able to maintain effective control of the two parties, something of even greater significance because the state has put in place some structural impediments to challenging the two-party political duopoly.

In the US, the state does not (in a strict legal or constitutional sense of the word) “own” the Republican or Democratic Parties because they are “private” organizations protected by the constitution’s First Amendment (freedom of association).  However, over the years, something biologists would recognize as “symbiosis” has evolved as the state and the parties (willingly and sometimes enthusiastically) have become entangled to the extent a structural analysis would recognize the parties as quasi-public although not quite at the status familiar elsewhere as quangos (quasi autonomous non-government organizations).  Despite being “private concerns”, the parties routinely conduct state-regulated primaries to select candidates and in many cases these are funded by tax revenue and administered by state electoral instrumentalities.  Beyond that, it needs to be remembered that to speak of a “US national election” (as one might of a “UK general election”) is misleading because as a legal construct such events are really 50 elections run by each state with electoral laws not wholly aligned (thus the famous (or dreaded, depending on one’s position) Iowa caucuses) and in many states, it’s state law which regulates who can voted in party primaries, some permitting “open” primaries in which any lawfully enrolled voter is allowed to cast a ballot while others run “closed” events, restricting participation to registered members of the relevant party.  What that means is in some places a citizen can vote in each party’s primary.  That done, those who prevail in a primary further are advantaged because many states have laws setting parameters governing who may appear on a ballot paper and most of them provide an easier path for the Republican and Democratic Party candidates by virtue of having granted both “major party” status.  As objects, the two parties, uniquely, are embedded in the electoral apparatus and the interaction of ballot access laws, debate rules and campaign finance rules mean the two function as state-sponsored actors; while not quite structurally duopolistic, they operate in a protected environment with the electoral equivalent of “high tariff barriers”.

Elon Musk (left) and Donald Trump (right), with Tesla Cybertruck (AWD Foundation Series), the White House, March, 2025.  It seemed like a good idea at the time.

Given all that, Elon Musk’s (b 1971) recent announcement he was planning to launch a “third party” (actually the US has many political parties, the “third party” tag used as a synecdoche for “not one of the majors”) might seem “courageous” and surprised many who thought the experience of his recent foray into political life might have persuaded him pursuits like EVs (electric vehicles), digging tunnels (he deserves praise for naming that SpaceX spin-off: “The Boring Company”) and travelling to Mars were more fulfilling.  However, Mr Musk believes the core of the country’s problems lie in the way its public finances are now run on the basis of the “Dick Cheney (born 1941; US vice president 2001-2009) doctrine: “Deficits don’t matter” and having concluded neither of the major parties are prepared to change the paradigm which he believes is leading the US to a fiscal implosion, a third party is the only obvious vehicle.  In Western politics, ever since shades of “socialism” and “capitalism” defined the democratic narrative, the idea of a “third way” has been a lure for theorists and practitioners with many interpretations of what is meant but all have in common what Mr Musk seems to be suggesting: finding the middle ground and offering it to those currently voting for one or other of the majors only because “your extremists are worse than our extremists”.  Between extremes there’s much scope for positioning (which will be variable between “social” & “economic” issues) and, given his libertarian instincts, it seems predicable Mr Musk’s economic vision will be “centre-right” rather than “centre-left” but presumably he’ll flesh out the details as his venture evolves.

Mr Musk can’t be accused of creating a “third party” because he wants to become POTUS (president of the US).  As a naturalized US citizen, Mr Musk is ineligible because Article II, Section 1, Clause 5 of the constitution restricts the office to those who are a “natural born Citizen” (Article II, Section 1, Clause 5).  Because the US Supreme Court (USSC) has never handed down a definitive ruling on the matter it’s not absolutely certain what that phrase means but the consensus among legal scholars is it refers to someone who was at birth a US citizen.  That need not necessitate being born on the soil of the US or its territories because US citizens often are born in other countries (especially to those on military or diplomatic duty) and even in international waters; indeed, there would appear no constitutional impediment to someone born in outer space (or, under current constitutional interpretation, on Mars) becoming POTUS provided they were at the time of birth a US citizen.  Nor does it seem an interpretation of the word “natural” could be used to exclude a US citizen conceived through the use of some sort of “technology” such as IVF (In Vitro Fertilization).

Lindsay Lohan, potential third party POTUS.

As a naturalized US citizen, Elon Musk can’t become POTUS so his new party (tentatively called the “America” Party) will have to nominate someone else and the constitution stipulates (Article II, Section 1, Clause 5): “No Person except a natural born Citizen, or a Citizen of the United States, at the time of the Adoption of this Constitution, shall be eligible to the Office of President; neither shall any Person be eligible to that Office who shall not have attained to the Age of thirty five Years, and been fourteen Years a Resident within the United States”.  The age requirement is unambiguous and in his Commentaries on the Constitution of the United States (1833), Justice Joseph Story (1779–1845; associate justice of the Supreme Court of the USSC 1812-1845) explained the residence requirement was “…not an absolute inhabitancy within the United States during the whole period; but such an inhabitancy as includes a permanent domicil in the United States.  That means Mr Musk can consider nominating Lindsay Lohan for president.  She’d apparently flirted with the idea of running in 2020 but at that point would have been a few months too young; on all grounds she’ll be eligible for selection in 2028 and many would be attracted to the idea of Lindsay Lohan having her own nuclear weapons.

Whether or not it’s “courageous” (or even “heroic”), to build a new third party in the US time will tell but certainly it’s ambitious but Mr Musk is also a realist and may not be planning to have a presidential candidate on the ballot in all 50 states or even contest every seat both houses of Congress.  As he’ll have observed in a number of countries, “third parties” need neither parliamentary majorities nor executive office to achieve decisive influence over policy, some with comparatively little electoral support able to achieve “balance of power” status in legislatures provided those votes are clustered in the right places.  Additionally, because the polarized electorate has delivered such close results in the House & Senate, the math suggests a balance of power may be attainable with fewer seats than historically would have been demanded and under the US system of fixed terms, an administration cannot simply declare such a congress “unworkable” and all another election (a common tactic in the Westminster system); it must, for at least two years, work with what the people have elected, even if that includes an obstreperous third party. Still, the challenges will be onerous, even before the “dirty tricks” departments of the major parties start searching for skeletons in the closets of third party candidates (in a rare example of bipartisanship the Republicans and Democrats will probably do a bit of intelligence-sharing on that project) and the history is not encouraging.

It was the Republican party which in the 1850s was the last “third party” to make the transition to become a “major” and not since 1996 has such a candidate in a presidential contest secured more than 5% of the national vote.  In the Electoral College, not since 1968 has a third-party candidate carried any states and 1912 was the last time a third-party nominee finished second (and 1912 was a bit of a “special case” in which the circumstances were unusually propitious for challenges to the majors).  Still, with (1) the polls recording a general disillusionment with the major parties and institutions of state and (2) Mr Musk’s wealth able to buy much advertising and “other forms” of influence, prospects for a third party may be untypically bright in 2028 elections and 2030 mid-terms.  There are no more elections for Donald Trump (b 1946; US president 2017-2021 and since 2025) and it seems underestimated even now just what an aberration he is in the political cycle.  While his use of techniques and tactics from other fields truly has since 2016 been disruptive, what he has done is unlikely to be revolutionary because it is all so dependent on his presence and hands on the levers of power.  When he leaves office, without the “dread and awe” the implied threat of his displeasure evokes, business may return to something closer what we still imagine “normal” to be.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.