Showing posts sorted by relevance for query Debunk. Sort by date Show all posts
Showing posts sorted by relevance for query Debunk. Sort by date Show all posts

Friday, January 6, 2023

Debunk

Debunk (pronounced dih-buhngk)

(1) To expose or excoriate (a claim, assertion, sentiment, etc.) as being pretentious, false, or exaggerated.

(2) To disparage, ridicule, lampoon.

1920–1925: An invention of US English, the construct being de- + bunk.  The de- prefix was from the Latin -, from the preposition (of, from (the Old English æf- was a similar prefix)).  It imparted the sense of (1) reversal, undoing, removing, (2) intensification and (3) from, off.  Like dis-, the de- prefix was used to form a complex verb with the sense of undoing the action of a simple one and the handy device has been most productive, English gaining such useful words as demob, degauss and, of course, the dreaded deconstruct & the lamentable decaffeinate.  It’s obviously valuable but the more fastidious guardians of English were of course moved to caution it shouldn’t be used because one was too indolent to find the existing antonym although it was conceded that some coinings were necessary to convey some special sense such as “decontaminate”, needed in those situations when something like “cleanse” is inadequate.  Bunk in this context was etymologically un-related to other forms of “bunk” and was a and was a clipping of bunkum (pronounced buhng-kuhm) which meant (1) insincere speechmaking by a politician intended merely to please local constituents and (2) insincere talk; claptrap; humbug.  Debunk is (a transitive) verb and debunker is a noun.

Although the exact date in unclear, during sittings of the sixteenth United States Congress (1819-1821), a long, torturous debate ensued on the difficult matter of the Missouri Compromise, something which would later return to haunt the nation.  Well into discussions, Felix Walker (1753–1828; representative (Democratic-Republican (sic)) for North Carolina 1817-1823), rose and began what was apparently, even by the standards of the House of Representatives, a long, dull and irrelevant speech which, after quite some time, induced such boredom that many members walked from the chamber and other attempted to end his delivery by moving that the question be put.  Noting the reaction, Representative Walker felt compelled to explain, telling his colleagues “I’m talking for Buncombe”, referring to his constituents in Buncombe County.  Delivered phonetically, the phrase entered the political lexicon as “talking to (or for) Bunkum” and this was soon clipped to “bunk” meaning “speech of empty thoughts expressed with inflated or pretentious language”.  Later, the sense of bunk was extended to mean “anything wrong or worthless”.

Bunk in the sense of “wrong, worthless” probably gained its popularity from the phrase “history is bunk”, attributed to Henry Ford (1863–1947), famous for being founder of the Ford Motor Company and infamous for some of his more odious opinions.  His words first appeared in print in an interview, publishing in 1916, the context being his opposition to US involvement in the war in Europe:

"History is more or less bunk.  It is tradition.  We don’t want tradition.  We want to live in the present and the only history that is worth a tinker’s dam is the history we make today.  That’s the trouble with the world.  We’re living in books and history and tradition.  We want to get away from that and take care of today.  We’ve done too much looking back.  What we want to do and do it quick is to make just history right now."

Quite what Mr Ford meant has been much discussed over the years and the man himself did later discuss it, although there are inconsistencies in his explanations.  Historians have concluded he was expressing scepticism at the value of history as it is taught in schools and other educational institutions; his feeling being there was too much emphasis on kings & emperors, wars & empires, politics & philosophy and entirely too little on the lives of ordinary people who, in a sense, actually “made the history”.  Ironically, given his critique of what’s known as the “great man” school of history, he is regarded as one of the great men whenever histories are written of the early automobile and the development of assembly-line mass-production.

The verb “debunk” actually emerged from a work of what would now be called popular revisionist history.  In 1923, novelist William Woodward (1874-1950) published the best-selling Bunk, the blurb suggesting his purpose being to “take the bunk out of things” and debunk was soon adopted by academic historians who in the 1920s made something of an industry in writing books and papers debunking the myths and puff-pieces the propaganda of World War I (1914-1918) produced in abundance.  An obviously useful word, it was soon in vogue throughout North America and quickly made its way across the Atlantic and to the rest of the English-speaking world.  Pedants in England, rarely happy with anything new, of course objected to a short punchy word intruding where they might use a paragraph but debunk made itself at home and never left.

A more recent coining was "prebunk", used as both noun and verb.  The act of prebunking involves issuing warnings about disinformation or misinformation before dissemination and once done, the fake news is said to have been prebunked (in political warfare it's a pre-emptive strike and thus differs from something like an injunction which is preventive).  Very much a word of the era of Donald Trump (b 1946; US president 2017-2021 and since 2025) and crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), "prebunk" seems not to have been used until 2017, sometime after a spokesperson for the Trump administration formalized the concept of "alternative facts".  "Alternative facts" was not something new and had been part of the language of government probably as long as there have been governments but the Trump White House was the first blatantly to admit use.  Mothers with young children are familiar with "alternative facts" such as "Santa Claus" or "the tooth fairy" and the idea worked so well under Trump it became a core part of the Biden administration's media management although, if coming from Joe Biden (b 1942; US president 2021-2025) himself, it was hard to tell where "alternative facts" end and senility began.

Servergate, the scandal about crooked Hillary Clinton's (b 1947; US secretary of state 2009-2013) home-brew mail server was as much about the cover-up which was her attempt to debunk the facts as it was about her initial wrongdoings.  For cartoonists, crooked Hillary was the gift which kept giving.   

Conspiracy theories have probably been around as long as human societies have existed but as means of communications have expanded, their spread has both extended and accelerated, social media just the latest and most effective vector of transmission.  Debunking conspiracy theories is also a thing although in this, there’s doubtlessly an element of preaching to the converted, the already convinced dismissing the debunkers as part of the conspiracy.  However, debunking can in itself be something of a conspiracy such as the wholly unconvincing stories concocted to try to explain away the curious business surrounding crooked Hillary Clinton’s home-brew mail server.  Trying to dismiss concerns about that as the stuff of conspiracy theorists was less a debunking than a cover-up.

Lindsay Lohan in The Parent Trap (1998).

A more conventional debunking was published by Nicki Swift which detailed the truly bizarre conspiracy theories about Lindsay Lohan’s “twin sister”.  It began after the release of the 1998 film The Parent Trap in which twins Hallie Parker and Annie James meet at summer camp after being separated at birth and, having been re-united, the pair embark upon a series of adventures in an attempt to bring back together their divorced parents.  Lindsay Lohan played both parts including many scenes in which the twins appeared together and while there had been advances in technology since Hayley Mills (b 1946) undertook the role in the 1961 original, the film was thought an impressive achievement in editing and stage direction, the body-double being Erin Mackey (b 1986, about a fortnight before Lindsay Lohan).

The conspiracy theory was that Lindsay Lohan didn’t play both parts and that she actually had a co-star: her twin sister Kelsey Lohan, variations of the explanation for the now absent spouse including that she was murdered immediately prior to the film’s debut while others say she was killed in 2001 after a mysterious (and well-concealed) disappearance.  BuzzFeed included an entry about this in one of their pieces about celebrity conspiracies, documenting the story of how after Kelsey died in a car accident (which, given her “sister’s” driving habits when young, was at least plausible) the Disney corporation “covered their tracks” by saying Lindsay portrayed the twins, her family corroborating this due to their obsession with celebrity.  Whether there was an intention to suggest Disney was in some was involved in the “death” wasn’t made clear but the wording certainly hints at the implication.

Mandii Vee (b 1996), for whom the truth is out there.

The idea of the Walt Disney Company as somehow evil has been around for decades and was the undercurrent in the helpful video posted on Mandii Vee’s YouTube channel, her explanation for the scandal being that Kelsey "mysteriously died" prior to the film's release and that put Disney in a predicament because they didn't want to release a movie starring a now dead girl.  Such things have been done before and sometimes with notable commercial success but according to Mandii Vee, Disney thought it would bring “bad juju” (a noun or adjective meaning “something cursed or haunted by a dark aura”).  Disney’s solution was said to be a high-finance version of comrade Stalin’s (1878-1953) “un-personing” or the techniques of erasure George Orwell (1903-1950) detailed in Nineteen Eighty-Four (1949), paying Lindsay Lohan's parents millions in hush money to keep the secret, never speaking of the unfortunate Kelsey again and denying she ever existed.  At that point, Disney would have pulped and re-printed all the film’s promotional collateral, re-shot the credits and publicized the story that Lindsay Lohan played both roles.  Finding the idea one actor could do both at the same time improbable, Mandii Vee delved a bit into physics and pondered whether such things were technically even possible.

The Edsel

1958 Edsel Citation convertible.

It has been suggested some debunking needs to be applied to some possibly mythical aspects of the story of the doomed Edsel.  The name “Edsel” has become a byword for commercial failure, based on the sad story of the Edsel car, a brand introduced in 1958 by the Ford Motor Company and so poorly received that the whole Edsel division was shuttered within three years.  The product was said to have failed because:

(1) It was just another variation of the existing large cars sold by the corporation under the Ford and Mercury brands while the increasing public appetite was for smaller, imported models (and within a few years Ford’s own and smaller Falcon, Fairlane & Mustang).

(2) It was introduced into a market where automobile sales were in decline because of the brief but sharp recession of 1958, the mid-price sector where sat the Edsel especially affected.

(3) It had for more than two years been over-hyped as something genuinely innovative whereas it was little different from a 1958 Ford or Mercury.

(4) The build quality was patchy, as was the factory’s support for dealers.

(5) The styling was judged unattractive.  There was much clumsiness in the detailing (although almost the whole US industry was similarly afflicted in 1958) but the single most polarizing aspect was the vertical grill assembly, controversial not because it was a regression to something which had become unfashionable in the “longer, lower, wider” era where things tended to the horizontal but because of the shape which to some suggested a woman’s vulva.  Some used the words “vagina” or “genitalia” but in those more polite times some publications were reluctant to use such language in print and preferred to suggest the grill resembled a “horse collar” or “toilet seat” although the latter was (literally) a bit of a stretch and anyway already used of some of the trunk (boot) lids on Chryslers styled to excess by Virgil Exner (1909–1973); more memorable was Time magazine’s “an Oldsmobile sucking a lemon”.  Some found the debate amusing and some disturbing but few found the look attractive.  The anthropomorphic implications of the grill were toned-down when the range was restyled for the 1959 range and vanished completely for the short-lived 1960 season but by then the damage was done.

Too much, too soon and too little, too late: 1958 Edsel Corsair (left), 1959 Edsel Corsair (centre) and 1960 Edsel Ranger (right).

The failure is a matter of record but one figure that has often puzzled analysts is that Ford booked a loss of over U$250 million on the programme at a time when a million dollars was still a lot of money and, depending on how the conversion is done, that would in 2022 dollars equate to between 2-3 billion.  The extent of the loss would be understandable if the Edsel had been as genuinely new as claimed but it’s difficult to see where all the money went given that all the expensive components were borrowed from the existing Ford and Mercury line up:

(1) The engines, although some were of a unique displacement, were just variations of the existing corporate line-up used in Ford, Mercury & Lincoln models (the Mileage-maker straight-six and the Y-Block, FE (Ford-Edsel) & MEL (Mercury-Edsel-Lincoln) V8s).

(2) The platform, transmissions and suspensions were shared with Ford & Mercury, the wheelbase the only difference.

(3) No dedicated factories were built for the Edsel, the cars assembled on the same assembly lines used by Ford and Mercury.

So the costs involved in the development were relatively less expensive endeavors such as body panels and interior trim.  The marketing expenses were presumably high and there were costs associated with the dealer network but the suspicion has long been that the infamous quarter-billion dollar loss was Ford taking advantage of accounting rules, perhaps booking against the Edsel most of the development costs of things like the FE engine, something which would remain in production until 1976.  That the Edsel was a big failure is disputed by nobody but financially, the losses may have been both over-stated and to some extent transferred to the taxpayer.

1960 Edsel Ranger.

However the accounting is deconstructed, there’s no dispute about the affair’s final contractual imbroglio which, remarkably, necessitated the company introducing a re-styled 1960 Edsel which was produced for only 34 days late in 1959.  Used sometimes as a case-study in contract law text-books, it appears as a cautionary tale.  Ford had intended to axe the brand after the 1959 production was completed but received advice from in-house counsel the contract with the dealer network (executed on 19 November 1956) included as an “essential term” an undertaking to provide Edsels for distribution for three seasons (which in the US don’t align exactly with calendar years, “model years” running traditionally from September to August).  The solution obviously was to cut the losses and buy-out the contracts and to that all but one dealer agreed.  What that meant was Ford faced the prospect of being sued for violating the terms of a contract it had written and the possibility a court could make an order for “specific performance”, meaning it would have to embark on a year’s production of a car selling dismally.  In the circumstances, that probably was unlikely but after the troubles of the previous few years, the last thing Ford wanted was a embarrassing court case so the decision was taken to do a minor re-styling of the 1960 Ford and offer a limited range badged as Edsels; while that sounds as cynical as it was, it was a quick & dirty way to (sort of) satisfy honor on both sides.  Thus to fulfil contractual obligations, the 1960 Edsel appeared, 2864 of which left what was by then a leisurely production-line between 15 October and 19 November 1959.  It was on that day the term of the contract expired and unilaterally it was terminated.

1960 Ford Fairlane 500.

Even that wasn’t the end of the company’s problems.  Although in recent years there had been successes such as the 1958 and 1959 Fords (which benefited from Chrysler’s quality control issues and the styling of GM cars in those years which respectively had been thought dated and polarizing), apart from the disaster which had been the Edsel, there was also the poor sales of the 1958-1960 Lincoln and the failure of the Continental brand which had been intended as a competitor not for Cadillac and Imperial but Rolls-Royce.  So hopes were high for the 1960 Ford until it occurred to someone it was 81½ inches (2045 mm) in width, meaning it would not be possible for buyers to register the things in those states which mandated 80 inches (2032 mm) as the maximum width for passenger cars.  To add insult in injury, being essentially a 1960 Ford, the 1960 Edsels were also affected.  Crony capitalism worked its magic and legal work-arounds were provided but it meant the 1960 body was a one-off, the next season’s cars coming in at exactly 80 inches.

A J.D. Vance meme with sofa (in US memes referred to usually as a "couch").

Memories of the Edsel’s grill were revived in 2024 during another debunking exercise.  In July that year, a post appeared on X (formerly known as Twitter) claiming there was a passage in J.D. Vance’s (b 1984; US vice president since 2025) book Hillbilly Elegy: A Memoir of a Family and Culture in Crisis (2016) in which the then Ohio senator (Republican) boasted of having enjoyed a sexual act with a latex glove, strategically placed between a sofa’s cushions.  It was fake news and nothing in the book even hinted at such an experience but quickly the post went viral; it once could take years for urban myths to spread around a few counties but in the social media age such things wiz around the planet in hours.  Quickly the tale was debunked but the sofa was a popular choice among the meme-makers.  It says something about US politics that so many really wanted to believe "couchgate" was true.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.