Showing posts sorted by date for query Bench. Sort by relevance Show all posts
Showing posts sorted by date for query Bench. Sort by relevance Show all posts

Tuesday, July 29, 2025

Rumble

Rumble (pronounced ruhm-buhl)

(1) A form of low frequency noise

(2) In video game controllers, a haptic feedback vibration.

(3) In the jargon of cardiologists, a quality of a "heart murmur".

(4) In the slang of physicians (as "stomach rumble"), borborygmus (a rumbling sound made by the movement of gas in the intestines).

(5) In slang, a street fight between or among gangs.

(6) As rumble seat (sometimes called dickie seat), a rear part of a carriage or car containing seating accommodation for servants, or space for baggage; known colloquially as the mother-in-law seat (an now also used by pram manufacturers to describe a clip-on seat suitable for lighter infants).

(7) The action of a tumbling box (used to polish stones).

(8) As rumble strip, in road-building, a pattern of variation in a road's surface designed to alert inattentive drivers to potential danger by causing a tactile vibration and audible rumbling if they veer from their lane.

(9) In slang, to find out about (someone or something); to discover the secret plans of another (mostly UK informal and used mostly in forms such as: "I've rumbled her" or "I've been rumbled").

(10) To make a deep, heavy, somewhat muffled, continuous sound, as thunder.

(11) To move or travel with such a sound:

1325-1375: From Middle English verbs rumblen, romblen & rummelyn, frequentative form of romen (make a deep, heavy, continuous sound (also "move with a rolling, thundering sound" & "create disorder and confusion")), equivalent to rome + -le.  It was cognate with the Dutch rommelen (to rumble), the Low German rummeln (to rumble), the German rumpeln (to be noisy) and the Danish rumle (to rumble) and the Old Norse rymja (to roar or shout), all of imitative origin.  The noun form emerged in the late fourteenth century, description of the rear of a carriage dates from 1808, replacing the earlier rumbler (1801), finally formalized as the rumble seat in 1828, a design extended to automobiles, the last of which was produced in 1949.  The slang noun meaning "gang fight" dates from 1946 and was an element in the 1950s "moral panic" about such things.  Rumble is a noun & verb, rumbler is a noun, rumbled is a verb, rumbling is a noun, verb & adjective and rumblingly is an adverb; the noun plural is rumbles.

Opening cut from studio trailer for Lindsay Lohan's film Freakier Friday (Walt Disney Pictures, 2025) available on Rumble.  Founded in 2013 as a kind of “anti-YouTube”, as well as being an online video platform Rumble expanded into cloud services and web hosting.  In the vibrant US ecosystem of ideas (and such), Rumble is interesting in that while also carrying non-controversial content, it’s noted as one of the native environments of conservative users from libertarians to the “lunar right”, thus the oft-used descriptor “alt-tech”.  Rumble hosts Donald Trump’s (b 1946; US president 2017-2021 and since 2025) Truth Social media platform which has a user base slanted towards “alt-this & that” although to some inherently it’s evil because much of its underlying code is in Java.

The Velvet Underground and Nico

Link Wray’s (1929-2005) 1958 instrumental recording Rumble is mentioned as a seminal influence by many who were later influential in some of the most notable forks of post-war popular music including punk, heavy-metal, death-metal, glam-rock, art-rock, proto-punk, psychedelic-rock, avant-pop and the various strains of experimental and the gothic.  Wray’s release of Rumble as a single also gained a unique distinction in that it remains the only instrumental piece ever banned from radio in the United States on purely “musical” grounds, the stations (apparently in some parts “prevailed upon” by the authorities) finding its power chords just too menacing for youth to resist.  It wasn't thought it would “give them ideas” in the political sense (many things banned for that fear) but because the “threatening” sound and title was deemed likely to incite juvenile delinquency and gang violence.  “Rumble” was in the 1950s youth slang for fights between gangs, thus the concern the song might be picked up as a kind of anthem and exacerbate the problems of gang culture by glorifying the phenomenon which had already been the centre of a "moral panic".  There is a science to deconstructing the relationship between musical techniques and the feelings induced in people and the consensus was the use of power chords, distortion, and feedback (then radically different from mainstream pop tunes) was “raw, dark and ominous”, even without lyrics; it’s never difficult to sell nihilism to teenagers.  Like many bans, the action heightened its appeal, cementing its status as an anthem of discontented youth and, on sale in most record stores, sales were strong.

The Velvet Underground & Nico (1967).

Lou Reed (1942-2013) said he spent days listening to Rumble before joining with John Cale (b 1942) in New York in 1964 to form The Velvet Underground.  Their debut album, The Velvet Underground & Nico, included German-born model Nico (1938-1988) and was, like their subsequent releases, a critical and commercial failure but within twenty years, the view had changed, their work now regarded among the most important and influential of the era, critics noting (with only some exaggeration): "Not many bought the Velvet Underground's records but most of those who did formed a band and headed to a garage."  The Velvet Underground’s output built on the proto heavy-metal motifs from Rumble with experimental performances and was noted especially for its controversial lyrical content including drug abuse, prostitution, sado-masochism and sexual deviancy.  However, despite this and the often nihilistic tone, in the decade since Rumble, the counter-culture had changed not just pop music but also America: The Velvet Underground was never banned from radio.

Rumble seat in 1937 Packard Twelve Series 1507 2/4-passenger coupé.  The most expensive of Packard's 1937 line-up, the Twelve was powered by a 473 cubic-inch (7.7 litre) 67o V12 rated at 175 horsepower at 3,200 RPM.  It was best year for the Packard Twelve, sales reaching 1,300 units.  The marque's other distinction in the era was the big Packard limousines were the favorite car of comrade Stalin (1878-1953; Soviet leader 1924-1953), a fair judge or machinery.

The rumble seat was also known as a dicky (also as dickie & dickey) seat in the UK while the colloquial “mother-in-law seat” was at least trans-Atlantic and probably global.  It was an upholstered bench seat mounted at the rear of a coach, carriage or early motorcar and as the car industry evolved and coachwork became more elaborate, increasingly they folded into the body.  The size varied but generally they were designed to accommodate one or two adults although the photographic evidence suggests they could be used also to seat half-a-dozen or more children (the seat belt era decades away).  Why it was called a dicky seat is unknown (the word dates from 1801 and most speculation is in some way related to the English class system) but when fitted on horse-drawn carriages it was always understood to mean "a boot (box or receptacle covered with leather at either end of a coach, the use based on the footwear) with a seat above it for servants".  On European phaetons, a similar fixture was the “spider”, a small single seat or bench for the use of a groom or footman, the name based on the spindly supports which called to mind an arachnid’s legs.  The spider name would later be re-purposed on a similar basis to describe open vehicles and use persists to this day, Italians and others sometimes preferring spyder.  They were sometimes also called jump-seats, the idea being they were used by servants or slaves who were required to “jump off” at their master’s command and the term “jump seat” was later used for the folding seats in long-wheelbase limousines although many coach-builders preferred “occasional seats”.

Rumble seat in 1949 Triumph 2000 Roadster.  The unusual (and doubtless welcome) split-screen was a post-war innovation, the idea recalling the twin-screen phaetons of the inter-war years.  Had they been aware of the things, many passengers in the back seats of convertibles (at highway speeds it was a bad hair day) would have longed for the return of the dual-cowl phaetons.  

The US use of “rumble seat” comes from the horse & buggy age so obviously predates any rumble from an engine’s exhaust system and it’s thought the idea of the rumble was literally the noise and vibration experienced by those compelled to sit above a live axle with 40 inch (1 metre-odd) steel rims on wooden-spoked wheels, sometimes with no suspension system.  When such an arrangement was pulled along rough, rutted roads by several galloping horses, even a short journey could be a jarring experience.  The rumble seat actually didn’t appear on many early cars because the engines lacked power so weight had to be restricted, seating typically limited to one or two; they again became a thing only as machines grew larger and bodywork was fitted.  Those in a rumble seat were exposed to the elements which could be most pleasant but not always and they enjoyed only the slightest protection afforded by the regular passenger compartment’s top & windscreen.  Ford actually offered the option of a folding top with side curtains for the rumble seats on the Model A (1927-1931) but few were purchased, a similar fate suffered by those produced by third party suppliers.  US production of cars with rumble seats ended in 1939 and the last made in England was the Triumph 1800/2000 Roadster (1946-1949) but pram manufacturers have of late adopted the name to describe a seat which can be clipped onto the frame.  Their distinction between a toddler seat and a rumble seat is that the former comes with the stroller and is slightly bigger, rated to hold 50 lbs (23 KG), while the former can hold up to 35 (16).

1935 MG NA Magnette Allingham 2/4-Seater by Whittingham & Mitchel.  Sometimes described by auction houses as a DHC (drophead coupé), this body style (despite what would come to be called 2+2 seating) really is a true roadster.  The scalloped shape of the front seats' squabs appeared also in the early (3.8 litre version; 1961-1964) Jaguar E-Types (1961-1974) but attractive as they were, few complained when they were replaced by a more prosaic but also more accommodating design.

Although most rumble (or dickie) seats were mounted in an aperture separated from the passenger compartment, in smaller vehicles the additional seat often was integrated but became usable (by people) only when the hinged cover was raised; otherwise, the rear-seat cushion was a “parcel shelf”.  The MG N-Type Magnette (1934-1936) used a 1271 cm3 (78 cubic inch) straight-six and while the combination of that many cylinders and a small displacement sounds curious, the configuration was something of an English tradition and a product of (1) a taxation system based on cylinder bore and (2) the attractive economies of scale and production line rationalization of “adding two cylinders” to existing four-cylinder units to achieve greater, smoother power with the additional benefit of retaining the same tax-rate.  Even after the taxation system was changed, some small-capacity sixes were developed as out-growths of fours.  Despite the additional length of the engine block, many N-type Magnettes were among the few front-engined cars to include a “frunk” (a front trunk (boot)), a small storage compartment which sat between cowl (scuttle) and engine.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Monday, May 26, 2025

Quota

Quota (pronounced kwoh-tuh)

(1) The share or proportional part of a total that is required from, or is due or belongs to a particular district, state, person, group etc.

(2) A proportional part or share of a fixed total amount or quantity.

(3) The number or percentage of persons of a specified kind permitted (enrol in an institution, join a club, immigrate to a country, items to be imported etc).

1660–1670: From the Medieval Latin, a clipping of the Latin quota pars ((a percentage of yield owed to the authority as a form of taxation (in the New Latin, a quota, a proportional part or share; the share or proportion assigned to each in a division), from quotus ((which?; what number?; how many?, how few?)), from quat (how many?; as many as; how much?), from the Proto-Italic kwot, from the primitive Indo-European kwóti, the adverb from kwos & kwís; it was cognate with the Ancient Greek πόσος (pósos) and the Sanskrit कति (kati).  In English, until 1921 the only known uses of “quota” appear to be in the context of the Latin form, use spiking in the years after World War I (1914-1918) when “import quotas” were a quick and simply form of regulating the newly resumed international trade.  Quota is a noun, the noun plural is quotas.

Google ngram: Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Being something imposed by those in authority, quotas attract work-arounds and imaginative techniques of avoidance & evasion.  The terms which emerged included (1) quota-hopping (the registration of a business, vehicle, vessel etc in another jurisdiction in order to benefit from its quota), (2) quota quickie (historically, a class of low-cost films commissioned to satisfy the quota requirements of the UK’s Cinematograph Films Act (1927), a protectionist scheme imposed to stimulate the moribund local industry.  The system widely was rorted and achieved little before being repealed by in the Films Act (1960) although modern historians of film have a fondness for the quota quickies which are a recognizable genre of cultural significance with a certain period charm, (3) quota refugee (a refugee, relocated by the office of the UNHCR (United Nations High Commissioner for Refugees) to a country other the one in which they sought asylum in, in accord with relevant certain UN quotas).

South Park's Eric Cartman (left) and Token (now Tolkien) Black (right).

The writers of the animated TV series South Park (1997) (made with the technique DCAS (digital cutout animation style), a computerized implementation of the original CAS (cutout animation style) in which physical paper or cardboard objects were (by hand) moved (still images later joined or the hands edited-out if filmed); the digital process deliberately emulates the jerky, 2D (two-dimensional) effect of the original CAS) had their usual fun with the idea of a DEI (diversity, equity, inclusion) quota as “tokenism” with the creation of the character Token Black (ie the “token black character” among the substantially white ensemble).  However, in 2022, some 300 episodes into the series, the character was retconned to become “Tolkien Black”, the story-line being he was named after JRR Tolkien (1892–1973), author of the children’s fantasy stories The Hobbit (1937) & The Lord of the Rings trilogy (1954-1955).  Retonning (the full form being “retroactive continuity” is a literary device (widely (and sometimes carelessly) used in many forms of pop culture) in which previously-established facts in a fictional are in some way changed (to the point even of eradication or contradiction).  This is done for many reasons which can be artistic, a reaction to changing public attitudes, administrative convenience or mere commercial advantage.  What South Park’s producers did was comprehensively retrospective in that the back-catalogue was also updated, extending even to the sub-titles, something like the “unpersoning” processes under Comrade Joseph Stalin (1878-1953; Soviet leader 1924-1953) or the painstaking “correcting” of the historic record undertaken by Winston Smith in George Orwell’s (1903-1950) Nineteen Eighty-Four (1949) .  Undertaken during the high-point of the BLM (Black Lives Matter) movement, the change did attract comment and most seemed to regard it as an attempt to remove a possible trigger for protest but there was also the argument there may have been concern the use of the given name “Token” might be able to be interpreted as a comment on the sometimes inventive spellings used by African-American parents.  While the use of “Token” as a comment on “white racism” was acceptable, an allusion to the racial stereotyping implicit in the spelling would be classified as at least a microaggression and probably white racism in action.

Gracious Quotes have aggregated Lindsay Lohan’s top ten quotes.

The English word quote (pronounced kwoht) was related to quota by a connection with the Latin quot.  It is used variously: (1) to repeat or use (a passage, phrase etc.) from a book, speech or such, (2) to enclose (words) within quotation marks or (3) to state a price.  It dated from the mid-fourteenth century and was from the Middle English coten & quoten (to mark a text with chapter numbers or marginal references), from the Old French coter, from the Medieval Latin quotāre (to divide into chapters and verses), from the Latin quot (how many) and related to quis (who).  The use evolved from the sense of “to give as a reference, to cite as an authority” to by the late seventeenth meaning “to copy out exact words”.  The use in commerce (“to state the price of a commodity or service” dates from the 1860s and was a revival of the etymological meaning from the Latin, the noun in this context in use by at least 1885.

In Australian politics, there have long been “informal” quotas.  Although Roman Catholics have in recent years infiltrated the Liberal Party (in numbers which suggest a “take-over” can’t be far off), there was a time when their presence in the party was rare and Sir Neil O'Sullivan (1900–1968) who between 1949-1958 sat in several cabinets under Sir Robert Menzies (1894–1978; prime-minister of Australia 1939-1941 & 1949-1966), noted wryly that as the ministry’s “designated Roman Catholic”, he: “wore the badge of his whole race.  That was of course an “unofficial” (though for years well-enforced) quota but the concept appears to this day to persist, including in the ALP (Australian Labor Party) which, long past it’s “White Australia” days, is now more sensitive than some to DEI.  However, the subtleties of reconciling the ALP’s intricate factional arrangements with the need simultaneously to maintain (again unofficial) quotas preserving the delicate business of identity politics seem to have occasional unexpected consequences.  In the first cabinet of Anthony Albanese (b 1963; prime-minister of Australia since 2022), there was one “designated Jew” (Mark Alfred Dreyfus (b 1956).  Mark Dreyfus’s middle name is “Alfred” which is of course striking but there is no known genealogical connection between and the Alfred Dreyfus (1859–1935), the French Jewish army officer at the centre of the infamous Dreyfus affair (1894-1906).  The surname Dreyfus is not uncommon among European Jews and exists most frequently in families of Alsatian origin although the Australian’s father was a Jewish refugee from Nazi Germany.  Having apparently outlived his ethnic usefulness, Dreyfus fell victim to factional axe and was dumped from the ministry, some conspiracy theorists pondering whether the ALP might have liked the “optics” of expelling a Jew while the party’s reaction to the war in Gaza was being criticized by Muslim commentators.

Smiles all round.  Official photograph of the new ALP ministry, Canberra, Australia, June 2022. 

The cabinet also had one “designated Muslim” (Edham Nurredin “Ed” Husic (b 1970)), notable for being both the first Muslim elected to federal parliament and thus the first to serve in a ministry.  That had an obviously pleasing multi-cultural symmetry but for a number of reasons the ALP achieved a remarkably successful result in the 2025 election and that complicated things because radically it changed the balance in the numbers between the party’s right-wing, the relativities between the New South Wales (NSW) and Victorian factions significantly distorted relative to their presence in the ministry.  While the ALP is often (correctly) described as “tribal”, it’s really an aggregation of tribes, split between the right, left and some notionally non-aligned members, those alliances overlaid by each individual’s dependence on their relevant state or territory branch.  The system always existed but after the 1960s became institutionalized and it’s now difficult to imagine the ALP working without the formalized (each with its own letterhead) factional framework for without it the results would be unpredictable; as all those who claimed the Lebanese state would be a better place were the influence of the Hezbollah to be eliminated or at least diminished are about to discover, such changes can make things worse.

However, the 2025 election delivered the ALP a substantial majority but what was of interest to the political junkies was that the breakdown in numbers made it obvious the NSW right-wing was over-represented in the ministry, compared to the Victorian right.  What that meant was that someone from NSW had to be sacrificed and that turned out to be Mr Husic, replaced as the cabinet’s designated Muslim by Dr Anne Aly from the Western Australia’s Labor Left.  Culturally, to many that aspect seemed culturally insensitive.  To be replaced as designated Muslim might by Mr Husic have been accepted as just a typical ALP factional power play (a reasonable view given it was the faction which put him in the ministry in the first place) had he been replaced by a man but to be replaced by a Muslim woman must have been a humiliation and one wonders if the factional power-brokers have done their “cultural awareness training”, something the party has been anxious to impose on the rest of the country.  Mr Husic’s demise to the less remunerative back-bench is said to have been engineered by Deputy Prime Minister Richard Marles (b 1967) of the Victorian Right Faction and his role wasn’t ignored when Mr Husic was interviewed on national television, informing the country: “I think when people look at a deputy prime minister, they expect to see a statesman, not a factional assassin.  Given the conduct & character of some previous holders of the office, it’s not clear why Mr Husic would believe Australians would think this but, in the circumstances, his bitterness was understandable.  Somewhat optimistically, Mr Husic added: “There will be a lot of questions put to Richard about his role.  And that's something that he will have to answer and account for.  In an act of kindness, the interviewer didn’t trouble to tell his interlocutor: (1) Those aware of Mr Marles’ role in such matters don’t need it explained and (2) those not aware don’t care.

Richard Marles (right) assessing Ed Husic’s (left) interscapular region.

When Mr Marles was interviewed, he was asked if he thought he had “blood on his hands”, the same question which more than forty years earlier had been put to Bob Hawke (1929–2019; Prime Minister of Australia 1983-1991) who had just (on the eve of a general election) assumed the ALP leadership after the “factional assassins” had pole-axed the hapless Bill Hayden (1933–2023; ALP leader 1977-1983) after the latter’s earnest but ineffectual half decade as leader of Her Majesty’s loyal opposition.  Mr Hawke, not then fully house-trained by the pre-modern ALP machine, didn’t react well but to Mr Marles it seemed water of a duck’s back and he responded: “I don't accept that, these are collective processes... they are obviously difficult processes.  But, at the end of the day we need to go through the process of choosing a ministry in the context of there being a lot of talented people who can perform the role.  Unfortunately, Mr Marles declined to discuss the secret factional manoeuvring which led to Mr Husic being sacrificed, the speculation including Dr Ally being thought better value because she could be not only cabinet’s designated woman but also boost the female numbers in the body, a matter of some sensitivity given how many women had joined the ALP caucus, many of them unexpectedly winning electorates to which they’d gain pre-selection only because the factional power-brokers considered them unwinnable.

Still, to be fair to Mr Marles, his anodyne non-answers were a master-class in composition and delivery: “There are so many people who would be able to admirably perform the role of ministers who are not ministers.  What I would say is I'm really confident about the ministry that has been chosen and the way in which it's going to perform on behalf of the Australian people.  But in the same breath, I'd also very much acknowledge the contribution that Ed Husic has made and for that matter, that Mark Dreyfus has made.  Both have made a huge contribution to this country in the time that they have served as ministers. I am grateful for that.  Whether or not he believed his gratitude would be appreciated, Mr Marles was emphatic about his faction maintaining its Masonic-like cloak of secrecy, concluding his answer by saying: “I'm not about to go into the detail of how those processes unfold.  I've not spoken about those processes in the past obviously and I'm not about to talk about them now.  It’s a shame politicians don’t think their parties should be as “transparent” the standard they often attempt to impose on others because Mr Marles discussing the plotting & scheming of factional machinations would be more interesting than most of what gets recited at his press conferences.

Although the most publicized barbs exchanged by politicians are inter-party, they tend to be derivative, predictable or scripted and much more fun are the spur-of-the-moment intra-party insults.  Presumably, intra-faction stuff might be juicier still but the leaks from that juicer are better sealed which is a shame because the ALP has a solid history in such things. 

Bill Hayden not having forgotten the part played in his earlier axing as party leader by Barrie Unsworth (b 1934; Premier of NSW 1986-1988) observed of him: “…were you the sort person who liked the simple pleasures in life, such as tearing the wings off butterflies, then Barrie Unsworth was the man for you.  Hayden had not escape critiquing either, the man who deposed him (Bob Hawke) describing him in the run up to the coup as “A lying cunt with a limited future.  Another ALP leader (Gough Whitlam (1916–2014; prime minister of Australia 1972-1975)) had a way with words, complaining to Charlie Jones (1917-2003): “You’re the transport minister, but every time you open your mouth, things go into reverse.  Nor did Whitlam restrict his invective to individuals, once complaining of some of his colleagues: “I can only say we've just got rid of the '36 faceless men' stigma to be faced with the 12 witless men.  The twelve were members of the ALP’s federal executive who in 1966 were poised to engineer Whitlam’s removal as deputy leader of the opposition and would have, had he not out- maneuvered them.

Sydney Daily Telegraph 22 March 1963 (left) and Liberal Party campaign pamphlet for 1963 federal election (right).

Dating from 1963, the phrase “36 faceless men” (one of whom was the token woman, the ALP having quotas even then) described the members of the ALP’s federal conference which, at the time, wrote the party platform, handing to the politicians to execute.  The term came to public attention when a photograph appeared on a newspaper’s front page showing Whitlam and Arthur Calwell (1896-1973; ALP leader 1960-1967) standing outside the hotel where the 36 were meeting, waiting to be invited in to be told what their policies were to be.  The conservative government used to great effect the claim the ALP was ruled by “36 faceless men”.  In the 2010s, there was a revival when there were several defenestrations of prime-ministers & premiers by factional operators who did their stuff, mostly in secret, through back channel deals and political thuggery.  In an untypically brief & succinct address, Dr Kevin Rudd (b 1957; Prime-Minister of Australia 2007-2010 & 2013) at the time summed up his feelings for his disloyal colleagues: “In recent days, Minister Crean [Simon Crean (1949–2023; ALP leader 2001-2003)] and a number of other faceless men have publicly attacked my integrity and therefore my fitness to serve as a minister in the government.... I deeply believe that if the Australian Labor Party, a party of which I have been a proud member for more than 30 years, is to have the best future for our nation, then it must change fundamentally its culture and to end the power of faceless men. Australia must be governed by the people, not by the factions.”  Otherwise mostly forgotten, Simon Crean and his followers are remembered as “Simon and the Creanites”, a coining by Peter Costello (b 1957; Treasurer of Australia, 1996-2007) who re-purposed “Creanites” from an earlier use by Paul Keating (b 1944; Prime Minister of Australia 1991-1996).