Showing posts with label Physics. Show all posts
Showing posts with label Physics. Show all posts

Wednesday, August 27, 2025

Quartervent

Quartervent (pronounced kwawr-ter-vent)

A small, pivoted, framed (or semi-framed) pane in the front or rear side-windows of a car, provided to optimize ventilation.

1930s: The construct was quarter + vent.  Dating from the late thirteenth century, the noun quarter (in its numerical sense) was from the Middle English quarter, from the Anglo-Norman quarter, from the Old French quartier, from the Latin quartarius (a Roman unit of liquid measure equivalent to about 0.14 litre).  Quartus was from the primitive Indo-European kweturtos (four) (from which the Ancient Greek gained τέταρτος (tétartos), the Sanskrit चतुर्थ (caturtha), the Proto-Balto-Slavic ketwirtas and the Proto-Germanic fedurþô).  It was cognate to quadrus (square), drawn from the sense of “four-sided”.  The Latin suffix –arius was from the earlier -ās-(i)jo- , the construct being -āso- (from the primitive Indo-European -ehso- (which may be compared with the Hittite appurtenance suffix -ašša-) + the relational adjectival suffix -yós (belonging to).  The suffix (the feminine –āria, the neuter -ārium) was a first/second-declension suffix used to form adjectives from nouns or numerals.  The nominative neuter form – ārium (when appended to nouns), formed derivative nouns denoting a “place where stuff was kept”.  The Middle English verb quarteren, was derivative of the noun.  Dating from the mid fourteenth century, vent was from the Middle English verb venten (to furnish (a vessel) with a vent), a shortened form of the Old French esventer (the construct being es- + -venter), a verbal derivative of vent, from the Latin ventus (wind), in later use derivative of the English noun.  The English noun was derived partly from the French vent, partly by a shortening of French évent (from the Old French esvent, a derivative of esventer) and partly from the English verb.  The hyphenated form quarter-vent is also used and may be preferable.  Quarter-vent is a noun; the noun plural is quarter-vents.  In use, the action of using the function provided by a quarter-vent obviously can be described with terms like quarter-venting or quarter-vented but no derived forms are recognized as standard.

1959 Cadillac Eldorado Biarritz.

Like almost all US passenger cars, the post-war Cadillacs all had quarter-vents (“vent windows” or “ventiplanes” to the Americans) and on the most expensive in the range they were controlled by an electric motor, a feature optional on the lesser models.  This was a time when the company's slogan Standard of the World” really could be taken seriously.  In 1969, with General Motors (GM) phasing in flow-through ventilation, Cadillac deleted the quarter-vents, meaning purchasers no longer had to decide whether to pay the additional cost to have them electrically-activated (a US$71.60 option on the 1968 Calais and De Ville).  GM's early implementation of flow-through ventilation was patchy so the change was probably premature but by 1969 the system was perfected and as good as their air-conditioning (A-C), famous since the 1950s for its icy blast.    

The now close to extinct quarter-vents were small, pivoted, framed (or semi-framed) panes of glass installed in the front or rear side windows of a car or truck; their purpose was to provide occupants with a source of ventilation, using the air-flow of the vehicle while in motion.  The system had all the attributes of other admirable technologies (such as the pencil) in that it was cheap to produce, simple to use, reliable and effective in its intended purpose.  Although not a complex concept, GM in 1932 couldn’t resist giving the things an impressively long name, calling them “No Draft Individually Controlled Ventilation” (NDICV being one of history’s less mnemonic initializations).  GM’s marketing types must have prevailed because eventually the snappier “ventiplanes” was adopted, the same process of rationality which overtook Chrysler in 1969 when the public decided “shaker” was a punchier name for their rather sexy scoop which, attached directly to the induction system and, protruding through a carefully shape lacuna in the hood (bonnet), shook with the engine, delighting the males aged 17-39 to whom it was intended to appeal.  “Shaker” supplanted Chrysler’s original “Incredible Quivering Exposed Cold Air Grabber” (IQECAG another dud); sometimes less is more.  Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) suggested a good title for his book might be Viereinhalb Jahre [des Kampfes] gegen Lüge, Dummheit und Feigheit (Four and a Half Years of Struggle against Lies, Stupidity and Cowardice) but his publisher thought that a bit ponderous and preferred the more succinct Mein Kampf: Eine Abrechnung (My Struggle: A Reckoning) and for publication even that was clipped to Mein Kampf.  Unfortunately, the revised title was the best thing about it, the style and contents truly ghastly and it's long and repetitious, the ideas within able easily to be reduced to a few dozen pages (some suggest fewer but the historical examples cited for context do require some space).

The baroque meets mid-century modernism: 1954 Hudson Italia by Carrozzeria Touring.  

Given how well the things worked, there’s long been some regret at their demise, a process which began in the 1960s with the development of “through-flow ventilation”, the earliest implementation of which seems to have appeared in the Hudson Italia (1954-1955), an exclusive, two-door coupé co-developed by Hudson in Detroit and the Milan-based Italian coachbuilder Carrozzeria Touring.  Although some of the styling gimmicks perhaps haven’t aged well, the package was more restrained than some extravagances of the era and fundamentally, the lines were well-balanced and elegant.  Unfortunately the mechanical underpinnings were uninspiring and the trans-Atlantic production process (even though Italian unit-labor costs were lower than in the US, Touring’s methods were labor-intensive) involved two-way shipping (the platforms sent to Milan for bodies and then returned to the US) so the Italia was uncompetitively expensive: at a time when the bigger and more capable Cadillac Coupe de Ville listed at US$3,995, the Italia was offered for US$4,800 and while it certainly had exclusivity, it was a time when there was still a magic attached to the Cadillac name and of the planned run of 50, only 26 Italias were produced (including the prototype).  Of those, 21 are known still to exist and they’re a fixture at concours d’élégance (a sort of car show for the rich, the term an un-adapted borrowing from the French (literally “competition of elegance”) and the auction circuit where they’re exchanged between collectors for several hundred-thousand dollars per sale.  Although a commercial failure (and the Hudson name would soon disappear), the Italia does enjoy the footnote of being the first production car equipped with what came to be understood as “flow-through ventilation”, provided with a cowl air intake and extraction grooves at the top of the rear windows, the company claiming the air inside an Italia changed completely every ten minutes.  For the quarter-vent, flow-through ventilation was a death-knell although some lingered on until the effective standardization of A-C proved the final nail in the coffin.

1965 Ford Cortina GT with eyeball vents and quarter-vents.

The car which really legitimized flow-through ventilation was the first generation (1962-1966) of the Ford Cortina, produced over four generations (some claim it was five) by Ford’s UK subsidiary between 1962-1982).  When the revised model displayed at the Earls Court Motor Show in October 1964, something much emphasized was the new “Aeroflow”, Ford’s name for through-flow ventilation, the system implemented with “eyeball” vents on the dashboard and extractor vents on the rear pillars.  Eyeball vents probably are the best way to do through-flow ventilation but the accountants came to work out they were more expensive to install than the alternatives so less satisfactory devices came to be used.  Other manufacturers soon phased-in similar systems, many coining their own marketing trademarks including “Silent-Flow-Ventilation”, “Astro-Ventilation” and the inevitable “Flow-thru ventilation”.  For the Cortina, Ford took a “belt & braces” approach to ventilation, retaining the quarter-vents even after the “eyeballs” were added, apparently because (1) the costs of re-tooling to using a single pane for the window was actually higher than continuing to use the quarter-vents, (2) it wasn’t clear if there would be general public acceptance of their deletion and (3) smoking rates were still high and drivers were known to like being able to flick the ash out via the quarter-vent (and, more regrettably, the butts too).  Before long, the designers found a way economically to replace the quarter-vents with “quarter-panes” or “quarter-lights” (a fixed piece of glass with no opening mechanism) so early Cortinas were built with both although in markets where temperatures tended to be higher (notable South Africa and Australia), the hinged quarter-vents remained standard equipment.  When the Mark III Cortina (TC, 1970-1976) was released, the separate panes in any form were deleted and the side glass was a single pane.

Fluid dynamics in action: GM's Astro-Ventilation.

So logically a “quarter-vent” would describe a device with a hinge so it could be opened to provide ventilation while a “quarter-pane”, “quarter-light” or “quarter-glass” would be something in the same shape but unhinged and thus fixed.  It didn’t work out that way and the terms tended to be used interchangeably (though presumably “quarter-vent” was most applied to those with the functionality.  However, the mere existence of the fixed panes does raise the question of why they exist at all.  In the case or rear doors, they were sometimes a necessity because the shape of the door was dictated by the intrusion of the wheel arch and adding a quarter-pane was the only way to ensure the window could completely be wound down.  With the front doors, the economics were sometimes compelling, especially in cases when the opening vents were optional but there were also instances where the door’s internal mechanisms (the door opening & window-winding hardware) were so bulky the only way to make stuff was to reduce the size of the window.  In some cases, manufacturers "solved" the problem by making rear side glass fixed which lowered their costs but it was never popular with customers.

1976 Volkswagen Passat B1 (1973-1980 (1988 in Brazil)) without quarter-vents, the front & rear quarter-panes fixed.

The proliferation of terms could have come in handy if the industry had decided to standardize and the first generation Volkswagen Passat (1973-1980) was illustrative of how they might been used.  The early Passats were then unusual in that the four-door versions had five separate pieces of side glass and, reading from left-to-right, they could have been classified thus: (1) a front quarter-pane, (2) a front side-window, (3) a rear side-window, (4) a rear quarter-pane and (5) a quarter-window.  The Passat was one of those vehicles which used the quarter-panes as an engineering necessity to permit the rear side-window fully to be lowered.  However the industry didn’t standardize and in the pre-television (and certainly pre-internet) age when language tended to evolve with greater regional variation, not even quarter-glass, quarter-vent, quarter-window & quarter-pane were enough and the things were known variously also as a “fly window”, “valence window”, “triangle window” and (possibly annoying architects) “auto-transom”, the hyphen used and not.

PA Vauxhall Velox (1957-1962): 1959 (left) and 1960 (right).  The one-piece rear window was introduced as a running-change in late 1959.

Before flow-through ventilation systems and long before A-C became ubiquitous, quarter-vents were the industry standard for providing airflow to car interiors and it was common for them to be fitted on both front and rear-doors and frequently, the rear units were fixed quarter-panes (the lowering of the side window thing).  A special type of fixed quarter-pane were those used with rear windows, originally an economic imperative because initially it was too expensive to fabricate one piece glass to suit the “wrap-around styles becoming popular.  Improved manufacturing techniques let the US industry by the early 1950s overcome the limitations but elsewhere, the multi-piece fittings would continue to be used for more than a decade.

1957 Mercury Turnpike Cruiser (left), details of the apparatuses above the windscreen (centre) and the Breezeaway rear window lowered (right)

The 1957 Mercury Turnpike Cruiser was notable for (1) the truly memorable model name, (2) introducing the “Breezeway" rear window which could be lowered and (3) having a truly bizarre arrangement of “features” above the windscreen.  Unfortunately, the pair of “radio aerials” protruding from the pods at the top of the Mercury’s A-pillars were a mere affectation, a “jet-age” motif decorating what were actually air-intakes.

Brochure for 1957 Mercury Turnpike Cruiser promoting, inter-alia, the Breezeway retractable rear window.

A three-piece construction was however adopted as part of the engineering for the “Breezeway”, a retractable rear window introduced in 1957 on the Mercury Turnpike Cruiser.  It was at the time novel and generated a lot of publicity but the concept would have been familiar to those driving many roadsters and other convertibles which had “zip-out” rear Perspex screens, allowing soft-top to remain erected while the rear was open.  Combined with the car’s quarter-vents, what this did was create the same fluid dynamics as flow-through ventilation.  The way Mercury made the retractable glass work was to section the window in a centre flat section (some 80% of the total width), flanked by a pair of fixed quarter-panes.  After the run in 1957-1959, it was resurrected for use on certain Mercury Montclairs, Montereys and Park Lanes.

1958 (Lincoln) Continental Mark III Convertible (with Breezeway window).  The platform was unitary (ie no traditional chassis) which with modern techniques easily was achievable on the sedans and coupes but the convertible required so much additional strengthening (often achieved by welding-in angle iron) that a Mark III Convertible, fueled and with four occupants, weighed in excess of 6000 lb (2720 kg). 

Ford must have been much taken with the feature because it appeared also on the gargantuan “Mark” versions of the (Lincoln) Continentals 1958, 1959 & 1960, dubbed respectively Mark III, IV, & V, designations Ford shamelessly would begin to recycle in 1969 because the corporation wanted the new Mark III to be associated with the old, classic Continental Mark II (1956-1957) rather than the succeeding bloated trio.  The “Breezeway” Lincolns also featured a reverse-slanted rear window, something which would spread not only to the Mercurys of the 1960s but also the English Ford Anglia (105E, 1959-1968) and Consul Classic (1961-1963) although only the US cars ever had the retractable glass.  The severe roofline was used even on the convertible Continentals, made possible by them sharing the rear window mechanism used on the sedan & couple, modified only to the extent of being retractable into a rear compartment.

1974 Lincoln Continental Town Car with mini vents.

In the 1970s Lincoln introduced the novelty of “mini-vents” which raised and lowered separately from the main side-glass.  Smoking was at the time socially acceptable (in some circles it must have appeared obligatory) and there was a lot of it about so engineers devoting time to finding a better way for those wanting to “flick ash out the window” while running the A-C wasn’t surprising.  Those visualizing a “flick” in process might be surprised such a thing existed because if in a modern vehicle, its shape honed in wind-tunnels and computer simulations, what would likely happen would be “blowback”.  That’s because the shape is aerodynamically efficient (with a “buffer zone” very close to the surface) and disrupting that by lowering a window shifts the inside pressure from positive to negative, ask thus being “sucked-in”.  However, on something like a 1974 Lincoln Continental (which conceptually can be imagined as one brick sitting atop two), the buffer zone can (depending on speed) extend as as much as 3 feet (close to a metre) from the body.  The meant ash was flicked into the “buffer zone” and it didn’t end up back in the cabin.  The vents didn’t last (another casualty of the quest for lower drag) but as late as 1985 they appeared as a US$72 extra and were known in the industry as the “smoker's option”.

1967 Chevrolet Camaro 327 Convertible with vent windows (left), 1969 Chevrolet Camaro ZL1 without vent windows (centre) and Lindsay Lohan (b 1986) & Jamie Lee Curtis (b 1958) in 1969 Chevrolet Camaro Convertible during filming of the remake of Freaky Friday (2003), Los Angeles, August 2024.  Freakier Friday is slated for release in August, 2025).

Through Chevrolet's COPO (Central Office Production Order) system, 69 1969 Camaros were built with the ZL1, an all-aluminum version of the 427 cubic inch (7.0 litre) big-block V8.  The COPO had been established as an efficient way to coordinate the production of fleet orders (law enforcement agencies, utility companies etc) for runs of vehicles in a certain specification but the drag racing community and others worked out it could be used also as “back-door” way to order small runs of cars with otherwise unavailable high-performance engines.  The Freakier Friday Camaro (badged as a 396 SS but several were used during filming including at least one with a roll-over bar for the stunt work) lacks the vent windows which were deleted from the range after 1967 when “Astro-Ventilation” (GM’s name for flow-through ventilation) was added.  In North American use, the devices typically are referred to as “vent windows” while a “quarter light” is a small lamp mounted (in pairs) in the lower section of the front bodywork and a “quarter-vent” is some sort of (real or fake) vent installed somewhere on the quarter panels.  As flow-through ventilation became standardized and A-C installation rates rose, Detroit abandoned the quarter-vent which pleased industry because it eliminated both parts and labor, lowering the cost of production (the savings absorbed as profits rather than being passed to the customers).  On the small, cheap Ford Pinto (1971-1980), removing the feature saved a reported US$2.16 per unit but, being small and cheap, A-C rarely was ordered by Pinto buyers which was probably a good thing because, laboring under the 1970s burdens of emission controls, the weight of  impact-resistant bumper bars and often an automatic transmission a Pinto was lethargic enough with out adding power-sapping A-C compressor and plumbing.  Responding (after some years of high inflation) to dealer feedback about enquires from Pinto customers indicating a interest in the return of vents, Fords cost-accountants calculated the unit cost of the restoration would be some US$17.  

Ford Australia’s early advertising copy for the XA Falcon range included publicity shots both with and without the optional quarter-vents (left) although all sedans & station wagons had the non-opening, rear quarter-panes, fitted so the side window completely could be lowered.  One quirk of the campaign was the first shot released (right) of the “hero model” of the range (the Falcon GT) had the driver’s side quarter-vent airbrushed out (how “Photoshop jobs” used to be done), presumably because it was thought to clutter a well-composed picture.  Unfortunately, the artist neglected to defenestrate the one on the passenger’s side.

Released in Australia in March 1972, Ford’s XA Falcon was the first in the lineage to include through-flow ventilation, the previously standard quarter-vent windows moved to the option list (as RPO (Regular Production Option) 86).  Because Australia often is a hot place and many Falcons were bought by rural customers, Ford expected a high take-up rate of RPO 86 (it was a time when A-C was expensive and rarely ordered) so the vent window hardware was stockpiled in anticipation.  However, the option didn’t prove popular but with a warehouse full of the parts, they remained available on the subsequent XB (1973-1976) and XC (1976-1979) although the take-up rate never rose, less the 1% of each range so equipped and when the XD (1979-1983) was introduced, there was no such option and this continued on all subsequent Falcons until Ford ceased production in Australia in 2016, by which time A-C was standard equipment.

Great moments in tabloid journalism: Sydney's Sun-Herald, Sunday 25 June, 1972.  The Sun-Herald was then part of the Fairfax group, proving Rupert Murdoch (b 1931) can't be blamed for everything.

The infrequency with which RPO 86 was ordered has been little noted by history but on one car with the option the fixtures did become a element which enabled a owner to claim the coveted “one-of-one” status.  In August 1973, near the end of the XA’s run, with no fanfare, Ford built about 250 Falcons with RPO 83, a bundle which included many of the parts intended for use on the stillborn GTHO Phase IV, cancelled (after four had been built) in 1972 after a newspaper generated one of their moral panics, this time about the “160 mph super cars” it was claimed the local manufacturers were about to unleash and sell to males ages 17-25.  Actually, none of them were quite that fast but not often has the tabloid press been too troubled by facts and the fuss spooked the politicians (it's seldom difficult to render a "minister horrified").  Under pressure, Holden cancelled the LJ Torana V8, Ford the GTHO Phase IV and Chrysler reconfigured it's E55 Charger 340 as a luxury coupé, available only with an automatic transmission and no high-performance modifications.

The “quarter-vent XA RPO 83 GT”: 1973 Ford Falcon XA GT sedan (Body Identification: 54H; Model Code: 18238) in Calypso Green (code J) with Onyx Black (code B) accents over Black Vinyl (Code B) with 351 4V V8 (Code T) and four-speed manual transmission (Code L).  It’s the only one produced with both RPO 83 (a (variably fitted) bundle of parts left-over from the aborted GTHO Phase IV project) and RPO 86 (front quarter-vent windows).  In the collector market they're referred to usually as “the RPO83 cars”.

So in 1973 Ford's warehouse still contained all the parts which were to be fitted to the GTHO Phase IV so they’d be homologated for competition and although the rules for racing had been changed to ensure there was no longer any need to produce small batches of “160 mph (257 km/h) super cars”, Ford still wanted to be able to use the heavy-duty bits and pieces in competition so quietly conjured up RPO 83 and fitted the bundle on the assembly line, most of the cars not earmarked for allocation to racing teams sold as “standard” Falcon GTs.  Actually, it’s more correct to say “bundles” because while in aggregate the number of the parts installed was sufficient to fulfil the demands of homologation, not all the RPO 83 GTs received all parts so what a buyer got really was “luck of the draw”; with nobody being charged extra for RPO 83, Ford didn’t pay too much attention to the details of the installations and many who purchased one had no idea the parts had been included, the manual choke's knob the only visually obvious clue.  Ford made no attempt to publicize the existence of RPO 83, lest the tabloids run another headline.  It’s certain 250 RPO 83 cars were built (130 four-door sedans & 120 two-door Hardtops) but some sources say the breakdown was 131 / 121 while others claim an addition nine sedans were completed.  Being a genuine RPO 83 car, the Calypso Green GT attracts a premium and while being only RPO 83 with quarter-vent windows is not of any great significance, it does permit the prized “one-of-one” claim and not even any of the four GTHO Phase IVs built (three of which survive) had them.  In the collector market, the “one-of-one” status can be worth a lot of money (such as a one-off convertible in a run of coupés) but a Falcon’s quarter-vents are only a curiosity.

The Bathurst 1000 winning RPO83 Falcon GTs, 1973 (left) & 1974 (right).

All else being equal, what makes one RPO83 more desirable than another is if it was factory-fitted with all the option's notional inventory and most coveted are the ones with four-wheel disk brakes.  Because the project was focused on the annual endurance event at Bathurst's high-speed Mount Panorama circuit, the disks were as significant as an additional 50 horsepower and a few weeks before the RPO 83 run they'd already been fitted to the first batch of Landaus, which were Falcon Hardtops gorped-up (what bling used to be called) with hidden headlights, lashings of leather, faux woodgrain and a padded vinyl roof, all markers of distinction in the 1970s and, unusually, there was also a 24 hour analogue clock.  Essentially a short wheelbase, two-door LTD (which structurally was a Falcon with the wheelbase stretched 10 inches (250 mm) to 121 (3075 mm)), the Landau was not intended for racetracks but because it shared a body shell and much of the running gear with the Falcon GT Hardtops, Ford claimed Landau production counted towards homologation of the rear disks.  Fearing that might be at least a moot point, a batch were installed also on some of the RPO83 cars and duly the configuration appeared at Bathurst for the 1973 event, their presence of even greater significance because that was the year the country switched from using imperial measures to metric, prompting the race organizers to lengthen the race from 500 miles (804 km) to 625 (1000), the Bathurst 500 thus becoming the Bathurst 1000.  RPO83 Falcon GTs won the 1973 & 1974 Bathurst 1000s.

The “quarter-vent XB GT”: 1973 Ford Falcon XB GT sedan (Body Identification: 54H; Model Code: 18338) in Polar White (Code 3) with Onyx Black (code B) accents over Parchment Vinyl (Code P) with 351C 4V V8 (Code T) and four-speed manual transmission (Code L).  The only one produced with RPO 86 (front quarter-vent windows).

So with a large stock sitting in the warehouse, despite the dismally low take-up rate, the quarter-vents remained available when the XB Falcon (1973-1976) range was released and of the 1952 XB GT sedans sold (there were also 949 two-door Hardtops) a single buyer ticked the RPO 86 box.  Again, although granting the coveted “one-of-one” status, it’s not something of great significance although the car to which the pair of vents was fitted is one of the more desirable XB GTs because it was one of the 139 XB GTs built with the combination of the “4V Big Port” 351 V8 and four-speed Top Loader manual transmission.  The first 211 XB GTs received the fully-imported 351 Clevelands, “using up” what was in stock, subsequent models switching to the locally made variant.

US Built 351C-4V in 1973 XB Falcon GT.

Ford Australia had been importing from the US the high-performance 351C-4V (4 venturi (ie two-barrel carburetor) V8 for use in the GT but when advised US production of that configuration was ending, the decision was taken to produce a local “high-performance” version of the 351 using the 351C 2V “small port” cylinder heads with “open” combustion chambers and a four-barrel carburetor; Ford Australia only ever manufactured the “small port” heads.  That means the Australian nomenclature “351C-4V” (small ports & four barrel carburetor) differs in meaning from that used in the US where it translated to “big ports & four barrel carburetor”.  It sounded a retrogressive step and while there was some sacrifice in top-end power, the antipodean combo turned out to be ideal for street use because the fluid dynamics of the flow rate through the smaller ports made for better low and mid-range torque (most useful for what most drivers do most of the time) whereas the big-port heads really were optimized for full-throttle operation, something often done on race tracks but rarely on public roads… even in the Australia of the early 1970s.  Still, some did miss the responsiveness of the high-compression US-built engine, even if the difference was really apparent only above 80 mph (130 km/h).

The other ceremony which happened in Australia on 11 November, 1975: Ford Australia's photo shoot, Melbourne, Victoria.

Although only 2,901 XB GTs were produced, as the “halo” model it was an important image-maker and the XB range proved successful with almost 212,000 sold over its 34 month life (over 18 months in a generally more buoyant economy XA production had reached over 129,000).  Stylistically, the XB was an improvement over the poorly detailed XA and much was made (among Fords claimed 2,056 changes from the XA) of the headlight’s high-beam activation shifting from a foot-operated button to a steering column stalk which, thirty-odd years on from the achievement of nuclear fission, doesn’t sound like much but motoring journalists had for years been advocating for “a headlight flasher” having been impressed by the “safety feature” when being “flashed” on the German Autobahns by something about to pass at high speed.  More welcome still was the GT’s four-wheel disk brakes, acknowledged as good as any then in volume production.  The success of the XB coincided with Ford Australia’s two millionth vehicle leaving the assembly line so on Tuesday 11 November, 1975, Ford’s public relations office invited journalists and camera crews to a ceremony to mark the occasion, laying on the usual catering (including free cigarettes!) to ensure a good attendance.

Ford Australia pre-release publicity shot for the XB range release (embargoed until 15 September 1973).

1973 Ford Falcon XB GT Hardtop (Body Identification: 65H; Model Code: 18318) in Yellow Blaze (Code M) with Onyx Black (code B) accents over Black Vinyl (Code B) with 351C 4V V8 (Code T) and three-speed T-Bar automatic transmission (Code B).  Because the various side windows used by the Hardtop, Ute and Panel Van derivatives were different to fit the door and roof shapes, the quarter-vents were never offered on those and RPO 86 on the Hardtops was the dreaded vinyl roof in tan.  The sunroof (RPO 10) was a rarely (168 Falcons and 244 Fairmonts) specified option.

Unfortunately, the pictures of the dutifully polished XB Fairmont (a Falcon with some gorp) sedan didn’t generate the publicity expected because the next editions of the daily newspapers (there were then a lot of those and they sold in big numbers) had a more sensational story to cover: On that Tuesday Sir John Kerr (1914–1991; governor-general of Australia 1974-1977) had dismissed from office Gough Whitlam (1916–2014; prime minister of Australia 1972-1975) and his troubled administration.  It was the first time the Crown had sacked a prime-minister since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841) and although in 1932 Sir Philip Game (1876–1961; governor of NSW 1930-1935) had sundered the commission of Jack Lang (1876–1975; premier of New South Wales 1925-1927 & 1930-1932), most Australians who pondered such things believed the days of meddling viceroys were done.  Sir John however proved the royal prerogative still existed (although paradoxically perhaps now only in the hands of a monarch’s representative rather than their own) and the footnote in the history of Australian manufacturing passed almost unnoticed.

Monday, August 4, 2025

Exposome

Exposome (pronounced eks-poh-sohm)

(1) A concept describing (1) the environmental exposures an individual encounters throughout life and (2) how these factors impact an individual's biology and health.

(2) The collection of environmental factors (stress, diet, climate, health-care etc) to which an individual is exposed and which can have an effect on health outcomes.

2005: The construct was expos(e) +‎ -ome, the word coined by cancer epidemiologist Dr Christopher Wild, then director of the International Agency for Research on Cancer (IARC).  Expose (in the sense of “to lay open to danger, attack, harm etc”; “to lay open to something specified”) dates from the mid-fifteenth century and was from the late Middle English exposen, from the Middle French exposer (to lay open, set forth), from the Latin expōnō (set forth), with contamination from poser (to lay, place). The –ome suffix was an alteration of -oma, from the Ancient Greek -ωμα (-ōma).  It was only partially cognate to -some (body), from σῶμα (soma) (body), in that both share the case ending -μα (-ma), but the ω was unrelated.  The sense was of “a mass of something” and use is familiar in forms such as genome (in genetics the complete genetic information (DNA (deoxyribonucleic acid) or RNA (ribonucleic acid)) and phenome (the whole set of phenotypic entities in a cell, tissue, organ, organisms, and species). Exposome is a noun and exposomic is an adjective; the noun plural is exposomes.

The study and assessment of external and internal factors (chemical, physical, biological, social, climatic etc) factors that may influence human health is not new and evidence of interest in the topic(s) exist in the literature of physicians and philosophers (there was sometimes overlap) from the ancient civilizations of Greece, Rome, China, Persia and India.  One of the paradoxes of modernity in medicine was that simultaneously there developed an interest in (1) interdisciplinary and holistic approaches while (2) specialization become increasingly entrenched, the latter leading sometimes to a “siloing” in research and data accumulation.  What makes exposome a useful tool is it is a way of expressing the interplay between genetics and environmental factors in the development of diseases with a particular focus on chronic conditions and widely the concept has been applied in many fields of medicine beyond public health.  What it does is calculate the cumulative effect of multiple exposures, allowing researchers to “scope-down” to specific or general gene-environment interactions, producing data to permit a more accurate assessment of disease risk and thus the identification of useful modes of intervention.

Dr Wild’s coining of exposome came about because some word or phrase was needed to describe his innovation which was the application of a systematic approach to measuring environmental exposures to what was coming to be known about the human genome; in a sense it was an exercise in cause and effect, the three components being (1) the external exposome, (2) the internal exposome and (3) the biological response.  The external exposome included factors such as air pollution, diet and socioeconomic factors as well as specific external factors like chemicals and radiation.  The internal exposome included endogenous factors, such as hormones, inflammation, oxidative stress, and gut microbiota.  The biological response described the complex interactions between the external and internal exposome factors and their influence on an individual's physiology and health.

At its most comprehensive (and complex), the exposome is a cumulative measure of all environmental exposures to which an individual has been subject throughout their entire life.  While that’s something that can be modelled for an “imagined person”, in a real-world instance it will probably always be only partially complete, not least because in some cases critical environmental exposures may not be known for long after their effect has been exerted; indeed, some may be revealed only by an autopsy (post mortem).  Conceptually however, the process can be illustrated by example and one illustrative of the approach is to contrast the factors affecting the same individual living in three different places.  What that approach does is emphasize certain obvious differences between places but variations in an exposome don’t depend on the sample being taken in locations thousands of miles apart.  For a variety of reasons, the same individual might record a radically different outcome if (in theory) living their entire life in one suburb compared with one adjacent or even in one room in one dwelling compared with another perhaps only a few feet away.  Conditions can be similar across a wide geographical spread or different despite close proximity (even between people sitting within speaking distance), the phenomenon of “micro-climates” in open-plan offices well documented.  The number of variables which can be used usefully to calculate (estimate might be a better word) an individual’s (or a group’s) exposome is probably at least in the dozens but could easily be expanded well into three figures were one to itemize influences (such as chemicals or specifics types of pollutant matter) and such is the complexity of the process that the mere existence of some factors might be detrimental to some individuals yet neutral or even beneficial to others.  At this stage, although the implications of applying AI (artificial intelligence) to the interaction of large data sets with a individual’s genetic mix have intrigued some, the exposome remains an indicative conceptual model rather than a defined process.

As an example, consider the same individual living variously in New York City, Dubai or Los Angeles.  In each of those places, some factors will be universal within the locality while others will vary according to which part of place one inhabits and even at what elevation at the same address; the physical environment in a building’s ground floor greatly can vary from that which prevails on the 44th floor:

Lindsay Lohan in New York City in pastel yellow & black bouclé tweed mini-dress.  Maintaining an ideal BMI (body mass index) is a positive factor in ones exposome. 

(1) Air Quality and Pollution: Moderate to high levels of air pollution, especially from traffic (NO₂, PM2.5). Seasonal heating (oil and gas) contributes in winter.  Subway air has unique particulate matter exposure.

(2) Climate and UV Radiation: Humid continental climate—cold winters and hot summers. Seasonal variability affects respiratory and cardiovascular stressors.

(3) Diet and Food Environment: Diverse food options—high availability of ultra-processed foods but also global cuisines. Food deserts in poorer boroughs can reduce fresh produce access.

(4) Built Environment and Urban Design: Dense, walkable, vertical urban environment. High reliance on public transport; more noise pollution and crowding stress.  Lower car ownership can reduce personal emissions exposure.

(5) Cultural and Psychosocial Stressors: High-paced lifestyle, long working hours. High density increases social stress, noise, and mental health challenges.  Diversity can be enriching or alienating, depending on context.

(6) Economic and Occupational Exposures: Highly competitive job market. Occupational exposures vary widely—white-collar vs service industries. Union protections exist in some sectors.

(7) Healthcare Access and Public Policy: Robust healthcare infrastructure, but disparities remain by borough and income. Medicaid and public hospitals provide some safety net.

Lindsay Lohan in Dubai in J.Lo flamingo pink velour tracksuit.  A healthy diet and regular exercise are factors in one's exposome. 

(1) Air Quality and Pollution: Frequently exposed to dust storms (fine desert dust), high PM10 levels, and air conditioning pollutants. Limited greenery means less natural air filtration.  Desalination plants and industrial expansion add further exposure.

(2) Climate and UV Radiation: Extreme desert heat (45°C+), intense UV exposure, little rain. Heat stress and dehydration risks are chronic, especially for outdoor workers.

(3) Diet and Food Environment: High import dependency. Abundant processed and fast foods, especially in malls. Dietary pattern skewed toward high sugar and fat content.  Cultural fasting (eg Ramadan) introduces cyclical dietary stressors.

(4) Built Environment and Urban Design: Car-centric city. Pedestrian-unfriendly in many areas due to heat and design. Heavy air conditioning use is a major indoor exposure pathway.

(5) Cultural and Psychosocial Stressors: Strict social codes and legal restrictions influence behavioral exposures. Expat life often means social disconnection and job insecurity for migrant workers.

(6) Economic and Occupational Exposures: Large migrant workforce faces occupational health risks, including long hours in extreme heat. Labor protections are inconsistent.

(7) Healthcare Access and Public Policy: Healthcare access stratified—good for citizens and wealthy expats, less so for low-wage migrants. Private sector dominates.

Lindsay Lohan in Los Angeles in 2005 Mercedes-Benz SL65 AMG (2005-2011) Roadster (R230, 2002-2011).  Smoking is a factor in one's exposome.

(1) Air Quality and Pollution: Known for smog due to vehicle emissions and topography (valley trap). Ozone levels high, especially in summer. Wildfire smoke increasingly common.

(2) Climate and UV Radiation: Mediterranean climate with mild, dry summers. High UV exposure, though moderated by coastal influence. Drought conditions affect water quality and stress.

(3) Diet and Food Environment: Strong health-food culture, organic and plant-based diets more common. Yet fast food and food deserts remain in less affluent areas.  Hispanic and Asian dietary influences prominent.

(4) Built Environment and Urban Design: Sprawling, suburban in many parts. High car dependence means more exposure to vehicle exhaust.  Outdoor activities more common in certain demographics (eg, beach culture).

(5) Cultural and Psychosocial Stressors: Cultural emphasis on appearance, wealth, and entertainment may increase psychosocial pressure.  Homelessness crisis also creates variable community stress exposures.

(6) Economic and Occupational Exposures: Gig economy widespread, leading to precarious employment. Hollywood and tech industries also introduce unique workplace stress patterns.

(7) Healthcare Access and Public Policy: California’s public health programs are progressive, but uninsured rates still high. Proximity to cutting-edge research centers can boost care quality for some.

So one's exposome is a product of what one wants or gets from life, mapped onto a risk analysis table.  In New York City, one copes with urban pollution and persistent subway dust in an increasingly variable climate marked by periods of high humidity, a dietary range determined by one's wealth, the advantage of a good (if not always pleasant) mass transit system and the possibility of a “walking distance” lifestyle, albeit it in usually crowded, fast-paced surroundings.  Employment conditions are mixed and access to quality health care is a product of one's insurance status or wealth.

In Dubai, one lives with frequent dust storms, months of intense heat and UV exposure, a dependence on food imports, the constant temptation of fast food (FSS; fat, salt, sugar).  The car-centric lifestyle has created a built environment described as “pedestrian-hostile” and there are sometimes severe legal limits on the personal freedom especially for migrant workers who are subject to heat exposure and limited labor rights (even those which exist often not enforced).  The health system distinctly is tiered (based on wealth) and almost exclusively privatized.

The air quality in Los Angeles greatly has improved since the 1970s but climate change has resulted in the more frequent intrusion of smoke from wildfires and the prevailing UV exposure tends to be high; the climate is not as “mild” as once it was rated.  While there are pockets in which walkability is good, Los Angeles mostly is a car-dependent culture and the coverage and frequency of mass-transit has in recent decades declined.  Although this is not unique to the city, there's heightened awareness of a sensitivity to specific cultural pressures based on appearances and perceptions of lifestyle while housing stress is increasing.  Economic pressures are being exacerbated by the growth of the gig economy and traditionally secure forms of employment are being displaced by AI (bots, robots and hybrids).  Although California's healthcare system is sometimes described as "progressive", on the ground, outcomes are patchy.

So each location shapes the exposome in distinctive ways and the potential exists for the process better to be modelled so public health interventions and policies can be adjusted.  Of course, some risks are global: anywhere on the planet there’s always the chance one might be murdered by the Freemasons but some things which might seem unlikely to be affected by location turn out also to be an exposome variable. Because planet Earth is (1) roughly spherical, (2) and travels through space (where concepts like up & down don’t apply) and (3) constantly is exposed to meteoroids (every day Earth receives tons of “space dust”), it would be reasonable to assume one is equally likely to be struck by a meteoroid wherever one may be.  However, according to NASA (the US National Aeronautics and Space Administration), strikes are not equally likely everywhere, some latitudes (and regions) being more prone, due to several factors:

(1) Because Earth’s rotation and orbital motion create a bias, meteoroids tend more often to approach from the direction of Earth’s orbital motion (the “apex direction”), meaning the leading hemisphere (the side facing Earth's motion, near the dawn terminator) sees more meteoroid entries than the trailing hemisphere.  On a global scale, the effect is small but is measurable with the risk increasing as one approaches the equatorial regions where rotational velocity is greatest.

(2) Because most meteoroids approach from near the plane of the Solar System (the ecliptic plane), there’s what NASA calls a “latitude distribution bias”: Earth’s equator being tilted only some 23.5° from the ecliptic, meteoroids are more likely to intersect Earth’s atmosphere near lower latitudes (the tropical & sub-tropical zones) than near the poles.  So, those wishing to lower their risk should try to live in the Arctic or Antarctic although those suffering chronic kosmikophobia (fear of cosmic phenomena) are likely already residents.

(3) Some 70% of the Earth’s surface area being the seas and oceans, statistically, most meteoroids land in the water rather than in land so the lesson is clear: avoid living at sea.  The calculated probability is of course just math; because sparsely populated deserts accumulate meteorites better because erosion is low, a large number have been found in places like the Sahara and outback Australia but those numbers reflect a preservation bias and don’t necessarily confirm a higher strike rate.  The lesson from the statisticians is: Don’t dismiss the notion of living in a desert because of a fear of being struck by a meteoroid.

(4) Gravitational focusing, although it does increase Earth’s meteoroid capture rates (disproportionately so for objects travelling more slowly), is a global effect so there is no known locational bias.  While there is at least one documented case of a person being struck by a meteoroid, the evidence does suggest the risk is too low to be statistically significant and should thus not be factored into the calculation of one’s exposome because one is anywhere at greater risk of being murdered by the Freemasons.

Ms Ann Hodges with bruise, Alabama, September. 1952.  Painful though it would have been, she did get  her 15 minutes of fame and eventually sold the fragment for US$25 so there was that.

In the narrow technical sense, many people have been struck by objects from space (as estimated 40+ tons of the stuff arrives every day) but most fragments are dust particles, too small to be noticed.  The only scientifically verified injury a person has suffered was an impressively large bruise a meteorite (the part of a meteoroid that survives its fiery passage through the atmosphere to land on Earth’s surface) on 10 September 1954 inflicted on Ms Ann Hodges (1920-1972) of Sylacauga, Alabama in the US.  Weighing 7.9 lb (3.6 kg), the intruder crashed through the roof of her house and bounced off a radio, striking her while enjoying a nap on the sofa.  The meteoroid was called Sylacauga and, just as appropriately, the offending meteorite was named the Hodges Fragment.  Anatomically modern humans (AMH) have been walking the planet for perhaps 300,000 years and we’ve been (more or less) behaviorally modern (BMH) for maybe a quarter of that so it’s possible many more of us have been struck,  In the absence of records, while it’s impossible to be definitive, it’s likely more have been murdered by the Freemasons that have ever been killed by stuff falling from space although, as the history of species extinction illustrates, a direct hit on someone is not a prerequisite for dire consequences.

Dashcam footage of meteorite fragment in the sky over Lexington, South Carolina.

The cosmic intruder crashed through the roof of a house on 26 June, 2025 and although there were no injuries, Fox News reported the fragment left a hole in the floor “about the size of a large cherry tomato”.  Analysis determined the rock was from the asteroid belt between Mars and Jupiter and as well as the dramatic fireball many captured on their dashcams, it would briefly have broken the sound barrier as it entered Earth’s atmosphere.  It was also very old, dating from slightly before the formation of the Solar System’s rocky inner planets (one of which is Earth) some 4.56 billion years ago and such fragments are of interest to many branches of science because they represent a small part of the “basic building blocks” of those planets and can thus assist in understanding the processes active during the Solar System’s earliest days.  Curiously (to those not trained in such things), the cosmologists explained “such a small fragment didn’t present a threat to anyone” which seems strange given its impact left a small crater in a floor, one implication being one wouldn’t wish for such a thing to hit one’s skull.  That the impact happened in Georgia, a state adjacent to Alabama where a half-century earlier the unfortunate Ms Hodges was struck, may make some add meteorite fragments” to their list of exposome factors south of the Mason-Dixon Line” but the sample size is too small for conclusions to be drawn and the events are mere geographic coincidences.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.