Demand(pronounced dih-mand (U) or dee–mahnd (non-U))
(1) To ask for with proper authority; claim as a right.
(2) To ask for peremptorily or urgently.
(3) To call for or require as just, proper, or necessary.
(4) In law, to lay formal claim to.
(5) In law, to summon, as to court.
(6) An urgent or pressing requirement.
(7) In economics, the desire to purchase, coupled (hopefully) with the power to do so.
(8) In economics, the quantity of goods that buyers will take at a particular price.
(9) A requisition; a legal claim.
(10) A question or inquiry (archaic).
1250-1300: From Middle English demaunden and Anglo-French demaunder, derived from the Medieval Latin dēmandāre (to demand, later to entrust) equivalent to dē + mandāre (to commission, order).The Old French was demander and, like the English, meant “to request” whereas "to ask for as a right" emerged in the early fifteenth century from Anglo-French legal use.As used in economic theory and political economy (correlating to supply), first attested from 1776 in the writings of Adam Smith. The word demand as used by economists is a neutral term which references only the conjunction of (1) a consumer's desire to purchase goods or services and (2) hopefully the power to do so. However, in general use, to say that someone is "demanding" something does carry a connotation of anger, aggression or impatience. For this reason, during the 1970s, the language of those advocating the rights of women to secure safe, lawful abortion services changed from "abortion on demand" (ie the word used as an economist might) to "pro choice". Technical fields (notably economics) coin derived forms as they're required (counterdemand, overdemand, predemand etc). Demand is a noun & verb, demanding is a verb & adjective, demandable is an adjective, demanded is a verb and demander is a noun; the noun plural is demands.
Video on Demand (VoD)
Directed by Tiago Mesquita with a
screenplay by Mark Morgan, Among the Shadows is a thriller which straddles the
genres, elements of horror and the supernatural spliced in as required. Although in production since 2015, with the
shooting in London and Rome not completed until the next year, it wasn’t until
2018 when, at the European Film Market, held in conjunction with the Internationale Filmfestspiele Berli (Berlin
International Film Festival), that Tombstone Distribution listed it, the
distribution rights acquired by VMI, Momentum and Entertainment One, and VMI
Worldwide.In 2019, it was released
progressively on DVD and video on demand (VoD), firstly in European markets,
the UK release delayed until mid-2020.In some markets, for reasons unknown, it was released with the title The
Shadow Within.
Video on Demand (VoD) and streaming services
are similar concepts in video content distribution but there are differences. VoD is a system which permits users to view
content at any time, these days mostly through a device connected to the
internet across IP (Internet Protocol), the selection made from a catalog or
library of available titles and despite some occasionally ambiguous messaging
in the advertising, the content is held on centralized servers and users can
choose directly to stream or download.
The VoD services is now often a sub-set of what a platform offers which
includes content which may be rented, purchased or accessed through a
subscription.
Streaming is a method of
delivering media content in a continuous flow over IP and is very much the product
of the fast connections of the twenty-first century. Packets are transmitted in real-time which
enables users to start watching or listening without waiting for an entire file
(or file set) to download, the attraction actually being it obviates the need
for local storage. There’s obviously
definitional and functional overlap and while VoD can involve streaming, not
all streaming services are technically VoD and streaming can also be used for
live events, real-time broadcasts, or continuous playback of media without
specific on-demand access. By contrast, the core purpose of VoD is to provide
access at any time and streaming is a delivery mechanism, VoD a broad concept and
streaming a specific method of real-time delivery as suited to live events as
stored content.
The Mercedes-Benz SSKL and the Demand Supercharger
Modern rendition of Mercedes-Benz SSLK in schematic, illustrating the drilled-out chassis rails. The title is misleading because the four or five SSKLs built were all commissioned in 1931 (although it's possible one or more used a modified chassis which had been constructed in 1929). All SSK chassis were built between 1928-1932 although the model remained in the factory's catalogue until 1933.
The Mercedes-Benz SSKL was one of the last of the road cars which could win top-line grand prix races.An evolution of the earlier S, SS and SSK, the SSKL (Super Sports Kurz (short) Leicht (light)) was notable for the extensive drilling of its chassis frame to the point where it was compared to Swiss cheese; reducing weight with no loss of strength. The SSKs and SSKLs were famous also for the banshee howl from the engine when the supercharger was running; nothing like it would be heard until the wail of the BRM V16s twenty years later.It was called a demand supercharger because, unlike some constantly-engaged forms of forced-induction, it ran only on-demand, in the upper gears, high in the rev-range, when the throttle was pushed wide-open.Although it could safely be used for barely a minute at a time, when running, engine power jumped from 240-odd horsepower (HP) to over 300. The number of SSKLs built has been debated and the factory's records are incomplete because (1) like many competition departments, it produced and modified machines "as required" and wasn't much concerned about documenting the changes and (2) many archives were lost as a result of bomb damage during World War II (1939-1945); most historians suggest there were four or five SSKLs, all completed (or modified from earlier builds) in 1931. The SSK had enjoyed great success in competition but even in its heyday was in some ways antiquated and although powerful, was very heavy, thus the expedient of the chassis-drilling intended to make it competitive for another season. Lighter (which didn't solve but at least to a degree ameliorated the brake & tyre wear) and easier to handle than the SSK (although the higher speed brought its own problems, notably in braking), the SSKL enjoyed a long Indian summer and even on tighter circuits where its bulk meant it could be out-manoeuvred, sometimes it still prevailed by virtue of durability and sheer power.
Rudolf Caracciola (1901–1959) and SSKL in the wet, German Grand Prix, Nürburgring, 19 July, 1931. Alfred Neubauer (1891–1980; racing manager of the Mercedes-Benz competition department 1926-1955) maintained Caracciola "...never really learned to drive but just felt it, the talent coming to him instinctively."
Sometimes too it got lucky. When the field assembled in 1931 for the Fünfter Großer Preis von Deutschland (fifth German Grand Prix) at the Nürburgring, even the factory acknowledged that at 1600 kg (3525 lb), the SSKLs, whatever their advantage in horsepower, stood little chance against the nimble Italian and French machines which weighed-in at some 200 KG (440 lb) less. However, on the day there was heavy rain with most of race conducted on a soaked track and the twitchy Alfa Romeos, Maseratis and the especially skittery Bugattis proved less suited to the slippery surface than the truck-like but stable SSKL, the lead built up in the rain enough to secure victory even though the margin narrowed as the surface dried and a visible racing-line emerged. Time and the competition had definitely caught up by 1932 however and it was no longer possible further to lighten the chassis or increase power so aerodynamics specialist Baron Reinhard von Koenig-Fachsenfeld (1899-1992) was called upon to design a streamlined body, the lines influenced both by his World War I (1914-1918 and then usually called the "World War") aeronautical experience and the "streamlined" racing cars which had been seen in the previous decade. At the time, the country greatly was affected by economic depression which spread around the world after the 1929 Wall Street crash, compelling Mercedes-Benz to suspend the operations of its competitions department so the one-off "streamliner" was a private effort (though with some tacit factory assistance) financed by the driver (who borrowed some of the money from his mechanic!).
The streamlined SSKL crosses the finish line, Avus, 1932.
The
driver was Manfred von Brauchitsch (1905-2003), nephew of Major General (later Generalfeldmarschall
(Field Marshal)) Walther von Brauchitsch (1881–1948; Oberbefehlshaber (Commander-in-Chief) of OKH (Oberkommando des Heeres (the German army's high command)
1938-1941). An imposing but ineffectual head of the army, Uncle Walther also borrowed
money although rather more than loaned by his nephew's mechanic, the field
marshal's funds coming from the state exchequer, "advanced"
to him by Adolf Hitler (1889-1945; Führer (leader) and German head of
government 1933-1945 & head of state 1934-1945). Quickly Hitler learned the easy way of
keeping his mostly aristocratic generals compliant was to loan them money, give
them promotions, adorn them with medals and grant them estates in the lands
he'd stolen during his many invasions.
His "loans" proved good investments. Beyond his exploits on the circuits, Manfred
von Brauchitsch's other footnote in the history of the Third Reich (1933-1945)
is the letter sent on April Fools' Day 1936 to Uncle Walther (apparently as a
courtesy between gentlemen) by Baldur von Schirach (1907-1974; head of the Hitlerjugend (Hitler Youth) 1931-1940
& Gauleiter (district party leader) and Reichsstatthalter
(Governor) of Vienna 1940-1945) claiming he given a "horse whipping" to the general's nephew because a remark the racing driver was alleged to have made about Frau von Schirach (the daughter of Hitler's court photographer!). It does seem von Schirach did just that though it wasn't quite the honorable combat he'd claimed: in the usual Nazi manner he'd arrived at von Brauchitsch's apartment in the company of several thugs and, thus assisted, swung his leather whip. Von Brauchitsch denied ever making the remarks. Unlike the German treasury, the mechanic got his money back and that loan proved a good investment, coaxing from the SSKL a victory in its final fling. Crafted in aluminum by Vetter in Cannstatt, the body was mounted on von Brauchitsch's race-car and proved its worth at the at the Avusrennen (Avus race) in May 1932; with drag reduced by a quarter, the top speed increased by some 12 mph (20 km/h) and the SSKL won its last major trophy on the unique circuit which rewarded straight-line speed like no other. It was the last of the breed; subsequent grand prix cars would be pure racing machines with none of the compromises demanded for road-use.
Evolution of the front-engined Mercedes-Benz grand prix car, 1928-1954
1928 Mercedes-Benz SS.
As road cars, the Mercedes-Benz W06 S (1927-1928) & SS (1928-1930) borrowed unchanged what had long been the the standard German approach in many fields (foreign policy, military strategy, diplomacy, philosophy etc): robust engineering and brute force; sometimes this combination worked well, sometimes not. Eschewing refinements in chassis engineering or body construction as practiced by the Italians or French, what the S & SS did was achieved mostly with power and the reliability for which German machinery was already renowned. Although in tighter conditions often out-manoeuvred, on the faster circuits both were competitive and the toughness of their construction meant, especially on the rough surfaces then found on many road courses, they would outlast the nimble but fragile opposition.
1929 Mercedes-Benz SSK.
By the late 1920s it was obvious an easier path to higher performance than increasing power was to reduce the SS's (Super Sport) size and weight. The former easily was achieved by reducing the wheelbase, creating a two-seat sports car still suitable for road and track, tighter dimensions and less bulk also reducing fuel consumption and tyre wear, both of which had plagued the big, supercharged cars. Some engine tuning and the use of lighter body components achieved the objectives and the SSK was in its era a trophy winner in sports car events and on the grand prix circuits. Confusingly, the "K" element in the name stood for kurz (short) and not kompressor (supercharger) as was applied to some other models although all SSKs used a supercharged, 7.1 litre (433 cubic inch) straight-six.
1931 Mercedes-Benz SSKL.
The French, British and Italian competition however also were improving their machinery and by late 1930, on the racetracks, the SSK was becoming something of a relic although it remained most desirable as a road car, demand quelled only by a very high price in what suddenly was a challenging economic climate. Without the funds to create anything new and with the big engine having reached the end of its development potential, physics made obvious to the engineers more speed could be attained only through a reduction in mass so not only were body components removed or lightened where possible but the chassis and sub-frames were drilled to the point where the whole apparatus was said to resemble "a Swiss cheese". The process was time consuming but effective because, cutting the SSK's 1600 KG heft to the SSKL's more svelte 1445 (3185), combined with the 300-odd HP which could be enjoyed for about a minute with the supercharger engaged, produced a Grand Prix winner which was competitive for a season longer than any had expected and one also took victory in the 1931 Mille Miglia. Although it appeared in the press as early a 1932, the "SSKL" designation is retrospective, the factory's extant records listing the machines either as "SSK" or "SSK, model 1931". No more than five were built and none survive (rumors of a frame "somewhere in Argentina" apparently an urban myth) although some SSK's were at various times "drilled out" to emulate the look and the appeal remains, a replica cobbled together from real and fabricated parts sold at auction in 2007 for over US$2 million; this was when a million dollars was still a lot of money.
1932 Mercedes-Benz SSKL (die Gurke).
The one-off bodywork (hand beaten from aircraft-grade sheet aluminum) was fabricated for a race held at Berlin's unique Automobil-Verkehrs- und Übungsstraße (Avus; the "Automobile traffic and training road") which featured two straights each some 6 miles (10 km) in length, thus the interest in increasing top speed and while never given an official designation by the factory, the crowds dubbed it die Gurke (the cucumber). The streamlined SSKL won the race and was the first Mercedes-Benz grand prix car to be called a Silberpfeil (silver arrow), the name coined by radio commentator Paul Laven (1902-1979) who was broadcasting trackside for Südwestdeutsche Rundfunkdienst AG (Southwest German Broadcasting Service); he was struck by the unusual appearance although the designer had been inspired by an aircraft fuselage rather than arrows or the vegetable of popular imagination. The moniker was more flattering than the nickname Weiße Elefanten (white elephant) applied to S & SS which was a reference to their bulk and not a use of the phrase in its usual figurative sense. The
figurative sense came from the Kingdom of Siam (modern-day Thailand) where
elephants were beasts of burden, put to work hauling logs in forests or carting
other heavy roads but the rare white (albino) elephant was a sacred animal which
could not be put to work.However, the
owner was compelled to feed and care for the unproductive creature and the
upkeep of an elephant was not cheap; they have large appetites.According to legend, if some courtier
displeased the king, he could expect the present of a white elephant.A “white elephant” is thus an unwanted
possession that though a financial burden, one is “stuck with” and the term is
applied the many expensive projects governments around the world seem unable to
resist commissioning.
Avus circuit. Unique in the world, it was the two long straights which determined die Gurke's emphasis on top speed. Even the gearing was raised (ie a numerically lower differential ratio) because lower engine speeds were valued more than low-speed acceleration which was needed only once a lap.
The size of the S & SS was exaggerated by the unrelieved expanses of white paint (Germany's designated racing color) although despite what is sometimes claimed, Ettore
Bugatti’s (1881–1947) famous quip “fastest trucks in the world” was his back-handed compliment not to the German cars but to W.
O. Bentley’s (1888–1971) eponymous racers which he judged brutish compared to
his svelte machines. Die Gurke ended up silver only because such had been the rush to complete the build in time for the race, there was time to apply the white paint so it raced in a raw aluminum skin. Remarkably, in full-race configuration, die Gurke was driven to Avus on public roads, a practice which in many places was tolerated as late as the 1960s. Its job at Avus done, die Gurke was re-purposed for high-speed tyre testing (its attributes (robust, heavy and fast) ideal for the purpose) before "disappearing" during World War II. Whether it was broken up for parts or metal re-cycling, spirted away somewhere or destroyed in a bombing raid, nobody knows although it's not impossible conventional bodywork at some point replaced the streamlined panels. In 2019, Mercedes-Benz unveiled what it described as an "exact replica" of die Gurke, built on an original (1931) chassis.
1934 Mercedes-Benz W25.
After building the replica Gurke, Mercedes-Benz for the first time subjected it to a wind-tunnel test, finding (broadly in line with expectations) its cd (coefficient of drag) improved by about a third, recording 0.616 against a standard SSK's 0.914. By comparison, the purpose-built W25 from 1934 delivered a 0.614 showing how effective Baron Koenig-Fachsenfeld's design had been although by today's standards, the numbers are not of shapes truly "slippery". Although "pure" racing cars had for years existed, the W25 (Werknummer (works number) 25) was the one which set many elements is what would for a quarter-century in competition be the default template for most grand prix cars and its basic shape and configuration remains recognizable in the last front-engined car to win a Word Championship grand prix in 1960. The W25 was made possible by generous funding from the new Nazi Party, "prestige projects" always of interest to the propaganda-minded party. With budgets which dwarfed the competition, immediately the Mercedes-Benz and Auto Unions enjoyed success and the W25 won the newly inaugurated 1935 European Championship. Ironically, the W25's most famous race was the 1935 German Grand Prix at the Nürburgring, won by the inspired Italian Tazio Nuvolari (1892–1953) in an out-dated and under-powered Alfa-Romeo P3, von Brauchitsch's powerful W25 shredding a rear tyre on the final lap. However, the Auto Union's chassis design fundamentally was more farsighted; outstanding though the engine was, the W25's platform was, in many ways, eine bessere Gurke (a better cucumber) and because its limitations were inherent, the factory "sat out" most of the 1936 season to develop the W125.
1937 Mercedes-Benz W125.
Along with the dramatic, mid-engined, V16 Auto Union Type C, the W125 was the most charismatic race car of the "golden age" of 1930s European circuit racing. When tuned for use on the fastest circuits, the 5.7 litre (346 cubic inch) straight-eight generated over 640 HP and in grand prix racing that number would not be exceeded until the turbocharged engines (first seen in 1977) of the 1980s. The W125 used a developed version of the W25's 3.4 (205) & 4.3 (262) straight-eights and the factory had assumed this soon would be out-performed by Auto Union's V16s but so successful did the big-bore eight prove the the Mercedes-Benz V16 project was aborted, meaning resources didn't need to be devoted to the body and chassis engineering which would have been required to accommodate the bigger, wider and heavier unit (something which is subsequent decades would doom a Maserati V12 and Porsche's Flat-16. The W125 was the classic machine of the pre-war "big horsepower" era and if a car travelling at 100 mph (160 km/h) passed a W125 at standstill, the latter could accelerate and pass that car within a mile (1.6 km).
A
W125 on the banked Nordschleife
(northern ribbon (curve)) at Avus, 1937.At Avus, the streamlined bodywork was fitted because a track which is 20
km (12 miles) in length but has only four curves puts an untypical premium on
top-speed.The banked turn was
demolished in 1967 because increased traffic volumes meant an intersection was
needed under the Funkturm (radio
tower), tower and today only fragments of the original circuit remain although
the lovely art deco race control tower still exists and was for a time used as
restaurant.Atop now sits a Mercedes-Benz
three-pointed star rather than the swastika which flew in 1937.
1938 Mercedes-Benz W154.
On the fastest circuits the streamlined versions of the W125s were geared to attain 330 km/h (205 mph) and 306 km/h (190 mph) often was attained in racing trim. With streamlined bodywork, there was also the Rekordwagen built for straight-line speed record attempts and one set a mark of 432.7 km/h (268.9 mph), a public-road world speed record that stood
until 2017. Noting the speeds and aware the cars were already too fast for circuits which had been designed for, at most, velocities sometimes 100 km/h (50 mph) less, the governing body changed the rules, limiting the displacement for supercharged machines to 3.0 litres (183 cubic inch), imagining that would slow the pace. Fast though the rule-makers were, the engineers were quicker still and it wasn't long before the V12 W154 was posting lap-times on a par with the W125 although they did knock a few km/h off the top speeds. The rule change proved as ineffective in limiting speed as the earlier 750 KG formula which had spawned the W25 & W125.
1939 Mercedes-Benz W165.
An exquisite one-off, the factory built three W165s for the single purpose of contesting the 1939 Tripoli Grand Prix. Remarkable as it may now sound, there used to be grand prix events in Libya, then a part of Italy's colonial empire. Anguished at having for years watched the once dominant Alfa Romeos enjoy only the odd (though famous) victory as the German steamroller flattened all competition (something of a harbinger of the Wehrmacht's military successes in 1939-1940), the Italian authorities waited until the last moment before publishing the event's rules, stipulating the use of a voiturette (small car) with a maximum displacement of 1.5 litres (92 cubic inch). The rules were designed to suit the Alfa Romeo 158 (Alfetta) and Rome was confident the Germans would have no time to assemble such a machine. However, knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945), still resenting what happened at the Nürburgring in 1935, would not be best pleased were his Axis partner (and vassal) Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) to enjoy even this small victory, the factory scrambled and conjured up the V8-powered (a first for Mercedes-Benz) W165, the trio delivering a "trademark 1-2-3" finish in Tripoli. As a consolation, with Mercedes-Benz busy building inverted V12s for the Luftwaffe's Messerschmitts, Heinkels and such, an Alfa Romeo won the 1940 Tripoli Grand Prix which would prove the city's last.
1954 Mercedes Benz W196R Strómlinienwagen (literally "streamlined car" but translated usually as "Streamliner".
A curious mix of old (drum brakes, straight-eight engine and swing axles) and new (a desmodromic valve train, fuel injection and aerodynamics developed in a wind-tunnel with the help of engineers then banned from being involved in aviation), the intricacies beneath the skin variously bemused or delighted those who later would come to be called nerds but it was the sensuous curves which attracted most publicity. Strange though it appeared, it was within the rules and clearly helped deliver stunning speed although the pace did expose some early frailty in road-holding (engineers have since concluded the thing was a generation ahead of tyre technology). It was one of the prettiest grand prix cars of the post war years and the shape (sometimes called "type Monza", a reference to the Italian circuit with long straights so suited to it) would later much appeal to pop-artist Andy Warhol (1928–1987) who used it in a number of prints.
1954 Mercedes-Benz W196R. In an indication of how progress accelerated after 1960, compare this W196R with (1) the W25 of 20 years earlier and (2) any grand prix car from 1974, 20 years later.
However, although pleasing to the eye, the W196R Strómlinienwagen was challenging even for expert drivers and it really was a machine which deserved a de Dion rear suspension rather than the swing axles (on road cars the factory was still building a handful with these as late as 1981 and their fudge of semi-trailing rear arms (the "swing axle when you're not having a swing axle") lasted even longer). Of more immediate concern to the drivers than any sudden transition to oversteer was that the aluminium skin meant they couldn't see the front wheels so, from their location in the cockpit, it was difficult to judge the position of the extremities, vital in a sport where margins can be fractions of a inch. After the cars in 1954 returned to Stuttgart having clouted empty oil drums (those and bails of hay was how circuity safety was then done) during an unsuccessful outing to the British Grand Prix at Silverstone, a conventional body quickly was crafted and although visually unremarkable, the drivers found it easier to manage and henceforth, the Strómlinienwagen appeared only at Monza. There was in 1954-1955 no constructor's championship but had there been the W196R would in both years have won and it delivered two successive world driver's championships for Juan Manuel Fangio (1911–1995). Because of rule changes, the three victories by the W196R Strómlinienwagen remain the only ones in the Formula One World Championship (since 1950) by a car with enveloping bodywork.
The pain or
distress caused by the loss or lack or solace and the sense of desolation
connected to the present state of one’s home and territory
2003: A
coining by Professor Glenn Albrecht (b 1953), the construct built from the
Latin sōlācium (solace, comfort) + -algia
(pain).Sōlācium was from sōlor (to comfort, console, solace) + –ac- (a variant of –āx- (used
to form adjectives expressing a tendency or inclination to the action of the
root verb)) + -ium, from the Latin -um
(in this context used to indicate the setting where a given activity is carried
out).The –algia suffix was from the New
Latin -algia, from the Ancient Greek
-αλγία (-algía), from compounds
ending in Ancient Greek ἄλγος (álgos) (pain) + the Ancient
Greek -ῐ́ᾱ
(-ĭ́ā).The most well-known was probably
kephalalgíā (headache).Solastalgia is a noun, Solastalgic is a noun
and adjective and solastalgically is an adverb; the noun plural is solastalgias.
Elements
what became the modern environmentalism can be found in writings from Antiquity
and there are passages in Biblical Scripture which are quoted to support the notion
Christ and God Himself were greenies.However,
as a political movement, it was very much a creation of the late twentieth
century although Theodore Roosevelt (TR, 1858–1919; US president 1901-1909),
despite his reputation as a big game hunter, made some notable
contributions.In what proved an active
retirement, Roosevelt would often remark that more than the landmark anti-trust
laws or his Nobel Peace Prize, the most enduring legacy of his presidency would
be the federal legislation relating to the conservation and protection of the
natural environment, both land and wildlife.While he was in the White House, new national parks and forests were
created, the total areas an impressive 360,000 square miles (930,000 km2),
a reasonable achievement given the pressure vested interests exerted upon the
Congress to prevent anything which would impinge upon “development”.
Portrait of
Theodore Roosevelt (1903) by John Singer Sargent (1856–1925).
Roosevelt though
was not typical and in most places the profits from industrialization &
development proved more compelling than abstractions about the environment;
even when the effects of climate change became obvious, it was clear only a
crisis would rapidly create the conditions for change.Events such as the London’s “Great Smog” of
1952 were so dramatic changes were made (culminating in the Clean Air Act (1956)) and the state of
the air quality in San Francisco & Los Angeles was by the late 1950s so
obviously deteriorating that California enacted anti-pollution laws even before
there was much federal legislation, the state remaining in the vanguard to this
day.Those political phenomenon for a
while encouraged the thought that even though decisive action to reduce carbon
emissions was improbable while climate change (once referred to as “the
greenhouse effect” and later “global warming”) seemed both remote and
conceptual, once the “crisis events” began to affect those living in the rich
countries of the global north (ie “the white folks”), the term would morph into
“climate crisis” and resource allocation would shift to address the
problem.That theory remains sound but
what was under-estimated was the threshold point for the word “crisis”.Despite the increasing frequency and severity
of wildfires, soaring temperatures, polar vortexes and floods, thus far the
political system is still being adjusted on the basis of gradual change: the
imperative remains managing rather than rectifying the problem.Once, television-friendly events such as (1)
melting glaciers creating landslides destroying entire villages which have for
centuries sate in the Swiss Alps, (2) suburbs of mansions in the hills of Los
Angeles being razed to the ground by wildfires, (3) previously unprecedented
floods in Europe and Asia killing hundreds and (4) heat waves routinely
becoming a feature of once temperate regions would have been thought “crisis
triggers” but the political system has thus far absorbed them.
Silent Spring (First edition, 1962) by Rachel
Carson.
The origins of the environment movement in its modem form are often
traced to the publication in 1962 of Silent
Spring by marine biologist Rachel Carson (1907–1964) although it took years
for the controversy that book generated to coalesce into an embryonic “green”
movement.Silent Spring was a best-seller which (in an accessible form)
introduced to the general public notions of the threat chemical pollution posed
to ecology, the power of her argument being to identify the issue not as
something restricted to a narrow section of agricultural concerns but as part
of a systemic threat to the balance of nature and the very survival of human civilization.There were many other influences
(demographic, cultural, economic, educational etc) at this time and by the late
1960s, it was apparent concerns about pollution, over-population, pesticide use
and such had created an identifiable shared language and public visibility
although it was something too fragmented to be called a movement, the goals and
advocated courses of action remaining disparate.Structurally however, organizations were
being formed and a convenient turning point suggesting critical mass had been
achieved came in the US in April, 1970 when some 20 million participants
received wide coverage in the media for Earth Day, a warning to the politicians
that “the environment” might affect voting patterns.It was in this era that the framework of US
environmental legislation was built including the Clean Air Act (1970), Clean
Water Act (1972) and Endangered
Species Act (1973) was formed, all passed during the administration of Richard
Nixon (1913-1994; US president 1969-1974) and under Nixon, in 1970, the EPA (Environmental
Protection Agency) was created, an institution of which Theodore Roosevelt
would have approved.
Earth Emotions: New Words for a New
World (2019) by Professor Glenn Albrecht.
When working as a academic, Glenn Albrecht was granted
conventional academic titles (such as Professor of Sustainability) but his work
puts him in the category of “ecophilosopher”, a concept which would have been understood
by the natural scientists of Antiquity; it’s now an increasingly populated
field with a niche in popular publishing.The eco- prefix was from the French éco-,
from the Latin oeco-, from Ancient
Greek οἶκος (oîkos) (house, household) and was for generations familiar in
“economy” and its derivatives but is now most associated with ecology or the
environment (in the ecological sense).For better or worse, it has come to be applied to novel constructs
including ecotourism (forms of “sustainable” tourism claimed to cause less
environmental damage), ecofascism (literally “fascist politics with support for
ecological concerns” but used usually (as a derogatory) to refer to
uncompromising, aggressive or violent environmental activism, the most extreme
form of which is ecoterrorism (a label used rather loosely, even of vegans who
stage protests outside restaurants serving the products of the slaughter
industry)) and ecofeminism (a socio-political movement combining feminism and
environmentalism).
The
ecophilosophers have produced many publications but Professor Albrecht has been
unusual in that he has been prolific also in the coining of words, especially
those which relate to or are consequent upon what he calls the “sumbiocentric”
(taking into account the centrality of the process of symbiosis in all of our
deliberations on human affairs”).Such
creations in emerging or expanding fields of study are of course not
unusual.In environmentalism, new terms
and words have in recent decades appeared but there’s been a element of
technological determinism to some.Although
the notion humanity lives on a “ship travelling through space” had been in use
since at least the mid-nineteenth century, the metaphor had been nautical and
it wasn’t until “spaceships” started to be launched the 1960s the term was
updated to the now familiar “spaceship earth”.Neologisms, even if used in context can be
baffling but helpfully, Professor Albrecht published also a “glossary of psycho
erratic terms” with pocket definitions explaining his lexicon of the “Earth’s emotions”.
Endemophilia:
A “love of
place”, specifically the “particular love of the locally and regionally distinctive
in the people of a place.” The mechanism for this is: “Once a person
realizes that the landscape they have before them is not replicated in even a
general way elsewhere in the country or on their continent or even in the
world, there is ample room for a positive Earth emotion based on rarity and
uniqueness.”This is
classified as a spectrum condition in that the more “a uniqueness is understood… the more it can be appreciated”. Professor Albrecht was speaking of
geology, florna & fauna but figuratively the concept can be applied to the
built environment in urban areas and it doesn’t demand an interest in
architecture to take pleasure from the form of (some) buildings.
Eutierria: A “feeling of total harmony with our place, and
the naïve loss of ego (merging subject and ego) we often felt as children”.Professor Albrecht cites the author Richard Louv (b 1949) who used the
phrase “nature
deficit disorder” in
suggesting a word was needed to describe the state of harmony one could achieve
if “connected
to the Earth”.Eutierria is a “positive feeling of oneness with the Earth
and its life forces, where the boundaries between self and the rest of nature
are obliterated, and a deep sense of peace and contentedness pervades
consciousness”.
The HUCE (Harvard University Center for the
Environment) in 2017 noted the phenomenon of mermosity, recording that some six
months earlier New York Magazine had “published its
most-read article ever, surpassing a photo spread of Lindsay Lohan.” The topic the HUCE summarized
as “Doom”, the apocalyptic
visions of a world ravaged by climate change, the young especially afflicted by
a crushing sense of dread.
Mermosity:
“An
anticipatory state of being worried about the possible passing of the familiar,
and its replacement by that which does not sit comfortably in one’s sense of
place.” This is a word now
with great currency because researchers have noted one aspect of the prominence
in the media of (1) human-induced climate change and (2) the apparent
inevitability of its adverse consequences has resulted in a pervading sense of
doom among some, especially the young.According to some psychologists, their young patients are exhibiting “mourning-like”
behaviour, thinking the planet already in the throes of destruction and they exist
merely as mourners at its protracted funeral.
Meteoranxiety:
The “anxiety
felt in the face of the threat of the frequency and severity of extreme weather
events”.This is an example
of a feedback loop in that weather events (rain, storms, heatwaves etc) now
tending by many to be attributed exclusively to human-induced climate change,
thus exacerbating one’s mermosity.In
the literature of psychology, behavioral economics, neuroscience, philosophy,
sociology & political science there are explanations (often replete with
house jargon) explaining how “perception bias” & “cognitive bias” operate
and interact but such things rarely are discussed on the TikTok news feeds
which these days are so influential in shaping world views.
Solastalgia:
“The pain or
distress caused by the loss or lack or solace and the sense of desolation
connected to the present state of one’s home and territory”.This is the “lived experience of negative environmental
change” and reflects the sense of loss of what once was (or one’s imagined
construct of what once was), a phenomenon Professor Albrecht describes as “the homesickness
you have when you are still at home”. Although coined to be used in the context of
climate change, it can be applied more widely and the feeling will be familiar
to those who notice the lack of familiar landmarks in cities as urban
redevelopment changes the architecture.In those cases, the distress can be made more troubling still because
even a building one may for years frequently have seen rapidly can fade from
memory to the point where it can be hard to remember its appearance, even if it
stood for decades.
Google
ngram: Because of the way Google harvests data for their ngrams, they’re not
literally a tracking of the use of a word in society but can be usefully
indicative of certain trends, (although one is never quite sure which
trend(s)), especially over decades.As a
record of actual aggregate use, ngrams are not wholly reliable because: (1) the
sub-set of texts Google uses is slanted towards the scientific & academic
and (2) the technical limitations imposed by the use of OCR (optical character
recognition) when handling older texts of sometime dubious legibility (a
process AI should improve).Where
numbers bounce around, this may reflect either: (1) peaks and troughs in use
for some reason or (2) some quirk in the data harvested.Being recent, the ngram for solastagia should
be an untypically accurate indication of trends in use but it’s a quantitative
and not qualitative measure: Although a word very much of the climate change
era, it has been used in other contexts as, as a neologism, it appears also in
many dictionaries and other on-line lists.
Sumbiocentric:
“Taking into
account the centrality of the process of symbiosis in all of our deliberations
on human affairs”. The
special place environmentalism has assumed in the public consciousness means
the sumbiocentric is positioned as something beyond just another construction
of ethics and should be thought a kind of secular, moral theology.Ominously, one apparent implication in this
would appear to be the desirability (according to some the necessity) for some
sort of internationally “co-ordinated” government, a concept with a wide vista and
in various forms at times advocated by figures as diverse as the polemicist playwright
George Bernard Shaw (GBS; 1856-1950) and Edward Teller (1908–2003), the
so-called “father of the hydrogen bomb”.
Sumbiophilia:
“The love of
living together”. This would
apparently be the state of things in the symbiocene, a speculative era which
would succeed the Anthropocene and be characterized by a harmonious and
cooperative coexistence between humans and the rest of nature which presumably would
be something of a new Jerusalem although shepherds, child care workers and others
would be advised not to take literally the Biblical Scripture: “The wolf also
shall dwell with the lamb, and the leopard shall lie down with the kid; and the
calf and the young lion and the fatling together; and a little child shall lead
them.” (Isaiah 11:6, King James Version (KJV, 1611)).However, other than sensible precautions when
around carnivorous predators, all would exist in a symbiosis (living together
for mutual benefit) without the destructive practices of the anthropocene.In the world of Green Party wine & cheese
evenings, sumbiophilia probably seems the most natural thing in the world
although the party leadership would be sufficiently realistic to understand not
all would agree so, when it was made compulsory, “re-education camps” would be needed to “persuade” the recalcitrant.As used by Professor Albrecht, sumbiophilia is an ideal but one
obviously counter-historical because the development of the nation state (which
took millennia and was (more or less) perfected in the nationalisms which have
been the dominant political paradigm since the nineteenth century) suggests what
people love is not us all “living together” but groups of us “keeping the
others out”.Not for nothing are
idealists thought the most dangerous creatures on Earth.
Terrafuric:
“The extreme
anger unleashed within those who can clearly see the self-destructive
tendencies in the current forms of industrial-technological society and feel
they must protest and act to change its direction”. This is another spectrum condition ranging
from writing truculent letters to the New York Times, to members of Extinction
Rebellion super-gluing themselves to the road to assassinating the “guilty
parties”, a la Luigi Mangione (b 1998).
Terranascia
(“Earth
creating forces”) and terraphthora (“Earth destroying forces”) are
companion terms which could be used by geologists, cosmologists and others but
the significance in this context is that humans are now (and have long been)
among the most ecologically destructive forces known.
Hannah
Arendt and Martin Heidegger (2017) by Antonia Grunenberg (b 1944). Hannah Arendt's (1906-1975) relationship with Martin Heidegger (1889–1976) began when she was a 19 year old student of philosophy and he her professor, married and aged 36. Both, for different reasons, would more than once have experienced solastalgia.
Solastalgia
began life in the milieu of the climate change wars but poets and others beyond
the battleground have been drawn to the word, re-purposing it in abstract or
figurative ways, comparing the process of literal environmental degradation
with losses elsewhere.The adaptations
have included (1) Social & cultural change (loss of familiar traditions or
communities), (2) Linguistic erosion (mourning the disappearance of words,
dialects or the quirks in language with which one grew up, replaced often by
new (and baffling) forms of slang), (3) One’s personal emotional framework (the
loss of friends, partner or family members), (4) Aging (the realization of
mounting decrepitude), (5) Digital displacement (a more recent phenomenon which
covers a range including an inability to master new technology, grief when once
enjoyed digital spaces become toxic, commercialized or abandoned and having to
“upgrade” from familiar, functional software to newer versions which offer no
advantages), (6) Artistic loss (one’s favourite forms of music, art or
literature become unfashionable and neglected) and (7) Existential
disconnection (not a new idea but now one an increasing number claim to suffer;
a kind of philosophical estrangement in which one feels “the world” (in the sense the German
philosopher Martin Heidegger (1889–1976) used the word) has become strange and
unfamiliar).
(1) A large
bin or receptacle; a fixed chest or box.
(2) In
military use, historically a fortification set mostly below the surface of the
ground with overhead protection provided by logs and earth or by concrete and
fitted with above-ground embrasures through which guns may be fired.
(3) A
fortification set mostly below the surface of the ground and used for a variety
of purposes.
(4) In golf,
an obstacle, classically a sand trap but sometimes a mound of dirt,
constituting a hazard.
(5) In
nautical use, to provide fuel for a vessel.
(6) In
nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent
storehouse.
(7) In
golf, to hit a ball into a bunker.
(8) To
equip with or as if with bunkers.
(9) In
military use, to place personnel or materiel in a bunker or bunkers (sometimes
as “bunker down”).
1755–1760:
From the Scottish bonkar (box, chest
(also “seat” (in the sense of “bench”) of obscure origin but etymologists
conclude the use related to furniture hints at a relationship with banker (bench).Alternatively, it may be from a Scandinavian
source such as the Old Swedish bunke (boards
used to protect the cargo of a ship). The
meaning “receptacle for coal aboard a ship” was in use by at least 1839
(coal-burning steamships coming into general use in the 1820s).The use to describe the obstacles on golf
courses is documented from 1824 (probably from the extended sense “earthen seat”
which dates from 1805) but perhaps surprisingly, the familiar sense from
military use (dug-out fortification) seems not to have appeared before World
War I (1914-1918) although the structures so described had for millennia existed.“Bunkermate” was army slang for the
individual with whom one shares a bunker while the now obsolete “bunkerman”
(“bunkermen” the plural”) referred to someone (often the man in charge) who
worked at an industrial coal storage bunker.Bunker & bunkerage is a noun, bunkering is a noun & verb,
bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives;
the noun plural is bunkers.
Just as
ships called “coalers” were used to transport coal to and from shore-based
“coal stations”, it was “oilers” which took oil to storage tanks or out to sea
to refuel ships (a common naval procedure) and these STS (ship-to-ship)
transfers were called “bunkering” as the black stuff was pumped,
bunker-to-bunker.That the coal used by
steamships was stored on-board in compartments called “coal bunkers” led
ultimately to another derived term: “bunker oil”.When in the late nineteenth century ships
began the transition from being fuelled by coal to burning oil, the receptacles
of course became “oil bunkers” (among sailors nearly always clipped to
“bunker”) and as refining processes evolved, the fuel specifically produced for
oceangoing ships came to be called “bunker oil”.
Bunker oil is
“dirty stuff”, a highly viscous, heavy fuel oil which is essentially the
residue of crude oil refining; it’s that which remains after the more
refined and volatile products (gasoline (petrol), kerosene, diesel etc) have
been extracted.Until late in the
twentieth century, the orthodox view of economists was its use in big ships was
a good thing because it was a product for which industry had little other use
and, as essentially a by-product, it was relatively cheap.It came in three flavours: (1) Bunker A: Light
fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate
viscosity used in engines larger than marine diesels but smaller than those
used in the big ships and (3) Bunker C: Heavy fuel oil used in container
ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating
mass.Because of its composition, Bucker
C especially produced much pollution and although much of this happened at sea
(unseen by most but with obvious implications), when ships reached harbor to dock,
all the smoke and soot became obvious.Over the years, the worst of the pollution from the burning of bunker
oil greatly has been reduced (the work underway even before the Greta Thunberg
(b 2003) era), sometimes by the simple expedient of spraying a mist of water
through the smoke.
Floor-plans
of the upper (Vorbunker) and lower (Führerbunker) levels of the structure
now commonly referred to collectively as the Führerbunker.
History’s most
infamous bunker remains the Berlin Führerbunker
in which Adolf Hitler (1889-1945; Führer
(leader) and German head of government 1933-1945 & head of state 1934-1945)
spent much of the last few months of his life.In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German
military campaigns and several others built where required but it’s the one in Berlin
which is remembered as “theFührerbunker”. Before 1944 when the intensification of the air
raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been
used other than by the architects and others involved in their construction and
it wasn’t a designation like Führerhauptquartiere
which the military and other institutions of state shifted between locations
(rather as “Air Force One” is attached not to a specific airframe but whatever
aircraft in which the US president is travelling).In subsequent historical writing, the term Führerbunker tends often to be applied
to the whole, two-level complex in Berlin and although it was only the lower
layer which officially was designated as that, for most purposes the
distinction is not significant.In military
documents, after January, 1945 the Führerbunker
was referred to as Führerhauptquartiere.
Führerbunker tourist information board, Berlin, Germany.
Only an
information board at the intersection of den
Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment
in 2006 prior to that year's FIFA (Fédération
Internationale de Football Association (International Federation of
Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse
77 where once the Führerbunker was located.The Soviet occupation forces razed the new Reich Chancellery and
demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German
Democratic Republic; the old East Germany) 1949-1990) abandoned attempts
completely to destroy what lay beneath.Until after the fall of the Berlin Wall (1961-1989) the site remained
unused and neglected, “re-discovered” only during excavations by
property developers, the government insisting on the destruction on whatever
was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings
(an unfortunate aspect of post-unification Berlin) began to appear on the
site.Most of what would have covered
the Führerbunker’s footprint is now a
supermarket car park.
The first
part of the complex to be built was the Vorbunker
(upper bunker or forward bunker), an underground facility of reinforced concrete
intended only as a temporary air-raid shelter for Hitler and his entourage in
the old Reich Chancellery.Substantially
completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich
Chancellery Air-Raid Shelter), the Vorbunker
label applied only in 1944 when the lower level (the Führerbunker proper) was appended.In mid January, 1945, Hitler moved into the Führerbunker and, as the military
situation deteriorated, his appearances above ground became less frequent until
by late March he rarely saw the sky,Finally, on 30 April, he committed suicide.
Bunker
Busters
Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.
Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other. That’s not
a new sentiment, being one philosophers and others have for millennia expressed
in various ways although since the advent of nuclear weapons, concerns understandably
become heightened.Like every form of
military technology ever deployed, once the “genie is out of the bottle” the
problem is there to be managed and at the dawn of the atomic age, delivering a
lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who
created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for
his use of it to discover and identify the isotopes in many non-radioactive
elements and for his enunciation of the whole number rule) observed:
“There are those about us who say that such
research should be stopped by law, alleging that man's destructive powers are
already large enough. So, no doubt, the
more elderly and ape-like of our ancestors objected to the innovation of cooked
food and pointed out the great dangers attending the use of the newly
discovered agency, fire. Personally, I
think there is no doubt that sub-atomic energy is available all around us and
that one day man will release and control its almost infinite power. We cannot prevent him from doing so and can
only hope that he will not use it exclusively in blowing up his next door
neighbor.”
The use in
June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb
Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in
Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear
facility) meant “Bunker Buster” hit the headlines.Carried by the Northrop B-2 Spirit heavy
bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with
a casing designed to withstand the stress of penetrating through layers of
reinforced concrete or thick rock.“Bunker buster” bombs have been around for a while, the ancestors of
today’s devices first built for the German military early in World War II (1939-1945)
and the principle remains unchanged to this day: up-scaled armor-piercing
shells.The initial purpose was to
produce a weapon with a casing strong enough to withstand the forces imposed
when impacting reinforced concrete structures, the idea simple in that what was
needed was a delivery system which could “bust through” whatever protective
layers surrounded a target, allowing the explosive charge to do damage where
needed rtaher than wastefully being expended on an outer skin.The German weapons proved effective but inevitably triggered an “arms
race” in that as the war progressed, the concrete layers became thicker, walls over
2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.Technological development continued and the
idea extended to rocket propelled bombs optimized both for armor-piercing and
aerodynamic efficiency, velocity a significant “mass multiplier” which made the
weapons still more effective.
USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.
Concurrent
with this, the British developed the first true “bunker busters”, building on
the idea of the naval torpedo, one aspect of which was in exploding a short distance
from its target, it was highly damaging because it was able to take advantage
of one of the properties of water (quite strange stuff according to those who
study it) which is it doesn’t compress.
What that meant was it was often the “shock wave” of the water rather
than the blast itself which could breach a hull, the same principle used for
the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German
dams. Because of the way water behaved,
it wasn’t necessary to score the “direct hit” which had been the ideal in the
early days of aerial warfare.
RAF Bomber
Command archive photograph of Avro Lancaster (built between 1941-1946) in
flight with Grand Slam mounted (left) and a comparison of the Tallboy &
Grand Slam (right), illustrating how the latter was in most respects a
scaled-up version of the former. To
carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated
Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam
carried externally, its dimensions exceeding internal capacity), deleted front
and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.Such was the concern with weight (especially
for take-off) that just about anything non-essential was removed from the B1
Specials, even three of the four fire axes and its crew door ladder.In the US, Boeing went through a similar exercise
to produce the run of “Silverplate” B-29 Superfortresses able to carry the first
A-bombs used in August, 1945.
Best known
of the British devices were the so called “earthquake bombs”, the Tallboy (12,000
lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive
bulk, were classified by the War Office as “medium capacity”. The terms “Medium Capacity” (MC) & “High
Capacity” referenced not the gross weight or physical dimensions but ratio of
explosive filler to the total weight of the construction (ie how much was explosive
compared to the casing and ancillary components). Because both had thick casings to ensure penetration
deep into hardened targets (bunkers and other structures encased in rock or reinforced
concrete) before exploding, the internal dimensions accordingly were reduced
compared with the ratio typical of contemporary ordinance.A High Capacity (HC) bomb (a typical “general-purpose” bomb) had a thinner casing and a much higher proportion of explosive (sometimes
over 70% of total weight). These were
intended for area bombing (known also as “carpet bombing”) and caused wide
blast damage whereas the Tallboy & Grand Slam were penetrative with casings
optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier. The Tallboy’s
5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the
Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big”
4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to
its 3000 LB (1.4 ton) charge.Like many
things in engineering (not just in military matters) the ratio represented a
trade-off, the MC design prioritizing penetrative power and structural
destruction over blast radius.The
novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a
direct hit on a target but by entering the ground nearby, the explosion (1)
creating an underground cavity (a camouflet) and (2) transmitting a shock-wave
through the target’s foundations, leading to the structure collapsing into the
newly created lacuna.
The
etymology of camouflet has an interesting history in both French and military
mining.Originally it meant “a whiff of
smoke in the face (from a fire or pipe) and in figurative use it was a
reference to a snub or slight insult (something unpleasant delivered directly
to someone) and although the origin is murky and it may have been related to
the earlier French verb camoufler (to
disguise; to mask) which evolved also into “camouflage”.In the specialized military jargon of siege
warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet”
referred to “an underground explosion that does not break the surface, but
collapses enemy tunnels or fortifications by creating a subterranean void or
shockwave”.The use of this tactic is
best remembered from the Western Front in World War I,
some of the huge craters now tourist attractions.
Under
watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in
front of the official portrait of the republic’s ever-unsmiling founder, Grand
Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of
Iran, 1979-1989).Ayatollah Khamenei
seemed in 1989 an improbable choice as Supreme Leader because others were
better credentialed but though cautious and uncharismatic, he has proved a great
survivor in a troubled region.
Since aerial
bombing began to be used as a strategic weapon, of great interest has been the
debate over the BDA (battle damage assessment) and this issue emerged almost as
soon as the bunker buster attack on Iran was announced, focused on the extent
to which the MOPs had damaged the targets, the deepest of which were concealed deep
inside a mountain.BDA is a constantly
evolving science and while satellites have made analysis of surface damage
highly refined, it’s more difficult to understand what has happened deep
underground.Indeed, it wasn’t until the
USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan
in 1945-1946, conducting interviews, economic analysis and site surveys that a
useful (and substantially accurate) understanding emerged of the effectiveness of
bombing although what technological advances have allowed for those with the
resources is the so-called “panacea targets” (ie critical infrastructure
and such once dismissed by planners because the required precision was for many
reasons rarely attainable) can now accurately be targeted, the USAF able to
drop a bomb within a few feet of the aiming point.As the phrase is used by the military, the Fordow
Uranium Enrichment Plant is as classic “panacea target” but whether even a technically
successful strike will achieve the desired political outcome remains to be
seen.
Mr Trump,
in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have
two countries that have been fighting so long and so hard that they don't know
what the fuck they're doing."Actually, both know exactly WTF they're doing; it's just Mr Trump (and
many others) would prefer they didn't do it.
Donald Trump (b 1946; US president
2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand
Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth
should one day be revealed.Even modelling
of the effects has probably been inconclusive because the deeper one goes
underground, the greater the number of variables in the natural structure and
the nature of the internal built environment will also influence blast
behaviour.All experts seem to agree much
damage will have been done but what can’t yet be determined is what has been
suffered by the facilities which sit as deep as 80 m (260 feet) inside the
mountain although, as the name implies, “bunker busters” are designed for buried
targets and it’s not always required for blast directly to reach target.Because the shock-wave can travel through earth
& rock, the effect is something like that of an earthquake and if the structure
sufficiently is affected, it may be the area can be rendered geologically too
unstable again to be used for its original purpose.
Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done. However, whatever
the murkiness surrounding the BDA, many analysts have concluded that even if
before the attacks the Iranian authorities had not approved the creation of a
nuclear weapon, this attack will have persuaded them one is essential for “regime
survival”, thus the interest in both Tel Aviv and (despite denials) Washington
DC in “regime change”.The consensus
seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation
of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level
required for use in power generation; the ayatollah liked to keep his options
open.So, the fear of some is the attacks,
even if they have (by weeks, months or years) delayed the Islamic Republic’s
work on nuclear development, may prove counter-productive in that they convince
the ayatollah to concur with the reasoning of every state which since 1945 has
adopted an independent nuclear deterrent (IND).That reasoning was not complex and hasn’t changed since first a prehistoric
man picked up a stout stick to wave as a pre-lingual message to potential adversaries,
warning them there would be consequences for aggression.Although a theocracy, those who command power
in the Islamic Republic are part of an opaque political institution and in the
struggle which has for sometime been conducted in anticipation of the death of
the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central
dynamics. Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.
Of the word "bust"
The Great Bust: The Depression of
the Thirties (1962)
by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has
never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast
plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra,
Australia (right).Remembered for a few things, Jack
Lang (1876–1975; premier of the Australian state of New South Wales (NSW)
1925-1927 & 1930-1932) remains best known for having in 1932 been the first
head of government in the British Empire to have been sacked by the Crown
since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord
Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).
Those
learning English must think it at least careless things can both be (1) “razed
to the ground” (totally to destroy something (typically a structure), usually
by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).The etymologies of “raze” and “raise” differ
but they’re pronounced the same so it’s fortunate the spellings vary but in
other troublesome examples of unrelated meanings, spelling and pronunciation
can align, as in “bust”.When used in
ways most directly related to human anatomy: (1) “a sculptural portrayal of a
person's head and shoulders” & (2) “the circumference of a woman's chest
around her breasts” there is an etymological link but these uses wholly are unconnected
with bust’s other senses.
Bust of
Lindsay Lohan in white marble by Stable Diffusion.Sculptures of just the neck and head came also to be called “busts”, the
emphasis on the technique rather than the original definition.
Bust in the sense
of “a sculpture of upper torso and head” dates from the 1690s and was from the
sixteenth century French buste, from
the Italian busto (upper body;
torso), from the Latin bustum (funeral
monument, tomb (although the original sense was “funeral pyre, place where
corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),The
alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was
influenced by the Etruscan custom of keeping the ashes of the dead in an urn
shaped like the person when alive.Thus
the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of
the human body from the chest up”.From
this came the meaning “dimension of the bosom; the measurement around a woman's
body at the level of her breasts” and that evolved on the basis of a comparison
with the sculptures, the base of which was described as the “bust-line”, the
term still used in dress-making (and for other comparative purposes as one of
the three “vital statistics” by which women are judged (bust, waist, hips),
each circumference having an “ideal range”).It’s not known when “bust” and “bust-line” came into oral use among
dress-makers and related professions but it’s documented since the 1880s.Derived forms (sometimes hyphenated) include
busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust
& underbust (technical terms in women's fashion referencing specific
measurements) and bustier (a tight-fitting women's top which covers (most or
all of) the bust.
Benito
Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing
beside his “portrait bust” (1926).
The
bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous
notoriety when his career as a forger was revealed with the publication of his
drawings which he’d represented as being from the hand of the French sculptor Auguste
Rodin (1840-1917) under whom he claimed to have studied.Mussolini appears here in one of the
subsequently much caricatured poses which were a part of his personality cult. More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs.
“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).
In
sculpture, what had been known as the “portrait statue” came after the 1690s to
be known as the “portrait bust” although both terms meant “sculpture of upper
torso and head” and these proved a popular choice for military figures because
the aspect enabled the inclusion of bling such as epaulettes, medals and other
decorations and being depictions of the human figure, busts came to be vested
with special significance by the superstitious.In early 1939, during construction of the new Reich Chancellery in
Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.For decades, the bust had sat in the old
Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi
court architect 1934-1942; Nazi minister of armaments and war production
1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of
government 1933-1945 & head of state 1934-1945) believed the Reich Eagle
toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident
secret, hurriedly issuing a commission to the German sculptor Arno Breker
(1900–1991) who carved an exact copy.To
give the fake the necessary patina, it was soaked for a time in strong, black
tea, the porous quality of marble enabling the fluid to induce some accelerated
aging.Interestingly, in his (sometimes
reliable) memoir (Erinnerungen
(Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer
admitted of the accident: “I felt this as an evil omen”.
The other
senses of bust (as a noun, verb & adjective) are diverse (and sometimes
diametric opposites and include: “to break or fail”; “to be caught doing
something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or
unexpectedly to succeed”; “to go broke”; “to break in” (horses, girlfriends etc):
“to assault”; the downward portion of an economic cycle (ie “boom & bust”);
“the act of effecting an arrest” and “someone (especially in professional sport)
who failed to perform to expectation”.That’s quite a range and that has meant the creation of dozens of
idiomatic forms, the best known of which include: “boom & bust”, “busted
flush”, “dambuster”, “bunker buster”,“busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust
one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust
loose, bust off, bust one's balls, bust-out, sod buster, bust the dust,
myth-busting and trend-busting. In the
sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst). Bust in
the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of
mid-nineteenth century US English and is of uncertain inspiration but most
etymologists seem to concur it was likely a modification of “burst” effected
with a phonetic alteration but it’s not impossible it came directly as an
imperfect echoic of Germanic speech.The
apparent contradiction of bust meaning both “fail” and “dramatically succeed”
happened because the former was an allusion to “being busted” (ie broken) while
the latter meaning used the notion of “busting through”.