(1) A movement
in avant-garde art, developed originally by a group of Italian artists in 1909 in
which forms (derived often from the then novel cubism) were used to represent
rapid movement and dynamic motion(sometimes
with initial capital letter)
(2) A
style of art, literature, music, etc and a theory of art and life in which
violence, power, speed, mechanization or machines, and hostility to the past or
to traditional forms of expression were advocated or portrayed (often with initial
capital letter).
(3) As futurology,
a quasi-discipline practiced by (often self-described) futurologists who
attempt to predict future events, movements, technologies etc.
(4) In
the theology of Judaism, the Jewish expectation of the messiah in the future
rather than recognizing him in the presence of Christ.
(5) In
the theology of Christianity, eschatological interpretations associating some
Biblical prophecies with future events yet to be fulfilled, including the
Second Coming.
1909: From
the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.Future was from the Middle English future & futur, from the Old French futur,
(that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular
suppletive future participle of esse (to
be) from the primitive Indo-European bheue
(to be, exist; grow). It was cognate
with the Old English bēo (I become, I
will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally
“afterhood”) in the given sense.The
technical use in grammar (of tense) dates from the 1520s.The –ism suffix was from the Ancient Greek
ισμός (ismós) & -isma noun suffixes, often directly,
sometimes through the Latin –ismus
& isma (from where English picked
up ize) and sometimes through the French –isme
or the German –ismus, all
ultimately from the Ancient Greek (where it tended more specifically to express
a finished act or thing done).It
appeared in loanwords from Greek, where it was used to form abstract nouns of
action, state, condition or doctrine from verbs and on this model, was used as
a productive suffix in the formation of nouns denoting action or practice,
state or condition, principles, doctrines, a usage or characteristic, devotion
or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism;
witticism etc).Futurism,
futurology, & futurology are nouns, futurist is a noun & adjective and futuristic
is an adjective; the noun plural is futurisms.
As a
descriptor of the movement in art and literature, futurism (as the Italian futurismo)
was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944)
and the first reference to futurist (a practitioner in the field of futurism) dates
from 1911 although the word had been used as early as 1842 in Protestant
theology in the sense of “one who holds that nearly the whole of the Book of
Revelations refers principally to events yet to come”.The secular world did being to use futurist
to describe "one who has (positive) feelings about the future" in
1846 but for the remainder of the century, use was apparently rare.The (now probably extinct) noun futurity was
from the early seventeenth century.The
noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science,
Liberty and Peace (1946) and has (for better or worse), created a minor
industry of (often self-described) futurologists.In
theology, the adjective futuristic came into use in 1856 with reference to
prophecy but use soon faded. In concert
with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while
by 1921 it was separated from the exclusive attachment to art and meant also “pertaining
to the future, predicted to be in the future”, the use in this context spiking
rapidly after World War II (1939-1945) when technological developments in fields such as ballistics,
jet aircraft, space exploration, electronics, nuclear physics etc stimulated
interest in such progress.
Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.
Futures, a financial
instrument used in the trade of currencies and commodities appeared first in
1880; they allow (1) speculators to bet on price movements and (2) producers
and sellers to hedge against price movements and in both cases profits (and
losses) can be booked against movement up or down.Futures trading can be lucrative but is also
risky, those who win gaining from those who lose and those in the markets are
usually professionals.The story behind
crooked Hillary Clinton's extraordinary profits in cattle futures (not a field
in which she’d previously (or has subsequently) displayed interest or expertise)
while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains
murky but it can certainly be said that for an apparently “amateur” dabbling in
a market played usually by experienced professionals, she was remarkably
successful and while perhaps there was some luck involved, her trading record
was such it’s a wonder she didn’t take it up as a career.While many analysts have, based on what
documents are available, commented on crooked Hillary’s somewhat improbable (and
apparently sometime “irregular”) foray into cattle futures, there was never an “official
governmental investigation” by an independent authority and no thus adverse
findings have ever been published.
The Arrival (1913), oil on canvas by
Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.
Given what would
unfold over during the twentieth century, it’s probably difficult to appreciate
quite how optimistic was the Western world in the years leading up to the World
War I (1914-1918).Such had been the rapidity of the
discovery of novelties and of progress in so many fields that expectations of
the future were high and, beginning in Italy, futurism was a movement devoted
to displaying the energy, dynamism and power of machines and the vitality and
change they were bringing to society.It’s
also often forgotten that when the first futurist exhibition was staged in
Paris in 1912, the critical establishment was unimpressed, the elaborate imagery
with its opulence of color offending their sense of refinement, now so attuned
to the sparseness of the cubists.
The Hospital Train (1915),
oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.
Futurism had
debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto
by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that
was old and celebrated change, originality, and innovation in culture and
society, something which should be depicted in art, music and literature.
Marinetti exalted in the speed, power of new technologies which were disrupting
society, automobiles, aeroplanes and other clattering machines.Whether he found beauty in the machines or
the violence and conflict they delivered was something he left his readers to
decide and there were those seduced by both but his stated goal was the repudiation of
traditional values and the destruction of cultural institutions such as museums
and libraries. Whether this was intended
as a revolutionary roadmap or just a provocation to inspire anger and controversy
is something historians have debated. Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash.
Futurismo: Uomo Nuovo (New Man, 1918), drawing
by Mario Sironi (1885-1961).
As a technique, the futurist artists borrowed much
from the cubists, deploying the same fragmented and intersecting plane surfaces
and outlines to render a number of simultaneous, overlaid views of an object
but whereas the cubists tended to still life, portraiture and other, usually
static, studies of the human form, the futurists worshiped movement, their
overlays a device to depict rhythmic spatial repetitions of an object’s
outlines during movement. People did
appear in futurist works but usually they weren’t the focal point, instead
appearing only in relation to some speeding or noisy machine.Some of the most prolific of the futurist
artists were killed in World War I and as a political movement it didn’t
survive the conflict, the industrial war dulling the public appetite for the
cult of the machine.However, the
influence of the compositional techniques continued in the 1920s and
contributed to art deco which, in more elegant form, would integrate the new
world of machines and mass-production into motifs still in use today.
Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.
By the
early twentieth century when the Futurism movement emerged, machines and
mechanism were already hundreds of years old (indeed the precursor devices
pre-date Christ) but what changed was the new generations of machines had
become sexy (at least in the eyes of men), associated as they were with
something beyond mere functionalism: speed and style.While planes, trains & automobiles all
attracted the futurists, the motorcycle was a much-favored motif because it
possessed an intimacy beyond other forms of transportation in that, literally it
was more an extension of the human body, the rider at speed conforming to the shape
of the structure fashioned for aerodynamic efficiency with hands and feet all directly
attached to the vital controls: machine as extension of man.
The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.
The Modern Boy (1928-1939) was, as the name implies,
a British magazine targeted at males aged 12-18 and the content reflected the
state of mind in the society of the inter-war years, the 1930s a curious decade
of progress, regression, hope and despair.
Although what filled much of the pages (guns, military conquest and
other exploits, fast cars and motorcycles, stuff the British were doing in
other peoples’ countries) would today see the editors cancelled or visited by
one of the many organs of the British state concerned with the suppression of such
things), it was what readers (presumably with the acquiescence of their
parents) wanted. Best remembered of the
authors whose works appeared in The Modern
Boy was Captain W.E. Johns (1893–1968), a World War I RFC
(Royal Flying Corps) pilot who created the fictional air-adventurer Biggles. The first Biggles tale appeared in 1928 in Popular Flying magazine (released also
as Popular Aviation and still in
publication as Flying) and his
stories are still sometimes re-printed (although with the blatant racism edited
out). The first Biggles story had a very
modern-sounding title: The White Fokker. The
Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not
known although dropping superfluous words (and much else) was a feature of
modernism. In October 1939, a few weeks after
the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of
restrictions by the Board of Trade on the supply of paper for civilian use.
Jockey Club Innovation Tower, Hong Kong (2013)
by Zaha Hadid (1950-2016).
If the
characteristics of futurism in art were identifiable (though not always
admired), in architecture, it can be hard to tell where modernism ends and
futurism begins.Aesthetics aside, the
core purpose of modernism was of course its utilitarian value and that did tend
to dictate the austerity, straight lines and crisp geometry that evolved into
mid-century minimalism so modernism, in its pure form, should probably be
thought of as a style without an ulterior motive.Futurist architecture however carried the
agenda which in its earliest days borrowed from the futurist artists in that it
was an assault on the past but later moved on and in the twenty-first century,
the futurist architects seem now to be interested above all in the possibilities
offered by advances in structural engineering, functionality sacrificed if need
be just to demonstrate that something new can be done.That's doubtless of great interest at awards
dinners where architects give prizes to each other for this and that but has
produced an international consensus that it's better to draw something new than
something elegant.The critique is that while
modernism once offered “less is more”, with neo-futurist architecture it's now “less
is bore”. Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.
(1) A large
bin or receptacle; a fixed chest or box.
(2) In
military use, historically a fortification set mostly below the surface of the
ground with overhead protection provided by logs and earth or by concrete and
fitted with above-ground embrasures through which guns may be fired.
(3) A
fortification set mostly below the surface of the ground and used for a variety
of purposes.
(4) In golf,
an obstacle, classically a sand trap but sometimes a mound of dirt,
constituting a hazard.
(5) In
nautical use, to provide fuel for a vessel.
(6) In
nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent
storehouse.
(7) In
golf, to hit a ball into a bunker.
(8) To
equip with or as if with bunkers.
(9) In
military use, to place personnel or materiel in a bunker or bunkers (sometimes
as “bunker down”).
1755–1760:
From the Scottish bonkar (box, chest
(also “seat” (in the sense of “bench”) of obscure origin but etymologists
conclude the use related to furniture hints at a relationship with banker (bench).Alternatively, it may be from a Scandinavian
source such as the Old Swedish bunke (boards
used to protect the cargo of a ship). The
meaning “receptacle for coal aboard a ship” was in use by at least 1839
(coal-burning steamships coming into general use in the 1820s).The use to describe the obstacles on golf
courses is documented from 1824 (probably from the extended sense “earthen seat”
which dates from 1805) but perhaps surprisingly, the familiar sense from
military use (dug-out fortification) seems not to have appeared before World
War I (1914-1918) although the structures so described had for millennia existed.“Bunkermate” was army slang for the
individual with whom one shares a bunker while the now obsolete “bunkerman”
(“bunkermen” the plural”) referred to someone (often the man in charge) who
worked at an industrial coal storage bunker.Bunker & bunkerage is a noun, bunkering is a noun & verb,
bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives;
the noun plural is bunkers.
Just as
ships called “coalers” were used to transport coal to and from shore-based
“coal stations”, it was “oilers” which took oil to storage tanks or out to sea
to refuel ships (a common naval procedure) and these STS (ship-to-ship)
transfers were called “bunkering” as the black stuff was pumped,
bunker-to-bunker.That the coal used by
steamships was stored on-board in compartments called “coal bunkers” led
ultimately to another derived term: “bunker oil”.When in the late nineteenth century ships
began the transition from being fuelled by coal to burning oil, the receptacles
of course became “oil bunkers” (among sailors nearly always clipped to
“bunker”) and as refining processes evolved, the fuel specifically produced for
oceangoing ships came to be called “bunker oil”.
Bunker oil is
“dirty stuff”, a highly viscous, heavy fuel oil which is essentially the
residue of crude oil refining; it’s that which remains after the more
refined and volatile products (gasoline (petrol), kerosene, diesel etc) have
been extracted.Until late in the
twentieth century, the orthodox view of economists was its use in big ships was
a good thing because it was a product for which industry had little other use
and, as essentially a by-product, it was relatively cheap.It came in three flavours: (1) Bunker A: Light
fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate
viscosity used in engines larger than marine diesels but smaller than those
used in the big ships and (3) Bunker C: Heavy fuel oil used in container
ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating
mass.Because of its composition, Bucker
C especially produced much pollution and although much of this happened at sea
(unseen by most but with obvious implications), when ships reached harbor to dock,
all the smoke and soot became obvious.Over the years, the worst of the pollution from the burning of bunker
oil greatly has been reduced (the work underway even before the Greta Thunberg
(b 2003) era), sometimes by the simple expedient of spraying a mist of water
through the smoke.
Floor-plans
of the upper (Vorbunker) and lower (Führerbunker) levels of the structure
now commonly referred to collectively as the Führerbunker.
History’s most
infamous bunker remains the Berlin Führerbunker
in which Adolf Hitler (1889-1945; Führer
(leader) and German head of government 1933-1945 & head of state 1934-1945)
spent much of the last few months of his life.In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German
military campaigns and several others built where required but it’s the one in Berlin
which is remembered as “theFührerbunker”. Before 1944 when the intensification of the air
raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been
used other than by the architects and others involved in their construction and
it wasn’t a designation like Führerhauptquartiere
which the military and other institutions of state shifted between locations
(rather as “Air Force One” is attached not to a specific airframe but whatever
aircraft in which the US president is travelling).In subsequent historical writing, the term Führerbunker tends often to be applied
to the whole, two-level complex in Berlin and although it was only the lower
layer which officially was designated as that, for most purposes the
distinction is not significant.In military
documents, after January, 1945 the Führerbunker
was referred to as Führerhauptquartiere.
Führerbunker tourist information board, Berlin, Germany.
Only an
information board at the intersection of den
Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment
in 2006 prior to that year's FIFA (Fédération
Internationale de Football Association (International Federation of
Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse
77 where once the Führerbunker was located.The Soviet occupation forces razed the new Reich Chancellery and
demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German
Democratic Republic; the old East Germany) 1949-1990) abandoned attempts
completely to destroy what lay beneath.Until after the fall of the Berlin Wall (1961-1989) the site remained
unused and neglected, “re-discovered” only during excavations by
property developers, the government insisting on the destruction on whatever
was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings
(an unfortunate aspect of post-unification Berlin) began to appear on the
site.Most of what would have covered
the Führerbunker’s footprint is now a
supermarket car park.
The first
part of the complex to be built was the Vorbunker
(upper bunker or forward bunker), an underground facility of reinforced concrete
intended only as a temporary air-raid shelter for Hitler and his entourage in
the old Reich Chancellery.Substantially
completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich
Chancellery Air-Raid Shelter), the Vorbunker
label applied only in 1944 when the lower level (the Führerbunker proper) was appended.In mid January, 1945, Hitler moved into the Führerbunker and, as the military
situation deteriorated, his appearances above ground became less frequent until
by late March he rarely saw the sky,Finally, on 30 April, he committed suicide.
Bunker
Busters
Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.
Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.
The use in
June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb
Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in
Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear
facility) meant “Bunker Buster” hit the headlines.Carried by the Northrop B-2 Spirit heavy
bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with
a casing designed to withstand the stress of penetrating through layers of
reinforced concrete or thick rock.“Bunker buster” bombs have been around for a while, the ancestors of
today’s devices first built for the German military early in World War II (1939-1945)
and the principle remains unchanged to this day: up-scaled armor-piercing
shells.The initial purpose was to
produce a weapon with a casing strong enough to withstand the forces imposed
when impacting reinforced concrete structures, the idea simple in that what was
needed was a delivery system which could “bust through” whatever protective
layers surrounded a target, allowing the explosive charge to do damage where
needed rtaher than wastefully being expended on an outer skin.The German weapons proved effective but inevitably triggered an “arms
race” in that as the war progressed, the concrete layers became thicker, walls over
2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.Technological development continued and the
idea extended to rocket propelled bombs optimized both for armor-piercing and
aerodynamic efficiency, velocity a significant “mass multiplier” which made the
weapons still more effective.
USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.
Concurrent
with this, the British developed the first true “bunker busters”, building on
the idea of the naval torpedo, one aspect of which was in exploding a short distance
from its target, it was highly damaging because it was able to take advantage
of one of the properties of water (quite strange stuff according to those who
study it) which is it doesn’t compress.
What that meant was it was often the “shock wave” of the water rather
than the blast itself which could breach a hull, the same principle used for
the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German
dams. Because of the way water behaved,
it wasn’t necessary to score the “direct hit” which had been the ideal in the
early days of aerial warfare.
RAF Bomber
Command archive photograph of Avro Lancaster (built between 1941-1946) in
flight with Grand Slam mounted (left) and a comparison of the Tallboy &
Grand Slam (right), illustrating how the latter was in most respects a
scaled-up version of the former. To
carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated
Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam
carried externally, its dimensions exceeding internal capacity), deleted front
and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.Such was the concern with weight (especially
for take-off) that just about anything non-essential was removed from the B1
Specials, even three of the four fire axes and its crew door ladder.In the US, Boeing went through a similar exercise
to produce the run of “Silverplate” B-29 Superfortresses able to carry the first
A-bombs used in August, 1945.
Best known
of the British devices were the so called “earthquake bombs”, the Tallboy (12,000
lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive
bulk, were classified by the War Office as “medium capacity”. The terms “Medium Capacity” (MC) & “High
Capacity” referenced not the gross weight or physical dimensions but ratio of
explosive filler to the total weight of the construction (ie how much was explosive
compared to the casing and ancillary components). Because both had thick casings to ensure penetration
deep into hardened targets (bunkers and other structures encased in rock or reinforced
concrete) before exploding, the internal dimensions accordingly were reduced
compared with the ratio typical of contemporary ordinance.A High Capacity (HC) bomb (a typical “general-purpose” bomb) had a thinner casing and a much higher proportion of explosive (sometimes
over 70% of total weight). These were
intended for area bombing (known also as “carpet bombing”) and caused wide
blast damage whereas the Tallboy & Grand Slam were penetrative with casings
optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier. The Tallboy’s
5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the
Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big”
4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to
its 3000 LB (1.4 ton) charge.Like many
things in engineering (not just in military matters) the ratio represented a
trade-off, the MC design prioritizing penetrative power and structural
destruction over blast radius.The
novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a
direct hit on a target but by entering the ground nearby, the explosion (1)
creating an underground cavity (a camouflet) and (2) transmitting a shock-wave
through the target’s foundations, leading to the structure collapsing into the
newly created lacuna.
The
etymology of camouflet has an interesting history in both French and military
mining.Originally it meant “a whiff of
smoke in the face (from a fire or pipe) and in figurative use it was a
reference to a snub or slight insult (something unpleasant delivered directly
to someone) and although the origin is murky and it may have been related to
the earlier French verb camoufler (to
disguise; to mask) which evolved also into “camouflage”.In the specialized military jargon of siege
warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet”
referred to “an underground explosion that does not break the surface, but
collapses enemy tunnels or fortifications by creating a subterranean void or
shockwave”.The use of this tactic is
best remembered from the use on the Western Front in World War I,
some of the huge craters now tourist attractions.
Under
watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in
front of the official portrait of the republic’s ever-unsmiling founder, Grand
Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of
Iran, 1979-1989).Ayatollah Khamenei
seemed in 1989 an improbable choice as Supreme Leader because others were
better credentialed but though cautious and uncharismatic, he has proved a great
survivor in a troubled region.
Since aerial
bombing began to be used as a strategic weapon, of great interest has been the
debate over the BDA (battle damage assessment) and this issue emerged almost as
soon as the bunker buster attack on Iran was announced, focused on the extent
to which the MOPs had damaged the targets, the deepest of which were concealed deep
inside a mountain.BDA is a constantly
evolving science and while satellites have made analysis of surface damage
highly refined, it’s more difficult to understand what has happened deep
underground.Indeed, it wasn’t until the
USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan
in 1945-1946, conducting interviews, economic analysis and site surveys that a
useful (and substantially accurate) understanding emerged of the effectiveness of
bombing although what technological advances have allowed for those with the
resources is the so-called “panacea targets” (ie critical infrastructure
and such once dismissed by planners because the required precision was for many
reasons rarely attainable) can now accurately be targeted, the USAF able to
drop a bomb within a few feet of the aiming point.As the phrase is used by the military, the Fordow
Uranium Enrichment Plant is as classic “panacea target” but whether even a technically
successful strike will achieve the desired political outcome remains to be
seen.
Mr Trump,
in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have
two countries that have been fighting so long and so hard that they don't know
what the fuck they're doing."Actually, both know exactly WTF they're doing; it's just Mr Trump (and
many others) would prefer they didn't do it.
Donald Trump (b 1946; US president
2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand
Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth
should one day be revealed.Even modelling
of the effects has probably been inconclusive because the deeper one goes
underground, the greater the number of variables in the natural structure and
the nature of the internal built environment will also influence blast
behaviour.All experts seem to agree much
damage will have been done but what can’t yet be determined is what has been
suffered by the facilities which sit as deep as 80 m (260 feet) inside the
mountain although, as the name implies, “bunker busters” are designed for buried
targets and it’s not always required for blast directly to reach target.Because the shock-wave can travel through earth
& rock, the effect is something like that of an earthquake and if the structure
sufficiently is affected, it may be the area can be rendered geologically too
unstable again to be used for its original purpose.
Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done. However, whatever
the murkiness surrounding the BDA, many analysts have concluded that even if
before the attacks the Iranian authorities had not approved the creation of a
nuclear weapon, this attack will have persuaded them one is essential for “regime
survival”, thus the interest in both Tel Aviv and (despite denials) Washington
DC in “regime change”.The consensus
seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation
of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level
required for use in power generation; the ayatollah liked to keep his options
open.So, the fear of some is the attacks,
even if they have (by weeks, months or years) delayed the Islamic Republic’s
work on nuclear development, may prove counter-productive in that they convince
the ayatollah to concur with the reasoning of every state which since 1945 has
adopted an independent nuclear deterrent (IND).That reasoning was not complex and hasn’t changed since first a prehistoric
man picked up a stout stick to wave as a pre-lingual message to potential adversaries,
warning them there would be consequences for aggression.Although a theocracy, those who command power
in the Islamic Republic are part of an opaque political institution and in the
struggle which has for sometime been conducted in anticipation of the death of
the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central
dynamics. Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.
Of the word "bust"
The Great Bust: The Depression of
the Thirties (1962)
by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has
never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast
plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra,
Australia.Remembered for a few things, Jack
Lang (1876–1975; premier of the Australian state of New South Wales (NSW)
1925-1927 & 1930-1932) remains best known for having in 1932 been the first
head of a government in the British Empire to have been sacked by the Crown
since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord
Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).
Those
learning English must think it at least careless things can both be (1) “razed
to the ground” (totally to destroy something (typically a structure), usually
by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).The etymologies of “raze” and “raise” differ
but they’re pronounced the same so it’s fortunate the spellings vary but in
other troublesome examples of unrelated meanings, spelling and pronunciation
can align, as in “bust”.When used in
ways most directly related to human anatomy: (1) “a sculptural portrayal of a
person's head and shoulders” & (2) “the circumference of a woman's chest
around her breasts” there is an etymological link but these uses wholly are unconnected
with bust’s other senses.
Bust of
Lindsay Lohan in white marble by Stable Diffusion.Sculptures of just the neck and head came also to be called “busts”, the
emphasis on the technique rather than the original definition.
Bust in the sense
of “a sculpture of upper torso and head” dates from the 1690s and was from the
sixteenth century French buste, from
the Italian busto (upper body;
torso), from the Latin bustum (funeral
monument, tomb (although the original sense was “funeral pyre, place where
corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),The
alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was
influenced by the Etruscan custom of keeping the ashes of the dead in an urn
shaped like the person when alive.Thus
the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of
the human body from the chest up”.From
this came the meaning “dimension of the bosom; the measurement around a woman's
body at the level of her breasts” and that evolved on the basis of a comparison
with the sculptures, the base of which was described as the “bust-line”, the
term still used in dress-making (and for other comparative purposes as one of
the three “vital statistics” by which women are judged (bust, waist, hips),
each circumference having an “ideal range”).It’s not known when “bust” and “bust-line” came into oral use among
dress-makers and related professions but it’s documented since the 1880s.Derived forms (sometimes hyphenated) include
busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust
& underbust (technical terms in women's fashion referencing specific
measurements) and bustier (a tight-fitting women's top which covers (most or
all of) the bust.
The other
senses of bust (as a noun, verb & adjective) are diverse (and sometimes
diametric opposites and include: “to break or fail”; “to be caught doing
something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or
unexpectedly to succeed”; “to go broke”; “to break in” (horses, girlfriends etc):
“to assault”; the downward portion of an economic cycle (ie “boom & bust”);
“the act of effecting an arrest” and “someone (especially in professional sport)
who failed to perform to expectation”.That’s quite a range and that has meant the creation of dozens of
idiomatic forms, the best known of which include: “boom & bust”, “busted
flush”, “dambuster”, “bunker buster”,“busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust
one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust
loose, bust off, bust one's balls, bust-out, sod buster, bust the dust,
myth-busting and trend-busting. In the
sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst). Bust in
the sense of “break”, “smash”, “fail”, “arrest” et al was a creation of
mid-nineteenth century US English and is of uncertain inspiration but most
etymologists seem to concur it was likely a modification of “burst” effected
with a phonetic alteration but it’s not impossible it came directly as an
imperfect echoic of Germanic speech.The
apparent contradiction of bust meaning both “fail” and “dramatically succeed”
happened because the former was an allusion to “being busted” (ie broken) while
the latter meaning used the notion of “busting through”.
(1) In the army of Ancient Rome, a military formation
which numbered between 3000-6000 soldiers, made up of infantry with supporting
cavalry.
(2) A description applied to some large military and paramilitary
forces.
(3) Any great number of things or (especially) as persons;
a multitude; very great in number (usually postpositive).
(4) A description applied to some associations of
ex-servicemen (usually initial capital).
(5) In biology, a taxonomic rank; a group of orders
inferior to a class; in scientific classification, a term occasionally used to
express an assemblage of objects intermediate between an order and a class.
1175–1225: From the Middle English legi(o)un, from the Old French legion (squad, band, company, Roman military unit), from the Latin legiōnem & legiōn- (nominative legiō) (picked body of soldiers; a levy of troops), the construct being leg(ere) (to gather, choose, read; pick out, select), from the primitive Indo-European root leg- (to gather; to collect) + -iōn The suffix –ion was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive iōnis). It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process. Legion & legionry are nouns, adjective & verb and legionnaire & legionary are nouns; the noun plural is legions.
The generalized sense of "a large number of
persons" emerged circa 1300 as a consequence of its use of legion in some
translations of the Bible (my name is
Legion: for we are many (Mark 5:9; King James Version (KJV, 1611)).It was used to describe various European military formations since the
1590s and had been applied to some associations of ex-servicemen since the American
Legion was established in 1919. The French
légion d'honneur (Legion of Honor) is
an order of distinction founded by Napoleon in 1802, the légion étrangère (French Foreign Legion) was originally a unit of
the French army officially made up of foreign volunteers (Polish, Belgian etc)
which traditionally served in colonies or on distant expeditions although
French nations soon appeared in Foreign Legion colours “for a number of reasons”. The noun legionnaire from the French légionnaire dates from 1818.The most famous modern association is Legionnaires' Disease, caused by Legionella pneumophilia, named after the
lethal outbreak in July 1976 at the American Legion convention in
Philadelphia's Bellevue Stratford Hotel, Legionella
thus becoming the name of the bacterium.The cause of the outbreak was traced to water used in the building’s
air-conditioning systems.
The Bellevue Stratford and Legionella pneumophilia
The origin of Legionnaires’ disease (Legionella pneumophilia) was in the bacterium resident in the air-conditioning
cooling towers of the Bellevue Stratford Hotel in Philadelphia which in July
1976 was hosting the Bicentennial convention of the American Legion, an
association of service veterans; the bacterium was subsequently named Legionella.
The Bellevue-Stratford Hotel, 1905.
The Legionella bacterium occurs
naturally and there had before been outbreaks of what came to be called Legionella pneumophilia,(a pneumonia-like
condition)most notably in 1968 but what
made the 1976 event different was the scale and severity which attracted investigation
and a review of the records which suggested the first known case in the United
States dated from 1957.Like HIV/AIDS,
it was only when critical mass was reached that it became identified as
something specific and there’s little doubt there may have been instances of Legionella pneumophilia for decades or
even centuries prior to 1957.The
increasing instance of the condition in the late twentieth century is most
associated with the growth in deployment of a particular vector of
transmission: large, distributed air-conditioning systems.Until the Philadelphia outbreak, the cleaning
routines required to maintain these systems wasn’t well-understood and indeed, the
1976 event wasn’t even the first time the Bellevue Stratford had been the
source two years earlier when it was the site of a meeting of the Independent
Order of Odd Fellows but in that case, fewer than two-dozen were infected and
only two fatalities whereas over two-hundred Legionnaires became ill thirty-four
died.Had the 1976 outbreak claimed only
a handful, it’s quite likely it too would have passed unnoticed.
Winter Evening, Bellevue-Stratford Hotel, circa 1910 by
Charles Cushing (b 1959).
That the 1976 outbreak was on the scale it was certainly
affected the Bellevue-Stratford.Built in
the Philadelphia CBD on the corner of Broad and Walnut Streets in 1904, it was
enlarged in 1912 and, at the time, was among the most impressive and luxurious
hotels in the world.Noted especially for
a splendid ballroom and the fine fittings in its thousand-odd guest rooms, it
instantly became the city’s leading hotel and a centre for the cultural and
social interactions of its richer citizens.Its eminence continued until during the depression of the 1930s, it
suffered the fate of many institutions associated with wealth and conspicuous
consumption, its elaborate form not appropriate in a more austere age.As business suffered, the lack of revenue
meant it was no longer possible to maintain the building and the tarnish began
to overtake the glittering structure.
The Bellevue Hotel Ballroom.
Although the ostentation of old never quite returned, in
the post-war years, the Bellevue-Stratford did continue to operate as a profitable
hotel until an international notoriety was gained in July 1976 with the
outbreak of the disease which would afflict over two-hundred and, ultimately,
strike down almost three dozen of the conventioneers who had been guests.Once the association with the hotel’s air-conditioning
became known, bookings plummeted precipitously and before the year was out, the
Stratford ceased operations although there was a nod to the architectural
significance, the now deserted building was in 1977 listed on the US National
Register of Historic Places.
The Bellevue Hotel XIX Restaurant.
The lure of past glories was however strong and in
1978-1979, after being sold, a programme described as a restoration rather than
a refurbishment was undertaken, reputedly costing a then impressive US$25 million,
the press releases at the time emphasizing the attention devoted to the air-conditioning
system.The guest rooms were entirely
re-created, the re-configuration of the floors reducing their number to under
six-hundred and the public areas were restored to their original appearance. However, for a number of reasons, business
never reached the projected volume and not in one year since re-opening did the
place prove profitable, the long-delayed but inevitable closure finally
happening in March 1986.
The Bellevue Hotel Lobby.
But, either because or in spite of the building being listed as
a historic place, it still attracted interest and, after being bought at a knock-down
price, another re-configuration was commenced, this time to convert it to the
now fashionable multi-function space, a mix of retail, hotel and office space, now
with the inevitable fitness centre and food court.Tellingly, the number of hotel rooms was
reduced fewer than two-hundred but even this proved a challenge for operators
profitably to run and in 1996, Hyatt took over.Hyatt, although for internal reasons shuffling the property within their
divisions and rebranding it to avoid any reference to the now troublesome Stratford
name, benefited from the decision by the city administration to re-locate Philadelphia’s
convention centre from the outskirts to the centre and, like other hotels in
the region, enjoyed a notable, and profitable, increase in demand.It’s now called simply: The Bellevue Hotel.
The Bellevue Hotel.
Understandably, the Bellevue’s page on Hyatt’s website,
although discussing some aspects of the building’s history such as having
enjoyed a visit from every president since Theodore Roosevelt (1858–1919; US president 1901-1909) and the exquisitely
intricate lighting system designed by Thomas Edison (1847-1931) himself, neglects even to
allude to the two outbreaks of Legionnaires’ disease in the 1970s, the sale in
1976 noted on the time-line without comment. In a nice touch, guests may check in with up to two dogs, provided they don't exceed the weight limit 50 lb (22.67 kg) pounds individually or 75 lb (34 kg) combined. Part of the deal includes a “Dog on Vacation” sign which will be provided when registering; it's to hang on the doorknob so staff know what's inside and there's a dog run at Seger Park, a green space about a ½ mile (¾ km) from the hotel. Three days notice is required if staying with one or two dogs and, if on a leash, they can tour the Bellevue's halls but they're not allowed on either the ballroom level or the 19th floor where the XIX restaurant is located. A cleaning fee (US$100) is added for stays of up to six nights, with an additional deep-cleaning charge applicable for 7-30 nights.
Lindsay Lohan with some of the legion of paparazzi who, despite technical progress which has disrupted the primacy of their role as content providers in the celebrity ecosystem, remain still significant players in what is a symbiotic process.
Cadillac advertising, 1958.
Cadillac in
1958 knew their buyer profile and their agency’s choice of the Bellevue
Stafford as a backdrop reflected this. They knew also to whom they were
talking, thus the copywriters coming up with: “Not long after a motorist takes delivery of his new Cadillac, he discovers
that the car introduces him
in a unique manner. Its new beauty and
elegance, for instance, speak eloquently of his taste and judgment.
Its new Fleetwood luxury indicates his
consideration for his passengers. And
its association with the world’s leading citizens acknowledges his standing in the world of
affairs.”That’s just how
things were but as the small-print (bottom left of image) suggest, women did
have their place as Cadillac accessories, a number of them in the photograph
to look decorative in their “gowns by Nan Duskin”.Lithuania-born Nan Duskin Lincoln (1893-1980)
in 1926 opened her eponymous fashion store on the corner of 18th
& Sansom Streets and enjoyed such success she was soon able to purchase
three buildings on Walnut Stereet which she converted into her flagship and for
years it was an internationally-renowned fashion mecca.Ms Duskin was unusual in that despite have
never been educated beyond the sixth grade, she was an advocate for fashion being
taught at universities and while that may not seem revolutionary in an age when
it’s probably possible to take a post-graduate degree in basket weaving, it was at the time a
novel idea.She worked as a lecturer in
design and criticism at Drexel University, the institution later establishing
the Nan Duskin Laboratory of Costume Design.It was said of Ms Duskin that when she selected a wedding dress for
brides-to-be, almost invariably they were delighted by the suggestion though
she would lament the young ladies were not always so successful in their choice
of grooms.
Nan Duskin brochure, 1942; While fashions change, slenderness never goes out of style and these designs are classic examples of “timeless lines”. Presumably, this brochure was printed prior to the imposition of wartime restrictions which resulted in such material being restricted to single color, printed on low-quality newsprint.
Responsible for bringing to
Philadelphia the work of some European couture houses sometimes then not seen
even in New York, there was nobody more responsible for establishing the city
as a leading centre of fashion before her semi-retirement in 1958 when she sold
her three stores to the Dietrich Foundation.
Unashamedly elitist and catering only to the top-end of the market
(rather like Cadillac in the 1950s), her stores operated more like salons than
retail outlets and while things for a while continued in that vein after 1958,
the world was changing and while the “best labels” continued to be stocked, the
uniqueness gradually was dissipated until it was really just another store,
little different from the many which had sought to emulate the model. In 1994, Nan Duskin filed for Chapter 11 Bankruptcy
(which enables to troubled companies to continue trading while re-structuring)
but the damage was done and in 1995 the businesses were closed. Nothing lasts forever and it’s tempting to
draw a comparison with the way Cadillac “lost its way” during the 1980s &
1990s.