(1) In chess, a situation in which a player is limited to moves that cost pieces or have a damaging positional effect.
(2) A situation in which, whatever is done, makes things worse (applied variously to sport, politics, battlefield engagements etc).
(3) A situation in which one is forced to act when one would prefer to remain passive and thus a synonym of the German compound noun Zugpflicht (the rule that a player cannot forgo a move).
(4) In game theory, a move which changes the outcome from win to loss.
Circa 1858 (1905 in English): A modern German compound, the construct being zug+zwang.Zug (move) was from the Middle High German zuc & zug, from the Old High German zug ,from Proto-Germanic tugiz, an abstract noun belonging to the Proto-Germanic teuhaną, from the primitive Indo-European dewk (to pull, lead); it was cognate with the Dutch teug and the Old English tyge.Zwang (compulsion; force; constraint; obligation) was from the Middle High German twanc, from the Old High German geduang.It belongs to the verb zwingen and cognates include the Dutch dwang and the Swedish tvång. The word is best understood as "compulsion to move" or, in the jargon of chess players: "Your turn to move and whatever you do it'll make things worse for you", thus the application to game theory, military strategy and politics where there's often a need to determine the "least worse option". Zugzwang
is a noun; the noun plural is Zugzwänge.In English, derived forms such as zugzwanged, zugzwanging, zugzwangish, zugzwanger,
zugzwangesque and zugzwangee are non-standard and used usually for humorous
effect.
Chess
and Game Theory
Endgame: Black's turn and Zugzwang! Daily Chess Musings depiction of the elegance of zugwang.
The first
known use of Zugzwang in the German
chess literature appears in 1858; the first appearance in English in 1905.However, the concept of Zugzwang had been known and written about for centuries, the
classic work being Italian chess player Alessandro Salvio's (circa 1575–circa 1640) study of endgames published in 1604 and he referenced Shatranj
writings from the early ninth century, some thousand years before the first
known use of the term.Positions with Zugzwang are not rare in chess endgames,
best known in the king-rook & king-pawn conjunctions.Positions of reciprocal Zugzwang are important in the analysis of endgames but although the concept is easily demonstrated and understood, that's true only of the "simple Zugzwang" and the so-called "sequential Zugzwang" will typically be a multi-move thing which demands an understanding of even dozens of permutations of possibilities.
Rendered by Vovsoftas cartoon character: abrunetteLindsay Lohan at the chessboard. In her youth, she was a bit of a zugzwanger.
Zugzwang describes a situation where one player is put at a disadvantage because they have to make a move although the player would prefer to pass and make no move. The fact the player must make a move means their position will be significantly weaker than the hypothetical one in which it is the opponent's turn to move. In game theory, it specifically means that it directly changes the outcome of the game from a win to a loss. Chess
textbooks often cite as the classic Zugzwang
a match in Copenhagen in 1923; on that day the German Grandmaster (the title
inaugurated in 1950) Friedrich Sämisch (1896–1975) played White against the Latvian-born
Danish Aron Nimzowitsch (1886-1935).Playing Black, Nimzowitsch didn’t play a tactical match in the
conventional sense but instead applied positional advantage, gradually to limit
his opponent’s options until, as endgame was reached, White was left with no
move which didn’t worsen his position; whatever he choose would lead either to material loss or strategic collapse and it’s said in his notebook, Nimzowitsch
concluded his entry on the match with “Zugzwang!”A noted eccentric in a discipline where idiosyncratic behaviour is not
unknown, the Polish Grandmaster Savielly Tartakower (1887-1956) observed of Nimzowitsch:
“He pretends
to be crazy in order to drive us all crazy.”
French
sculptor Auguste Rodin's (1840-1917) The
Thinker (1904), Musée Rodin,
Paris (left) and Boris Johnson (b 1964; UK prime-minister 2019-2022) thinking
about which would be his least worst option (left).
In its classic form chess is a game between two, played with fixed rules on a board with a known number of pieces (32) and squares (64). Although a count of the possible permutations in a match would yield a very big number, in chess, the concept of Zugwang is simple and understood the same way by those playing black and white; information for both sides is complete and while the concept can find an expression both combinatorial game theory (CGT) and classical game theory, the paths can be different. CGT and GT (the latter historically a tool of economic modelers and strategists in many fields) are both mathematical studies of games behaviour which can be imagined as “game-like” but differ in focus, assumptions, and applications. In CGT the basic model (as in chess) is of a two-player deterministic game in which the moves alternate and luck or chance is not an element.This compares GT in which there may be any number of players, moves may be simultaneous, the option exists not to move, information known to players may be incomplete (or asymmetric) and luck & chance exist among many variables (which can include all of Donald Rumsfeld’s (1932–2021: US defense secretary 1975-1977 & 2001-2006) helpful categories (known knowns, known unknowns, unknown unknowns & (most intriguingly) unknown knowns).So, while CGT is a good device for deconstructing chess and such because such games are of finite duration and players focus exclusively on “winning” (and if need be switching to “avoiding defeat”), GT is a tool which can be applied to maximize advantage or utility in situations where a win/defeat dichotomy is either not sought or becomes impossible.The difference then is that CGT envisages two players seeking to solve deterministic puzzle on a win/lose basis while GT is there to describes & analyse strategic interactions between & among rational actors, some or all of which may be operating with some degree of uncertainty.
Serial zugzwanger Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022), Parliament House, Canberra. More than many, Mr Joyce has had to sit and ponder what might at that moment be his “least worst” option. He has made choices good and bad.
In politics
and military conflicts (a spectrum condition according to Prussian general and
military theorist Carl von Clausewitz (1780–1831)), a zugzwang often is seen as
parties are compelled to take their “least worst” option, even when
circumstances dictate it would be better to “do nothing”. However, the zugzwang can lie in the eye of
the beholder and that why the unexpected Ardennes Offensive, (Wacht am Rhein (Watch on the Rhine) the German
code-name though popularly known in the West as the Battle of the Bulge,
(December 1944-January 1945)) was ordered by Adolf Hitler (1889-1945; Führer
(leader) and German head of government 1933-1945 & head of state 1934-1945).It was the last major German strategic
offensive of World War II (1939-1945) and among all but the most sycophantic of
Hitler’s military advisors it was thought not “least worst” but rather “worse
than the sensible” option (although not all the generals at the time concurred with
what constituted “sensible”).Under the
Nazi state’s Führerprinzip (leader
principle) the concept was that in any institutional structure authority was
vested in the designated leader and that meant ultimately Hitler’s rule was a
personal dictatorship (although the extent of the fragmentation wasn’t
understood until after the war) so while the generals could warn, counsel &
advise, ultimately decisions were based on the Führer’s will, thus the Ardennes
Offensive.
While the
operation made no strategic sense to the conventionally-schooled generals, to
Hitler it was compelling because the tide of the war had forced him to pursue
the only strategy left: delay what appeared an inevitable defeat in the hope
the (real but still suppressed) political tensions between his opponents would sunder
their alliance, allowing him to direct his resources against one front rather
than three (four if the battle in the skies was considered a distinct theatre
as many historians argue).Like Charles
Dickens’ (1812–1870) Mr Micawber in David
Copperfield (1849-1850), Hitler was hoping “something would turn up”.Because of the disparity in military and
economic strength between the German and Allied forces, in retrospect, the
Ardennes Offensive appears nonsensical but, at the time, it was a rational tactic
even if the strategy of “delay” was flawed.Confronted as he was by attacks from the west, east and south,
continuing to fight a defensive war would lead only to an inevitable defeat; an
offensive in the east was impossible because of the strength of the Red Army
and even a major battlefield victor in the south would have no strategic
significance so it was only in the west a glimmer of success seemed to beckon.
The bulge.
In the last
great example of the professionalism and tactical improvisation which was a
hallmark of their operations during the war, secretly the Wehrmacht (the German
military) assembled a large armored force (essentially under the eyes of the
Allies) and staged a surprise attack through the Ardennes, aided immeasurably
by the cover of heavy, low clouds which precluded both Allied reconnaissance
and deployment of their overwhelming strength in air-power.Initially successful, the advance punched
several holes in the line, the shape of which, when marked on a map, lent the
campaign the name “Battle of the Bulge” but within days the weather cleared,
allowing the Allies to unleash almost unopposed their overwhelming superiority
in air power.This, combined with their
vast military and logistical resources, doomed the Ardennes Offensive,
inflicting losses from which the Wehrmacht never recovered: From mid-January
on, German forces never regained the initiative, retreating on all fronts until
the inevitable defeat in May.A last
throw of the dice, the offensive both failed and squandered precious (and often
irreplaceable) resources badly needed elsewhere.By December 1944, Hitler had been confronted
with a zugzwang (of his own making) and while whatever he did would have made
Germany’s position worse, at least arguably, the Ardennes Offensive was not
even his “least worse” option.
(1) A large
bin or receptacle; a fixed chest or box.
(2) In
military use, historically a fortification set mostly below the surface of the
ground with overhead protection provided by logs and earth or by concrete and
fitted with above-ground embrasures through which guns may be fired.
(3) A
fortification set mostly below the surface of the ground and used for a variety
of purposes.
(4) In golf,
an obstacle, classically a sand trap but sometimes a mound of dirt,
constituting a hazard.
(5) In
nautical use, to provide fuel for a vessel.
(6) In
nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent
storehouse.
(7) In
golf, to hit a ball into a bunker.
(8) To
equip with or as if with bunkers.
(9) In
military use, to place personnel or materiel in a bunker or bunkers (sometimes
as “bunker down”).
1755–1760:
From the Scottish bonkar (box, chest
(also “seat” (in the sense of “bench”) of obscure origin but etymologists
conclude the use related to furniture hints at a relationship with banker (bench).Alternatively, it may be from a Scandinavian
source such as the Old Swedish bunke (boards
used to protect the cargo of a ship). The
meaning “receptacle for coal aboard a ship” was in use by at least 1839
(coal-burning steamships coming into general use in the 1820s).The use to describe the obstacles on golf
courses is documented from 1824 (probably from the extended sense “earthen seat”
which dates from 1805) but perhaps surprisingly, the familiar sense from
military use (dug-out fortification) seems not to have appeared before World
War I (1914-1918) although the structures so described had for millennia existed.“Bunkermate” was army slang for the
individual with whom one shares a bunker while the now obsolete “bunkerman”
(“bunkermen” the plural”) referred to someone (often the man in charge) who
worked at an industrial coal storage bunker.Bunker & bunkerage is a noun, bunkering is a noun & verb,
bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives;
the noun plural is bunkers.
Just as
ships called “coalers” were used to transport coal to and from shore-based
“coal stations”, it was “oilers” which took oil to storage tanks or out to sea
to refuel ships (a common naval procedure) and these STS (ship-to-ship)
transfers were called “bunkering” as the black stuff was pumped,
bunker-to-bunker.That the coal used by
steamships was stored on-board in compartments called “coal bunkers” led
ultimately to another derived term: “bunker oil”.When in the late nineteenth century ships
began the transition from being fuelled by coal to burning oil, the receptacles
of course became “oil bunkers” (among sailors nearly always clipped to
“bunker”) and as refining processes evolved, the fuel specifically produced for
oceangoing ships came to be called “bunker oil”.
Bunker oil is
“dirty stuff”, a highly viscous, heavy fuel oil which is essentially the
residue of crude oil refining; it’s that which remains after the more
refined and volatile products (gasoline (petrol), kerosene, diesel etc) have
been extracted.Until late in the
twentieth century, the orthodox view of economists was its use in big ships was
a good thing because it was a product for which industry had little other use
and, as essentially a by-product, it was relatively cheap.It came in three flavours: (1) Bunker A: Light
fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate
viscosity used in engines larger than marine diesels but smaller than those
used in the big ships and (3) Bunker C: Heavy fuel oil used in container
ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating
mass.Because of its composition, Bucker
C especially produced much pollution and although much of this happened at sea
(unseen by most but with obvious implications), when ships reached harbor to dock,
all the smoke and soot became obvious.Over the years, the worst of the pollution from the burning of bunker
oil greatly has been reduced (the work underway even before the Greta Thunberg
(b 2003) era), sometimes by the simple expedient of spraying a mist of water
through the smoke.
Floor-plans
of the upper (Vorbunker) and lower (Führerbunker) levels of the structure
now commonly referred to collectively as the Führerbunker.
History’s most
infamous bunker remains the Berlin Führerbunker
in which Adolf Hitler (1889-1945; Führer
(leader) and German head of government 1933-1945 & head of state 1934-1945)
spent much of the last few months of his life.In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German
military campaigns and several others built where required but it’s the one in Berlin
which is remembered as “theFührerbunker”. Before 1944 when the intensification of the air
raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been
used other than by the architects and others involved in their construction and
it wasn’t a designation like Führerhauptquartiere
which the military and other institutions of state shifted between locations
(rather as “Air Force One” is attached not to a specific airframe but whatever
aircraft in which the US president is travelling).In subsequent historical writing, the term Führerbunker tends often to be applied
to the whole, two-level complex in Berlin and although it was only the lower
layer which officially was designated as that, for most purposes the
distinction is not significant.In military
documents, after January, 1945 the Führerbunker
was referred to as Führerhauptquartiere.
Führerbunker tourist information board, Berlin, Germany.
Only an
information board at the intersection of den
Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment
in 2006 prior to that year's FIFA (Fédération
Internationale de Football Association (International Federation of
Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse
77 where once the Führerbunker was located.The Soviet occupation forces razed the new Reich Chancellery and
demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German
Democratic Republic; the old East Germany) 1949-1990) abandoned attempts
completely to destroy what lay beneath.Until after the fall of the Berlin Wall (1961-1989) the site remained
unused and neglected, “re-discovered” only during excavations by
property developers, the government insisting on the destruction on whatever
was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings
(an unfortunate aspect of post-unification Berlin) began to appear on the
site.Most of what would have covered
the Führerbunker’s footprint is now a
supermarket car park.
The first
part of the complex to be built was the Vorbunker
(upper bunker or forward bunker), an underground facility of reinforced concrete
intended only as a temporary air-raid shelter for Hitler and his entourage in
the old Reich Chancellery.Substantially
completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich
Chancellery Air-Raid Shelter), the Vorbunker
label applied only in 1944 when the lower level (the Führerbunker proper) was appended.In mid January, 1945, Hitler moved into the Führerbunker and, as the military
situation deteriorated, his appearances above ground became less frequent until
by late March he rarely saw the sky,Finally, on 30 April, he committed suicide.
Bunker
Busters
Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.
Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.
The use in
June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb
Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in
Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear
facility) meant “Bunker Buster” hit the headlines.Carried by the Northrop B-2 Spirit heavy
bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with
a casing designed to withstand the stress of penetrating through layers of
reinforced concrete or thick rock.“Bunker buster” bombs have been around for a while, the ancestors of
today’s devices first built for the German military early in World War II (1939-1945)
and the principle remains unchanged to this day: up-scaled armor-piercing
shells.The initial purpose was to
produce a weapon with a casing strong enough to withstand the forces imposed
when impacting reinforced concrete structures, the idea simple in that what was
needed was a delivery system which could “bust through” whatever protective
layers surrounded a target, allowing the explosive charge to do damage where
needed rtaher than wastefully being expended on an outer skin.The German weapons proved effective but inevitably triggered an “arms
race” in that as the war progressed, the concrete layers became thicker, walls over
2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.Technological development continued and the
idea extended to rocket propelled bombs optimized both for armor-piercing and
aerodynamic efficiency, velocity a significant “mass multiplier” which made the
weapons still more effective.
USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.
Concurrent
with this, the British developed the first true “bunker busters”, building on
the idea of the naval torpedo, one aspect of which was in exploding a short distance
from its target, it was highly damaging because it was able to take advantage
of one of the properties of water (quite strange stuff according to those who
study it) which is it doesn’t compress.
What that meant was it was often the “shock wave” of the water rather
than the blast itself which could breach a hull, the same principle used for
the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German
dams. Because of the way water behaved,
it wasn’t necessary to score the “direct hit” which had been the ideal in the
early days of aerial warfare.
RAF Bomber
Command archive photograph of Avro Lancaster (built between 1941-1946) in
flight with Grand Slam mounted (left) and a comparison of the Tallboy &
Grand Slam (right), illustrating how the latter was in most respects a
scaled-up version of the former. To
carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated
Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam
carried externally, its dimensions exceeding internal capacity), deleted front
and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.Such was the concern with weight (especially
for take-off) that just about anything non-essential was removed from the B1
Specials, even three of the four fire axes and its crew door ladder.In the US, Boeing went through a similar exercise
to produce the run of “Silverplate” B-29 Superfortresses able to carry the first
A-bombs used in August, 1945.
Best known
of the British devices were the so called “earthquake bombs”, the Tallboy (12,000
lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive
bulk, were classified by the War Office as “medium capacity”. The terms “Medium Capacity” (MC) & “High
Capacity” referenced not the gross weight or physical dimensions but ratio of
explosive filler to the total weight of the construction (ie how much was explosive
compared to the casing and ancillary components). Because both had thick casings to ensure penetration
deep into hardened targets (bunkers and other structures encased in rock or reinforced
concrete) before exploding, the internal dimensions accordingly were reduced
compared with the ratio typical of contemporary ordinance.A High Capacity (HC) bomb (a typical “general-purpose” bomb) had a thinner casing and a much higher proportion of explosive (sometimes
over 70% of total weight). These were
intended for area bombing (known also as “carpet bombing”) and caused wide
blast damage whereas the Tallboy & Grand Slam were penetrative with casings
optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier. The Tallboy’s
5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the
Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big”
4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to
its 3000 LB (1.4 ton) charge.Like many
things in engineering (not just in military matters) the ratio represented a
trade-off, the MC design prioritizing penetrative power and structural
destruction over blast radius.The
novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a
direct hit on a target but by entering the ground nearby, the explosion (1)
creating an underground cavity (a camouflet) and (2) transmitting a shock-wave
through the target’s foundations, leading to the structure collapsing into the
newly created lacuna.
The
etymology of camouflet has an interesting history in both French and military
mining.Originally it meant “a whiff of
smoke in the face (from a fire or pipe) and in figurative use it was a
reference to a snub or slight insult (something unpleasant delivered directly
to someone) and although the origin is murky and it may have been related to
the earlier French verb camoufler (to
disguise; to mask) which evolved also into “camouflage”.In the specialized military jargon of siege
warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet”
referred to “an underground explosion that does not break the surface, but
collapses enemy tunnels or fortifications by creating a subterranean void or
shockwave”.The use of this tactic is
best remembered from the Western Front in World War I,
some of the huge craters now tourist attractions.
Under
watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in
front of the official portrait of the republic’s ever-unsmiling founder, Grand
Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of
Iran, 1979-1989).Ayatollah Khamenei
seemed in 1989 an improbable choice as Supreme Leader because others were
better credentialed but though cautious and uncharismatic, he has proved a great
survivor in a troubled region.
Since aerial
bombing began to be used as a strategic weapon, of great interest has been the
debate over the BDA (battle damage assessment) and this issue emerged almost as
soon as the bunker buster attack on Iran was announced, focused on the extent
to which the MOPs had damaged the targets, the deepest of which were concealed deep
inside a mountain.BDA is a constantly
evolving science and while satellites have made analysis of surface damage
highly refined, it’s more difficult to understand what has happened deep
underground.Indeed, it wasn’t until the
USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan
in 1945-1946, conducting interviews, economic analysis and site surveys that a
useful (and substantially accurate) understanding emerged of the effectiveness of
bombing although what technological advances have allowed for those with the
resources is the so-called “panacea targets” (ie critical infrastructure
and such once dismissed by planners because the required precision was for many
reasons rarely attainable) can now accurately be targeted, the USAF able to
drop a bomb within a few feet of the aiming point.As the phrase is used by the military, the Fordow
Uranium Enrichment Plant is as classic “panacea target” but whether even a technically
successful strike will achieve the desired political outcome remains to be
seen.
Mr Trump,
in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have
two countries that have been fighting so long and so hard that they don't know
what the fuck they're doing."Actually, both know exactly WTF they're doing; it's just Mr Trump (and
many others) would prefer they didn't do it.
Donald Trump (b 1946; US president
2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand
Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth
should one day be revealed.Even modelling
of the effects has probably been inconclusive because the deeper one goes
underground, the greater the number of variables in the natural structure and
the nature of the internal built environment will also influence blast
behaviour.All experts seem to agree much
damage will have been done but what can’t yet be determined is what has been
suffered by the facilities which sit as deep as 80 m (260 feet) inside the
mountain although, as the name implies, “bunker busters” are designed for buried
targets and it’s not always required for blast directly to reach target.Because the shock-wave can travel through earth
& rock, the effect is something like that of an earthquake and if the structure
sufficiently is affected, it may be the area can be rendered geologically too
unstable again to be used for its original purpose.
Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done. However, whatever
the murkiness surrounding the BDA, many analysts have concluded that even if
before the attacks the Iranian authorities had not approved the creation of a
nuclear weapon, this attack will have persuaded them one is essential for “regime
survival”, thus the interest in both Tel Aviv and (despite denials) Washington
DC in “regime change”.The consensus
seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation
of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level
required for use in power generation; the ayatollah liked to keep his options
open.So, the fear of some is the attacks,
even if they have (by weeks, months or years) delayed the Islamic Republic’s
work on nuclear development, may prove counter-productive in that they convince
the ayatollah to concur with the reasoning of every state which since 1945 has
adopted an independent nuclear deterrent (IND).That reasoning was not complex and hasn’t changed since first a prehistoric
man picked up a stout stick to wave as a pre-lingual message to potential adversaries,
warning them there would be consequences for aggression.Although a theocracy, those who command power
in the Islamic Republic are part of an opaque political institution and in the
struggle which has for sometime been conducted in anticipation of the death of
the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central
dynamics. Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.
Of the word "bust"
The Great Bust: The Depression of
the Thirties (1962)
by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has
never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast
plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra,
Australia (right).Remembered for a few things, Jack
Lang (1876–1975; premier of the Australian state of New South Wales (NSW)
1925-1927 & 1930-1932) remains best known for having in 1932 been the first
head of government in the British Empire to have been sacked by the Crown
since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord
Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).
Those
learning English must think it at least careless things can both be (1) “razed
to the ground” (totally to destroy something (typically a structure), usually
by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).The etymologies of “raze” and “raise” differ
but they’re pronounced the same so it’s fortunate the spellings vary but in
other troublesome examples of unrelated meanings, spelling and pronunciation
can align, as in “bust”.When used in
ways most directly related to human anatomy: (1) “a sculptural portrayal of a
person's head and shoulders” & (2) “the circumference of a woman's chest
around her breasts” there is an etymological link but these uses wholly are unconnected
with bust’s other senses.
Bust of
Lindsay Lohan in white marble by Stable Diffusion.Sculptures of just the neck and head came also to be called “busts”, the
emphasis on the technique rather than the original definition.
Bust in the sense
of “a sculpture of upper torso and head” dates from the 1690s and was from the
sixteenth century French buste, from
the Italian busto (upper body;
torso), from the Latin bustum (funeral
monument, tomb (although the original sense was “funeral pyre, place where
corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),The
alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was
influenced by the Etruscan custom of keeping the ashes of the dead in an urn
shaped like the person when alive.Thus
the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of
the human body from the chest up”.From
this came the meaning “dimension of the bosom; the measurement around a woman's
body at the level of her breasts” and that evolved on the basis of a comparison
with the sculptures, the base of which was described as the “bust-line”, the
term still used in dress-making (and for other comparative purposes as one of
the three “vital statistics” by which women are judged (bust, waist, hips),
each circumference having an “ideal range”).It’s not known when “bust” and “bust-line” came into oral use among
dress-makers and related professions but it’s documented since the 1880s.Derived forms (sometimes hyphenated) include
busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust
& underbust (technical terms in women's fashion referencing specific
measurements) and bustier (a tight-fitting women's top which covers (most or
all of) the bust.
The other
senses of bust (as a noun, verb & adjective) are diverse (and sometimes
diametric opposites and include: “to break or fail”; “to be caught doing
something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or
unexpectedly to succeed”; “to go broke”; “to break in” (horses, girlfriends etc):
“to assault”; the downward portion of an economic cycle (ie “boom & bust”);
“the act of effecting an arrest” and “someone (especially in professional sport)
who failed to perform to expectation”.That’s quite a range and that has meant the creation of dozens of
idiomatic forms, the best known of which include: “boom & bust”, “busted
flush”, “dambuster”, “bunker buster”,“busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust
one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust
loose, bust off, bust one's balls, bust-out, sod buster, bust the dust,
myth-busting and trend-busting. In the
sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst). Bust in
the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of
mid-nineteenth century US English and is of uncertain inspiration but most
etymologists seem to concur it was likely a modification of “burst” effected
with a phonetic alteration but it’s not impossible it came directly as an
imperfect echoic of Germanic speech.The
apparent contradiction of bust meaning both “fail” and “dramatically succeed”
happened because the former was an allusion to “being busted” (ie broken) while
the latter meaning used the notion of “busting through”.
(1) In neo-paganism
and modern witchcraft, a ceremonial bundle of herbs or a perforated object used
to sprinkle water (in spells as “witches water”), usually at the commencement
of a ritual.
(2) In neurology,
as Asperger's syndrome (less commonly Asperger syndrome), an autism-related
developmental disorder characterised by sustained impairment in social
interaction and non-verbal communication and by repetitive behaviour as well as
restricted interests and routines.The
condition was named after Austrian pediatrician Hans Asperger (1906–1980).
Pre-1300:
The surname Asperger was of German origin and was toponymic (derived from a
geographical location or feature).The
town of Asperg lies in what is now the district of Ludwigsburg,
Baden-Württemberg, in south-west Germany and in German, appending the suffix “-er” can denote being “from a place”, Asperger
thus deconstructs as “someone from Asperg” and in modern use would suggest ancestral
ties to the town of Asperg or a similar-sounding locality.Etymologically, Asperg may be derived from
older Germanic or Latin roots, possibly meaning “rough hill” or “stony mountain”
(the Latin asper meaning “rough” and
the German berg meaning “mountain or
hill”.The term “Asperger’s syndrome”
was in 1976 coined by English psychiatrist Lorna Wing (1928–2014),
acknowledging the work of Austrian pediatrician Hans Asperger (1906–1980).Dr Wing was instrumental in the creation of
the National Autistic Society, a charity which has operated since 1962.Asperger is a noun (capitalized if in any
context used as a proper noun).Aspergerian
& Aspergic are nouns; the noun plural forms being Aspergers, Aspergerians &
Aspergics.In the literature, Aspergerian
& Aspergic (of, related to, or having qualities similar to those of
Asperger's syndrome (adjective) & (2) someone with Asperger's syndrome
(noun)) appear both to have been used.In general use “Asperger's” was the accepted ellipsis of Asperger's
syndrome while the derogratory slang forms included Aspie, autie, aspie, sperg,
sperglord & assburger, now all regarded as offensive in the same way
“retard” is now proscribed.
The noun asperges described a sprinkling ritual
of the Catholic Church, the name was applied also to an antiphon intoned or
sung during the ceremony.It was from
the Late Latin asperges, noun use of
second-person singular future indicative of aspergere
(to scatter, strew upon, sprinkle), the construct being ad (to, towards, at) + spargere
(to sprinkle).The use in Church Latin
was a learned borrowing from Latin aspergō
(to scatter or strew something or someone; to splash over; to spot, stain,
sully, asperse; besmirch; (figuratively) to bestow, bequeath something to, set
apart for) the construct being ad- + spargō
(strew, scatter; sprinkle; moisten).The origin lay in the phrase Asperges me, Domine, hyssopo et mundabor
(Thou shalt sprinkle me, O Lord, with hyssop, and I shall be cleansed), from
the 51st Psalm (in the Vulgate), sung during the rite of sprinkling a
congregation with holy water.Hyssop (any
of a number of aromatic bushy herbs) was from the Latin hȳsōpum, from the Ancient Greek ὕσσωπος
(hússōpos), of Semitic origin and the
idea was would be cleansed of one’s sins.In the Old English the loan-translation of the Latin aspergere was onstregdan.
The three
most recent popes demonstrate their aspergillum
(also spelled aspergill) technique
while performing the sprinkling rite.In
the more elaborate rituals, it's often used in conjunction with a container
called an aspersorium (holy water bucket).Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022, left),
Francis (1936-2025; pope 2013-2025, centre) and Leo XIV (b 1955; pope since
2025, right).
In the Christian
liturgy, an aspergillum was used to sprinkle holy water and the borrowing,
adaptation and re-purposing of ceremonies, feasts days and such from paganism
widely was practiced by the early Church.In the Bible (notably chapter 14 in the Old Testament’s Book of
Leviticus) there are descriptions of purification rituals involving the use of
cedar wood, hyssop, and scarlet wool to create an instrument for sprinkling
blood or water and historians sometimes cite this as “proto-aspergillum”.While it seems the earliest known use on
English of “aspergillum” dates from 1649, the documentary evidence is clear the
practice in the Christian liturgy was ancient and common since at least the
tenth century.Exactly when the
ritualistic practice began isn’t known but because water is so obviously
something used “to cleanse”, it’s likely it has been a part of religious
rituals for millennia before Christianity.
The use of the “asperger” in neo-paganism &
witchcraft was a continuation of the concept and well documented in the
remarkably prolific literature (some book shops have dedicated sections)
devoted to modern witchcraft and the construction of the objects (a bundle of
fresh herbs or a perforated object for sprinkling water) is a lineal descendent
of the aspergillum of the Medieval church and that makes sense, both institutions
devoted to the process of cleansing although the targets may have
differed.According to Ancient Pathways Witchcraft (which sounds an authoritative source), although it’s the fluid
which does the cleansing, the asperger is significant because it symbolizes “the
transformative and cleansing properties of water…”, rinsing away “…spiritual debris
that might interfere with the sanctity of rituals.”In both neo-paganism and witchcraft, the
herbs used may vary and while, pragmatically, sometimes this was dictated by
seasonal or geographical availability, priests and witches would also choose
the composition based on some “unique essences” being better suited to “enhance the
sacred water's effectiveness”.Nor were herbs always used for, as in the rituals of the church, “an asperger might
be a metal or wooden rod designed with perforations or an attached mesh”,
something like a “small brush or a dedicated holy water sprinkler akin to those seen in
Christian liturgy.”Again, it
was the sprinkling of the water which was the critical element in the process,
the devices really delivery systems which, regardless of form, existed to
transform simple water into “a divine medium of purity and transformation.”That said, their history of use did vest them
with tradition, especially when certain herbs were central to a spell.
Dr Hans
Asperger at work, Children's Clinic, University of Vienna, circa 1935.
The term
“Asperger’s syndrome” first appeared in a paper by English psychiatrist Lorna
Wing (1928–2014) although use seems not to have entered the medical mainstream
until 1981.Dr Wing (who in 1962 was one
of the founders of the charitable organization the National Autistic Society)
named it after Austrian pediatrician Hans Asperger (1906–1980) who first
described the condition in 1944, calling it autistischen
psychopathen (autistic psychopathy).Dr Wing was instrumental in the creation of the National Autistic
Society, a charity which has operated since 1962.The German autistischen was an inflection of autistisch (autistic), the construct being Autist (autistic) + -isch (an adjectival suffix).
The English word autism was from the German Autismus, used in 1913 by Swiss psychiatrist and eugenicist Eugen
Bleuler (1857-1939), the first known instance dating from 1907 and attributed
by Swiss psychiatrist & psychotherapist Carl Jung (1875-1961) as an
alternative to his earlier “auto-erotism” although in his book Dementia
Praecox, oder Gruppe der Schizophrenien (Precocious Dementia, or Group of Schizophrenias, 1911) Bleuler
differentiated the terms.The construct
of the word was the Ancient Greek αὐτός (autos) (self) + -ισμός (-ismós) (a suffix used to form abstract
nouns of action, state or condition equivalent to “-ism”).Being a time of rapid advances in the
relatively new discipline of psychiatry, it was a time also of linguistic innovation,
Dr Bleuler in a Berlin lecture in 1908 using the term “schizophrenia”, something he’d been
using in Switzerland for a year to replace “dementia
praecox”, coined by German psychiatrist Emil Kraepelin's (1856-1926).What Dr Bleuler in 1913 meant by “autistic”
was very different from the modern understanding in that to him it was a symptom
of schizophrenia, not an identifiably separate condition.In the UK, the profession picked this up and
it was used to describe “a tendency to turn inward and become absorbed in one's own
mental and emotional life, often at the expense of connection to the external
world” while “autistic thinking” referred to those who were “self-absorbed,
fantasy-driven, and detached from reality; thinking patterns, commonly seen in those
suffering schizophrenia.”
Looking Up was the monthly newsletter of the
International Autism Association and in Volume 4, Number 4 (2006), it was
reported Lindsay Lohan’s car had blocked the drop-off point for Smashbox Cares,
a charity devoted to teaching surfing to autistic youngsters.Arriving at the designated spot at Malibu’s
Carbon Beach, the volunteers were delayed in their attempt to disembark their
charges, something of significance because routine and predictability is
important to autistic people.To make up
for it, Ms Lohan staged an impromptu three hour beach party for the children, appearing
as a bikini-clad DJ.Apparently, it was
enjoyed by all.
The modern
sense of “autistic” began to emerge in the 1940s, among the first to contribute
the Austrian-American psychiatrist Leo Kanner (1894–1981) who in 1943 published
a paper using the phrase “early infantile autism” to describe a distinct
syndrome (which now would be understood as autism spectrum disorder).The following year, in Vienna, Dr Asperger
wrote (seemingly influenced by earlier work in Russia) of his observational
studies of children, listing the behaviors he associated with the disorder and unlike
some working in the field during the 1940s, Dr Asperger wasn’t wholly
pessimistic about his young patients, writing in Autistic Psychopathy in Childhood (1944): “The example of autism shows particularly
well how even abnormal personalities can be capable of development and
adjustment. Possibilities of social integration which one would never have dreamt
of may arise in the course of development.”Many of the documents associated with Dr Asperger’s
work were lost (or possibly taken to the Soviet Union) in the chaotic last
weeks of World War II (1939-1945) and it wasn’t until Dr Wing in the 1970s
reviewed some material from the archives that his contributions began to be
appreciated although not until 1992 did “Asperger’s Syndrome” became a standard
diagnosis.
DSM IV (1994). Not all in the profession approved of the reclassification of Asperger’s syndrome under the broader Autism Spectrum Disorder, believing it reduced the depth of diagnostic evaluation, flattened complexity and was disconnected from clinical reality. There was also regret about structural changes, DSM-5 eliminating the multiaxial system (Axes I–V), which some clinicians found useful for organizing information about the patient, especially Axis II (personality disorders) and Axis V (Global Assessment of Functioning).
Asperger’s Syndrome
first appeared in the American Psychiatric Association's (APA) classification
system when it was added to the fourth edition of the Diagnostic and
Statistical Manual of Mental Disorders (DSM-IV, 1994) and the utility for clinicians
was it created a sub-group of patients with autism but without a learning
disability (ie characterized by deficits in social interaction and restricted
interests, in the absence of significant language delay or cognitive impairment),
something with obvious implications for treatment. In the DSM-5 (2013), Autism Spectrum Disorder
(ASD) was re-defined as a broader category which combined Asperger syndrome,
Autistic Disorder & PDD-NOS (Pervasive Developmental Disorder Not Otherwise
Specified) into a single ASD diagnosis, the editors explaining the change as a
reflection of an enhanced understanding of the condition, the emphasis now on
it being something with varying degrees of severity and presentation rather
than distinct types.
However, although after 2013 the term no longer appeared in the DSM, it has remained in popular use, the British military historian Sir Antony Beevor (b 1946) in Ardennes 1944 (2015, an account of the so-called "Battle of the Bulge") speculating of Field Marshal Bernard Montgomery (First Viscount Montgomery of Alamein, 1887–1976) that "one might almost wonder whether [he] suffered from what today would be called high-functioning Asperger syndrome." The eleventh
release of the World Health Organization’s (WHO) International Classification
of Diseases (ICD) (ICD-11) aligned with the DSM-5 and regards what once would
have been diagnosed as Asperger’s Syndrome to be deemed a relatively mild
manifestation of ASD.The diagnostic
criteria for ASD focus on deficits in social communication and interaction, as
well as repetitive behaviors and interests.Although no longer current, the DSM IV’s criteria for Asperger's
Disorder remain of interest because while the label is no longer used, clinicians
need still to distinguish those in the spectrum suffering some degree of
learning disability and those not so affected:
DSM-IV diagnostic
criteria for Asperger’s Disorder (299.80).
A.
Qualitative impairment in social interaction, as manifested by at least two of
the following:
(1) marked
impairments in the use of multiple nonverbal behaviors such as eye-to-eye gaze,
facial expression, body postures, and gestures to regulate social interaction.
(2) failure
to develop peer relationships appropriate to developmental level.
(3) a lack
of spontaneous seeking to share enjoyment, interests, or achievements with
other people (eg by a lack of showing, bringing, or pointing out objects of
interest to other people).
(4) lack of
social or emotional reciprocity.
B.
Restricted repetitive and stereotyped patterns of behavior, interests, and
activities, as manifested by at least one of the following:
(1) encompassing
preoccupation with one or more stereotyped and restricted patterns of interest
that is abnormal either in intensity or focus.
(2) apparently
inflexible adherence to specific, non-functional routines or rituals.
(3) stereotyped
and repetitive motor mannerisms (eg hand or finger flapping or twisting, or
complex whole-body movements).
(4) persistent
preoccupation with parts of objects.
C. The
disturbance causes clinically significant impairment in social, occupational,
or other important areas of functioning
D. There is
no clinically significant general delay in language (eg single words used by
age 2 years, communicative phrases used by age 3 years).
E. There is
no clinically significant delay in cognitive development or in the development
of age-appropriate self-help skills, adaptive behavior (other than social
interaction), and curiosity about the environment in childhood.
F. Criteria
are not met for another specific Pervasive Developmental Disorder or
Schizophrenia.
The term in
the twenty-first century became controversial after revelations of some of Dr Asperger's activities during the Third Reich (Austria annexed by Germany
in 1938) which included his clinic in Vienna sending selected children to be
victims of Aktion T4 (a mass-murder programme of involuntary euthanasia targeting
those with disabilities), an operation which ran at times in parallel with the programmes
designed to exterminate the Jews, Gypsies, homosexuals and others. While there is no surviving documentary
evidence directly linking Dr Asperger to the selection process which decided
which children were to be killed, researchers have concluded the records
suggest his construction of what came later to be called “Asperger’s syndrome” was
actually that very process with an academic gloss. Because those Dr Asperger so categorized were
the autistic children without learning difficulties, they were thus deemed
capable of being “cured” and thus spared from the T4’s lists, unlike the “uneducable”
who would never be able to be made into useful German citizens. While the surviving material makes clear Dr
Asperger was at least a “fellow traveller” with the Nazi regime, in
professional, artistic and academic circles there was nothing unusual or even
necessarily sinister about that because in a totalitarian state, people have
few other choices if they wish to avoid unpleasantness. However, it does appear Dr Asperger may have
been unusually co-operative with the regime and his pre-1945 publication record
suggests sympathy with at least some aspects of the Nazis’ racial theories and eugenics.