Showing posts sorted by date for query Quarantine. Sort by relevance Show all posts
Showing posts sorted by date for query Quarantine. Sort by relevance Show all posts

Saturday, January 3, 2026

Defiant

Defiant (pronounced dih-fahy-uhnt)

Characterized by defiance or a willingness to defy; boldly resistant or challenging.

1830s: From the French défiant, from the Old French, present participle of the verb défier (to challenge, defy, provoke), the construct thus def(y) + “i” + -ant.  Defy dates from the mid thirteenth century and was from the Middle English defien, from the Old French desfier, from the Vulgar Latin disfidare (renounce one's faith), the construct being dis- (away) + fidus (faithful).  The construct in French was thus des- (in the sense of negation) + fier (to trust), (from the Vulgar Latin fīdāre, from the Classical Latin fīdere (fidelity),  In the fourteenth century, the meaning shifted from “be disloyal” to “challenge”.  The suffix –ant was from the Middle English –ant & -aunt, partly from the Old French -ant, from Latin -āns; and partly (in adjectival derivations) a continuation of the use of the Middle English -ant, a variant of -and, -end, from the Old English -ende ( the present participle ending).  Extensively used in the sciences (especially medicine and pathology), the agent noun was derived from verb.  It was used to create adjectives (1) corresponding to a noun in -ance, having the sense of "exhibiting (the condition or process described by the noun)" and (2) derived from a verb, having the senses of: (2a) "doing (the verbal action)", and/or (2b) "prone/tending to do (the verbal action)".  In English, many of the words to which –ant was appended were not coined in English but borrowed from the Old French, Middle French or Modern French.  The negative adjectival forms are non-defiant & undefiant although there is a kind of middle ground described by quasi-defiant, semi-defiant & half-defiant, the latter three sometimes used in military conflicts where, for whatever reason, it’s been necessary (or at least desirable) for a force to offer a “token resistance” prior to an inevitable defeat.  The adjective over-defiant refers to a resistance or recalcitrance, the extent or duration of which is not justified by the circumstances; in such cases the comparative is “more defiant” and the superlative “most defiant”.  Defiant is a noun & adjective, defiantness is a noun and defiantly is an adverb; the noun plural is defiants.

Defiance in politics: use with caution

The commonly used synonyms include rebellious, direful, truculent, insolent, rebellious, recalcitrant, refractory, contumacious & insubordinate but in diplomacy, such words must be chosen with care because what is one context may be a compliment, in another it may be a slight.  This was in 1993 discovered by Paul Keating (b 1944; Prime Minister of Australia 1991-1996) who labelled Dr Mahathir bin Mohamad (b 1925; prime minister of Malaysia 1981-2003 & 2018-2020) one of the “recalcitrant” when the latter declined to attend a summit meeting of the Asia-Pacific Economic Cooperation (APEC).  For historic reasons, Dr Mahathir was sensitive to the memories of the imperialist oppressors telling colonized people what to do and interpreted Mr Keating’s phrase as a suggestion he should be more obedient (the most commonly used antonym of defiant, the others including obedient & submissive).  Things could quickly have been resolved (Dr Mahathir of the “forgive but not forget” school of IR (international relations)) but, unfortunately, Mr Keating was brought up in the gut-wrenching “never apologize” tradition of the right-wing of the New South Wales (NSW) Labor Party so what could have been handled as a clumsy linguistic gaffe was allowed to drag on.

Circa 1933 Chinese propaganda poster featuring a portrait of Generalissimo Chiang Kai-shek (Chiang Chung-cheng).  Set in an oval frame below flags alongside stylized Chinese lettering, the generalissimo is depicted wearing his ceremonial full-dress uniform with decorations.

The admission an opponent is being “defiant” must also sometimes be left unsaid.  Ever since Generalissimo Chiang Kai-shek (1887-1975; leader of the Republic of China (mainland) 1928-1949 & the renegade province of Taiwan 1949-1975) in 1949 fled mainland China, settling on and assuming control of the island of Taiwan, the status of the place has been contested, most dramatically in the incidents which flare up occasionally in the in the straits between the island and the mainland, remembered as the First (1954–1955), Second (1958) and Third (1995-1996) Taiwan Strait Crises which, although sometimes in retrospect treated as sabre rattling or what Hun Sen (b 1952; prime minister (in one form or another) 1985-2023) might have called “the boys letting off steam”, were at the time serious incidents, each with the potential to escalate into something worse.  Strategically, the first two crises were interesting studies in Cold War politics, the two sides at one stage exchanging information about when and where their shelling would be aimed, permitting troops to be withdrawn from the relevant areas on the day.  Better to facilitate administrative arrangements, each side’s shelling took place on alternate days, satisfying honor on both sides.  The other landmark incident was China’s seat at the United Nations (UN), held by the Republic of China (ROC) (Taiwan) between 1945-1971 and the People’s Republic of China (PRC) (the mainland) since.

Jiefang Taiwan, xiaomie Jiangzei canyu (Liberate Taiwan, and wipe out the remnants of the bandit Chiang) by Yang Keyang (楊可楊) and Zhao Yannian (趙延年). 

A 1954 PRC propaganda poster printed as part of anti-Taiwan campaign during first Taiwan Strait Crisis (1954-1955), Generalissimo Chiang Kai-shek depicted as a scarecrow erected on Taiwan by the US government and military. Note the color of the generalissimo’s cracked and disfigured head (tied to a pole) and the similarity to the color of the American also shown.  The artists have included some of the accoutrements often associated with Chiang’s uniforms: white gloves, boots and a ceremonial sword.  The relationship between Chiang and the leaders of PRC who defeated his army, Chairman Mao (Mao Zedong. 1893–1976; paramount leader of PRC 1949-1976) and Zhou Enlai (1898–1976; PRC premier 1949-1976) was interesting.  Even after decades of defiance in his renegade province, Mao and Zhou still referred to him, apparently genuinely, as “our friend”, an expression which surprised both Richard Nixon (1913-1994; US president 1969-1974) and Henry Kissinger (b 1923; US national security advisor 1969-1973 & secretary of state 1973-1977) who met the chairman and premier during their historic mission to Peking in 1972.

A toast: Comrade Chairman Mao Zedong (left) and  Generalissimo Chiang Kai-shek (right), celebrating the Japanese surrender, Chongqing, China, September 1945.  After this visit, they would never meet again.

Most people, apparently even within the PRC, casually refer to the place as “Taiwan” but state and non-governmental entities, anxious not to upset Beijing, use a variety of terms including “Chinese Taipei” (the International Olympic Committee (IOC) and the Fédération Internationale de Football Association (FIFA, the International Federation of Association Football) & its continental confederations (AFC, CAF, CONCACAF, CONMEBOL, OFC and UEFA)), “Taiwan District” (the World Bank) and “Taiwan Province of China (the International Monetary Fund (IMF)).  Taiwan’s government uses an almost declarative “Republic of China” which is the name adopted for China after the fall of the Qing dynasty and used between 1912-1949 and even “Chinese Taipai” isn’t without controversy, “Taipei” being the Taiwanese spelling whereas Beijing prefers “Taibei,” the spelling used in the mainland’s Pinyin system.  There have been variations on those themes and there’s also the mysterious “Formosa”, use of which persisted in the English-speaking world well into the twentieth century, despite the Republic of Formosa existing on the island of Taiwan for only a few months in 1895.  The origin of the name Formosa lies in the island in 1542 being named Ilha Formosa (beautiful island) by Portuguese sailors who had noticed it didn’t appear on their charts.  From there, most admiralties in Europe and the English-speaking world updated their charts, use of Formosa not fading until the 1970s.

All that history is well-known, if sometimes subject to differing interpretations but some mystery surrounds the term “renegade province”, used in recent years with such frequency that a general perception seems to have formed that it’s Beijing’s official (or at least preferred) description of the recalcitrant island.  That it’s certainly not but in both the popular-press and specialist journals, the phrase “renegade province” is habitually used to describe Beijing’s views of Taiwan.  Given that Beijing actually calls Taiwan the “Taiwan Province” (sometimes styled as “Taiwan District” but there seems no substantive difference in meaning) and has explicitly maintained it reserves the right to reclaim the territory (by use of military invasion if need be), it’s certainly not unreasonable to assume that does reflect the politburo's view but within the PRC, “renegade province” is so rare (in Chinese or English) as to be effectively non-existent, the reason said to be that rather than a renegade, the island is thought of as a province pretending to be independent; delusional rather than defiant.  Researchers have looked into the matter when the phrase “renegade province” was first used in English when describing Taiwan.  There may be older or more obscure material which isn’t indexed or hasn’t been digitized but of that which can be searched, the first reference appears to be in a US literary journal from 1973 (which, it later transpired, received secret funding from the US Central Intelligence Agency (CIA)).  It took a while to catch on but, appearing first in the New York Times in 1982, became a favorite during the administration of Ronald Reagan (1911-2004; US president 1981-1989) and has been part of the standard language of commentary since.  Diplomats, aware of Beijing's views on the matter, tend to avoid the phrase, maintaining the “delusional rather than defiant” line.

Picture of defiance: Official State Portrait of Vladimir Putin (2002), oil on canvas by Igor Babailov (b 1965).

The idea of a territory being a “renegade province” can be of great political, psychological (and ultimately military) significance.  The core justification used by Mr Putin (Vladimir Vladimirovich Putin; b 1952; president or prime minister of Russia since 1999) when explaining why his “special military operation” against Ukraine in 2022 was not an “invasion” or “war of aggression” (he probably concedes it may be a “state of armed conflict”) was that he denied Ukraine was a sovereign, independent state and that Volodymyr Zelenskyy (b 1978, president of Ukraine since 2019) was not a legitimate president.  In other words, Ukraine is merely a region of the modern Russia in something of the way it was once one of the 15 constituent SSRs (Soviet Socialist Republic) of the Soviet Union.  Although the Kremlin doesn’t use the phrase, in Mr Putin’s world view, Ukraine is a renegade province and he likely believes that applies also to the Baltic States (Latvia, Lithuania & Estonia) and possibly other former SSRs.  Lake many, the CCP is watching events in Ukraine with great interest and, as recent “exercises” seem to suggest the People’s Liberation Army (PLA) have sufficiently honed their techniques to execute either a blockade (which would be an “act of war”) or a “quarantine” (which would not), the attention of Western analysts is now focused on the hardly secret training being undertaken to perfect what’s needed for the triphibious operations demanded by a full-scale invasion.  The US think-tanks which think much about this possibility have suggested “some time” in 2027 as the likely point at which the military high command would assure the CCP’s central committee such a thing is possible.  What will happen will then depend upon (1) the state of things in the PRC and (2) the CCP’s assessment of how the long-term “strategic ambiguity” of Washington would manifest were an attempt made to finish the “unfinished business” of 1949.

Lindsay Lohan, who has lived a life of defiance.

The objectification of women’s body parts has of course been a theme in Western culture since at least Antiquity but rarely can as much attention been devoted to a single fingernail as the one photographed on Lindsay Lohan’s hand in July 2010 (during her “troubled starlet” phase).  The text printed on the fingernail was sufficiently explicit not to need a academic deconstruction of its alleged meaning, given image was taken when she sitting in court listening to a judge sentence her for one of her many transgressions; the consensus was the text was there to send a “defiant message” the internet’s collective conclusion (which wasn’t restricted to entertainment and celebrity sites) presumably reinforced by the nail being on the middle finger.  Ms Lohan admitted to fining this perplexing, tweeting on X (then known as Twitter) it was merely a manicure and had “…nothing to do w/court, it's an airbrush design from a stencil.  So, rather than digital defiance, it was fashion.  Attributing a motif of defiance to Ms Lohan wasn’t unusual during “troubled starlet” phase, one site assessing a chronological montage of her famous mug shots before concluding with each successive shot, “Lindsay's face becomes more defiant — a young woman hardening herself against a world that had turned her into a punch-line”.

The Bolton-Paul Defiant (1939-1943)

The Parthian shot was a military tactic, used by mounted cavalry and made famous by the Parthians, an ancient people of the Persian lands (the modern-day Islamic Republic of Iran since 1979).  While in real or feigned retreat on horseback, the Parthian archers would, in full gallop, turn their bodies backward to shoot at the pursuing enemy.  This demanded both fine equestrian skills (a soldier’ hands occupied by his bows & arrows) and great confidence in one's mount, something gained only by time spent between man & beast.  To make the achievement more admirable still, the Parthians used neither stirrups nor spurs, relying solely on pressure from their legs to guide and control their galloping mounts and, with varying degrees of success, the tactic was adopted by many mounted military formations of the era including the Scythians, Huns, Turks, Magyars, and Mongols.  The Parthian Empire existed between 247 BC–224 AD.  The Royal Air Force (RAF) tried a variation of the Parthian shot with Bolton-Paul Defiant, a single-engined fighter and Battle of Britain contemporary of the better remembered Spitfire and Hurricane.  Uniquely, the Defiant had no forward-firing armaments, all its firepower being concentrated in four .303 machine guns in a turret behind the pilot.  The theory behind the design dates from the 1930s when the latest multi-engined monoplane bombers were much faster than contemporary single-engined biplane fighters then in service. The RAF considered its new generation of heavily-armed bombers would be able to penetrate enemy airspace and defend themselves without a fighter escort and this of course implied enemy bombers would similarly be able to penetrate British airspace with some degree of impunity.

Bolton-Paul Defiant.

By 1935, the concept of a turret-armed fighter emerged.  The RAF anticipated having to defend the British Isles against massed formations of unescorted enemy bombers and, in theory, turret-armed fighters would be able approach formations from below or from the side and coordinate their fire.  In design terms, it was a return to what often was done early in the World War I, though that had been technologically deterministic, it being then quite an engineering challenge to produce reliable and safe (in the sense of not damaging the craft's own propeller) forward-firing guns.  Deployed not as intended, but as a fighter used against escorted bombers, the Defiant enjoyed considerable early success, essentially because at attack-range, it appeared to be a Hurricane and the German fighter pilots were of course tempted attack from above and behind, the classic hunter's tactic.  They were course met by the the Defiant's formidable battery.  However, the Luftwaffe learned quickly, unlike the RAF which for too long persisted with their pre-war formations which were neat and precise but also excellent targets.  Soon the vulnerability of the Defiant resulted in losses so heavy its deployment was unsustainable and it was withdrawn from front-line combat.  It did though subsequently proved a useful stop-gap as a night-fighter and provided the RAF with an effective means of combating night bombing until aircraft designed for the purpose entered service.

The Trump class "battleships"

In a surprise announcement, the Pentagon announced the impending construction of a “new battleship class” the first of the line (USS Defiant) to be the US Navy’s “largest surface combatant built since World War II [1939-1945]”.  The initial plans call for a pair to be launched with a long-term goal of as many as two dozen with construction to begin in 2030.  Intriguingly, Donald Trump (b 1946; US president 2017-2021 and since 2025) revealed that while the Department of Defense’s (it’s also now the Department of War) naval architects would “lead the design”, he personally would be involved “…because I’m a very aesthetic person.  That may sound a strange imperative when designing something as starkly functional as a warship but in navies everywhere there’s a long tradition of “the beautiful ship” and the design language still in use, although much modified, is recognizably what it was more than a century earlier.  The Secretary of the Navy certainly stayed on-message, announcing the USS Defiant would be “…the largest, deadliest and most versatile and best-looking warship anywhere on the world’s oceans”, adding that components for the project would “be made in every state.”  It won't however be the widest because quirk of ship design in the US Navy is that warships tend to be limited to a beam (width) of around 33 metres (108 feet) because that’s the limit for vessels able to pass through the Panama Canal.

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

By comparison with the existing surface fleet the 35,000 ton Defiant will be impressively large although, by historic standards, the largest (non-carrier) surface combatants now in service are of modest dimensions and displacement.  The largest now afloat are the 15,000-ton Zumwalt class destroyers (which really seem to be cruisers) while the 10,000 ton Ticonderoga class cruisers (which really are destroyers) are more numerous.  So, even the Defiant will seem small compared with the twentieth century Dreadnoughts (which became a generic term for “biggest battleship”), the US Iowa class displacing 60,000 ton at their heaviest while the Japanese Yamato-class weighted-in at 72,000.  Even those behemoths would have been dwarfed by the most ambitious of the H-Class ships in Plan-Z which were on German drawing boards early in World War II.  Before reality bit hard, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) left physics to the engineers and wasn't too bothered by economics.  After being disappointed the proposals the successors to the Bismarck-class ships would have their main armament increased only from eight 15-inch (380 mm) to eight 16 inch cannons, he ordered OKM (Oberkommando der Marine; the Naval High Command) to design bigger ships.  That directive emerged as the ambitious Plan Z which would have demanded so much steel, essentially nothing else in the Reich could have been built.  Although not one vessel in Plan Z ever left the slipway (the facilities even to lay down the keels non-existent), such a fleet would have been impressive, the largest (the H-44) fitted with eight 20-inch (508 mm) cannons.  Even more to the Führer’s liking was the concept of the H-45, equipped with eight 31.5 inch (800 mm) Gustav siege guns.  However, although he never lost faith in the key to success on the battlefield being bigger and bigger tanks, the experience of surface warfare at sea convinced Hitler the days of the big ships were over and he would even try to persuade the navy to retire all their capital ships and devote more resources to the submarines which, as late as 1945, he hoped might still prolong the war.  Had he imposed such priorities in 1937-1938 so the Kriegsmarine (German Navy) could have entered World War II with the ability permanently to have 100 submarines engaged in high-seas raiding rather than barely a dozen, the early course of the war might radically have been different.  Germany indeed entered the war without a single aircraft carrier (the only one laid down never completed), such was the confidence the need to confront the Royal Navy either would never happen or was years away.

The US Navy in 1940 began construction of six Iowa class battleships but only four were ever launched because it had become clear the age of the aircraft carrier and submarine had arrived and the last battleship launched was the Royal Navy’s HMS Vanguard which entered service in 1946.  Although the admirals remained fond of the fine cut of her silhouette on the horizon, to the Treasury (an institution in the austere, post-war years rapidly asserting its authority over the Admiralty) the thing was a white elephant, something acknowledged even by the romantic, battleship-loving Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) who, when in November, 1953 planning a trip to Bermuda for a summit meeting with Dwight Eisenhower (1890-1969; US POTUS 1953-1961), opted to fly because “it costs Stg£30,000 if we go by Vanguard, and only £3,000 by air.  In 1959, Vanguard was sold for scrap and broken up the next year while the last of the Iowa class ships were decommissioned in 1992 after having spent many years of their life in a non-active reserve.  Defiant is of course a most Churchillian word and after World War I (1914-1918, he was asked by a French municipality to devise the wording for its war memorial.  He proposed:

IN WAR: RESOLUTION

IN DEFEAT: DEFIANCE

IN VICTORY: MAGNANIMITY

IN PEACE: GOODWILL

At the time, old Georges Clemenceau (1841–1929; French prime minister 1906-1909 & 1917-1920) wasn’t feeling much magnanimity towards the Germans and nor was he much in the mood to extend any goodwill so Churchill’s suggestion was rejected.  

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

The conventional wisdom therefore was the days of the big warships were done and the Soviet Navy’s curious decision in the 1980s to lay down five (four of which were launched) Kirov class battlecruisers seemed to confirm the view.  Although the Kremlin called the ships тяжёлый атомный ракетный крейсер (heavy nuclear-powered guided missile cruisers), admiralties in the West, still nostalgic lot, choose to revive the old name “battlecruiser”.  The battlecruiser (essentially a battleship with less armor) was a brainchild of the naval theorists of the early twentieth century but while the concept was sound (and in practice may have proved so if the theory had been followed at sea) but in service was a disappointment and none were commissioned after 1920 until the Soviets revived the idea.  As recently as 2018, NATO (North Atlantic Treaty Organization) sources were sceptical any of the Russian ships would ever return to service but in 2025 the Admiral Nakhimov (ex-Kalinin) emerged from a long and expensive re-fit & modernization to serve as the world’s biggest warship.  Although fast and heavily armed, concern remains about her vulnerability to missiles and torpedoes.

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

The US Navy seems confident about the protection afforded by the Trump class’s systems, claiming “the battleship [the Pentagon’s term] will be capable of operating independently, as part of a Carrier Strike Group, or commanding its own Surface Action Group depending on the mission and threat environment.  In other words, unlike an aircraft carrier, the security of the vessel does not depend on a flotilla of destroyers and other smaller escort vessels.  The first of the Trump class is projected to cost between US$10-15 billion although, on the basis of experience, few will be surprised if this number “blows out”.  The Trump class will be the flagships for the Navy’s “Golden Fleet” initiative (an old naval term dating from days of the Spanish colonial Empire and nothing to do with Mr Trump’s fondness for the metal).  In an age in which small, cheap, UAVs (unmanned aerial vehicles, usually referred to as drones) have revolutionized warfare (on land and at sea), the return of the big ships is as interesting as it was unexpected and analysts are already writing their assessments of the prospects of success.

Although the concept wasn’t new, it was late in the nineteenth century naval architects began to apply the word “class” systematically to group ships of the same design, the pioneers the Royal Navy but other powers soon adopted the practice.  It had long been the practice for warships to be constructed on the basis of substantially replicating existing designs and some truly were “identical” to the extent a series would now be called a “class” but before the terminology became (more or less) standardized, warships usually were described by their “Rate” or “Type” (first-rate ship of the line, corvette, frigate etc) but, in the usual military way, there was also much informal slang including phrases such as “the Majestic battleships” or “ships of the Iron Duke type”.  The crystallization of the “class” concept was really a result of technological determinism as the methods developed in factories which emerged during the industrial revolution spread to ship-building; steam power, hulls of iron & steel and the associated complex machinery made design & construction increasingly expensive, thus the need to amortize investment and reduce build times by ordering ships in batches with near-identical specifications.

Navies in the era were also becoming more bureaucratic (a process which never stopped and some believe is accelerating still) and Admiralties became much taken with precise accounting and doctrinal categorisation.  The pragmatic admirals however saw no need to reinvent the wheel, “class” already well-established in engineering and taxonomy, the choice thus an obvious administrative convenience.  The “new” nomenclature wasn’t heralded as a major change or innovations, the term just beginning to appear in the 1870s in Admiralty documents, construction programmes and parliamentary papers in which vessels were listed in groups including Devastation class ironclad turret ships (laid down 1869), Colossus class battleships (laid down 1879) and Admiral class battleships (1880s).  In recent history tests, warships prior to this era sometimes are referred to as “Ship-of-the-line class”, “Three decker class” etc but this use is retrospective.  The French Navy adopted the convention almost simultaneously (with the local spelling classe) with Imperial Germany’s Kaiserliche Marine (Imperial Navy) following in the 1890s with Klasse.  The US Navy was comparatively late to formalise the use and although “class” in this context does appear in documents in the 1890s, the standardization wasn’t complete until about 1912.

As a naming convention (“King George V class”, “Iowa class” etc), the rule is the name chosen is either (1) the first ship laid down, or (2) the lead ship commissioned.  According to Admiralty historians, this wasn’t something determined by a committee or the whim of an admiral (both long naval traditions) but was just so obviously practical.  It certainly wasn’t an original idea because the term “class” was by the late nineteenth century well established in industrial production, civil engineering, and military administration; if anything the tradition-bound admirals were late-adopters, sticking to their old classificatory habit long after it had outlived its usefulness.  With ships becoming bigger and more complex, what was needed was a system (which encompassed not only the ships but also components such as guns, torpedoes, engines etc) which grouped objects according to their defined technical specification rather than their vague “type” (which by then had become most elastic) or individual instances; naval architecture had entered the “age of interchangability”.

A docked Boomin' Beaver.

It’s good the US Navy is gaining (appropriately large) “Trump Class” warships (which the president doubtless will call “battleships” although they’re more in the “battlecruiser” tradition).  Within the fleet however there are on the register many smaller vessels and the most compact is the 19BB (Barrier Boat), a specialized class of miniature tugboat used deploy and maintain port security booms surrounding Navy ships and installations in port.  Over the last quarter century there have been a dozen-odd commissioned of which ten remain in active service.  Unlike many of the Pentagon’s good (and expensive) ideas, the Barrier Boats were a re-purposing of an existing design, their original purpose being in the logging industry where they were used to manoeuvre logs floating along inland waterways.  In that role the loggers dubbed them “log broncs” because the stubby little craft would “rear up like a rodeo bronco” when spun around by 180o.  Sailors of course have their own slang and they (apparently affectionately) call the 19BBs the “Boomin’ Beaver”, the origin of that being uncertain but it may verge on the impolite.  It’s not known if President Trump is aware of the useful little BB19 but if brought to his attention, he may be tempted to order two of them renamed “USS Joe Biden” and “USS Crooked Hillary” although, unlike those reprobates, the Boomin’ Beavers have done much good work for the nation.

The Arc de Triomphe, Paris (left), Donald Trump with model of his proposed arch, the White House, October, 2025 (centre) and a model of the arch, photographed on the president's Oval Office desk (right).  Details about the arch remain sketchy but it's assumed (1) it will be "big" and (2) there will be some gold, somewhere.

As well as big ships (and the big Donald J Trump Ballroom already under construction where the White House’s East Wing once stood), Mr Trump is also promising a “big arch”.  A part of the president’s MDCBA (Make D.C. Beautiful Again) project, the structure (nicknamed the “Triumphal Arch” and in the style of the Arc de Triomphe which stands in the centre of the Place Charles de Gaulle (formerly the Place de l’Étoile), the western terminus of the avenue des Champs-Élysées) is scheduled to be completed in time to celebrate the nation’s 250th anniversary on 4 July 2026.  Presumably, on that day, it will be revealed the official name is something like the “Donald J Trump Sestercentennial Arch” which will appear on the structure in large gold letters.  The arch is said to be “privately funded”, using money left over from what was donated to build the ballroom, a financing mechanism which has attracted some comment from those concerned about the “buying of influence”.

Adolf Hitler's (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) sketch of an arch (1926, left) and Hitler, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) and others examining Speer's model of the arch, presented 20 April 1939 upon the occasion of the Führer’s 50th birthday (right; note the pattern in carpet).

A model of Germania.  To give some indication of the scale, within the dome of the huge meeting hall (at top of image), St. Peter's Cathedral in Rome would have fitted several times over; its diameter of the dome would have been 250 metres (825 feet).

Commissioned to honor those who fought and died for France during the French Revolutionary (1792-1802) and Napoleonic Wars (1803-1815), construction of the Arc de Triomphe (officially the Arc de Triomphe de l'Étoile) absorbed 30-odd years between 1806-1836, as a piece of representational architecture the structure is thought perfectly proportioned for assessment by the human eye and perhaps for this reason it has been admired by many.  As early as 1926, Adolf Hitler sketched his vision of a grand arch for Berlin, while bitter experience taught him the big warships were a bad idea because of their vulnerability to air attack, he never lost his enthusiasm for megalomania in architecture and in Albert Speer he found the ideal architect.  Noting the dimensions in Hitler’s sketch, Speer responded with something in the spirit of their blueprint for Germania.  Hitler’s planned the rebuilding of Berlin to be complete by 1950, less than ten years after the expected victory in a war which would have made him the master of Europe from the French border to the Ural mountains (things didn’t work out well for him).  While the 50 metre (163 feet) tall Arc de Triomphe presented a monumental appearance and provided a majestic terminus for the Champs Elysees, Speer’s arch stood 117 meters (384 feet) in height but even though obviously substantial, it would have been entirely in scale with the rest of Germania, the whole place built in a way to inspire awe simply by virtue of sheer size.

Tuesday, January 9, 2024

Compunction

Compunction (pronounced kuhm-puhngk-shuhn)

(1) A feeling of uneasiness or anxiety of the conscience caused by regret for doing wrong or causing pain; contrition; remorse; sorrow.

(2) Any uneasiness or hesitation about the rightness of an action.

1350–1400: From the Middle English compunccion, from the Old French compunction (from which in the twelfth century Modern French gained compunction), from the Late Latin compunctionem (a pricking) & compūnctiōn- (stem of the Ecclesiastical Latin compunctiō) (remorse; a stinging or pricking (of one’s guilty conscience)), the construct being the Classical Latin compūnct(us) (past participle of compungere (to sting; severely to prick), the construct of which was (com- (used as an intensive prefix) + pungere (to prick; to puncture) (from a suffixed form of the primitive Indo-European root peuk- (to prick)) + -iōn- (stem of –iō and a suffix forming nouns, used especially on past participle stems).  The origin of the meaning in Latin (transferred from the element pungere (to prick; to puncture)) was the idea of “a pricking of one’s guilty conscience” which could induce some feeling of regret although, like many injuries cause by pin-pricks, recovery was often rapid.  The adjective compunctious (causing compunction, pricking the conscience) dates from the late sixteenth century.  Compunction & compunctiousness are nouns, compunctious & compunctionless are adjectives and compunctiously is an adverb; the noun plural is compunctions.

The Ecclesiastical Latin compunctiō (and compunction in other forms) appears frequently in the texts of the early Church, used in a figurative sense originally to convey a more intense sense of “contrition” or “remorse” than that familiar in modern use.  Contrition and remorse were of course a thing vital for the Church to foster, indeed to demand of the congregation.  The very structure of Christianity was built upon the idea that all were born in a state of guilt because the very act of conception depending upon an original sin and this was what made Jesus unique: the virgin birth meant Christ was born without sin although centuries of theological squabbles would ensue as the debate swirled about his nature as (1) man, (2) the son of God and (3) God.  That was too abstract for most which was fine with the priests who preferred to focus on the guilt of their flock and their own importance as the intermediaries between God and sinner, there to arrange forgiveness, something which turned out to be a commodity and commodities are there to be sold.  Forgiveness was really the first futures market and compunction was one of the currencies although gold and other mediums of exchange would also figure.

Sorry (Regretful or apologetic for one's actions) was from the From Middle English sory, from the Old English sāriġ (feeling or expressing grief, sorry, grieved, sorrowful, sad, mournful, bitter), from the Proto-West Germanic sairag, from the Proto-Germanic sairagaz (sad), from the primitive Indo-European seh₂yro (hard, rough, painful).  It was cognate with the Scots sairie (sad, grieved), the Saterland Frisian seerich (sore, inflamed), the West Frisian searich (sad, sorry), the Low German serig (sick, scabby), the German dialectal sehrig (sore, sad, painful) and the Swedish sårig.  Remarkably, despite the similarities in spelling and meaning, “sorry” is etymologically unrelated to “sorrow”.  Sorrow (a state of woe; unhappiness) was from the Middle English sorow, sorwe, sorghe & sorȝe, from the Old English sorg & sorh (care, anxiety, sorrow, grief), from the Proto-West Germanic sorgu, from the Proto-Germanic surgō (which may be compared with the West Frisian soarch, the Dutch zorg, the German Sorge, and the Danish, Swedish and Norwegian sorg), from the primitive Indo-European swergh (watch over, worry; be ill, suffer) (which may be compared with the Old Irish serg (sickness), the Tocharian B sark (sickness), the Lithuanian sirgti (be sick) and the Sanskrit सूर्क्षति (sū́rkati) (worry).

Johnny Depp & Amber Heard saying sorry in Australia and Johnny Depp deconstructing sorry in London.

Sorry indicates (1) one is regretful or apologetic for one’s thoughts or actions but it can also mean (2) one is grieved or saddened (especially by the loss of something or someone), (3) someone or something is in a sad or regrettable state or (4) someone or something is hopelessly inadequate for their intended role or purpose.  Such is human nature that expressions of sorry in the sense of an apology are among the more common exchanges and one suspects something like the 80/20 rule applies: 80% of apologies are offered by (or extracted from) 20% of the population.  So frequent are they that an art has evolved to produce phrases by which an apology can be delivered in which sorry is somehow said without actually saying sorry.  This is the compunction one fells when one is not feeling compunctious and a classic example was provided when the once (perhaps then happily) married actors Johnny Depp (b 1963) & Amber Heard (b 1986) were in 2015 caught bringing two pet dogs into Australia in violation of the country’s strict biosecurity laws.  Ms Heard pleaded guilty to falsifying quarantine documents, stating in mitigation her mistake was induced by “sleep deprivation”.  No conviction was recorded (the maximum sentence available being ten years in jail) and she was placed on a Aus$1,000 one-month good behavior bond, the couple ordered to make a “public apology” and that they did, a short video provided, the script unexceptional but the performances something like a Monty Python sketch.  However, whatever the brief performance lacked in sincerity, as free advertising for the biosecurity regime, it was invaluable.  Mr Depp later returned to the subject when promoting a film in London.

The synonyms for “sorry” (as in an apology) include regret, apologize, compunctious, contrite, penitent, regretful, remorseful & repentant (which is more a subsequent act).  Practiced in the art of the “non-apologetic” apology are politicians (some of whom have honed it to the point where it’s more a science) who have a number of ways of nuancing things.  Sometimes the excuse is that simply to say “sorry” might subsequent legal proceedings be construed as an admission of liability, thus exposing the exchequer and there was some basis for that concept which has prompted some jurisdictions explicitly to write into legislation that in traffic accidents and such, simply to say “sorry” cannot be construed as such an admission.  That of course has had no apparent effect on the behaviour of politicians.  Even when there is no possibly of exposing the state to some sort of claim, politicians are still averse to anything like the word “sorry” because it’s seen as a “loss of face” and a victory for one’s opponents.

There are exceptions.  Some politicians, especially during periods of high popularity, worked out that such was the novelty, saying sorry could work quite well, especially if delivered in a manner which seemed sincere (and the right subject, in the right hands, can learn such tricks) although some who found it worked did overdo it, the repetition making it clear it was just another cynical tactic.  An example was Peter Beattie (b 1952; Premier of Queensland 1998-2007) who found the electorate responded well to a leader saying sorry but such was the low quality of the government he headed that there was often something for which to apologize and having set the precedent, he felt compelled to carry on until the sheer repetitive volume of the compunctiousness began merely to draw attention to all the incompetence.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The other exception is the set-piece event.  This is where a politician apologizes on behalf of someone else (a previous government, hopefully the opposition or something a vague as the nation in some dim, distant past) while making it clear that personally it’s nothing to do with them personally.  There has been a spate of these in recent decades, many apologizing for egregiously appalling acts by white men against ethnic minorities, indigenous populations, the disabled or other powerless groups.  Again, some of the apologies have been in the form of “personally sorry it happened”, thereby ticking the box without costing anything; people like and indigenous population apparently deserving words but not compensation.  For the rest of us, ranging from the genuinely sincere to the cynically opportunistic nihilistic psychopaths, the most obvious tool is the adverb: to say “I am so sorry” can be more effective than “I’m sorry” provided the tone of voice, inflections and the non-verbal clues are all in accord.  Sorry is recommend by many because it so easily can be made to sound sincere with a ease that’s challenging with compunctious, contrite, penitent, regretful, and remorseful, the longer words ideal for one politician “apologizing” to another in a form which is linguistically correct while being quite contemptuous.

Sunday, January 1, 2023

Quarantine

Quarantine (pronounced kwawr-uhn-teen or kwor-uhn-teen)

(1) In historic English common law, the period of 40 days during which a widow was entitled to remain in her deceased husband's home while any dower is collected and returned.

(2) A strict isolation imposed to prevent the spread of disease and (by extension), any rigorous measure of isolation, regardless of the reason.

(3) A period, originally 40 days (the historic understanding of the maximum known incubation period of disease) of detention or isolation imposed upon ships, persons, animals, or plants on arrival at a port or place, when suspected of carrying some infectious or contagious disease; a record system kept by port health authorities in order to monitor and prevent the spread of contagious diseases.  The origin was in measures taken in 1448 in Venice's lazaret to avoid renewed outbreaks of the bubonic plague.

(4) In historic French law, a 40-day period imposed by the king upon warring nobles during which they were forbidden from exacting revenge or to continue warfare.

(5) A place where such isolation is enforced (a lazaret).

(6) In international relations, a blockade of trade, suspension of diplomatic relations, or other action whereby one country seeks to isolate another.

(7) In computing, a place where files suspected of harboring a computer virus or other harmful code are stored in a way preventing infection of other files or machines; the process of such an isolation.

(8) To withhold a portion of a welfare payment from a person or group of people (Australia).

(9) To quarantine someone or something.

1600–1610: From the Middle English quarentine (period a ship suspected of carrying contagious disease is kept in isolation), from the Norman quarenteine, from the French quarenteine, from the Italian quarantina, a variant of quarantena, originally from the upper Italian (Venetian) dialect as quaranta giorni (space of forty days, group of forty), from quaranta (forty) from the Medieval Latin quarentīna (period of forty days; Lent), from the Classical Latin quadrāgintā (four tens, forty) and related to quattuor (four), from the primitive Indo-European root kwetwer (four).  The difference between quarantine and isolation is one of context; while people might for many reasons be isolated, quarantine is a public health measure to deal with those exposed to or at risk of having been infected by a communicable disease, the duration of the quarantine being sufficient to ensure any risk of spreading the infection has passed.  The name is from the Venetian policy (first enforced as the 30 day edict trentino in 1377) of keeping ships from plague-stricken countries waiting off its port for forty days to ensure no latent cases remained aboard.  The extended sense of "any period of forced isolation" dates from the 1670s.  A doublet of carene and quadragene.

In the context the L'Ancien Régime (pre-revolutionary France), it was a calque of the French quarantaine, following the edicts of Louis IX (and formalized by the quarantaine du Roi (1704) of Louis XIV which was a mechanism of quieting squabbling nobles).  Quarantine was introduced to international relations as a euphemism for "blockade" in 1937 because the Roosevelt administration was (1) conscious of public reaction to the effects on civilians of the Royal Navy’s blockade of Imperial Germany during World War I (1914-1918) and (2) legal advice that a “blockade” of a non-belligerent was, under international law, probably an act of war.  The use was revived by the Kennedy administration during the Cuban Missile Crisis (October 1962).  The verb meaning "put under quarantine" came quickly to be used in any sense including figuratively (to isolate, as by authority) dates from 1804.  Predating the use in public health, in early sixteenth century English common law, the quarentine was the period of 40 days during which a widow was entitled to remain in her dead husband's home while any dower is collected and returned.  The alternative spellings quarentine, quarantin, quaranteen, quarantain, quarantaine, quarrentine, quarantene, quarentene, quarentyne, querentyne are all obsolete except in historic references).  While not of necessity entirely synonymous, detention, sequester, separation, seclusion, segregation, sequestration, lazaretto, segregate, confine, separate, seclude, insulate, restrict, detach & cordon, are at least vaguely similar.  Quarantine is a noun & verb, quarantiner is a noun, quarantinable is an adjective and quarantined & quarantining are verbs & adjectives.

In scripture, the number 40 often occurs although Biblical scholars, always anxious to dismiss musings from numerologists, new age practitioners and crystal-wearing basket weavers, reject the notion it has any special meaning beyond the idea of a “period of trial or struggle”, memorably expressed in the phrase “forty days and forty nights”.  In the Old Testament, when God destroyed the earth in the Great Flood, he delivered rain for 40 days and 40 nights (Genesis 7:12).  After killing the Egyptian, Moses fled to Midian where he spent 40 years in the desert tending flocks (Acts 7:30) and subsequently he stood on Mount Sinai for 40 days and 40 nights (Exodus 24:18) and then interceded on Israel’s behalf for 40 days and 40 nights (Deuteronomy 9:18, 25).  In Deuteronomy 25:3, the maximum number of lashes a man could receive as punishment for a crime was set at 40.  The Israelite spies took 40 days to spy out Canaan (Numbers 13:25), the Israelites wandered for 40 years (Deuteronomy 8:2-5) and before Samson’s deliverance, Israel served the Philistines for 40 years (Judges 13:1).  Goliath taunted Saul’s army for 40 days before David arrived to slay him (1 Samuel 17:16) and when Elijah fled from Jezebel, he traveled 40 days and 40 nights to Mt. Horeb (1 Kings 19:8).  The number 40 also appears in the prophecies of Ezekiel (4:6; 29:11-13) and Jonah (3:4).  In the New Testament, the quarentyne was the desert in which Christ fasted and was tempted for for 40 days and 40 nights (Matthew 4:2) and there were 40 days between Jesus’ resurrection and ascension (Acts 1:3).  Presumably, this influenced Western medicine because it was long (and still by some) recommended that women should for 40 days rest after childbirth.

Plague, the Venetians and Quarantino

The Plague of Justinian arrived in Byzantine capital of Constantinople in 541, brought from recently conquered Egypt across the Mediterranean by plague-ridden fleas in the fur of rats on ships bringing loot from the war.  From the imperial capital it spread across Europe, Asia, North Africa and Arabia, killing an estimated thirty to fifty million, perhaps a quarter the inhabitants of the eastern Mediterranean.  Plague never really went away, localized outbreaks happening periodically unit it returned as a pandemic some eight-hundred years later; the Black Death, which hit Europe in 1347, claimed some two-hundred million in just four years and demographically, Europe would not for centuries recover from the Black Death.

There was at the time little scientific understanding of contagion but it became clear it was related to proximity so officials in Venetian-controlled port city of Ragusa (now Dubrovnik in Croatia) resolved to keep newly arrived sailors in isolation until it was apparent they were healthy.  Initially, the sailors were confined to their ships for thirty days, formalized in a 1377 Venetian law as a trentino (thirty days), which radically reduced the transmission rate and by 1448, the Venetians had increased the forced isolation to forty days (quarantine), which, given bubonic plague’s thirty-seven day cycle from infection to death, was an example of a practical scientific experiment.  The word soon entered Middle English as quarantine (already in use in common law as a measure of certain rights accruing to a widow), the origin of the modern word and practice of quarantine.  The English had many opportunities to practice quarentine.  In the three-hundred odd years between 1348 and 1665, London suffered some forty outbreaks, about once a generation (or every twenty years), the significance of this pattern something which modern epidemiologists would later understand.  Quarentine laws were introduced in the early sixteenth century and proved effective, reducing the historic medieval death-rates to about twenty percent.

Eggs à la Lohan

In self-imposed quarantine in March 2020, Lindsay Lohan was apparently inspired by a widely shared motivational poem by Kitty O’Meara (on the internet dubbed the "poet laureate of the pandemic") which included the fragment:

And the people stayed home.  And read books, and listened, and rested, and exercised, and made art, and played games, and learned new ways of being, and were still.  And listened more deeply.  Some meditated, some prayed, some danced.  Some met their shadows.  And the people began to think differently.

One of Lindsay Lohan's recommendations for a time of quarantine was to take the time to cook, posting a photograph of Eggs à la Lohan, a tasty looking omelet.  The poem also contained the words:

And the people healed.  And, in the absence of people living in ignorant, dangerous, mindless, and heartless ways, the earth began to heal.  And when the danger passed, and the people joined together again, they grieved their losses, and made new choices, and dreamed new images, and created new ways to live and heal the earth fully, as they had been healed.

Unfortunately, viewed from early 2023, it would seem Ms O'Meara's hopes quarantine might have left us kinder, gentler and more thoughtful may not have be realized.  It may be Mr Putin didn’t read poem and just ate omelet. 

Tuesday, April 26, 2022

Isolation

Isolation (pronounced ahy-suh-ley-shuhn)

(1) An act or instance of isolating; the state of being isolated.

(2) In medicine, the complete separation from others of a person suffering from contagious or infectious disease; quarantine.

(3) In diplomacy, the separation, as a deliberate choice by government, of a nation from other nations by nonparticipation in or withdrawal from international relations and institutions.

(4) In psychoanalysis, a process whereby an idea or memory is divested of its emotional component.

(5) In social psychology, the failure of an individual to maintain contact with others or genuine communication where interaction with others persists.

(6) In linguistics and other fields, to consider matters without regard to context.

(7) In chemistry, obtaining an element from one of its compounds, or of a compound from a mixture

(8) In computing, a database property that determines when and how changes made in one transaction are visible to other concurrent transactions.

1830s: A compound word, isolate + -ion.  A modern English borrowing from the French isolé (placed on an island (thus away from other people)).  Isolé was from the Italian isolato, past participle of isolare, the root of which was the Latin insulātus & insulātes (made into an island), from insula (island).  From circa 1740, English at first used the French isolé (rendered as isole) which appeared also as isole'd in the 1750s, isolate the verb emerging in the 1830s; isolated the past participle.  Isolation is now the most familiar form, the suffix –ion is from the Latin - (genitive -iōnis), appended to a perfect passive participle to form a noun of action.  Words with similar meanings, often varying by context, includes solitude, desolation, confinement, segregation, remoteness, privacy, quarantine, sequestration, aloofness, detachment, withdrawal, exile, aloneness, concealment, retreat, hiding, reclusion, monkhood, and seclusion.

Isolation, Social Phobia and Social Anxiety Disorder

As long ago as 400 BC, Greek physician Hippocrates (circa 460–c370 BC) noted there were people who sought social isolation, describing them as those who "love darkness as life" adding, in a hint at later understandings of mental illness, they tended also to "think every man observes them."  Such folk doubtless pre-dated antiquity, being always part of organized societies but it wasn’t until the late nineteenth century when psychiatry emerged as a distinct field that the particular human condition came to be known as social phobia or social neurosis, then thought of as a descriptor of extremely shy patients who sought isolation by choice.

Desolate: an emo in isolation.

Despite the increasing medicalization of the spectrum of the human condition, it wasn’t until 1968, in the second edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-II), that social isolation was described as a specific phobia of social situations or excessive fear of being observed or scrutinized by others but at this point the definition of social phobia was very narrow.  With the release in 1980 of the DSM-III, social phobia was included as an official psychiatric diagnosis although it restricted the criteria, noting those who sought social isolation did so because of a fear of “performance situations” and did not include fears of less formal encounters such as casual conversations.  Those with such broad fears were instead to be diagnosed with “avoidant personality disorder” which, for technical reasons defined within the DSM-III, could not be co-diagnosed as social phobia, an attitude reflecting the editors’ view that phobias and neuroses needed specifically to be codified rather than acknowledging there existed in some a “general anxiety” disorder.  This neglect was addressed in the 1987 revision to the DSM-III (DSM-III-R) which changed the diagnostic criteria, making it possible to diagnose social phobia and avoidant personality disorder in the same patient.  In this revision, the term "generalized social phobia" was introduced.  DSM-IV was published in 1994 and the term “social anxiety disorder” (SAD) replaced social phobia, this reflecting how broad and generalized fears are in the condition although the diagnostic criteria differed only slightly from those in the DSM-III-R.  The DSM-IV position remains essentially current; the modifications in the DSM-5 (2013) not substantively changing the diagnosis, altering little more than the wording of the time frame although the emphasis on recognizing whether the experience of anxiety is unreasonable or excessive was shifted from patient to clinician.

For some, COVID-19 isolation was a business opportunity.

Generalized anxiety disorder (GAD) and panic disorder (PD) were formalized when DSM-III was released in 1980 although among clinicians, GAD had for some years been a noted thread in the literature but what was done in DSM-III was to map GAD onto the usual pattern of diagnostic criteria.  In practice, because of the high degree of co-morbidity with other disorders, the utility of GAD as defined was soon a regular topic of discussion at conferences and the DSM’s editors responded, the parameters of GAD refined in subsequent releases between 1987-1994 when GAD’s diagnostic criteria emerged in its recognizably modern form.  By the time the terminology for mental disorders began in the nineteenth century to be codified, the word anxiety had for hundreds of years been used in English to describe feelings of disquiet or apprehension and in the seventeenth century there was even a school of thought it was a pathological condition.  It was thus unsurprising that “anxiety” was so often an element in the psychiatry’s early diagnostic descriptors such as “pantophobia” and “anxiety neurosis”, terms which designated paroxysmal manifestations (panic attacks) as well as “interparoxysmal phenomenology” (the apprehensive mental state).  The notion of “generalized anxiety”, although not then in itself a diagnosis, was also one of the symptoms of many conditions including the vaguely defined neurasthenia which was probably understood by many clinicians as something similar to what would later be formalized as GAD.  As a distinct diagnostic category however, it wasn’t until the DSM-III was released in 1980 that GAD appeared, anxiety neurosis split into (1) panic disorder and (2) GAD.  When the change was made, the editors noted it was a response to comments from clinicians, something emphasised when DSM-III was in 1987 revised (DSM-III-R), in effect to acknowledge there was a class of patient naturally anxious (who might once have been called neurotic or pantophobic) quite distinct from those for whom a source of anxiety could be deduced.  Thus, the cognitive aspect of anxiety became the critical criterion but within the profession, some scepticism about the validity of GAD as a distinct diagnostic category emerged, the most common concern being the difficulty in determining clear boundaries between GAD, other anxiety-spectrum disorders and certain manifestations of depression.

The modern label aside, GAD has a really long lineage and elements of the diagnosis found in case histories written by doctors over the centuries would have seemed familiar to those working in the early nineteenth century, tales of concern or apprehension about the vicissitudes of life a common thing.  As psychiatry in those years began to coalesce as a speciality and papers increasingly published, it was clear the behaviour of those suffering chronic anxiety could culminate in paroxysmal attacks, thus it was that GAD and panic attacks came to be so associated.  In English, the term panophobia (sometimes as pantaphobia, pantophobia or panphobia) dates from 1871, the word from the Late Latin pantŏphŏbŏs, from the Ancient Greek παντοφόβος (all-fearing (literally “anxiety about everything”)).  It appears in the surviving works of medieval physicians and it seems clear there were plenty of “pantophobic patients” who allegedly were afraid of everything and it was not a product of the Dark Ages, Aristotle (384-322 BC) in the seventh book of his Nicomachean Ethics (350 BC) writing there were men “…by nature apt to fear everything, even the squeak of a mouse”.

The first edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-I (1952) comprised what seems now a modest 130 pages.  The latest edition (DSM-5-TR (2022)) has 991 pages.  The growth is said to be the result of advances in science and a measure of the increasing comprehensiveness of the manual, not an indication that madness in the Western world is increasing.  The editors of the DSM would never use the word "madness" but for non-clinicians it's a handy term which can be applied to those beyond some point on the spectrum of instability.

Between Aristotle and the publication of the first edition of the DSM in 1952, physicians (and others) pondered, treated and discussed the nature of anxiety and theories of its origin and recommendations for treatment came and went.  The DSM (retrospectively labelled DSM-I) was by later standards a remarkably slim document but unsurprisingly, anxiety was included and discussed in the chapter called “Psychoneurotic Disorders”, the orthodoxy of the time that anxiety was a kind of trigger perceived by the conscious part of the personality and produced by a threat from within; how the patient reacted to this resulted in their reaction(s).  There was in the profession a structural determinism to this approach, the concept of defined “reaction patterns” at the time one of the benchmarks in US psychiatry.  When DSM-II was released in 1968, the category “anxiety reaction” was diagnosed when the anxiety was diffuse and neither restricted to specific situations or objects (ie the phobic reactions) nor controlled by any specific psychological defense mechanism as was the case in dissociative, conversion or obsessive-compulsive reactions. Anxiety reaction was characterized by anxious expectation and differentiated from normal apprehensiveness or fear.  Significantly, in DSM-II the reactions were re-named as “neuroses” and it was held anxiety was the chief characteristic of “neuroses”, something which could be felt or controlled unconsciously by various symptoms.  This had the effect that the diagnostic category “anxiety neurosis” encompassed what would later be expressed as panic attacks and GAD.

A: Excessive anxiety and worry (apprehensive expectation), occurring more days than not for at least 6 months, about a number of events or activities (such as work or matters relating to educational institutions).

B: The patient finds it difficult to control the worry.

C: The anxiety and worry are associated with three (or more) of the following six symptoms:

(1) Restlessness or feeling keyed up or on edge.

(2) Being easily fatigued.

(3) Difficulty concentrating or mind going blank.

(4) Irritability.

(5) Muscle tension.

(6) Sleep disturbance (difficulty falling or staying asleep, or restless, unsatisfying sleep).

The key change really was for the criteria for GAD requiring fewer symptoms. Whereas with the DSM-IV-TR (2000) individuals needed to exhibit at least three physical and three cognitive symptoms for a diagnosis of GAD, under DSM-5 (2013), only one of each was required so not only was the accuracy and consistency of diagnosis (by definition) improved, the obvious practical effect was better to differentiate GAD from other anxiety disorders and (importantly) the usual worries and concerns endemic to the human condition.  The final significant aspect of the evolution was that by the time of DSM-5, GAD had become effectively a exclusionary diagnosis in that it cannot be diagnosed if the anxiety is better explained by other anxiety disorders and nor can GAD be caused directly by stressors or trauma.