Showing posts sorted by date for query Blueprint. Sort by relevance Show all posts
Showing posts sorted by date for query Blueprint. Sort by relevance Show all posts

Saturday, January 3, 2026

Defiant

Defiant (pronounced dih-fahy-uhnt)

Characterized by defiance or a willingness to defy; boldly resistant or challenging.

1830s: From the French défiant, from the Old French, present participle of the verb défier (to challenge, defy, provoke), the construct thus def(y) + “i” + -ant.  Defy dates from the mid thirteenth century and was from the Middle English defien, from the Old French desfier, from the Vulgar Latin disfidare (renounce one's faith), the construct being dis- (away) + fidus (faithful).  The construct in French was thus des- (in the sense of negation) + fier (to trust), (from the Vulgar Latin fīdāre, from the Classical Latin fīdere (fidelity),  In the fourteenth century, the meaning shifted from “be disloyal” to “challenge”.  The suffix –ant was from the Middle English –ant & -aunt, partly from the Old French -ant, from Latin -āns; and partly (in adjectival derivations) a continuation of the use of the Middle English -ant, a variant of -and, -end, from the Old English -ende ( the present participle ending).  Extensively used in the sciences (especially medicine and pathology), the agent noun was derived from verb.  It was used to create adjectives (1) corresponding to a noun in -ance, having the sense of "exhibiting (the condition or process described by the noun)" and (2) derived from a verb, having the senses of: (2a) "doing (the verbal action)", and/or (2b) "prone/tending to do (the verbal action)".  In English, many of the words to which –ant was appended were not coined in English but borrowed from the Old French, Middle French or Modern French.  The negative adjectival forms are non-defiant & undefiant although there is a kind of middle ground described by quasi-defiant, semi-defiant & half-defiant, the latter three sometimes used in military conflicts where, for whatever reason, it’s been necessary (or at least desirable) for a force to offer a “token resistance” prior to an inevitable defeat.  The adjective over-defiant refers to a resistance or recalcitrance, the extent or duration of which is not justified by the circumstances; in such cases the comparative is “more defiant” and the superlative “most defiant”.  Defiant is a noun & adjective, defiantness is a noun and defiantly is an adverb; the noun plural is defiants.

Defiance in politics: use with caution

The commonly used synonyms include rebellious, direful, truculent, insolent, rebellious, recalcitrant, refractory, contumacious & insubordinate but in diplomacy, such words must be chosen with care because what is one context may be a compliment, in another it may be a slight.  This was in 1993 discovered by Paul Keating (b 1944; Prime Minister of Australia 1991-1996) who labelled Dr Mahathir bin Mohamad (b 1925; prime minister of Malaysia 1981-2003 & 2018-2020) one of the “recalcitrant” when the latter declined to attend a summit meeting of the Asia-Pacific Economic Cooperation (APEC).  For historic reasons, Dr Mahathir was sensitive to the memories of the imperialist oppressors telling colonized people what to do and interpreted Mr Keating’s phrase as a suggestion he should be more obedient (the most commonly used antonym of defiant, the others including obedient & submissive).  Things could quickly have been resolved (Dr Mahathir of the “forgive but not forget” school of IR (international relations)) but, unfortunately, Mr Keating was brought up in the gut-wrenching “never apologize” tradition of the right-wing of the New South Wales (NSW) Labor Party so what could have been handled as a clumsy linguistic gaffe was allowed to drag on.

Circa 1933 Chinese propaganda poster featuring a portrait of Generalissimo Chiang Kai-shek (Chiang Chung-cheng).  Set in an oval frame below flags alongside stylized Chinese lettering, the generalissimo is depicted wearing his ceremonial full-dress uniform with decorations.

The admission an opponent is being “defiant” must also sometimes be left unsaid.  Ever since Generalissimo Chiang Kai-shek (1887-1975; leader of the Republic of China (mainland) 1928-1949 & the renegade province of Taiwan 1949-1975) in 1949 fled mainland China, settling on and assuming control of the island of Taiwan, the status of the place has been contested, most dramatically in the incidents which flare up occasionally in the in the straits between the island and the mainland, remembered as the First (1954–1955), Second (1958) and Third (1995-1996) Taiwan Strait Crises which, although sometimes in retrospect treated as sabre rattling or what Hun Sen (b 1952; prime minister (in one form or another) 1985-2023) might have called “the boys letting off steam”, were at the time serious incidents, each with the potential to escalate into something worse.  Strategically, the first two crises were interesting studies in Cold War politics, the two sides at one stage exchanging information about when and where their shelling would be aimed, permitting troops to be withdrawn from the relevant areas on the day.  Better to facilitate administrative arrangements, each side’s shelling took place on alternate days, satisfying honor on both sides.  The other landmark incident was China’s seat at the United Nations (UN), held by the Republic of China (ROC) (Taiwan) between 1945-1971 and the People’s Republic of China (PRC) (the mainland) since.

Jiefang Taiwan, xiaomie Jiangzei canyu (Liberate Taiwan, and wipe out the remnants of the bandit Chiang) by Yang Keyang (楊可楊) and Zhao Yannian (趙延年). 

A 1954 PRC propaganda poster printed as part of anti-Taiwan campaign during first Taiwan Strait Crisis (1954-1955), Generalissimo Chiang Kai-shek depicted as a scarecrow erected on Taiwan by the US government and military. Note the color of the generalissimo’s cracked and disfigured head (tied to a pole) and the similarity to the color of the American also shown.  The artists have included some of the accoutrements often associated with Chiang’s uniforms: white gloves, boots and a ceremonial sword.  The relationship between Chiang and the leaders of PRC who defeated his army, Chairman Mao (Mao Zedong. 1893–1976; paramount leader of PRC 1949-1976) and Zhou Enlai (1898–1976; PRC premier 1949-1976) was interesting.  Even after decades of defiance in his renegade province, Mao and Zhou still referred to him, apparently genuinely, as “our friend”, an expression which surprised both Richard Nixon (1913-1994; US president 1969-1974) and Henry Kissinger (b 1923; US national security advisor 1969-1973 & secretary of state 1973-1977) who met the chairman and premier during their historic mission to Peking in 1972.

A toast: Comrade Chairman Mao Zedong (left) and  Generalissimo Chiang Kai-shek (right), celebrating the Japanese surrender, Chongqing, China, September 1945.  After this visit, they would never meet again.

Most people, apparently even within the PRC, casually refer to the place as “Taiwan” but state and non-governmental entities, anxious not to upset Beijing, use a variety of terms including “Chinese Taipei” (the International Olympic Committee (IOC) and the Fédération Internationale de Football Association (FIFA, the International Federation of Association Football) & its continental confederations (AFC, CAF, CONCACAF, CONMEBOL, OFC and UEFA)), “Taiwan District” (the World Bank) and “Taiwan Province of China (the International Monetary Fund (IMF)).  Taiwan’s government uses an almost declarative “Republic of China” which is the name adopted for China after the fall of the Qing dynasty and used between 1912-1949 and even “Chinese Taipai” isn’t without controversy, “Taipei” being the Taiwanese spelling whereas Beijing prefers “Taibei,” the spelling used in the mainland’s Pinyin system.  There have been variations on those themes and there’s also the mysterious “Formosa”, use of which persisted in the English-speaking world well into the twentieth century, despite the Republic of Formosa existing on the island of Taiwan for only a few months in 1895.  The origin of the name Formosa lies in the island in 1542 being named Ilha Formosa (beautiful island) by Portuguese sailors who had noticed it didn’t appear on their charts.  From there, most admiralties in Europe and the English-speaking world updated their charts, use of Formosa not fading until the 1970s.

All that history is well-known, if sometimes subject to differing interpretations but some mystery surrounds the term “renegade province”, used in recent years with such frequency that a general perception seems to have formed that it’s Beijing’s official (or at least preferred) description of the recalcitrant island.  That it’s certainly not but in both the popular-press and specialist journals, the phrase “renegade province” is habitually used to describe Beijing’s views of Taiwan.  Given that Beijing actually calls Taiwan the “Taiwan Province” (sometimes styled as “Taiwan District” but there seems no substantive difference in meaning) and has explicitly maintained it reserves the right to reclaim the territory (by use of military invasion if need be), it’s certainly not unreasonable to assume that does reflect the politburo's view but within the PRC, “renegade province” is so rare (in Chinese or English) as to be effectively non-existent, the reason said to be that rather than a renegade, the island is thought of as a province pretending to be independent; delusional rather than defiant.  Researchers have looked into the matter when the phrase “renegade province” was first used in English when describing Taiwan.  There may be older or more obscure material which isn’t indexed or hasn’t been digitized but of that which can be searched, the first reference appears to be in a US literary journal from 1973 (which, it later transpired, received secret funding from the US Central Intelligence Agency (CIA)).  It took a while to catch on but, appearing first in the New York Times in 1982, became a favorite during the administration of Ronald Reagan (1911-2004; US president 1981-1989) and has been part of the standard language of commentary since.  Diplomats, aware of Beijing's views on the matter, tend to avoid the phrase, maintaining the “delusional rather than defiant” line.

Picture of defiance: Official State Portrait of Vladimir Putin (2002), oil on canvas by Igor Babailov (b 1965).

The idea of a territory being a “renegade province” can be of great political, psychological (and ultimately military) significance.  The core justification used by Mr Putin (Vladimir Vladimirovich Putin; b 1952; president or prime minister of Russia since 1999) when explaining why his “special military operation” against Ukraine in 2022 was not an “invasion” or “war of aggression” (he probably concedes it may be a “state of armed conflict”) was that he denied Ukraine was a sovereign, independent state and that Volodymyr Zelenskyy (b 1978, president of Ukraine since 2019) was not a legitimate president.  In other words, Ukraine is merely a region of the modern Russia in something of the way it was once one of the 15 constituent SSRs (Soviet Socialist Republic) of the Soviet Union.  Although the Kremlin doesn’t use the phrase, in Mr Putin’s world view, Ukraine is a renegade province and he likely believes that applies also to the Baltic States (Latvia, Lithuania & Estonia) and possibly other former SSRs.  Lake many, the CCP is watching events in Ukraine with great interest and, as recent “exercises” seem to suggest the People’s Liberation Army (PLA) have sufficiently honed their techniques to execute either a blockade (which would be an “act of war”) or a “quarantine” (which would not), the attention of Western analysts is now focused on the hardly secret training being undertaken to perfect what’s needed for the triphibious operations demanded by a full-scale invasion.  The US think-tanks which think much about this possibility have suggested “some time” in 2027 as the likely point at which the military high command would assure the CCP’s central committee such a thing is possible.  What will happen will then depend upon (1) the state of things in the PRC and (2) the CCP’s assessment of how the long-term “strategic ambiguity” of Washington would manifest were an attempt made to finish the “unfinished business” of 1949.

Lindsay Lohan, who has lived a life of defiance.

The objectification of women’s body parts has of course been a theme in Western culture since at least Antiquity but rarely can as much attention been devoted to a single fingernail as the one photographed on Lindsay Lohan’s hand in July 2010 (during her “troubled starlet” phase).  The text printed on the fingernail was sufficiently explicit not to need a academic deconstruction of its alleged meaning, given image was taken when she sitting in court listening to a judge sentence her for one of her many transgressions; the consensus was the text was there to send a “defiant message” the internet’s collective conclusion (which wasn’t restricted to entertainment and celebrity sites) presumably reinforced by the nail being on the middle finger.  Ms Lohan admitted to fining this perplexing, tweeting on X (then known as Twitter) it was merely a manicure and had “…nothing to do w/court, it's an airbrush design from a stencil.  So, rather than digital defiance, it was fashion.  Attributing a motif of defiance to Ms Lohan wasn’t unusual during “troubled starlet” phase, one site assessing a chronological montage of her famous mug shots before concluding with each successive shot, “Lindsay's face becomes more defiant — a young woman hardening herself against a world that had turned her into a punch-line”.

The Bolton-Paul Defiant (1939-1943)

The Parthian shot was a military tactic, used by mounted cavalry and made famous by the Parthians, an ancient people of the Persian lands (the modern-day Islamic Republic of Iran since 1979).  While in real or feigned retreat on horseback, the Parthian archers would, in full gallop, turn their bodies backward to shoot at the pursuing enemy.  This demanded both fine equestrian skills (a soldier’ hands occupied by his bows & arrows) and great confidence in one's mount, something gained only by time spent between man & beast.  To make the achievement more admirable still, the Parthians used neither stirrups nor spurs, relying solely on pressure from their legs to guide and control their galloping mounts and, with varying degrees of success, the tactic was adopted by many mounted military formations of the era including the Scythians, Huns, Turks, Magyars, and Mongols.  The Parthian Empire existed between 247 BC–224 AD.  The Royal Air Force (RAF) tried a variation of the Parthian shot with Bolton-Paul Defiant, a single-engined fighter and Battle of Britain contemporary of the better remembered Spitfire and Hurricane.  Uniquely, the Defiant had no forward-firing armaments, all its firepower being concentrated in four .303 machine guns in a turret behind the pilot.  The theory behind the design dates from the 1930s when the latest multi-engined monoplane bombers were much faster than contemporary single-engined biplane fighters then in service. The RAF considered its new generation of heavily-armed bombers would be able to penetrate enemy airspace and defend themselves without a fighter escort and this of course implied enemy bombers would similarly be able to penetrate British airspace with some degree of impunity.

Bolton-Paul Defiant.

By 1935, the concept of a turret-armed fighter emerged.  The RAF anticipated having to defend the British Isles against massed formations of unescorted enemy bombers and, in theory, turret-armed fighters would be able approach formations from below or from the side and coordinate their fire.  In design terms, it was a return to what often was done early in the World War I, though that had been technologically deterministic, it being then quite an engineering challenge to produce reliable and safe (in the sense of not damaging the craft's own propeller) forward-firing guns.  Deployed not as intended, but as a fighter used against escorted bombers, the Defiant enjoyed considerable early success, essentially because at attack-range, it appeared to be a Hurricane and the German fighter pilots were of course tempted attack from above and behind, the classic hunter's tactic.  They were course met by the the Defiant's formidable battery.  However, the Luftwaffe learned quickly, unlike the RAF which for too long persisted with their pre-war formations which were neat and precise but also excellent targets.  Soon the vulnerability of the Defiant resulted in losses so heavy its deployment was unsustainable and it was withdrawn from front-line combat.  It did though subsequently proved a useful stop-gap as a night-fighter and provided the RAF with an effective means of combating night bombing until aircraft designed for the purpose entered service.

The Trump class "battleships"

In a surprise announcement, the Pentagon announced the impending construction of a “new battleship class” the first of the line (USS Defiant) to be the US Navy’s “largest surface combatant built since World War II [1939-1945]”.  The initial plans call for a pair to be launched with a long-term goal of as many as two dozen with construction to begin in 2030.  Intriguingly, Donald Trump (b 1946; US president 2017-2021 and since 2025) revealed that while the Department of Defense’s (it’s also now the Department of War) naval architects would “lead the design”, he personally would be involved “…because I’m a very aesthetic person.  That may sound a strange imperative when designing something as starkly functional as a warship but in navies everywhere there’s a long tradition of “the beautiful ship” and the design language still in use, although much modified, is recognizably what it was more than a century earlier.  The Secretary of the Navy certainly stayed on-message, announcing the USS Defiant would be “…the largest, deadliest and most versatile and best-looking warship anywhere on the world’s oceans”, adding that components for the project would “be made in every state.”  It won't however be the widest because quirk of ship design in the US Navy is that warships tend to be limited to a beam (width) of around 33 metres (108 feet) because that’s the limit for vessels able to pass through the Panama Canal.

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

By comparison with the existing surface fleet the 35,000 ton Defiant will be impressively large although, by historic standards, the largest (non-carrier) surface combatants now in service are of modest dimensions and displacement.  The largest now afloat are the 15,000-ton Zumwalt class destroyers (which really seem to be cruisers) while the 10,000 ton Ticonderoga class cruisers (which really are destroyers) are more numerous.  So, even the Defiant will seem small compared with the twentieth century Dreadnoughts (which became a generic term for “biggest battleship”), the US Iowa class displacing 60,000 ton at their heaviest while the Japanese Yamato-class weighted-in at 72,000.  Even those behemoths would have been dwarfed by the most ambitious of the H-Class ships in Plan-Z which were on German drawing boards early in World War II.  Before reality bit hard, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) left physics to the engineers and wasn't too bothered by economics.  After being disappointed the proposals the successors to the Bismarck-class ships would have their main armament increased only from eight 15-inch (380 mm) to eight 16 inch cannons, he ordered OKM (Oberkommando der Marine; the Naval High Command) to design bigger ships.  That directive emerged as the ambitious Plan Z which would have demanded so much steel, essentially nothing else in the Reich could have been built.  Although not one vessel in Plan Z ever left the slipway (the facilities even to lay down the keels non-existent), such a fleet would have been impressive, the largest (the H-44) fitted with eight 20-inch (508 mm) cannons.  Even more to the Führer’s liking was the concept of the H-45, equipped with eight 31.5 inch (800 mm) Gustav siege guns.  However, although he never lost faith in the key to success on the battlefield being bigger and bigger tanks, the experience of surface warfare at sea convinced Hitler the days of the big ships were over and he would even try to persuade the navy to retire all their capital ships and devote more resources to the submarines which, as late as 1945, he hoped might still prolong the war.  Had he imposed such priorities in 1937-1938 so the Kriegsmarine (German Navy) could have entered World War II with the ability permanently to have 100 submarines engaged in high-seas raiding rather than barely a dozen, the early course of the war might radically have been different.  Germany indeed entered the war without a single aircraft carrier (the only one laid down never completed), such was the confidence the need to confront the Royal Navy either would never happen or was years away.

The US Navy in 1940 began construction of six Iowa class battleships but only four were ever launched because it had become clear the age of the aircraft carrier and submarine had arrived and the last battleship launched was the Royal Navy’s HMS Vanguard which entered service in 1946.  Although the admirals remained fond of the fine cut of her silhouette on the horizon, to the Treasury (an institution in the austere, post-war years rapidly asserting its authority over the Admiralty) the thing was a white elephant, something acknowledged even by the romantic, battleship-loving Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) who, when in November, 1953 planning a trip to Bermuda for a summit meeting with Dwight Eisenhower (1890-1969; US POTUS 1953-1961), opted to fly because “it costs Stg£30,000 if we go by Vanguard, and only £3,000 by air.  In 1959, Vanguard was sold for scrap and broken up the next year while the last of the Iowa class ships were decommissioned in 1992 after having spent many years of their life in a non-active reserve.  Defiant is of course a most Churchillian word and after World War I (1914-1918, he was asked by a French municipality to devise the wording for its war memorial.  He proposed:

IN WAR: RESOLUTION

IN DEFEAT: DEFIANCE

IN VICTORY: MAGNANIMITY

IN PEACE: GOODWILL

At the time, old Georges Clemenceau (1841–1929; French prime minister 1906-1909 & 1917-1920) wasn’t feeling much magnanimity towards the Germans and nor was he much in the mood to extend any goodwill so Churchill’s suggestion was rejected.  

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

The conventional wisdom therefore was the days of the big warships were done and the Soviet Navy’s curious decision in the 1980s to lay down five (four of which were launched) Kirov class battlecruisers seemed to confirm the view.  Although the Kremlin called the ships тяжёлый атомный ракетный крейсер (heavy nuclear-powered guided missile cruisers), admiralties in the West, still nostalgic lot, choose to revive the old name “battlecruiser”.  The battlecruiser (essentially a battleship with less armor) was a brainchild of the naval theorists of the early twentieth century but while the concept was sound (and in practice may have proved so if the theory had been followed at sea) but in service was a disappointment and none were commissioned after 1920 until the Soviets revived the idea.  As recently as 2018, NATO (North Atlantic Treaty Organization) sources were sceptical any of the Russian ships would ever return to service but in 2025 the Admiral Nakhimov (ex-Kalinin) emerged from a long and expensive re-fit & modernization to serve as the world’s biggest warship.  Although fast and heavily armed, concern remains about her vulnerability to missiles and torpedoes.

Depiction of Trump class USS Defiant issued by the US Navy, December, 2025.

The US Navy seems confident about the protection afforded by the Trump class’s systems, claiming “the battleship [the Pentagon’s term] will be capable of operating independently, as part of a Carrier Strike Group, or commanding its own Surface Action Group depending on the mission and threat environment.  In other words, unlike an aircraft carrier, the security of the vessel does not depend on a flotilla of destroyers and other smaller escort vessels.  The first of the Trump class is projected to cost between US$10-15 billion although, on the basis of experience, few will be surprised if this number “blows out”.  The Trump class will be the flagships for the Navy’s “Golden Fleet” initiative (an old naval term dating from days of the Spanish colonial Empire and nothing to do with Mr Trump’s fondness for the metal).  In an age in which small, cheap, UAVs (unmanned aerial vehicles, usually referred to as drones) have revolutionized warfare (on land and at sea), the return of the big ships is as interesting as it was unexpected and analysts are already writing their assessments of the prospects of success.

Although the concept wasn’t new, it was late in the nineteenth century naval architects began to apply the word “class” systematically to group ships of the same design, the pioneers the Royal Navy but other powers soon adopted the practice.  It had long been the practice for warships to be constructed on the basis of substantially replicating existing designs and some truly were “identical” to the extent a series would now be called a “class” but before the terminology became (more or less) standardized, warships usually were described by their “Rate” or “Type” (first-rate ship of the line, corvette, frigate etc) but, in the usual military way, there was also much informal slang including phrases such as “the Majestic battleships” or “ships of the Iron Duke type”.  The crystallization of the “class” concept was really a result of technological determinism as the methods developed in factories which emerged during the industrial revolution spread to ship-building; steam power, hulls of iron & steel and the associated complex machinery made design & construction increasingly expensive, thus the need to amortize investment and reduce build times by ordering ships in batches with near-identical specifications.

Navies in the era were also becoming more bureaucratic (a process which never stopped and some believe is accelerating still) and Admiralties became much taken with precise accounting and doctrinal categorisation.  The pragmatic admirals however saw no need to reinvent the wheel, “class” already well-established in engineering and taxonomy, the choice thus an obvious administrative convenience.  The “new” nomenclature wasn’t heralded as a major change or innovations, the term just beginning to appear in the 1870s in Admiralty documents, construction programmes and parliamentary papers in which vessels were listed in groups including Devastation class ironclad turret ships (laid down 1869), Colossus class battleships (laid down 1879) and Admiral class battleships (1880s).  In recent history tests, warships prior to this era sometimes are referred to as “Ship-of-the-line class”, “Three decker class” etc but this use is retrospective.  The French Navy adopted the convention almost simultaneously (with the local spelling classe) with Imperial Germany’s Kaiserliche Marine (Imperial Navy) following in the 1890s with Klasse.  The US Navy was comparatively late to formalise the use and although “class” in this context does appear in documents in the 1890s, the standardization wasn’t complete until about 1912.

As a naming convention (“King George V class”, “Iowa class” etc), the rule is the name chosen is either (1) the first ship laid down, or (2) the lead ship commissioned.  According to Admiralty historians, this wasn’t something determined by a committee or the whim of an admiral (both long naval traditions) but was just so obviously practical.  It certainly wasn’t an original idea because the term “class” was by the late nineteenth century well established in industrial production, civil engineering, and military administration; if anything the tradition-bound admirals were late-adopters, sticking to their old classificatory habit long after it had outlived its usefulness.  With ships becoming bigger and more complex, what was needed was a system (which encompassed not only the ships but also components such as guns, torpedoes, engines etc) which grouped objects according to their defined technical specification rather than their vague “type” (which by then had become most elastic) or individual instances; naval architecture had entered the “age of interchangability”.

A docked Boomin' Beaver.

It’s good the US Navy is gaining (appropriately large) “Trump Class” warships (which the president doubtless will call “battleships” although they’re more in the “battlecruiser” tradition).  Within the fleet however there are on the register many smaller vessels and the most compact is the 19BB (Barrier Boat), a specialized class of miniature tugboat used deploy and maintain port security booms surrounding Navy ships and installations in port.  Over the last quarter century there have been a dozen-odd commissioned of which ten remain in active service.  Unlike many of the Pentagon’s good (and expensive) ideas, the Barrier Boats were a re-purposing of an existing design, their original purpose being in the logging industry where they were used to manoeuvre logs floating along inland waterways.  In that role the loggers dubbed them “log broncs” because the stubby little craft would “rear up like a rodeo bronco” when spun around by 180o.  Sailors of course have their own slang and they (apparently affectionately) call the 19BBs the “Boomin’ Beaver”, the origin of that being uncertain but it may verge on the impolite.  It’s not known if President Trump is aware of the useful little BB19 but if brought to his attention, he may be tempted to order two of them renamed “USS Joe Biden” and “USS Crooked Hillary” although, unlike those reprobates, the Boomin’ Beavers have done much good work for the nation.

The Arc de Triomphe, Paris (left), Donald Trump with model of his proposed arch, the White House, October, 2025 (centre) and a model of the arch, photographed on the president's Oval Office desk (right).  Details about the arch remain sketchy but it's assumed (1) it will be "big" and (2) there will be some gold, somewhere.

As well as big ships (and the big Donald J Trump Ballroom already under construction where the White House’s East Wing once stood), Mr Trump is also promising a “big arch”.  A part of the president’s MDCBA (Make D.C. Beautiful Again) project, the structure (nicknamed the “Triumphal Arch” and in the style of the Arc de Triomphe which stands in the centre of the Place Charles de Gaulle (formerly the Place de l’Étoile), the western terminus of the avenue des Champs-Élysées) is scheduled to be completed in time to celebrate the nation’s 250th anniversary on 4 July 2026.  Presumably, on that day, it will be revealed the official name is something like the “Donald J Trump Sestercentennial Arch” which will appear on the structure in large gold letters.  The arch is said to be “privately funded”, using money left over from what was donated to build the ballroom, a financing mechanism which has attracted some comment from those concerned about the “buying of influence”.

Adolf Hitler's (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) sketch of an arch (1926, left) and Hitler, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) and others examining Speer's model of the arch, presented 20 April 1939 upon the occasion of the Führer’s 50th birthday (right; note the pattern in carpet).

A model of Germania.  To give some indication of the scale, within the dome of the huge meeting hall (at top of image), St. Peter's Cathedral in Rome would have fitted several times over; its diameter of the dome would have been 250 metres (825 feet).

Commissioned to honor those who fought and died for France during the French Revolutionary (1792-1802) and Napoleonic Wars (1803-1815), construction of the Arc de Triomphe (officially the Arc de Triomphe de l'Étoile) absorbed 30-odd years between 1806-1836, as a piece of representational architecture the structure is thought perfectly proportioned for assessment by the human eye and perhaps for this reason it has been admired by many.  As early as 1926, Adolf Hitler sketched his vision of a grand arch for Berlin, while bitter experience taught him the big warships were a bad idea because of their vulnerability to air attack, he never lost his enthusiasm for megalomania in architecture and in Albert Speer he found the ideal architect.  Noting the dimensions in Hitler’s sketch, Speer responded with something in the spirit of their blueprint for Germania.  Hitler’s planned the rebuilding of Berlin to be complete by 1950, less than ten years after the expected victory in a war which would have made him the master of Europe from the French border to the Ural mountains (things didn’t work out well for him).  While the 50 metre (163 feet) tall Arc de Triomphe presented a monumental appearance and provided a majestic terminus for the Champs Elysees, Speer’s arch stood 117 meters (384 feet) in height but even though obviously substantial, it would have been entirely in scale with the rest of Germania, the whole place built in a way to inspire awe simply by virtue of sheer size.

Thursday, November 28, 2024

Cereal & Serial

Cereal (pronounced seer-ee-uhl)

(1) Any plant of the grass family yielding an edible grain (wheat, rye, oats, rice, corn, maize, sorghum, millet etc).

(2) The grain from those plants.

(3) An edible preparation of these grains, applied especially to packaged, (often process) breakfast foods.

(4) Of or relating to grain or the plants producing it.

(5) A hamlet in Alberta, Canada.

(6) As Ceres International Women's Fraternity, a women's fraternity focused on agriculture, founded on 17 August 1984 at the International Conclave of FarmHouse fraternity.

1590s: From the sixteenth century French céréale (having to do with cereal), from the Latin cereālis (of or pertaining to the Roman goddess Ceres), from the primitive Indo-European ker-es-, from the root er- (to grow”) from which Latin gained also sincerus (source of the English sincere) and crēscō (grow) (source of the English crescent).  The noun use of cereal in the modern sense of (a grass yielding edible grain and cultivated for food) emerged in 1832 and was developed from the adjective (having to do with edible grain), use of which dates from 1818, also from the French céréale (in the sense of the grains).  The familiar modern use (packaged grain-based food intended for breakfast) was a creation of US English in 1899.  If used in reference to the goddess Ceres, an initial capital should be used.  Cereal, cereology & cerealogist are nouns and ceralic is an adjective; the noun plural is cereals.

Lindsay Lohan mixing Pilk.

Cereal is often used as modifier (cereal farming, cereal production, cereal crop, non-cereal, cereal bar, pseudocereal, cereal dust etc) and a cereologist is one who works in the field of cerealogy (the investigation, or practice, of creating crop circles).  The term “cereal killer” is used of one noted for their high consumption of breakfast cereals although some might be tempted to apply it to those posting TikTok videos extolling the virtue of adding “Pilk” (a mix of Pepsi-Cola & Milk) to one’s breakfast cereal.  Pilk entered public consciousness in December 2022 when Pepsi Corporation ran a “Dirty Sodas” promotion for the concoction, featuring Lindsay Lohan.  There is some concern about the high sugar content in packaged cereals (especially those marketed towards children) but for those who want to avoid added sugar, Pepsi Corporation does sell “Pepsi Max Zero Sugar” soda and Pilk can be made using this.  Pepsi Max Zero Sugar contains carbonated water, caramel color, phosphoric acid, aspartame, acesulfame potassium, caffeine, citric acid, potassium benzoate & calcium disodium EDTA.

TikTok, adding Pilk to cereal and the decline of Western civilization.

A glass of Pilk does of course make one think of Lindsay Lohan but every mouthful of one’s breakfast cereal is something of a tribute to a goddess of Antiquity.  In 496 BC, Italy was suffering one of its periodic droughts and one particularly severe and lingering, the Roman fields dusty and parched.  As was the practice, the priests travelled to consult the Sibylline oracle, returning to the republic’s capital to report a new goddess of agriculture had to be adopted and sacrifices needed immediately to be made to her so rain would again fall on the land.  It was Ceres who was chosen and she became the goddess of agriculture and protector of the crops while the caretakers of her temple were the overseers of the grain market (they were something like the wheat futures traders in commodity exchanges like the Chicago Board of Trade (CBOT)).  It was the will of the goddess Ceres which determined whether a harvest was prolific or sparse and to ensure abundance, the Romans ensured the first cuttings of the corn were always sacrificed to her.  It’s from the Latin adjective cereālis (of or pertaining to the Roman goddess Ceres) English gained “cereal”.

For millennia humanity’s most widely cultivated and harvested crop, cereal is a grass cultivated for its edible grain, the best known of which are rice, barley, millet, maize, rye, oats, sorghum & wheat.  Almost all cereals are annual crops (ie yielding one harvest per planting) although some strains of rice can be grown as a perennial and an advantages of cereals is the differential in growth rates and temperature tolerance means harvesting schedules can be spread from mid-spring until late summer.  Except for the more recent hybrids, all cereals are variations of natural varieties and the first known domestication occurred early in the Neolithic period (circa 7000–1700 BC).  Although the trend in cultivated area and specific yield tended over centuries to display a gradual rise, it was the “green revolution” (a combination of new varieties of cereals, chemical fertilizers, pest control, mechanization and precise irrigation which began to impact agriculture at scale in the mid twentieth century) which produced the extraordinary spike in global production.  This, coupled with the development of transport & distribution infrastructure (ports and bulk carriers), made possible the increase in the world population, now expected to reach around 10 billion by mid-century before declining.

Serial (pronounced seer-ee-uhl)

(1) Anything published, broadcast etc, in short installments at regular intervals (a novel appearing in successive issues of a magazine (ie serialized); a radio or TV series etc).

(2) In library & publishing jargon, a publication in any medium issued in successive parts bearing numerical or chronological designation and intended to be continued indefinitely.

(3) A work published in installments or successive parts; pertaining to such publication; pertaining to, arranged in, or consisting of a series.

(4) Occurring in a series rather than simultaneously (used widely, serial marriage; serial murderer, serial adulterer etc).

(5) Effecting or producing a series of similar actions.

(6) In IT, of or relating to the apparent or actual performance of data-processing operations one at a time (in the order of occurrence or transmission); of or relating to the transmission or processing of each part of a whole in sequence, as each bit of a byte or each byte of a computer word.

(7) In grammar, of or relating to a grammatical aspect relating to an action that is habitual and ongoing.

(8) In formal logic and logic mathematics (of a relation) connected, transitive, and asymmetric, thereby imposing an order on all the members of the domain.

(9) In engineering & mass-production (as “serial number”), a unique (to a certain product, model etc) character string (which can be numeric or alpha-numeric) which identifies each individual item in the production run.

(10) In music, of, relating to, or composed in serial technique.

(11) In modern art, a movement of the mid-twentieth century avant-garde in which objects or constituent elements were assembled in a systematic process, in accordance with the principles of modularity.

(12) In UK police jargon, a squad of officers equipped with shields and other protective items, used for crowd and riot control.

1823: From the New Latin word seriālis, from the Classical Latin seriēs (series), the construct being serial + -al on the Latin model which was seriēs + -ālis.  It was cognate to the Italian seriale.  The Latin seriēs was from serere (to join together, bind), ultimately from the primitive Indo-European ser- (to bind, put together, to line up).  The -al suffix was from the Middle English -al, from the Latin adjectival suffix -ālis, ((the third-declension two-termination suffix (neuter -āle) used to form adjectives of relationship from nouns or numerals) or the French, Middle French and Old French –el & -al.  It was use to denote the sense "of or pertaining to", an adjectival suffix appended (most often to nouns) originally most frequently to words of Latin origin, but since used variously and also was used to form nouns, especially of verbal action.  The alternative form in English remains -ual (-all being obsolete).  The –alis suffix was from the primitive Indo-European -li-, which later dissimilated into an early version of –āris and there may be some relationship with hel- (to grow); -ālis (neuter -āle) was the third-declension two-termination suffix and was suffixed to (1) nouns or numerals creating adjectives of relationship and (2) adjectives creating adjectives with an intensified meaning.  The suffix -ālis was added (usually, but not exclusively) to a noun or numeral to form an adjective of relationship to that noun. When suffixed to an existing adjective, the effect was to intensify the adjectival meaning, and often to narrow the semantic field.  If the root word ends in -l or -lis, -āris is generally used instead although because of parallel or subsequent evolutions, both have sometimes been applied (eg līneālis & līneāris).  Serial, serializer , serialization serialism & serialist are nouns, serialing, serialize & serialed are verbs, serializable is an adjective and serially is adverb; the noun plural is serials.

The “serial killer” is a staple of the horror film genre.  Lindsay Lohan’s I Know Who Killed Me (2007) was not well received upon release but it has since picked up a cult following.

The adjective serial (arranged or disposed in a rank or row; forming part of a series; coming in regular succession) seems to have developed much in parallel with the French sérial although the influence of one on the other is uncertain.  The word came widely to be used in English by the mid nineteenth century because the popular author Charles Dickens (1812–1870) published his novels in instalments (serialized); sequentially, chapters would appear over time in periodicals and only once the series was complete would a book appear containing the whole work.  The first use of the noun “serial” to mean “story published in successive numbers of a periodical” was in 1845 and that came from the adjective; it was a clipping of “serial novel”.  By 1914 this had been extended to film distribution and the same idea would become a staple of radio and television production, the most profitable for of which was apparently the “mini-series”, a term first used in 1971 although the concept had been in use for some time.  Serial number (indicating position in a series) was first recorded in 1866, originally of papers, packages and such and it was extended to soldiers in 1918.  Surprisingly perhaps, given the long history of the practice, the term, “serial killer” wasn’t used until 1981 although the notion of “serial events” had been used of seemingly sequential or related murders as early as the 1960s.  On that model, serial became a popular modifier (serial rapist, serial adulterer, serial bride, serial monogamist, serial pest, serial polygamy etc)

For those learning English, the existence of the homophones “cereal” & “serial” must be an annoying quirk of the language.  Because cereals are usually an annual crop, it’s reasonable if some assume the two words are related because wheat, barley and such are handled in a “serial” way, planting and harvesting recurrent annual events.  Doubtless students are told this is not the case but there is a (vague) etymological connection in that the Latin serere meant “to join together, to bind” and it was used also to mean “to sow” so there is a connection in agriculture: sowing seeds in fields.  For serial, the connection is structural (linking elements in a sequence, something demonstrated literally in the use in IT and in a more conceptual way in “serial art”) but despite the differences, both words in a way involve the fundamental act of creating order or connection.

Serial art by Swiss painter Richard Paul Lohse (1902–1988): Konkretion I (Concretion I, 1945-1946), oil on pavatex (a wood fibre board made from compressed industrial waste) (left), Zwei gleiche Themen (Two same topics, 1947), colored pencil on paper (centre) and  Konkretion III (1947), oil on pavatex.

In modern art, “serial art” was a movement of the mid-twentieth century avant-garde in which objects or constituent elements were assembled in a systematic process in accordance with the principles of modularity.  It was a concept the legacy of which was to influence (some prefer “infect”) other artistic schools rather than develop as a distinct paradigm but serial art is still practiced and remains a relevant concept in contemporary art.  The idea was of works based on repetition, sequences or variations of a theme, often following a systematic or conceptual approach; the movement was most active during the mid-twentieth century and a notable theme in Minimalism, Donald Judd (1928-1994), Andy Warhol (1928–1987), Sol LeWitt (1928-2007) (there must have been something “serial” about 1928) and Richard Paul Lohse (1902-1988) all pioneers of the approach.  Because the techniques of the serialists were adopted by many, their style became interpolated into many strains of modern art so to now speak of it as something distinctive is difficult except in a historic context.  The embrace by artists of digital tools, algorithms, and AI (Artificial Intelligence) technologies has probably restored a new sort of “purity” to serial art because generative processes are so suited to create series of images, sculptures or digital works that explore themes like pattern, progression, or variation, the traditional themes of chaos, order and perception represented as before.  In a way, serial art was just waiting for lossless duplication and the NFT (Non-fungible token) and more conservative critics still grumble the whole idea is little different to an architect’s blueprint which documents the structural framework without the “skin” which lends the shape its form.  They claim it's the engineering without the art.

Relics of the pre-USB age; there were also 25 pin serial ports.

In IT hardware, “serial” and “parallel” refer to two different methods of transmitting data between devices or components and the distinction lies in how data bits are sent over a connection.  In serial communication, data was transmitted one bit at a time over as little as single channel or wire which in the early days of the industry was inherently slow although in modern implementations (such as USB (Universal Serial Bus) or PCIe (Peripheral Component Interconnect Express)) high speeds are possible.  Given what was needed in the early days, serial technology was attractive because the reduction in wiring reduced cost and complexity, especially over the (relatively) long distances at which serial excelled and with the use of line-drivers, the distances frequently were extended to hundreds of yards.  The trade-off was of course slower speed but these were simpler times.  In parallel communication, data is transmitted multiple bits at a time, each bit traveling simultaneously over its own dedicated channel and this meant it was much faster than serial transmission.  Because more wires were demanded, the cost and complexity increased, as did the potential for interference and corruption but most parallel transmission was over short distances (25 feet (7½ metres) was “long-distance”) and the emergence of “error correcting” protocols made the mode generally reliable.  For most, it was the default method of connecting a printer and for large file sizes the difference in performance was discernible, the machines able to transmit more data in a single clock cycle due to simultaneous bit transmission.  Except for specialized applications or those dealing with legacy hardware (and in industries like small-scale manufacturing where such dedicated machines can be physically isolated from the dangers of the internet, parallel and serial ports and cables continue to render faithful service) parallel technology is effectively obsolete and serial connections are now almost universally handled by the various flavours of USB.

Wednesday, June 12, 2024

Reduction

Reduction (pronounced ri-duhk-shuhn)

(1) The act of reducing or the state of being reduced.

(2) The amount by which something is reduced or diminished.

(3) The form (result) produced by reducing a copy on a smaller scale (including smaller scale copies).

(4) In cell biology, as meiosis, especially the first meiotic cell division in which the chromosome number is reduced by half.

(5) In chemistry, the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen).

(6) In film production when using physical film stock (celluloid and such), the process of making a print of a narrower gauge from a print of a wider gauge (historically from 35 to 16 mm).

(7) In music, a simplified form, typically an arrangement for a smaller number of parties  such as an orchestral score arranged for a solo instrument.

(8) In computability theory, a transformation of one problem into another problem, such as mapping reduction or polynomial reduction.

(9) In philosophy (notably in phenomenology), a process intended to reveal the objects of consciousness as pure phenomena.

(10) In metalworking, the ratio of a material's change in thickness compared to its thickness prior to forging and/or rolling.

(11) In engineering, (usually as “reduction gear”), a means of energy transmission in which the original speed is reduced to whatever is suitable for the intended application.

(12) In surgery, a procedure to restore a fracture or dislocation to the correct alignment, usually with a closed approach but sometimes with an open approach.

(13) In mathematics, the process of converting a fraction into its decimal form or the rewriting of an expression into a simpler form.

(14) In cooking, the process of rapidly boiling a sauce to concentrate it.

(15) During the colonial period, a village or settlement of Indians in South America established and governed by Spanish Jesuit missionaries.

1475–1485: From the Middle English reduccion, from the earlier reduccion, from the Middle French reduction, from the Latin reductiōnem & reductiōn- (stem of reductiō (a “bringing back”)) the construct being reduct(us) (past participle of redūcere (to lead back) + -iōn- (the noun suffix).  The construct in English was thus reduc(e), -ion.  Reduce was from the Middle English reducen, from the Old French reduire, from the Latin redūcō (reduce), the construct being re- (back) + dūcō (lead).  The –ion suffix was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive -iōnis).  It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process.  Reduction, reductivism, reductionistic & reductionism are nouns, reductionist is a noun & adjective, reductional & reductive are adjectives; the noun plural is reductions.  Forms like anti-reduction, non-reduction, over-reduction, pre-reduction, post-reduction, pro-reduction, self-reduction have been created as required.

Actor Ariel Winter (b 1998), before (left) and after (right) mammaplasty (breast reduction).  Never has satisfactorily it been explained why this procedure seems to be lawful in all jurisdictions.

In philosophy & science, reductionism is an approach used to explain complex phenomena by reducing them to their simpler, more fundamental components.  It posits that understanding the parts of a system and their interactions can provide a complete explanation of the system as a whole an approach which is functional and valuable is some cases and to varying degrees inadequate in others.  The three generally recognized classes of reductionism are (1) Ontological Reductionism, the idea that reality is composed of a small number of basic entities or substances, best illustrated in biology where life processes are explained by reducing things to the molecular level.  (2) Methodological Reductionism, an approach which advocates studying systems by breaking into their constituent parts, much used in psychology where it might involve studying human behavior by examining neurological processes.  (3) Theory Reductionism which involves explaining a theory or phenomenon in one field by the principles of another, more fundamental field as when chemistry is reduced to the physics or chemical properties explained by the operation of quantum mechanics.  Reduction has been an invaluable component in many of the advances in achieved in science in the last two-hundred-odd years and some of the process and mechanics of reductionism have actually been made possible by some of those advances.  The criticism of an over-reliance on reductionism in certain fields in that its very utility can lead to the importance of higher-level structures and interactions being overlooked; there is much which can’t fully be explained by the individual parts or even their interaction.  The diametric opposite of reductionism is holism which emphasizes the importance of whole systems and their properties that emerge from the interactions between parts.  In philosophy, reductionism is the position which holds a system of any level of complexity is nothing but the sum of its parts and an account of it can thus be reduced to accounts of individual constituents.  It’s very much a theoretical model to be used as appropriate rather than an absolutist doctrine but it does hold that phenomena can be explained completely in terms of relations between other more fundamental phenomena: epiphenomena.  A reductionist is either (1) an advocate of reductionism or (2) one who practices reductionism.

Reductionism: Lindsay Lohan during "thin phase".

The adjective reductive has a special meaning in Scots law pertaining to reduction of a decree or other legal device (ie something rescissory in its effect); dating from the sixteenth century, it’s now rarely invoked.  In the sense of “causing the physical reduction or diminution of something” it’s been in use since the seventeenth century in fields including chemistry, metallurgy, biology & economics, always to convey the idea of reduces a substance, object or some abstract quantum to a lesser, simplified or less elaborated form.  At that time, it came to be used also to mean “that can be derived from, or referred back to; something else” and although archaic by the early 1800s, it existence in historic texts can be misleading.  It wasn’t until after World War II (1939-1945) that reductive emerged as a derogatory term, used to suggest an argument, issue or explanation has been “reduced” to a level of such simplicity that so much has been lost as to rob things of meaning.  The phrase “reductio ad absurdum” (reduction to the absurd) is an un-adapted borrowing from the Latin reductiō ad absurdum, and began in mathematics, logic (where it was a useful tool in deriving proofs in fields like).  In wider use, it has come to be used of a method of disproving a statement by assuming the statement is true and, with that assumption, arriving at a blatant contradiction; the synonyms are apagoge & “proof by contradiction”.

Single-family houses (D-Zug) built in 1922 on the principle of architectural reductionism by Heinrich Tessenow in collaboration with Austrian architect Franz Schuster (1892–1972), Moritzburger Weg 19-39 (the former Pillnitzer Weg), Gartenstadt Hellerau, Dresden, Germany.

As a noun, a reductivist is one who advocates or adheres to the principles of reductionism or reductivism.  In art & architecture (and some aspects of engineering) this can be synonymous with the label “a minimalist” (one who practices minimalism).  As an adjective, reductivist (the comparative “more reductivist”, the superlative “most reductivist”) means (1) tending to reduce to a minimum or to simplify in an extreme way and (2) belonging to the reductivism movement in art or music.  The notion of “extreme simplification” (a reduction to a minimum; the use of the fewest essentials) has always appealed some and appalled others attracted to intricacy and complexity.  The German architect Professor Heinrich Tessenow (1876-1950) summed it up in the phrase for which he’s remembered more than his buildings: “The simplest form is not always the best, but the best is always simple.”, one of those epigrams which may not reveal a universal truth but is probably a useful thing to remind students of this and that lest they be seduced by the process and lose sight of the goal.  Tessenow was expanding on the principle of Occam's Razor (the reductionist philosophic position attributed to English Franciscan friar & theologian William of Ockham (circa 1288–1347) written usually as Entia non sunt multiplicanda praeter necessitatem (literally "Entities must not be multiplied beyond necessity" which translates best as “the simplest solution is usually the best.

Reductio in extrema

1960 Lotus Elite Series 1 (left) and at the Le Mans 24 Hour endurance classic, June 1959 (left) Lotus Elite #41 leads Ferrari 250TR #14. The Ferrari (DNF) retired after overheating, the Elite finishing eighth overall, winning the 1.5 litre GT class.

Weighing a mere 500-odd kg (1100 lb), the early versions of the exquisite Lotus Elite (1957-1963) enchanted most who drove it but the extent of the reductionism compromised the structural integrity and things sometimes broke when used under everyday conditions which of course includes potholed roads.  Introduced late in 1961 the Series 2 Elite greatly improved this but some residual fragility was inherent to the design.  On the smooth surfaces of racing circuits however, it enjoyed an illustrious career, notable especially for success in long-distance events at the Nürburgring and Le Mans.  The combination of light weight and advanced aerodynamics meant the surprisingly powerful engine (a lightweight and robust unit which began life powering the water pumps of fire engines!) delivered outstanding performance, frugal fuel consumption and low tyre wear.  As well as claiming five class trophies in the Le Mans 24 hour race, the Elite twice won the mysterious Indice de performance (an index of thermal efficiency), a curious piece of mathematics actually intended to ensure, regardless of other results, a French car would always win something.

Colin Chapman (1928–1982), who in 1952 founded Lotus Cars, applied reductionism even to the Tessenow mantra in his design philosophy: “Simplify, then add lightness.”  Whether at the drawing board, on the factory floor or on the racetrack, Chapman seldom deviated from his rule and while it lent his cars sparking performance and delightful characteristics, more than one of the early models displayed an infamous fragility.  Chapman died of a heart attack which was a good career move, given the likely legal consequences of his involvement with John DeLorean (1925–2005) and the curious financial arrangements made with OPM (other people's money) during the strange episode which was the tale of the DMC DeLorean gullwing coupé.

1929 Mercedes-Benz SSKL blueprint (recreation, left) and the SSKL “streamliner”, AVUS, Berlin, May 1932 (right).

The Mercedes-Benz SSKL was one of the last of the road cars which could win top-line grand prix races.  An evolution of the earlier S, SS and SSK, the SSKL (Super Sports Kurz (short) Leicht (light)) was notable for the extensive drilling of its chassis frame to the point where it was compared to Swiss cheese; reducing weight with no loss of strength.  The SSK had enjoyed success in competition but even in its heyday was in some ways antiquated and although powerful, was very heavy, thus the expedient of the chassis-drilling intended to make it competitive for another season.  Lighter (which didn't solve but at least to a degree ameliorated the high tyre wear) and easier to handle than the SSK (although the higher speed brought its own problems, notably in braking), the SSKL enjoyed a long Indian summer and even on tighter circuits where its bulk meant it could be out-manoeuvred, sometimes it still prevailed by virtue of sheer power.  By 1932 however the engine’s potential had been reached and no more metal could be removed from the structure without dangerously compromising safety; in engineering (and other fields), there is a point at which further reduction becomes at least counter-productive and often dangerouw.  The solution was an early exercise in aerodynamics (“streamlining” the then fashionable term), an aluminium skin prepared for the 1932 race held on Berlin’s AVUS (Automobil-Versuchs und Übungsstraße (automobile traffic and practice road)).  The reduction in air-resistance permitted the thing to touch 255 km/h (158 mph), some 20 km/h (12 mph) more than a standard SSLK, an increase the engineers calculated would otherwise have demanded another (unobtainable) 120 horsepower.  The extra speed was most useful at the unique AVUS which comprised two straights (each almost six miles (ten kilometres) in length) linked by two hairpin curves, one a dramatic banked turn.  The SSKL was the last of the breed, the factory’s subsequent Grand Prix machines all specialized racing cars.

Reduction gears: Known casually as "speed reducers", reduction gears are widely used in just about every type of motor and many other mechanical devices.  What they do is allow the energy of a rotating shaft to be transferred to another shaft running at a reduced speed (achieved usually by the use of gears (cogs) of different diameters.

In chemistry, a reduction is the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen) and as an example, if an iron atom (valence +3) gains an electron, the valence decreases to +2.  Linguistically, it’s obviously counterintuitive to imagine a “reduced atom” is one which gains rather than loses electrons but the term in this context dates from the early days of modern chemistry, where reduction (and its counterpart: “oxidation”) were created to describe reactions in which one substance lost an oxygen atom and the other substance gained it.   In a reaction such as that between two molecules of hydrogen (2H2)and one of oxygen (O2) combining to produce two molecules of water (2H2O), the hydrogen atoms have gained oxygen atoms and were said to have become “oxidized,” while the oxygen atoms have “lost them” by attaching themselves to the hydrogens, and were thus “reduced”.  Chemically however, in the process of gaining an oxygen atom, the hydrogen atoms have had to give up their electrons and share them with the oxygen atoms, while the oxygen atoms have gained electrons, thus the seeming paradox that the “reduced” oxygen has in fact gained something, namely electrons.

Secretary of Defence the younger (left) and elder (right).  Donald Rumsfeld (left) with Gerald Ford (1913–2006; US president 1974-1977) and George W Bush (George XLIII, b 1946; US president 2001-2009).

Donald Rumsfeld (1932–2021; US Secretary of Defense 1975-1977 & 2001-2006) may or may not have been evil but his mind could sparkle and his marvellously reductionist principles can be helpful.  His classification of knowledge was often derided but it remains a useful framework:

(1) Known unknowns.
(2) Known knowns.
(3) Unknown unknowns.
(4) (most intriguingly) Unknown knowns.

A expert reductionist, he reminded us also there are only three possible answers to any question and while there's a cultural reluctance to say “don’t know”, sometimes it is best:

(1) I know and I’m going to tell you.
(2) I know and I’m not going to tell you.
(3) Don’t know.

While (1) known unknowns, (2) known knowns and (3) unknown unknowns are self-explanatory, examples of (4) unknown knowns are of interest and a classic one was the first “modern” submarine, developed by the Germans during the last months of World War II (1939-1945).

German Type XII Elektroboot (1945).

In World War II, the course of the war could have been very different had OKM (Oberkommando der Marine (the Kriegsmarine's (German Navy) high command)) followed the advice of the commander of the submarines and made available a fleet of 300 rather than building a surface fleet which wasn’t large enough to be a strategic threat but of sufficient size to absorb resources which, if devoted to submarines, could have been militarily effective.  With a fleet of 300, it would have been possible permanently to maintain around 100 at sea but at the outbreak of hostilities, only 57 active boats were on the navy’s list, not all of which were suitable for operations on the high seas so in the early days of the conflict, it was rare for the Germans to have more than 12 committed to battle in the Atlantic.  Production never reached the levels necessary for the numbers to achieve critical mass but even so, in the first two-three years of the war the losses sustained by the British were considerable and the “U-Boat menace” was such a threat that much attention was devoted to counter-measures and by 1943 the Allies could consider the battle of the Atlantic won.  The Germans’ other mistake was not building a true submarine capable of operating underwater (and therefore undetected) for days at a time.

It was only in 1945 when Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) and Grand Admiral Karl Dönitz (1891–1980; head of the German Navy 1943-1945, German head of state 1945) were assessing the “revolutionary” new design that they concluded there was no reason why such craft couldn’t have been built in the late 1930s because the engineering capacity and technology existed even then (although the industrial and labor resources did not).  It was a classic case of what Donald Rumsfeld (1932–2021: US defense secretary 1975-1977 & 2001-2006) would later call an “unknown known”: The Germans in 1939 knew how to build a modern submarine but didn’t “know that they knew”.  Despite the improvements however, military analysts have concluded that even if deployed in numbers, such was the strength of forces arrayed against Nazi Germany that by 1945, not even such a force could have been enough to turn the tide of war.  However, had the German navy in 1939-1940 had available a fleet of even 100 such submarines (about a third what OKM (Oberkommando der Marine (the Kriegsmarine's (German Navy) high command) calculated was the critical strategic size given at any point only a third would be at sea with the others either in transit or docked), the battle in the Atlantic would have been much more difficult for the British.

Mr Rumsfeld however neglected to mention another class of knowledge: the “lost known”, examples of which have from time-to-time appeared and there may be more still to be discovered.  The best known were associated with the knowledge lost after the fall in the fifth century of the Western Roman Empire when Europe entered the early medieval period, once popularly known as the “Dark Ages”.  The lost knowns included aspects of optics such as lens grinding and the orthodoxy long was the knowledge was not “re-discovered” or “re-invented” until perfected in Italy during the late thirteenth century although it’s now understood that in the Islamic world lens continued during the late Medieval period to be ground and it’s suspected it was from Arabic texts the information reached Europe.

What really was a lost known was how the Romans of Antiquity made their long-lasting opus caementicium (concrete) so famously “sticky” and resistant to the effects of salt water.  Unlike modern concrete, made using Portland cement & water, Roman concrete was a mix of volcanic ash & lime, mixed with seawater, the later ingredient inducing a chemical reaction creating a substance stronger and more durable.  When combined with volcanic rocks, it formed mineral crystalline structures called aluminum tobermorite which lent the mix great strength and resistance to cracking.  After the fall of Rome, the knowledge was lost and even when a tablet was discovered listing the mix ratios, caementicium couldn’t be replicated because the recipe spoke only of “water” and not “sea water”, presumably because that was common knowledge.  It was only modern techniques of analysis which allowed the “lost known” to again become a “known known”.