Thursday, October 14, 2021

Bubble

Bubble (pronounced buhb-uhl)

(1) A spherical globule of gas (or vacuum) contained in a liquid or solid.

(2) Anything that lacks firmness, substance, or permanence; an illusion or delusion.

(3) An inflated speculation, especially if fraudulent.

(4) The act or sound of bubbling.

(5) A spherical or nearly spherical canopy or shelter; dome.

(6) To form, produce, or release bubbles; effervesce.

(7) To flow or spout with a gurgling noise; gurgle.

(8) To speak, move, issue forth, or exist in a lively, sparkling manner; exude cheer.

(9) To seethe or stir, as with excitement; to boil.

(10) To cheat; deceive; swindle (archaic).

(11) To cry (archaic Scots).

(12) A type of skirt.

(13) In infection control management, a system of physical isolation in which un-infected sub-sets population are protected by restricting their exposure to others.

1350-1400: From the Middle English noun bobel which may have been from the Middle Dutch bubbel & bobbel and/or the Low German bubbel (bubble) and Middle Low German verb bubbele, all thought to be of echoic origin.  The related forms include the Swedish bubbla (bubble), the Danish boble (bubble) and the Dutch bobble.  The use to describe markets, inflated in value by speculation widely beyond any relationship to their intrinsic value, dates from the South Sea Bubble which began circa 1711 and collapsed in 1720.  In response to the collapse, parliament passed The Bubble Act (1720), which required anyone seeking to float a joint-stock company to first secure a royal charter.  Interestingly, the act was supported by the South Sea Company before its failure.  Ever since cryptocurrencies emerged, many have been describing them as a bubble which will burst and while that has happened with particular coins (the exchange collapses are something different), the industry thus far has continued with only the occasional period of deflation.  Bubble & bubbling are nouns & verbs, bubbler is a noun, bubbled is a verb, bubbly is a noun & adjective, bubbleless & bubblelike are adjectives and bubblingly is an adverb; the noun plural is bubbles.

An artificial tulip in elisa mauve.

However although the South Sea affair was the first use of “bubble” to describe such a market condition, it wasn’t the first instance of a bubble which is usually regarded as the Dutch tulpenmanie (tulip mania) which bounced during the 1630s, contract prices for some bulbs of the recently introduced and wildly fashionable tulip reaching extraordinarily high levels, the values accelerating from 1634 until a sudden collapse in 1637.  Apparently just a thing explained by a classic supply and demand curve, the tulip bubble burst with the first big harvest which demonstrated the bulbs and flowers were really quite common.  In history, there would have been many pervious bubbles but it wasn’t until the economies and financial systems of early-modern Europe were operating that the technical conditions existed for them to manifest in the form and to the extent we now understand.  Interestingly, for something often regarded as the proto-speculative asset bubble and a landmark in economic history, twentieth-century revisionist historians have suggested it was more a behavioral phenomenon than anything with any great influence on the operation of financial markets or the real economy, the “economic golden age” of the Dutch Republic apparently continuing unaffected for almost a century after the bottom fell out of the tulip market.  The figurative uses have been created or emerged as required, the first reference to anything wanting firmness, substance, or permanence is from 1590s.  The soap-bubble dates from 1800, bubble-shell is from 1847, bubble-gum was introduced in 1935 and bubble-bath appears first to have be sold in 1937.  The slang noun variation “bubbly” was first noted in 1920, an invention of US English.  

The word "bubble" spiked shortly after the start of the Covid-19 pandemic.  Over time, use has expanded to encompass large-scale operations like touring sporting teams and even the geographical spaces used for the 2022 Beijing Winter Olympics but the original meaning was more modest: small groups based on close friends, an extended family or co-workers.  These small bubbles weren't supposed to be too elastic and operated in conjunction with other limits imposed in various jurisdictions; a bubble might consist of a dozen people but a local authority might limit gatherings to ten in the one physical space so two could miss out, depending on the details in the local rules.  Bubble thus began as an an unofficial term used to describe the cluster of people outside the household with whom one felt comfortable in an age of pandemic.

Tulips

Bubbles were however a means of risk-reduction, not a form of quarantine.  The risks in a bubble still exist, most obviously because some may belong to more than one bubble, contact thus having a multiplier effect, the greater the number of interactions, the greater the odds of infection.  Staying home and limiting physical contact with others remained preferable, the next best thing to an actual quarantine.  The more rigorously administered bubbles used for events like the Olympics are essentially exercises in perimeter control, a defined "clean" area, entry into which is restricted to those tested and found uninfected.  At the scale of something like an Olympic games, it's a massive undertaking to secure the edges but, given sufficient resource allocation can be done although it's probably misleading to speak of such an operation as as a "bubble".  Done with the static-spaces of Olympic venues, they're really quarantine-zones.  Bubble more correctly describes touring sporting teams which move as isolated bubbles often through unregulated space.

The Bubble Skirt

A type of short skirt with a balloon style silhouette, the bubble dress (more accurately described as a bubble skirt because that’s the bit to which the description applies) is characterized by a voluminous skirt with the hem folded back on itself to create a “bubble” effect at the hemline.  Within the industry, it was initially called a tulip skirt, apparently because of a vague resemblance to the flower but the public preferred bubble.  It shouldn’t be confused with the modern tulip skirt and the tulip-bubble thing is just a linguistic coincidence, there’s no link with the Dutch tulipmania of the 1630s.  Stylistically, the bubble design is a borrowing from the nineteenth century bouffant gown which featured a silhouette made of a wide, full skirt resembling a hoop skirt, sometimes with a hoop or petticoat support underneath the skirt.   While bouffant gowns could be tea (mid-calf) or floor length, bubble skirts truncate the look hemlines tend to be well above the knee.  Perhaps with a little more geometric accurately, the design is known also as the “puffball” and, in an allusion to oriental imagery, the “harem” skirt.  Fashion designer Christian Lacroix (b 1951) became fond of the look and a variation included in his debut collection was dubbed le pouf but, in English, the idea of the “poof skirt” never caught on.

Lindsay Lohan in Catherine Malandrino silk pintuck dress with bubble skirt, LG Scarlet HDTV Launch Party, Pacific Design Center, Los Angeles, April 2008.

It must have been a memorable silhouette in the still austere post-war world, a sheath dress made voluminous with layers of organza or tulle, the result a cocoon-like dress with which Pierre Cardin (1922-2022) and Hubert de Givenchy (1927-2018) experimented in 1954 and 1958, respectively. A year later, Yves Saint Laurent (1936-2008) for Dior added the combination of a dropped waist dress and bubble skirt; post-modernism had arrived.  For dressmakers, bubble fashion presented a structural challenge and mass-production became economically feasible only because of advances in material engineering, newly available plastics able to be molded in a way that made possible the unique inner construction and iconic drape of the fabric.  For that effect to work, bubble skirts must be made with a soft, pliable fabric and the catwalk originals were constructed from silk, as are many of the high end articles available today but mass-market copies are usually rendered from cotton, polyester knits, satin or taffeta.

The bubble in the 1950s by Pierre Cardin (left), Givenchy (centre) & Dior (right).

The bubble skirt was never a staple of the industry in the sense that it would be missing from annual or seasonal ranges, sometimes for a decade or more and sales were never high, hardly surprising given it was not often a flattering look for women above a certain age, probably about seven or eight.  Deconstructing the style hints at why: a hemline which loops around and comes back up, created sometimes by including a tighter bottom half with the bulk of additional material above, it formed a shape not dissimilar to a pillow midway through losing its stuffing.  For that reason, models caution the look is best when combined with a sleek, fitted top to emphasize the slimness of the waistline, cinched if necessary with a belt some sort of delineating tie.  The bubble needs to be the feature piece too, avoiding details or accessories which might otherwise distract; if one is wearing a partially un-stuffed pillow, the point needs to be made it’s being done on purpose.

The bubble is adaptable although just because something can be done doesn’t mean it should be done.  The bubble skirt has however received the Paris Hilton (b 1981) imprimatur so there’s that.

TikTok and Instagram influencer Ella Cervetto (b 2000) in Oh Polly Jessamy (an off-shoulder layered bubble hem corset mini dress) in True Red (available also in Ivory), Sydney, Australia, November 2024.

On the catwalks however, again seemingly every decade or so, the bubble returns, the industry relying on the short attention span of consumers of pop culture inducing a collective amnesia which allows many resuscitations in tailoring to seem vaguely original.  Still, if ever a good case could be made for a take on a whimsical 1950s creation to re-appear, it was the staging of the first shows of the 2020-2021 post-pandemic world and the houses responded, Louis Vuitton, Erdem, Simone Rocha and JW Anderson all with billowy offerings, even seen was an improbably exuberant flourish of volume from Burberry.  What appeared on the post-Covid catwalk seemed less disciplined than the post-war originals, the precise constraints of intricately stitched tulle forsaken to permit a little more swish and flow, a romantic rather than decadent look.  The reception was generally polite but for those who hoped for a different interpretation, history suggests the bubble will be back in a dozen-odd years.

Wednesday, October 13, 2021

Disheveled

Disheveled (pronounced dih-shev-uhld)

(1) Hanging loosely or in disorder; unkempt.

(2) Untidy in appearance; disarranged.

1375–1425: From the Late Middle English discheveled (without dressed hair), replacing the earlier form dishevely which ran in parallel with dischevele (bare-headed), from the Old French deschevelé (bare-headed, with shaven head), past -participle adjective from descheveler (to disarrange the hair), the construct being des- (apart (the prefix indicating negation of a verb)) + -cheveler (derivative of chevel (hair; a hair) (cheveu in Modern French)) from the Latin capillus (a diminutive form from the root of caput (head), thought perhaps cognate with the Persian کوپله‎ (kūple) (hair of the head).  The Modern French forms are déchevelé & échevelé.  As applied to the hair itself in the sense of “hanging loose and throw about in disorder, having a disordered or neglected appearance”, use dated from the mid-fifteenth century while the general sense of “with disordered dress” emerged around the turn of the seventeenth.  The verb dishevel is interesting in that it came centuries later; a back formation from disheveled, used to mean “to loosen and throw about in disorder, cause to have a disordered or neglected appearance” it applied first to the hair in the 1590s and later to clothing and other aspects of appearance.  Synonyms include messy, scraggly, tousled, unkempt, untidy, crumpled, slovenly and sloppy.  The alternative spelling is dishevelled.  Disheveled is a verb & adjective, dishevelment is a noun and dishevelledly is an adverb.

Instances of dishevelment can be caused by (1) prevailing wind conditions, (2) a stylist preparing an actor or model or (3) other causes.  Lindsay Lohan in Confessions of a Teenage Drama Queen (2004, left) illustrates the stylist's craft while the other states of disarray (centre & right) would have been induced by "other causes".  Stylists preparing models for static shoots sometimes use remarkably simple tricks and equipment, hair held in a wind-blown look using nothing more than strips of cardboard, bulldog clips and some strategically placed scotch tape.  It takes less time and produces a more natural result than post-production digital editing.     

Donald Trump (b 1946; US president 2017-2021) seems prone to dishevelment in conditions above 2 on the Beaufort scale.  For perfectionists, the comparative form is "more disheveled" and the superlative "most disheveled".

Lindsay Lohan and her lawyer in court, Los Angeles, January 2012.

Hair apparent: Boris Johnson (b 1964; UK prime-minister 2019-2022) was known to have "weaponized" his hair as part of his image as (1) a toff who didn't care and (2) an English eccentric.  However just as Dolly Parton (b 1946) revealed that "it takes a lot of money to look this cheap", Mr Johnson's studied untidiness took a bit of work to maintain and credit must rightly be accorded to Ms Kelly Jo Dodge MBE.

Corruption is probably a permanent part of politics although it does ebb and flow and exists in different forms in different places.  In the UK, the honours system with its intricate hierarchy and consequent determination on one’s place in the pecking order on the Order of Precedence has real world consequences such as determining whether one sits at dinners with the eldest son of a duke or finds one’s self relegated to a table with the surviving wife of a deceased baronet.  Under some prime-ministers the system was famously corrupt and while things improved in the nineteenth century, under David Lloyd George (1863–1945; UK prime-minister 1916-1922) honours were effectively for sale in a truly scandalous way.  None of his successors were anywhere near as bad although Harold Wilson’s (1916–1995; UK prime minister 1964-1970 & 1974-1976) resignation honours list attracted much comment and did his reputation no good but in recent years it’s been relatively quiet on the honours front.  That was until the resignation list of Boris Johnson was published.  It included some names unknown to all but a handful of political insiders and many others which were controversial for their own reasons but at the bottom of the list was one entry which all agreed was well deserved: Ms Kelly Jo Dodge, for 27 years the parliamentary hairdresser, was created a Member of the Most Excellent Order of the British Empire (MBE).  In those decades, she can have faced few challenges more onerous than Boris Johnson’s hair yet never once failed to make it an extraordinary example in the (actually technically difficult) “not one hair in place” style.  The citation on her award read "for parliamentary service" but insiders all knew it really was for "services to dishevelment".

Tuesday, October 12, 2021

Gap

Gap (pronounced gap)

(1) A break or opening, as in a fence, wall, or military line; breach; an opening that implies a breach or defect (vacancy, deficit, absence, or lack).

(2) An empty space or interval; interruption in continuity; hiatus.

(3) A wide divergence or difference; disparity

(4) A difference or disparity in attitudes, perceptions, character, or development, or a lack of confidence or understanding, perceived as creating a problem.

(5) A deep, sloping ravine or cleft through a mountain ridge.

(6) In regional use (in most of the English-speaking world and especially prominent in the US), a mountain pass, gorge, ravine, valley or similar geographical feature (also in some places used of a sheltered area of coast between two cliffs and often applied in locality names).

(7) In aeronautics, the distance between one supporting surface of an airplane and another above or below it.

(8) In electronics, a break in a magnetic circuit that increases the inductance and saturation point of the circuit.

(9) In various field sports (baseball, cricket, the football codes etc), those spaces between players which afford some opportunity to the opposition.

(10) In genetics, an un-sequenced region in a sequence alignment.

(11) In slang (New Zealand), suddenly to depart.

(12) To make a gap, opening, or breach in.

(13) To come open or apart; form or show a gap.

1350–1400: From the Middle English gap & gappe (an opening in a wall or hedge; a break, a breach), from Old Norse gap (gap, empty space, chasm) akin to the Old Norse gapa (to open the mouth wide; to gape; to scream), from the Proto-Germanic gapōną, from the primitive Indo-European root ghieh (to open wide; to yawn, gape, be wide open) and related to the Middle Dutch & Dutch gapen, the German gaffen (to gape, stare), the Danish gab (an expanse, space, gap; open mouth, opening), the Swedish gap & gapa and the Old English ġeap (open space, expanse).  Synonyms for gap can include pause, interstice, break, interlude, lull but probably not lacuna (which is associated specifically with holes).  Gap is a noun & verb, gapped & gapping are verbs, Gapless & gappy are adjectives; the noun plural is gaps.

Lindsay Lohan demonstrates a startled gape, MTV Movie-Awards, Gibson Amphitheatre, Universal City, California, June 2010.

The use to describe natural geographical formations (“a break or opening between mountains” which later extended to “an unfilled space or interval, any hiatus or interruption”) emerged in the late fifteenth century and became prevalent in the US, used of deep breaks or passes in a long mountain chain (especially one through which a waterway flows) and often used in locality names.  The use as a transitive verb (to make gaps; to gap) evolved from the noun and became common in the early nineteenth century as the phrases became part of the jargon of mechanical engineering and metalworking (although in oral use the forms may long have existed).  The intransitive verb (to have gaps) is documented only since 1948.  The verb gape dates from the early thirteenth century and may be from the Old English ġeap (open space, expanse) but most etymologists seem to prefer a link with the Old Norse gapa (to open the mouth wide; to gape; to scream); it was long a favorite way of alluding to the expressions thought stereotypical of “idle curiosity, listlessness, or ignorant wonder of bumpkins and other rustics” and is synonymous with “slack-jawed yokels”).  The adjective gappy (full of gaps; inclined to be susceptible to gaps opening) dates from 1846.  The adjectival use gap-toothed (having teeth set wide apart) has been in use since at least the 1570s, but earlier, Geoffrey Chaucer (circa 1344-1400) had used “gat-toothed” for the same purpose, gat from the Middle English noun gat (opening, passage) from the Old Norse gat and cognate with gate.

Lindsay Lohan demonstrates her admirable thigh gap, November 2013.

The “thigh gap” seems first to have been documented in 2012 but gained critical mass on the internet in 2014 when it became of those short-lived social phenomenon which produced a minor moral panic.  “Thigh gap” described the empty space between the inner thighs of a women when standing upright with feet touching; a gap was said to be good and the lack of a gap bad.  Feminist criticism noted it was not an attribute enjoyed by a majority of mature human females and it thus constituted just another of the “beauty standards” imposed on women which were an unrealizable goal for the majority.  The pro-ana community ignored this critique and thinspiration (thinspo) bloggers quickly added annotated images and made the thigh gap and essential aspect of female physical attractiveness.  

A walking, talking credibility gap: crooked Hillary Clinton (b 1947; US secretary of state 2009-2013).

In English, gap has been prolific in the creation of phrases & expressions.  The “generation gap” sounds modern and as a phrase it came into wide use only in the 1960s in reaction to the twin constructs of “teenagers” and the “counter-culture” but the concept has been documented since antiquity and refers to a disconnect between youth and those older, based on different standards of behavior, dress, artistic taste and social mores.  The term “technology gap” was created in the early 1960s and was from economics, describing the various implications of a nation’s economy gaining a competitive advantage over others by the creation or adoption of certain technologies.  However, the concept was familiar to militaries which had long sought to quantify and rectify any specific disadvantage in personnel, planning or materiel they might suffer compared to their adversaries; these instances are described in terms like “missile gap”, “air gap”, “bomber gap”, “megaton gap” et al (and when used of materiel the general term “technology deficit” is also used).  Rearmament is the usual approach but there can also be “stop gap” solutions which are temporary (often called “quick & dirty” (Q&D)) fixes which address an immediate crisis without curing the structural problem.  For a permanent (something often illusory in military matters) remedy for a deficiency, one is said to “bridge the gap”, “gap-fill” or “close the gap”.  The phrase “stop gap” in the sense of “that which fills a hiatus, an expedient in an emergency” appears to date from the 1680s and may have been first a military term referring to a need urgently to “plug a gap” in a defensive line, “gap” used by armies in this sense since the 1540s.  The use as an adjective dates from the same time in the sense of “filling a gap or pause”.  A “credibility gap” is discrepancy between what’s presented as reality and a perception of what reality actually is; it’s applied especially to the statements of those in authority (politicians like crooked Hillary Clinton the classic but not the only examples).  “Pay gap” & “gender gap” are companion terms used most often in labor-market economics to describe the differences in aggregate or sectoral participation and income levels between a baseline group (usually white men) and others who appear disadvantaged.

“Gap theorists” (known also as “gap creationists”) are those who claim the account of the Earth and all who inhabit the place being created in six 24 hour days (as described in the Book of Genesis in the Bible’s Old Testament) literally is true but that there was a gap of time between the two distinct creations in the first and the second verses of Genesis.  What this allows is a rationalization of modern scientific observation and analysis of physical materials which have determined the age of the planet.  This hypothesis can also be used to illustrate the use of the phrase “credibility gap”.  In Australia, gap is often used to refer to the (increasingly large) shortfall between the amount health insurance funds will pay compared with what the health industry actually charges; the difference, paid by the consumer, (doctors still insist on calling them patients) is the gap (also called the “gap fee”).  In Australia, the term “the gap” has become embedded in the political lexicon to refer to the disparity in outcomes between the indigenous and non-indigenous communities in fields such as life expectancy, education, health, employment, incarceration rates etc.  By convention, it can be used only to refer to the metrics which show institutional disadvantage but not other measures where the differences are also striking (smoking rates, crime rates, prevalence of domestic violence, drug & alcohol abuse etc) and it’s thus inherently political.  Programmes have been designed and implemented with the object of “closing the gap”; the results have been mixed.

Opinion remains divided on the use of platinum-tipped spark plugs in the Mercedes-Benz M100 (6.3 & 6.9) V8.

A “spark gap” is the space between two conducting electrodes, filled usually with air (or in specialized applications some other gas) and designed to allow an electric spark to pass between the two.  One of the best known spark gaps is that in the spark (or sparking) plug which provides the point of ignition for the fuel-air mixture in internal combustion engines (ICE).  Advances in technology mean fewer today are familiar with the intricacies of spark plugs, once a familiar (and often an unwelcome) sight to many.  The gap in a spark plug is the distance between the center and ground electrode (at the tip) and the size of the gap is crucial in the efficient operation of an ICE.  The gap size, although the differences would be imperceptible to most, is not arbitrary and is determined by the interplay of the specifications of the engine and the ignition system including (1) the compression ratio (low compression units often need a larger gap to ensure a larger spark is generated), (2) the ignition system, high-energy systems usually working better with a larger gap, (3) the materials used in the plug’s construction (the most critical variable being their heat tolerance); because copper, platinum, and iridium are used variously, different gaps are specified to reflect the variations in thermal conductivity and the temperature range able to be endured and (4) application, high performance engines or those used in competition involving sustained high-speed operation often using larger gaps to ensure a stronger and larger spark.

Kennedy, Khrushchev and the missile gap

The “missile gap” was one of the most discussed threads in the campaign run by the Democratic Party’s John Kennedy (JFK, 1917–1963; US president 1961-1963) in the 1960 US presidential election in which his opponent was the Republican Richard Nixon (1913-1994; US president 1969-1974).  The idea there was a “missile gap” was based on a combination of Soviet misinformation, a precautionary attitude by military analysts in which the statistical technique of extrapolation was applied on the basis of a “worst case scenario” and blatant empire building by the US military, notably the air force (USAF), anxious not to surrender to the navy their pre-eminence in the hierarchy of nuclear weapons delivery systems.  It’s true there was at the time a missile gap but it was massively in favor of the US which possessed several dozen inter-continental ballistic missiles (ICBM) while the USSR had either four or six, depending on the definition used.  President Dwight Eisenhower (1890-1969; US president 1953-1961), a five-star general well acquainted with the intrigues of the military top brass, was always sceptical about the claims and had arranged the spy flights which confirmed the real count but was constrained from making the information public because of the need to conceal his source of intelligence.  Kennedy may actually have known his claim was incorrect but, finding it resonated with the electorate, continued to include it in his campaigning, knowing the plausibility was enhanced in a country where people were still shocked by the USSR having in 1957 launched Sputnik I, the first ever earth-orbiting satellite.  Sputnik had appeared to expose a vast gap between the scientific capabilities of the two countries, especially in the matter of big missiles. 

President Kennedy & comrade Khrushchev at their unproductive summit meeting, Vienna, June 1961.

Fake gaps in such matters were actually nothing new.  Some years earlier, before there were ICBMs so in any nuclear war the two sides would have to have used aircraft to drop bombs on each other (al la Hiroshima & Nagasaki in 1945), there’d been a political furore about the claim the US suffered a “bomber gap” and would thus be unable adequately to respond to any attack.  In truth, by a simple sleight of hand little different to that used by Nazi Germany to 1935 to convince worried British politicians that the Luftwaffe (the German air force) was already as strong as the Royal Air Force (RAF), Moscow had greatly inflated the numbers and stated capability of their strategic bombers, a perception concerned US politicians were anxious to believe.  The USAF would of course be the recipient of the funds needed to build the hundreds (the US would end up building thousands) of bombers needed to equip all those squadrons and their projections of Soviet strength were higher still.  If all of this building stuff to plug non-existent gaps had happened in isolation it would have been wasteful of money and natural resources which was bad enough but this hardware made up the building blocks of nuclear strategy; the Cold war was not an abstract exercise where on both sides technicians with clipboards walked from silo to silo counting warheads.

Instead, the variety of weapons, their different modes of delivery (from land, sea, undersea and air), their degrees of accuracy and their vulnerability to counter-measures was constantly calculated to assess their utility as (1) deterrents to an attack, (2) counter-offensive weapons to respond to an attack or (3) first-strike weapons with which to stage a pre-emptive or preventative attack.  In the Pentagon, the various high commands and the burgeoning world of the think tanks, this analysis was quite an industry and it had to also factor in the impossible: working out how the Kremlin would react.  In other words, what the planners needed to do was create a nuclear force which was strong enough to deter an attack yet not seem to be such a threat that it would encourage an attack and that only scratched the surface of the possibilities; each review (and there were many) would produce detailed study documents several inches thick.

US Navy low-level photograph spy of San Cristobal medium-range ballistic missile (MRBM) site #1, Cuba, 23 October, 1962.

In October 1962, during the Cuban Missile Crisis, the somewhat slimmer nuclear war manuals synthesized from those studies were being read with more interest than usual.  It was a tense situation and had Kennedy and comrade Nikita Khrushchev (1894–1971; Soviet leader 1953-1964) not agreed to a back-channel deal, the US would probably have attacked Cuba in some manner, not knowing three divisions of the Red Army were stationed there to protect the Soviet missiles and that would have been a state of armed conflict which could have turned into some sort of war.  As it was, under the deal, Khrushchev withdrew the missiles from Cuba in exchange for Kennedy’s commitment not to invade Cuba and withdraw 15 obsolescent nuclear missiles from Turkey, the stipulation being the Turkish component must be kept secret.  That secrecy colored for years the understanding of the Cuban Missile Crisis and the role of the US nuclear arsenal played in influencing the Kremlin.  The story was that the US stayed resolute, rattled the nuclear sabre and that was enough to force the Soviet withdrawal.  One not told the truth was Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) who became president after Kennedy was assassinated in 1963 and historians have attributed his attitude to negotiation during the Vietnam War to not wishing to be unfavorably compared to his predecessor who, as Dean Rusk (1909–1994; US secretary of state 1961-1969) put it, stood “eyeball to eyeball” with Khrushchev and “made him blink first”.  The existence of doomsday weapon of all those missiles would distort Soviet and US foreign policy for years to come.

Monday, October 11, 2021

Brink

Brink (pronounced bringk)

(1) The edge or margin of a steep place or of land bordering water.

(2) Any extreme edge; verge.

(3) A crucial or critical point, especially of a situation or state beyond which success or catastrophe occurs:

1250–1300: From the Middle English brink, from the Middle Dutch brinc from the Old Norse brink (steepness, shore, bank, grassy edge).  It was cognate with the Middle Low German brink (edge, hillside) and the Old Norse brekka (slope, hill).  Danish gained brink directly from the Old Norse but for most other languages the greater influence was the Proto-Germanic brenkon, probably from the primitive Indo-European bhreng-, a variant of bhren- (to project; edge), source also of the Lithuanian brinkti (to swell).  Brink is a noun and brinkless an adjective; the noun plural is brinks.

Brinkmanship

A coining from the early cold war, brinkmanship is forever associated with John Foster Dulles (1888-1959; US Secretary of State 1953-1959), the origin in the words he used in a 1956 interview with Time-Life’s Washington bureau chief James Shepley (1917-1988):

The ability to get to the verge without getting into the war is the necessary art. If you cannot master it, you inevitably get into war. If you try to run away from it, if you are scared to go to the brink, you are lost.”

Secretary of State John Foster Dulles and President Dwight Eisenhower (1890-1969; US president 1953-1961), November 1955.

Even then, it was hardly a new notion of geopolitics and, as a strategy, doubtlessly as old as conflict itself and with some history in US political discourse, John Quincy Adams (1767-1848: US President 1825-1829) having adopted the imagery of “…the brink of war” as early as 1829.  Brinkmanship however was applied to, rather than invented by Dulles.  It was the creation of President Eisenhower’s Democratic Party opponent in the 1952 & 1956 elections, Adlai Stevenson (1900-1965), who gave an interview some weeks after Dulles in which he disparaged the secretary of state for "boasting of his brinkmanship, the art of bringing us to the edge of the nuclear abyss."  Stephenson was borrowing from the then quite novel "-manship" words which had entered the vernacular and the word quickly caught on, the Cuban Missile Crisis (1962) often used as an exemplar of the policy in action although the revelations which later emerged about what actually transpired during those dramatic October days showed there were many more complexities at play.

Beyond the brink:  Foster Dulles' headstone, Arlington National Cemetery, Arlington, Virginia.

Born shortly after Stephenson’s interview was brinkmanship's illegitimate sister, the wholly unetymological brinksmanship, the added -s- a construction based on the earlier salesmanship, sportsmanship etc.  Invention of the facetious –manship formations is often attributed to the humorist Stephen Potter (1900-1969) who in 1947 published The Theory and Practice of Gamesmanship (or the Art of Winning Games without Actually Cheating) and in subsequent years added golfmanship and one-upmanship to his informal lexicon.  Gamesmanship had however been used and discussed by Ian Coster (1903-1955) in his autobiographic Friends in Aspic (1939) and he attributed it to the poet Sir Francis Meynell (1891-1975).  Coster used an amateur village cricket team to illustrate gamesmanship.  Because such teams typically contained only two or three competent fieldsmen, advantage could be gained by ensuring all were wearing identical clothing and, especially, headgear, thereby making it harder for the batsman to tell whether his shot was heading towards a good fieldsman or a dud.

Lindsay Lohan on the brink of a wardrobe malfunction, Miami, Florida, May 2011.

In the public imagination, brinkmanship remains still the enduring encapsulation of the High Cold War and the Cuban Missile Crisis in particular, the events in the Caribbean summed up in the words of Dean Rusk (1909–1994; US secretary of state 1961-1969): “We're eyeball to eyeball, and I think the other fellow just blinked.”  That narrative at the time suited the White House (and the phalanx of Kennedy family hagiographers who shaped the truths & myths of Camelot) and the various parts of the nuclear weapons establishment (a diverse crew including the Air Force, the navy, the Pentagon and the Defense Department, all with their own policy agendas to push) forged the influential idea of “calibrated brinkmanship”, an extension of the original position attributed to Dulles modified by the notion that it’s the superiority of one’s nuclear arsenal and a perception of willingness to use it which will allow one to prevail in a crisis.  It would be years before it would be revealed the crisis of 1962 unfolded rather differently but by then, the perception had done its damage.

Sunday, October 10, 2021

Diagonal

Diagonal (pronounced dahy-ag-uh-nl or di-ag-nl (both uses U & non-U)

(1) In mathematics, connecting two nonadjacent angles or vertices of a polygon or polyhedron, as a straight line.

(2) In mathematics, a set of entries in a square matrix running either from upper left to lower right (main diagonal, or principal diagonal ) or lower left to upper right (secondary diagonal ).

(3) In number theory, as the broken diagonal, in the theory of magic squares, a set of n cells forming two parallel diagonal lines in the square.

(4) In linear algebra, as diagonal matrix, a matrix in which the entries outside the main diagonal are all zero.

(5) In geometry, extending from one edge of a solid figure to an opposite edge, as a plane (joining two nonadjacent vertices).

(6) In category theory, as diagonal morphism, a morphism from an object to the product of that object with itself, which morphism is induced by a pair of identity morphisms of the said object.

(7) Something with or assuming an oblique direction; having slanted or oblique lines or markings; having a slanted or oblique direction.

(8) In typography, a virgule (a slash), known also as a solidus (used in computing file systems variously as forward slash & back slash or slash & slosh (the generalized term the diagonal mark).

(9) In design, any line or pattern using diagonals; something put, set, or drawn obliquely.

(10) In fabrics, a cloth marked or woven with slanting lines or patterns

(11) In manège, of a horse at a trot, the state in which the foreleg and the hind leg, diagonally opposite, which move forward simultaneously.

(12) In zoological anatomy, of or related to the cater-corner (diagonally opposite) legs of a quadruped, whether the front left and back right or front right and back left.

(13) In chess, one of the oblique lines of squares on a chessboard (the mode in which a bishop may be moved).

1400s: From the Middle French diagonal From the Latin diagōnālis, the construct being the Ancient Greek διαγώνιος (diagn(ios)) (from angle to angle) + the Latin -ālis (the third-declension two-termination suffix (neuter -āle) used to form adjectives of relationship from nouns or numerals).  The construct of the Greek diagōnios, was dia- + γωνία (gōnía) (angle; corner), from the primitive Indo-European root genu- (knee; angle).  The dia- prefix was from the Ancient Greek prefix δια- (dia-), from διά (diá) (through, across, by, over) and was most productive, the familiar forms including diadem, diacritical, diagnosis, diagram, diameter, dialect, dialogue & diatribe.  The adjective diagonal (implied in diagonally) (extending as a line from one angle to another not adjacent) dates from the early fifteenth century and was from the Old French diagonal, from the Latin diagonalis, from diagonus (slanting line), from the Ancient Greek diagōnios.  It emerged as a noun in the 1570s in the sense of “a straight line drawn from one angle to or through another not adjacent, in a plane or solid figure".  The specific technical meaning in chess describes "a line of squares running diagonally across a board" and is the mode in which a bishop may move).  Diagonal is a noun & adjective, diagonality is a noun and diagonally is an adverb; the noun plural is diagonals.

Defying the tyranny of the horizontal line: Lindsay Lohan’s hand-written notes made during one of her court appearances in Los Angeles, July 2010.  Even on the Reddit subs where exist the planet’s most unforgiving critics, most were so taken with the neatness of the lettering, the diagonality attracted barely a comment.

A diagonal measurement is defined usually by describing a line between the bottom left and the upper right corners (or vice versa) of a square or rectangle.  It has a nuanced value when used of computer monitors, televisions and such because it has to be read in conjunction with the aspect ratio of the device.  A 19 inch (monitor sizes usually expressed in inches although the French will always include a metric conversion) monitor in a 16:9 aspect will be very different from a 19 inch 4:3 device.  In computing, what began in typography, as diagonal marks (the virgule (often called a slash of solidus (/) and the later “back slash (\)) are used in computing file systems to separate directories & sub-directories (now familiar as folders) from file names.  Under MS/PC-DOS, OS/2 & Windows, a file called myfile.txt to a sub-directory called text in a directory called user on D: drive would be displayed in the path D:\user\text\myfile.txt (although under DOS it would be in upper case).  The Windows crowd call these diagonal marks “back slashes” and the solidus they call “forward slashes” and they’re used for other purposes.  The Unix crew think this childish and insist a solidus is a “slash” and there’s no such thing as a back-slash which real people call a slosh.

Notable moments in diagonal (canted) headlamps

The one-off, 1938 Jaguar SS100 fixed head coupé (FHC) “Grey Lady” which demonstrates the traditional placement when four lights were used.

The inclination designers for decades felt to use a diagonal arrangement for headlights began innocently enough in the pre-war years when it emulated the usual practice of placing a pair of driving lamps or for lights inboard of the main headlamps and lower down, mounted typically on the bumper bar or its supporting brackets.  Most headlamps until the late 1930s were in separate housings, as were the auxiliary devices and even cars which integrated them into the coachwork adopted the same geometry.  This was due in part to the evolutionary nature of automobile styling which has often tried to avoid the “shock of the new” and in part to regulations, especially those which applied in the US.

Jaguar S-Type (1963-1968, left), Vanden Plas Princess R (1964-1968, centre) and Volvo 164 (1968-1975, right).

Although most would regard the technique which essentially integrated the driving lamps/fog lamps into the coachwork as just a variation on the diagonal theme, professional designers insist not; they say this is just wrapping enveloping bodywork around an existing device.  Also, the professionals prefer the term “canted headlamps” because “diagonal” has a more precise definition in mathematics.

Rover 3.5 Coupé (P5B 1967-1973, left) and Packard Coupe (1958) (right)

While the US manufacturers usually re-tooled in 1957-1958 after regulations had been changed to allow quad head-lamps, the British were often fiscally challenged and needed to continue to use existing sheet metal.  A design like the Vanden Plas Princess R (and the companion Wolseley 6/99 & 6/110 (1959-1968)) has sufficient space to allow the diagonal placement but the Rover P5 (1958-1967) with its wider grill precluded the approach so the expedient solution was to go vertical.  Although obviously just “bolted on”, such was the appeal of the P5B it just added to the charm.  It could have been much worse because less charming was the 1958 Packard Coupe, produced by Studebaker-Packard, the company an ultimately doomed marriage of corporate convenience which seemed at the time a good idea but proved anything but. Studebaker-Packard lacked the funds to re-tool to take advantage of the rules allowing four head-lamps but without the feature their cars would have looked even more hopelessly outdated than they anyway did so cheap fibreglass “pods” were produced which looked as “tacked on” as they were.  They were the last Packards made and Studebaker’s demise followed within a decade.

1963 Zunder

The Zunder ("spark" in German) was produced in Argentina between 1960-1963 and used the power-train from the Porsche 356.  The body was fashioned in fibreglass and was one of the many interesting products of the post war industry in Brazil and Argentina, the history of which is much neglected.  By the standards of time, it was well-built but as a niche product, was never able to achieve the critical mass necessary to ensure the company’s survival and production ceased in 1963 after some 200 had been built.

Buick Electra 225 (First generation 1959–1960, left) and (Lincoln) Continental Mark III (1958-1960, right).  The Buick adopted horizontal headlamps in 1960.

In the late 1950s, most US manufacturers did have cash to spend and the industry spirit at the time was never to do in moderation what could be done in excess although by comparison with the Lincoln, the Buick verged on the restrained.  Tellingly, the Buick sold well while the Continental was such a disaster Ford considered sending Lincoln to join Edsel on the corporate scrapheap and the nameplate was saved only because it was possible at low cost to re-purpose a prototype Ford Thunderbird as the new Continental.  Rarely has any replacement been such a transformation and the 1961 Continental would influence the design of full-sized American cars for twenty years.  It used horizontally mounted head-lamps.

1961 Chrysler 300 G.

Chrysler’s “Letter Series 300” (1955-1965) coupes and convertibles were the brightest glint in the golden age in which Detroit’s power race was played out in the big cars, an era which would be ended by the introduction of the intermediates and pony cars in the 1960s.  The 300G (1961) was visually little changed from the previous year’s 300F but the simple change to diagonal headlamps was transformative.  There were those who didn’t like the look but generally it was well received and as a first impression, the feeling might have been Chrysler had mastered the motif in a way the Continental Mark III proved Ford just didn’t get it.

1961 DeSoto Adventurer (left), 1962 Dodge Dart (centre) and 1963 Dodge Polara (right).

However, Chrysler’s designers in the early 1960s may have decided they liked diagonal headlamps which was good but seemingly they liked them so much they though the buyers should be offered as many permutations of the idea as could be made to work on a production line.  What’s remarkable is not that the public didn’t take to the approach but that it took the corporation so long to admit the mistake and try something more conventional.  Just to hedge their bets, while Dodge, Plymouth and DeSoto all had headlamps mounted at an obvious degree of cant, on the Chryslers the effect was so subtle one really needed to hold a spirit level to the front end to confirm there was an slant, albeit one imperceptible to the naked eye.  The one division which never were the diagonal way was the Imperial but it’s headlamp treatment was more bizarre still.

1961 DeSoto styling proposal (September 1958) for the 1961 range.

For DeSoto, things could have looked worse even than they did, some of the implementations of the diagonal motif which went as far as clay models or actual metal prototypes so bizarre one wonders what external influences were being studied (or inhaled).  As it turned out, 1961 would be the end of the line for DeSoto, a nameplate which had been successful as recently as the mid 1950s.  Its demise was little to do with diagonal head-lamps (though they didn’t help) but a product of Chrysler’s other divisions expanding their ranges up and down, encroaching on a market segment DeSoto once found so lucrative.  The phenomenon was a harbinger of the eventual fate of marques like Mercury, Pontiac, Oldsmobile and Plymouth.

Clockwise from top left: Fiat 8V (1952-1954), Gordon-Keeble GK-1 (1961-1967), Jensen C-V8 (1962-1966) and Triumph Vitesse (1962-1971).

Perhaps surprisingly, the French majors were never enamored, presumably because Citroën and Renault didn’t like to be imitative and Peugeot were too conservative.  Some of the Europeans did dabble with the idea, embracing it as an expression of modernity although the then radical treatment of the head-lamps sometimes struck a discordant note when they were grafted onto something where the rest of the platform was so obviously from one or two generations past.  Fiat’s exquisite 8Vs didn’t all get the diagonal look but those which did remain the most memorable of the few of the breed built.  An unqualified aesthetic success was the Gordon-Keeble built to aviation standards and powered by a Chevrolet V8.  It deserved to succeed but floundered as much of the British industry did in the era because of a lack of capitalization and an accounting operation which didn’t match the quality of the Engineering.  More successful was the Jensen C-V8 but while the distinctive front end now makes it much prized by collectors, at the time it was less admired and its very presence served only to emphasize how antiquated the rest of the styling had become.  For its replacement, Jensen tuned to an Italian styling house and the Interceptor, introduced in 1966 and remembered for the vast expanse or rear glass, is now thought a classic of the era.  The one which sold best was the Triumph Vitesse, one of a number of variations built on the robust and versatile separate chassis of the Herald (1959-1971) including the Spitfire and GT6.  Somewhat the BMW M3 of its day, the Vitesse’s front end actually lived on in India (though without the lusty six cylinder engines) but curiously, the inner headlights weren’t fitted.

Gilding the lily: The Lancia Fulvia coupé (1965-1976) before & after.

The lovely, delicate lines of the Lancia Fulvia were perfect and really couldn’t be improved.  The unfortunate facelift with the canted lights was no improvement.