Sunday, July 10, 2022

Lede

Lede (pronounced leed)

(1) In hot-print typesetting, a deliberate misspelling of “lead”, created to avoid the possibility “lead” being used as text rather than being understood as an instruction.

(2) In journalism, a short summary serving as an introduction to a news story, article or other copy.

(3) In journalism, the main and thus the news item thought most important.

(4) A man; a person (and many related meanings, all obsolete).

Mid twentieth century:  An altered spelling of lead (in the sense used in journalism: “short introductory summary”), used in the printing trades to distinguish it from the homograph lead (in the sense of the “thin strip of type metal for increasing the space between lines of type” which was made of lead (pronounced led (rhyming with “dead”)) and use in this context has always been was rare outside the US.  The historic plural form (Middle English) was lede but in modern use in journalism it’s always ledes.

The historic use has been obsolete since the late sixteenth century.  The Middle English lede & leode (variously: man; human being, person; lord, prince; God; sir; group, kind; race; a people, nation; human race; land, real property, the subjects of a lord or sovereign; persons collectively (and other forms)) were connected to three closely related words (1) the Old English lēod (man; chief, leader; (poetic) prince; a people, people group; nation), (2) the Old English lēoda (man; person; native of a country (and related to lēod)) and (3) the Old English lēode (men; people; the people of a country (which was originally the plural of lēod)).  Lēod was derived from the Proto-West Germanic liud & liudi, from the Proto-Germanic liudiz (man; person; men; people), from the primitive Indo-European hléwdis (man, people) from hlewd- (to grow; people).  The bafflingly large array of meanings of lede in the Middle English is perhaps best encapsulated in the phrase “in all lede” (all the world).  It was related too to the Northumbrian dialectical form lioda (men, people) and was cognate with the German Leute (nation, people) and the Old High German liut (person, people), from the primitive Indo-European root leudh- (people), the source also of the Old Church Slavonic ljudu and the Lithuanian liaudis (nation, people).  Care must be taken if using lede in Sweden.  In the Swedish, lede was from the nominal use (masculine inflection) of the adjective led (evil), used in the synonym den lede frestaren (the evil tempter), understood as “the evil one, the loathsome or disgusting one; the devil, Satan”.

In modern use, lede was a deliberate misspelling of lead, one of a number of words created so they could be used in the instructions given to printers while avoid the risk they might appear in the text being set.  Lede could thus be used to indicate which paragraphs constitute the lede while avoiding any confusion with the word “lead” which might be part of the text to appear in an article.  The other created forms were “dek” (subhead or sub-heading (modified from “deck”)), hed (headline (modified from head)) and kum (used as “lede to kum” or “lede 2 kum” (a placeholder for “lead to come”)).  The technical terms were of some significance when the hot-typesetting process was used for printing.

It’s not clear when lede came into use in the printing trades, the dictionaries which maintain a listing are not consistent and the origin is suggested to date from anywhere between 1950-1976.  The difficulty is determining the date of origin is that the documents on which the words (lede, dek, hed, kum) appeared were ephemeral, created to be used on the day and discarded.  Lede move from the print factory floor to the newsroom to become part of the jargon of journalism, used to describe the main idea in the first few lines of a story, a device to entice a reader to persist (the idea of click-bait nothing new).  In much of the English-speaking world, the word in this context is spelled “lead” but lede remains common in the US.  In either form, it’s entered the jargon, the phrase "to bury the lede" meaning “unwittingly to neglect to emphasize the very most important part of the story (about an editor’s most damning critique) and “if it bleeds it ledes” means a dramatic story, preferably with explicit images, will be on the front page or be the first item of a news bulletin.

Beyond journalism, lede is sometimes used.  An auction site might for example have a “lede photograph”, the one which is though the most tempting and this attract the biggest audience (ie more click-bait).  In general use, it’s probably not helpful and likely only to confuse.  Henry Fowler (1858-1933) would doubtless have included its general use among what in his Dictionary of Modern English Usage (1926) he derided as a “pride of knowledge”: “...a very un-amiable characteristic and the display of it should sedulously be avoided” and to illustrated his point he said such vanities included “didacticism, French words, Gallicisms, irrelevant allusions, literary critics’ words, novelty hunting, popularized technicalities, quotations, superiority & word patronage.”  Ones suspects that outside of the print-shop, little would Henry Fowler have tolerated lede.

Others agree, suggesting the old sense from Middle English is now used only as a literary or poetic device and it’s otherwise the sort of conscious archaism of which Henry Fowler disapproved, a thing used only to impress others and beyond that, the modern form should never travel beyond wherever as a form of jargon it makes sense.  Still, there were word nerds who though it might have a future.  The US author William Safire (1929–2009) seemed to find hope in its prospects and even coined “misledeing” which might be tempting for editors searching for novel words with which berate neophyte writers.

Emerge

Emerge (pronounced ih-murj)

(1) To come forth into view or notice, as from concealment or obscurity

(2) To rise or come forth from or as if from water or other liquid.

(3) To come up or arise, as a question or difficulty.

(4) To come into existence; develop.

(5) To rise, as from an inferior or unfortunate state or condition.

1560s: From the Middle French émerger from the Latin ēmergere (bring forth, bring to light (intransitively "arise out or up, come forth, come up, come out, rise"), an assimilated form of the construct ē- (a variant of the suffix ex- (out, forth)) + mergere (to dive, dip, sink).  The notion was of rising from a liquid by virtue of buoyancy.  The verb re-emerge (also as reemerge) (to emerge again or anew) dates from 1775 and other forms of this evolved: re-emerged (1778), re-emerging (1088) & re-emergence (1801).  The noun emergence dates from the 1640s, initially in the sense of "an unforeseen occurrence" from the French émergence, from the Middle French émerger, from the Latin ēmergere; the modern meaning (process of coming) emerged in 1704, the meanings co-existing until by circa 1735 the new had entirely replaced the old.  The noun emersion (reappearance, act of emerging) dates from the 1630s; it was a noun of action from past-participle stem of Latin ēmergere and originally was used of eclipses and occultations.  The adjective emergent was actually quite old, noted from the late fourteenth century in the sense of “rising from what surrounds it, coming into view", from the Latin emergentem (nominative emergens), present participle of ēmergere.  The present participle is emerging and the past participle emerged)

Emerge, emanate or issue all mean to come forth but differ in nuance.  Emerge is used of coming forth from a place shut off from view, or from concealment, or the like, into sight and notice in the sense the sun emerges from behind the clouds.  Emanate is used of intangible things, as light or ideas, spreading from a source and both gossip and rumors can emanate from sources reliable or otherwise.  Issue is often used of a number of persons, a mass of matter, or a volume of sound or the like, coming forth through any outlet or outlets; smoke is said to issue from a chimney, something, if it be white, associated with the emergence of a new Roman Catholic Pope 

Emerging from the magic circle

With Boris Johnson’s curious, slow-motion resignation as prime-minister of the UK, attention begins to turn to the way Conservative and Unionist (Tory) Party elects its next leader to become the latest custodian of one of the world’s nuclear arsenals.  These days, it takes longer than once it did for the Tories to do this for it’s now an exhaustive process consisting of (1) a multi-round contest in which all members of parliament (MPs) vote for however many of their colleagues have nominated, the unfortunate chap (and these days some Tory chaps are women) receiving the fewest voted dropping out so the next round may proceed until the field is whittled down to two at which point (2) that pair is submitted to the party membership at large to make their choice.  The winner will assume the party leadership and, with the Tories enjoying in the House of Commons the solid majority Mr Johnson secured at the last election, will go to the palace to kiss hands and be invited to become Her Majesty’s Prime-Minister of the United Kingdom & First Lord of the Treasury.

This democratic (and leisurely) process is relatively new to the Tory party, its leaders elected by a formal vote only since 1965 and even then, until 2001, it was only MPs who voted.  Prior to that, Tory leaders were said to “emerge” from what was known as a “magic circle” and although never as mysterious as some suggested, it was an opaque process, conducted by party grandees.  The classic example was in 1957 when the choice was between Harold MacMillan (1894-1986; UK prime-minister 1957-1963) and Rab Butler (1902-1982).  To his office in the House of Lords, the lisping (fifth) Lord Salisbury (1983-1972) summoned those he thought good chaps (women at this point hadn’t yet become chaps) and asked “Hawold or Wab?”  Hawold prevailed.

The change in process in 1965 came about at the insistence of Sir Alec Douglas-Home (1903-1995 and the fourteenth Earl of Home before disclaiming his peerage in 1963 to become prime-minister (1963-1964)).  Since 1957, the country had changed and there was much criticism of the murky manner by which Sir Alec had become party leader and there was a clamour, even within the party, both to modernize and appear more transparently democratic.  From this point, unleashed were the forces which would in 1975 see Margaret Thatcher (1925-2013; UK prime-minister 1979-1990) elected leader but the first beneficiary of the wind of change was Edward "Ted" Heath (1916-2005; UK prime-minister 1970-1974), a grammar school boy who replaced a fourteenth earl.  Notably, to appear more modern, Heath in 1965 didn't repair (as he had with MacMillan when he emerged in 1957), to the Turf Club for a celebratory meal of oysters, game pie and champagne which “…might have made people think a reactionary regime had been installed”.  It can be hard now to understand quite what a change Heath's accession in 1965 flagged; the Tory Party previously had leaders from the middle class but never the lower middle class.  The significance of what emerged in 1965 was less the new leader than a changed Tory Party in a changed country.     

So Tory leaders no longer emerge from a magic circle but a remarkable number are emerging to offer themselves as willing to be considered so clearly it’s thought still a desirable job although not what it was.  MacMillan once marveled that his predecessor, William Ewart Gladstone (1809–1898; four times UK prime-minister 1868-1894), prime-minister at a time when the British Empire spanned the globe, managed every year to spend four months at his country house, an arrangement one suspects Mr Johnson would have found most tolerable.  Ideologically, the indications are there will be little to choose between those on offer, the extent of the variation probably something like that once described by Georges Clemenceau (1841–1929; Prime Minister of France 1906-1909 & 1917-1920) as the difference between "a politician who would murder their own mother and one prepared to murder only someone else's mother".

Mr Johnson will no doubt reflect on his time in Downing Street and perhaps conclude a few things might have been done differently but, after all, he has been prime-minister, one of the few to make it to the top of the greasy pole.  When the office beckoned Lord Melbourne (1779-1848; UK prime-minister 1834 & 1835-1841), he was disinclined to accept, fearing it would be “…a damned bore” but his secretary persuaded him, saying “…no Greek or Roman ever held the office and if it lasts but three months it’ll still be worthwhile to have been Prime Minister of England”.  Having apparently appointed himself Prime-Minister Emeritus, as he sits in No 10, plotting and scheming ways to remain, Mr Johnson can at least remember and be glad.

For a brief, shining moment (2019-2021), the world had three leaders with nuclear weapons and outstanding haircuts.

Alexander Boris de Pfeffel Johnson (b 1964; Prime Minister of the United Kingdom 2019-2022 (probably)) (left).

Donald John Trump (b 1946-; President of the United States 2017-2021) (centre).

Kim Jong-un (b circa 1983; Supreme Leader of DPRK (Democratic People’s Republic of Korea (North Korea)) since 2011) (right).

Saturday, July 9, 2022

Assassin

Assassin (pronounced uh-sas-in)

A murderer, especially one who kills a politically prominent person for reason of fanaticism or profit.

One of an order of devout Muslims, active in Persia and Syria circa 1090-1272, the prome object of whom was to assassinate Crusaders (should be used with initial capital letter).

1525–1535: An English borrowing via French and Italian, from the Medieval Latin assassīnus (assassinī in the plural), from the Arabic Hashshashin (ashshāshīn in the plural) (eaters of hashish), the Arabic being حشّاشين, (ħashshāshīyīn (also Hashishin or Hashashiyyin).  It shares its etymological roots with the Arabic hashish (from the Arabic: حشيش (ashīsh)) and in the region is most associated with a group of Nizari Shia Persians who worked against various Arab and Persian targets.

The Hashishiyyin were an Ismaili Muslim sect at the time of the Crusades, under leadership of to Hasan ibu-al-Sabbah (known as shaik-al-jibal or "Old Man of the Mountains") although the name was widely applied to a number of secret sects operating in Persia and Syria circa 1090-1272.  The word was known in Anglo-Latin from the mid-thirteenth century and variations in spelling not unusual although hashishiyy (hashishiyyin in the plural) appears to be the most frequently used.  The plural suffix “-in” was a mistake by Medieval translators who assumed it part of the Bedouin word.  

Whether in personal, political or family relations, assassination is one of the oldest and, done properly, one of the most effective tools known to man.  The earliest known use in English of the verb "to assassinate" in printed English was by Matthew Sutcliffe (circa 1548-1629) in A Briefe Replie to a Certaine Odious and Slanderous Libel, Lately Published by a Seditious Jesuite (1600), borrowed by William Shakespeare (circa 1564-1616) for Macbeth (1605).  Among the realists, it’s long been advocated, Sun Tzu in the still read The Art of War (circa 500 BC) arguing the utilitarian principle: that a single assassination could be both more effective and less destructive that other methods of dispute resolution, something with which Niccolò Machiavelli (1469–1527), in his political treatise Il Principe (The Prince, written circa 1513 & published 1532), concurred.  As a purely military matter, it’s long been understood that the well-targeted assassination of a single leader can be much more effective than a battlefield encounter whatever the extent of the victory; the “cut the head off the snake” principle.

Modern history

The assassination in July 2022 of Abe Shinzō san (安倍 晋三 (Shinzo Abe, 1954-2022, prime minister of Japan 2006-2007 & 2012-2020) came as a surprise because as a part of political conflict, assassination had all but vanished from Japan.  That’s not something which can be said of many countries in the modern era, the death toll in Asia, Africa, the Middle East and South & Central America long, the methods of dispatch sometimes gruesome.  Russia’s annals too are blood-soaked although it’s of note perhaps in that an extraordinary number of the killings were ordered by one head of Government.  The toll of US presidents is famous and also documented are some two-dozen planned attempted assassinations.  Even one (as far as is known) prime-minister of the UK has been assassinated, Spencer Perceval (1762–1812; Prime-Minister of the UK 1809-1912) shot dead (apparently by a deranged lone assassin) on 11 May 1812, his other claim to fame that uniquely among British premiers, he also served as solicitor-general and attorney-general.  Conspiracy theorists note also the death of Pope John-Paul I (1912–1978; pope Aug-Sep 1978).

Ultranationalist activist Otoya Yamaguchi (1943-1960), about to stab Socialist Party leader Inejiro Asanuma san (1898-1960) with his yoroi-dōshi (a short sword, fashioned with particularly thick metal and suitable for piercing armor and using in close combat), Hibiya Public Hall, Tokyo, 12 October 1960.The assassin committed suicide while in custody.

Historically however, political assassinations in Japan were not unknown, documented since the fifth century, the toll including two emperors.  In the centuries which unfolded until the modern era, by European standards, assassinations were not common but the traditions of the Samurai, a military caste which underpinned a feudal society organized as a succession of shogunates (a hereditary military dictatorship (1192–1867)), meant that violence was seen sometimes as the only honorable solution when many political disputes were had their origin in inter and intra-family conflict.  Tellingly, even after firearms came into use, most assassinations continued to be committed with swords or other bladed-weapons, a tradition carried on when the politician Asanuma Inejirō san was killed on live television in 1960.

Most remembered however is the cluster of deaths which political figures in Japan suffered during the dark decade of the 1930s.  It was a troubled time and although Hara Takashi san (1856-1921; Prime Minister of Japan 1918-1921) had in 1921 been murdered by a right-wing malcontent (who received a sentence of only three years), it had seemed at the time an aberration and few expected the next decade to assume the direction it followed.  However in an era in which the most fundamental aspects of the nation came to be contested by the politicians, the imperial courtiers, the navy and the army (two institutions with different priorities and intentions), all claiming to be acting in the name of the emperor, conflict was inevitable, the only thing uncertain was how things would be resolved.

Hamaguchi Osachi san (1870–1931; Prime Minister of Japan 1929-1931) was so devoted to the nation that when appointed head of the government’s Tobacco Monopoly Bureau, he took up smoking despite his doctors warnings it would harm his fragile health.  His devotion was praised but he was overtaken by events, the Depression crushing the economy and his advocacy of peace and adherence to the naval treaty which limited Japan’s ability to project power made him a target for the resurgent nationalists.  In November 1930 he was shot while in Tokyo Railway station, surviving a few months before succumbing an act which inspired others.  In 1932 the nation learned of the Ketsumeidan Jiken (the "League of Blood" or "Blood-Pledge Corps Incident"), a nationalist conspiracy to assassinate liberal politicians and the wealthy donors who supported them.  A list on twenty-two intended victims was later discovered but the group succeeded only in killing one former politician and one businessman.

The death of Inukai Tsuyoshi san (1855–1932; Prime Minister of Japan 1931-1932) was an indication of what was to follow.  A skilled politician and something of a technocrat, he’d stabilized the economy but he abhorred war as a ghastly business and opposed army’s ideas of adventures in China, something increasingly out of step with those gathering around his government.  In May 1932, after visiting the Yasukuni Shrine to pay homage to the Meiji’s first minister of war (assassinated in 1869), nine navy officers went to the prime-minister’s office and shot him dead.  Deed done, the nine handed themselves to the police.  At their trial, there was much sympathy and they received only light sentences (later commuted) although some fellow officers feared they may be harshly treated and sent to the government a package containing their nine amputated fingers with offers to take the place of the accused were they sentenced to death.  In the way the Japanese remember such things, it came to be known as “the May 15 incident”.

Nor was the military spared.  Yoshinori Shirakawa san (1869–1932) and Tetsuzan Nagata san (1884–1935), both generals in the Imperial Japanese Army were assassinated, the latter one of better known victims of the Aizawa Incident of August 1935, a messy business in which two of the three army factions then existing resolved their dispute with murder.  Such was the scandal that the minister of army was also a victim but he got of lightly; being ordered to resign “until the fuss dies down” and returning briefly to serve as prime-minister in 1937 before dying of natural cause some four years later.

All of the pressures which had been building to create the political hothouse that was mid-1930s Japan were realized in Ni Ni-Roku Jiken (the February 26 incident), an attempted military coup d'état in which fanatical young officers attempted to purge the government and military high command of factional rivals and ideological opponents (along with, as is inevitable in these things, settling a few personal scores).  Two victims were Viscount Takahashi Korekiyo san (1854–1936; Prime Minister 1921-1922) and Viscount Saitō Makoto san (1858–1936; admiral in the Imperial Japanese Navy & prime-minister 1932-1934 (and the last former Japanese Prime Minister to be assassinated until Shinzo Abe san in 2022)).  As a coup, it was a well-drilled operation, separate squads sent out at 2am to execute their designated victims although, in Japanese tradition, they tried not to offend, one assassin recorded as apologizing to terrified household staff for “the annoyance I have caused”.  Of the seven targets the rebels identified, only three were killed but the coup failed not because not enough blood was spilled but because the conspirators made the same mistake as the Valkyrie plotters (who sought in 1944 to overthrow Germany’s Nazi regime); they didn’t secure control of the institutions which were the vital organs of state and notably, did not seize the Imperial Palace and thus place between themselves between the Emperor and his troops, something they could have learned from Hernán Cortés (1485–1547) who made clear to his Spanish Conquistadors that the capture of Moctezuma (Montezuma, circa 1466-1520; Emperor of the Aztec Empire circa 1502-1520) was their object.  As it was, the commander in chief ordered the army to suppress the rebellion and within hours it was over.

However, the coup had profound consequences.  If Japan’s path to war had not been guaranteed before the insurrection, after it the impetus assumed its own inertia and the dynamic shifted from one of militarists against pacifists to agonizing appraisals of whether the first thrust of any attack would be to the south, against the USSR or into the Pacific.  The emperor had displayed a decisiveness he’d not re-discover until two atomic bombs had been dropped on his country but, seemingly convinced there was no guarantee the army would put down a second coup, his policy became one of conciliating the military which was anyway the great beneficiary of the February 26 incident; unified after the rebels were purged, it quickly asserted control over the government, weakened by the death of its prominent liberals and the reluctance of others to challenge the army, assassination a salutatory lesson.

Assassins both:  David Low’s (1891-1963) Rendezvous, Evening Standard, 20 September 1939. 

The Molotov–Ribbentrop Pact (usually styled as the Nazi-Soviet Pact), was a treaty of non-aggression between the USSR and Nazi Germany and signed in Moscow on 23 August 1939.  A political sensation when it was announced, it wouldn't be until the first Nuremberg Trial (1945-1946) that the Western powers became aware of the details of the suspected secret protocol under which the signatories partitioned Poland between them.   Low's cartoon was published shortly after Stalin (on 17 September) invaded from the east, having delayed military action until German success was clear.

It satirizes the cynicism of the Molotov-Ribbentrop Pact, Hitler and Stalin bowing politely, words revealing their true feelings.  After returning to Berlin from the signing ceremony, von Ribbentrop reported the happy atmosphere to Hitler as "…like being among comrades" but if he was fooled, comrade Stalin remained the realist.  When Ribbentrop proposed a rather effusive communiqué of friendship and a 25 year pact, the Soviet leader suggested that after so many years of "...us tipping buckets of shit over each-other", a ten year agreement announced in more business-like terms might seem to the peoples of both nations, rather more plausible.  It was one of a few occasions on which comrade Stalin implicitly admitted even a dictator needs to take note of public opinion.  His realism served him less well when he assumed no rational man fighting a war on two fronts against a formidable enemy would by choice open another front of 3000-odd kilometres (1850 miles) against an army which could raise 500 divisions.  Other realists would later use the same sort of cold calculation and conclude that however loud the clatter from the sabre rattling, Mr Putin would never invade Ukraine.

Inflation

Inflation (pronounced in-fley-shuhn)

(1) In economics, a persistent, substantial rise in the general level of prices, often related to an increase in the money supply, resulting in the loss of value of currency.

(2) Of or pertaining to the act of inflating or the state of being inflated.

(3) In clinical medicine, the act of distending an organ or body part with a fluid or gas.

(4) In the study of the metrics of educational standards, an undue improvement in academic grades, unjustified by or unrelated to merit.

(5) In theoretical cosmology, an extremely rapid expansion in the size of the universe, said to have happened almost immediately after the big bang.

1300-1350: From the Middle English inflacioun & inflacion, from the Old French inflation (swelling), from the Latin inflationem (nominative īnflātiō) (expansion; a puffing up, a blowing into; flatulence), noun of action from the past participle stem of inflare (blow into, puff up) and thus related to from īnflātus, the perfect passive participle of īnflō (blow into, expand).  The construct of the figurative sense (inspire, encourage) was in- (into) (from the primitive Indo-European root en (in)) + flare (to blow) (from the primitive Indo-European root bhle- (to blow)).  The meaning "action of inflating with air or gas" dates from circa 1600 while the monetary sense of "a sustained increase in prices" replaced the original meaning (an increase in the amount of money in circulation), first recorded in US use in 1838,  The derived noun hyperinflation dates from 1925 when it was first used to describe the period of high inflation in Weimar Germany; earlier, surgeons had used the word when describing certain aspects of lung diseases.  The adjective inflationary was first used in 1916 as a historic reference to the factors which caused a rapid or sustained increase in prices.

The early meaning related to flatulence, the sense of a “swelling caused by gathering of "wind" in the body” before being adopted as a technical term by clinicians treating lung conditions.  The figurative use as in "outbursts of pride" was drawn directly from the Latin inflationem, nominative inflatio, as a noun of action from past participle stem of inflare (blow into; puff up).  The now most common use beyond the tyre business, that of economists to describe statistically significant movement in prices is derived from an earlier adoption by state treasuries to measure volume of money in circulation, first recorded in 1838 in the US; the money supply is now counted with a number of definitions (M1, M3 etc).  The first papers in cosmological inflation theory were published in 1979 by Cornell theoretical physicist Alan Guth (b 1947).

Cosmic Inflation

Cosmic inflation is a theory of exponential expansion of space in the early universe.  This inflationary period is speculated to have begun an indescribably short time after the start of the big bang and to have been about as brief.  Even now, space continues to expand, but at less rapid rates so the big bang is not just a past event but, after fourteen billion-odd years, still happening.

Definitely not to scale.

One implication of the scale of the expansion of space is the speculation that things, some of which may have been matter, may have travelled faster than the speed of light, suggesting the speed of light came into existence neither prior to or at the start of the big bang but after, possibly within a fraction of a second although, within the discipline, other models have been built.  The breaking of the Einsteinian speed limit may suggest conditions in that first fraction of a second of existence were so extreme the laws of physics may not merely have been different but may not have existed or have been even possible.  If that's true, it may be nonsensical to describe them as laws.  Matter, energy and time also may have come into existence later than the start of the big bang.

The theory has produced derivatives.  One notion is, even though it’s possible always to imagine an equation which can express any duration, time may not be divisible beyond a certain point; another that there can never exist a present, only a past or a future.  Perhaps most weird is the idea the (often labeled chaotic but actually unknown) conditions of the very early big bang could have progressed instead to expanding space but without matter, energy or time.  Among nihilists, there’s discussion about whether such a universe could be said to contain nothing, although an even more interesting question is whether a genuine state (non-state?) of nothing is possible even in theory.

Price Inflation

In economics, inflation is in the West is suddenly of interest because the rate has spiked.  The memories are bad because the inflation associated with the 1970s & 1980s was finally suppressed by central banks and some political realists good at managing expectations combining to engineer recessions and the consequent unemployment.  After that, in advanced economies, as inflation faded from memory to history, there tended to be more academic interest in the possibility deflation might emerge as a problem.  As the Bank of Japan discovered, high inflation was a nasty thing but experience and the textbooks at least provided case-studies of how it could be tamed whereas deflation, one established and remaining subject to the conditions which led to its existence, could verge on the insoluble.

In most of the West however, deflationary pressures tended to be sectoral components of the whole, the re-invented means of production and distribution in the Far East exporting unprecedented efficiencies to the West, the falling prices serving only to stimulate demand because they happened in isolation of other forces.  However, the neo-liberal model which began to prevail after the political and economic construct of the post-World War II settlement began to unravel was based on a contradictory implementation of supply-side economics: Restricting the money supply while simultaneously driving up asset prices.  That was always going to have consequences (and there were a few), one of which was the GFC (global financial crisis (2008-circa 2011)) which happened essentially because the rich had run out of customers with the capacity to service loans and had begun lending money to those who were never going to be able to pay it back.  Such lending has always happened but at scale, it can threaten entire financial infrastructures.  Whether that was actually the case in 2008 remains a thing of debate but such was the uncertainty at the time (much based on a widespread unwillingness of many to reveal their true positions) that everyone’s worst-case scenarios became their default assumption and the dynamics which have always driven markets in crisis (fear and stampede) spread.

What was clear in the wake of the failure of Lehman Brothers (1847-2008) was that much money had simply ceased to exist, a phenomenon discussed by a most interested Karl Marx (1818-1883) in Das Kapital (1867–1883) and while losses were widespread, of particular significance were those suffered by the rich because it was these which were restored (and more) by what came to be called quantitative easing (QE), actually a number of mechanisms but essentially increasing the money supply.  The text books had always mentioned the inflationary consequences of this but that had been based on the assumption that the supply would spread wide.  The reason the central bankers had little fear of inducing inflation (as measured by the equations which have been honed carefully since the 1970s so as not to frighten the horses) was that the money created was given almost exclusively to the rich, a device under which not only were the GFC losses made good but the QE system (by popular demand) was maintained, the wealth of rich increasing extraordinarily.  It proved trickle-down economics did work (at least as intended, a trickle being a measure of a very small flow), the inequalities of wealth in society now existing to an extent not seen in more than a century.

Salvator Mundi (circa 1500) by Leonardo da Vinci.  Claimed to be the artist's last known painting, in 2017 it sold at auction in 2017 for US$450.3 million, still a record and more than double that achieved by the next most expensive, Picasso’s Les femmes d’Alger (Version ‘O’), which made US$179.4 million in 2015.

Post GFC inflation did happen but it was sectorally specific, mansions and vintage Ferraris which once changed hands for a few million suddenly selling for tens of millions and a Leonardo of not entirely certain provenance managed not far from half a billion.  The generalized inflationary effect in the broad economy was subdued because (1) the share of the money supply held by the non-rich had been subject only to modest increases and (2) the pre-existing deflationary pressures which had for so long been helpful continued to operate.  By contrast, what governments were compelled (for their own survival) to do as the measures taken during the COVID-19 pandemic so affected economic activity, had the effect of increasing the money supply in the hands of those not rich and combined with (1) low interest rates which set the cost of money at close to zero, (2) pandemic-induced stresses in labour markets and supply and distribution chains and (3) the effects of Russia’s invasion of Ukraine created what is now called a “perfect storm”.  The inflation rate was already trending up even before the invasion but it has proved an accelerant.  In these circumstances, all that can be predicted is that the text-book reaction of central banks (raising interest rates) will be (1) a probably unavoidable over-reaction to deal with those factors which can be influenced by monetary policy and (2) will not affect the geopolitical factors which are vectors through which inflation is being exported to the West.  Central banks really have no choice other than to use the tools at their disposal and see what happens but the problem remains that while those tools are effective (if brutish) devices for dealing with demand-inflation, their utility in handling supply-inflation is limited. 

First world problem: It’s now hard to find a Ferrari 250 GTO for less than US$70 million.

Friday, July 8, 2022

Heckblende

Heckblende (pronounced hek-blend or hek-blend-ah (German)

A moulded piece of reflective plastic permanently mounted between a car’s tail lamp (or tail light) assemblies and designed to make them appear a contiguous entity

1980s: A German compound noun, the construct being Heck (rear; back) + Blende (cover).  As a surname, Heck (most common in southern Germany and the Rhineland) came from the Middle High German hecke or hegge (hedge), the origin probably as a topographic name for someone who lived near a hedge.  The link with hedges as a means of dividing properties led in the Middle Low German to heck meaning “wooden fencing” under the influence of the Old Saxon hekki, from the Proto-West Germanic hakkju.  In nautical slang "heck" came to refer to the “back of a ship” because the position of the helmsman in the stern was enclosed by such a fence and from here it evolved in modern German generally to refer to "back or rear".  The Modern German Blende was from blenden (deceive), from the Middle High German blenden, from the Old High German blenten, from the Proto-Germanic blandijaną, from the primitive Indo-European blend- and was cognate with the Dutch blenden and the Old English blendan.  Because all German nouns are capitalized, Heckblende is correct but in English, heckblende is the usual spelling.

The German blende translates as “cover” so the construct Heck + Blende (one of their shorter compounds) happily deconstructs as “back cover” and that obviously describes the plastic mouldings used to cover the space between a car’s left and right-side tail lamps.  Blenden however can (as a transitive or intransitive) translate as (1) “to dazzle; to blind” in the sense of confuse someone’s sight by means of excessive brightness”, (2) (figuratively and usually as an intransitive) to show off; to pose (try to make an impression on someone by behaving affectedly or overstating one’s achievements) and (3) “to dazzle” in the sense of deception (from the 1680s German Blende (an ore of zinc and other metals, a back-formation from blenden (in the sense of "to blind, to deceive") and so called because the substance resembles lead but yields none (but should not be confused with the English construct hornblende (using the English “blende” in the sense of “mix”) (a dark-green to black mineral of the amphibole group, calcium magnesium iron and hydroxyl aluminosilicate)).

A heckblende thus (1) literally is a cover and (2) is there to deceive a viewer by purporting to be part of the rear lighting rather than something merely decorative (sic).  If a similar looking assembly is illuminated and thus part of the lighting system, then it's not a heckblende but part of a full-width tail lamp. 

1934 Auburn Boat-tail Speedster.

On cars, the design of tail lamps stated modestly enough and few were in use before 1914, often a small, oil-lit single lens the only fitting.  Electric lamps were standardized by the 1920s and early legislation passed in many jurisdictions specified the need for red illumination to the rear (later also to indicate braking) but about the only detail specified was a minimum luminosity; shape, size and placement was left to manufacturers.  Before the late 1940s, most early tail laps were purely functional with little attempt to make them design motifs although during the art deco era, there were some notably elegant flourishes but despite that, they remained generally an afterthought and on lower priced models, a second tail lamp was sometimes optional, the standard of a left and right-side unit not universal until the 1950s.

A tale of the tails of two economies:  1959 MGA Twin-Cam FHC & 1959 Daimler Majestic (upper) and 1959 Chevrolet Impala (batwing) flattop & 1959 DeSoto Adventurer convertible (lower).

It was in the 1950s the shape of tail lamps became increasingly stylized.  With modern plastics freeing designers from the constraints the use of glass had imposed and the experience gained during the Second World War in the mass-production of molded Perspex, new possibilities were explored.  In the UK and Europe, there was little extravagance, manufacturers content usually to take advantage of new materials and techniques mostly to fashion what were little more than larger, more rounded versions of what had gone before, the amber lens being adopted as turn indicators to replace the mechanically operated semaphore signals often little more than a duplication of the red lamp or an unimaginatively-added appendage.

1961 Chrysler Turboflite show car.

Across the Atlantic, US designers were more ambitious but one idea which seems not to have been pursued was the full-width tail lamp and that must have been by choice because it would have presented no challenges in engineering.  Instead, as the jet age became the space age, the dominant themes were aeronautical or recalled the mechanism of rocketry, tail lamps styled to resemble the exhausts of jet-engines or space ships, the inspiration as often from SF (science fiction) as the runway.  Pursuing that theme, much of the industry succumbed to the famous fin fetish, the tails of their macropterous creations emphasizing the vertical more than the horizontal.  Surprisingly though, despite having produced literally dozens of one-off “concept” and “dream” cars over the decade, it seems it wasn’t until 1961 when Chrysler sent their Turboflite around the show circuit that something with a genuine full-width tail lamp was shown.

1936 Tatra T87 (left), 1961 Tatra T603A prototype (centre) & 1963 Tatra T-603-X5 (right).

That same year, in Czechoslovakia, the Warsaw Pact’s improbable Bohemian home of the avant garde, Tatra’s engineers considered full-width tail lamps for their revised 603A.  As indicated by the specification used since before the war (rear-engined with an air-cooled, 2.5 litre (155 cubic inch) all-aluminum V8), Tatra paid little attention to overseas trends and were influenced more by dynamometers and wind tunnels.  However, the tail lamps didn’t make it to volume production although the 603A prototype did survive to be displayed in Tatra’s Prague museum.  Tatra’s designs, monuments to mid-century modernism, remain intriguing.

1967 Imperial LeBaron four door Hardtop.

If the idea didn’t impress behind the iron curtain, it certainly caught on in the West, full-width assemblies were used by many US manufacturers over the decades including Mercury, Imperial, Dodge, Shelby, Ford, Chrysler & Lincoln.  Some genuinely were full-width lamps in that the entire panel was illumined, a few from the Ford corporation even with the novelty of sequential turn-signals (outlawed in the early 1970s, bureaucrats seemingly always on the search for something to ban).  Most however were what would come to be called heckblendes, intended only to create an illusion.

Clockwise from top left: 1974 ZG Fairlane (AU), 1977 Thunderbird (US), 1966 Zodiac Mark IV (UK), 1970 Thunderbird (US), 1973 Landau (AU) & 1970 Torino (US).

Whether heckblendes or actually wired assemblies, Ford became especially fond of the idea which in 1966 made an Atlantic crossing, appearing on the Mark IV Zodiac, a car packed with advanced ideas but so badly executed it tarnished the name and when it (and the lower-priced Zephyr which made do without the heckblende) was replaced, the Zephyr & Zodiac names were banished from Europe, never to return.  Ford’s southern hemisphere colonial outpost picked-up the style (and typically several years later), Ford Australia using heckblendes on the ZF & ZG Fairlanes (1972-1976) and the P5 LTD & Landau (1973-1976).  The Fairlane’s heckblendes weren’t reprised when the restyled ZH (1976-1979) model was released but, presumably having spent so much of the budget on new tail lamps, the problem of needing a new front end was solved simply by adapting that of the 1968 Mercury Marquis (the name shamelessly borrowed too), colonies often run with hand-me-downs.


1968 HK Holdens left to right: Belmont, Kingswood, Premier & Monaro GTS.  By their heckblende (or its absence), they shall be known.

In Australia, the local subsidiary of General Motors (GM) applied a double fake.  The "heckblende" on the HK Monaro GTS (1968-1969), as a piece of cost-cutting, was actually red-painted metal rather than reflective plastic and unfortunately prone to deterioration under the harsh southern sun; it was a fake version of a fake tail lamp.  Cleverly though, the fake apparatus was used as an indicator of one's place in the hierarchy, the basic Belmont with just tail lamps, the (slightly) better-appointed Kingswood with extensions, the up-market Premier with extended extensions and the Monaro GTS with the full-width part.  Probably the Belmont and Premier were ascetically most successful.  Exactly the same idea was recycled for the VH Commodore (1981-1984), the SL/E (effectively the Premier's replacement) model's tail lamp assemblies gaining stubby extensions.




Left to right, 1967 HR Premier, 1969 HT Brougham & 1971 HQ Premier.  

The idea of a full-width decorative panel wasn’t new, Holden having used such a fitting on earlier Premiers.  Known as the “boot appliqué strip”, it began small on the EJ (1962-1963), EH (1963-1965) & HD (1965-1966) before becoming large and garish on the HR (1966-1968) but (although not then known as bling), that must have been thought a bit much because it was toned down and halved in height when applied to the elongated and tarted-up Brougham (1968-1971 and intended to appeal to the bourgeoisie) and barely perceptible when used on the HQ Premier (1971-1974).  Holden didn’t however forget the heckblende and a quite large slab appeared on the VT Commodore (1997-2000) although it wasn’t retained on the revised VX (2000-2002) but whether in this the substantial rise in the oil price (and thus the cost of plastic) was a factor isn’t known.

Left to right: 1973 Porsche 914 2.0, 1983 BMW 323i (E30) & 1988 Mercedes-Benz 300E (W124).

Although, beginning with the 914 in 1973, Porsche was an early European adopter of the heckblende and has used it frequently since, it was the 1980s which were the halcyon days of after-market plastic, owners of smaller BMWs and Mercedes-Benz seemingly the most easily tempted.  The additions were always unnecessary and the only useful way they can be catalogued is to say some were worse than others.  The fad predictably spread to the east (near, middle & far) and results there were just as ghastly although the popularity of the things must have been helpful as a form of economic stimulus, such was the volume in which the things were churned out.  Among males aged 17-39, few things have proved as enduringly infectious as a love of gluing or bolting to cars, pieces of plastic which convey their owner's appalling taste. 

2019 Mercedes-Benz EQC 400 with taillight bar.

Fewer manufacturers now use heckblendes as original equipment and when they did the terminology varied, nomenclature including "decor panels", "valances" or "tail section appliqués".  However, although the heckblende may (hopefully) be headed for extinction, full-width tail lamps still entice stylists and modern techniques of design and production, combined with what LEDs & OLEDs have made possible, mean it’s again a popular feature, the preferred term now “taillight bar”.

Scrunchie

Scrunchie (pronounced skruhn-chee)

An elastic band covered with gathered fabric, used to fasten the hair, as in a ponytail (usually as scrunchie).

1987: The construct was scrunch + -ie.  Scrunch has existed since the early nineteenth century, either an intensive form of crunch; ultimately derived from the onomatopoeia of a crumpling sound or a blend of squeeze + crunch.  The suffix -ie was a variant spelling of -ee, -ey & -y and was used to form diminutive or affectionate forms of nouns or names.  It was used also (sometimes in a derogatory sense to form colloquial nouns signifying the person associated with the suffixed noun or verb (eg bike: bikie, surf: surfie, hood: hoodie etc).  Scrunchie is now used almost exclusively to describe the hair accessory.  Historically, the older form (scrunchy) was used for every other purpose but since the emergence of the new spelling there’s now some overlap.  Never fond of adopting an English coining, in French, the term is élastique à cheveux (hair elastic).  The alternative spelling is scrunchy (used both in error and in commerce).  Scrunchie is a noun; the noun plural is scrunchies.  The adjectives scrunchier & scrunchiest are respectively the comparative & superlative forms of scrunchy; as applied to scrunchies, they would be used to describe a scrunchie's relative degree of scrunchiness (a non-standard noun).

Mean Girls (2004) themed scrunchies are available on-line.

It's not unlikely that in times past, women took elastic bands (or something with similar properties) and added a decorative covering to create their own stylized hair-ties but as a defined product, the scrunchie appears to have first been offered as a commercial product 1963-1964 but the design was not patented until 1987 when night club singer Rommy Hunt Revson (1944–2022) named the hair accessory the Scunci (after her pet toy poodle), the name scrunchie an organic evolution because the fabric "scrunched up".  They were very popular in the 1990s although some factions in the fashion industry disparaged them, especially offended (some people do take this stuff seriously) if seeing them worn on the wrist or ankle.  The scrunchie is a classic case-study in the way such products wax and wane in popularity, something wholly unrelated to their functionality or movements in price; while the elasticity of scrunchies has remained constant, the elasticity of demand is determined not by supply or price but by perceptions of fashionability.  When such a low-cost item becomes suddenly fashionable, influenced by the appearance in pop-culture, use extends to a lower-income demographic which devalues the appeal and the fashion critics declare the thing "a fashion faux pas" and use declines among all but those oblivious of or indifferent to such rulings.  In the way of such things however, products often don't go away but lurk on the edges of respectability, the comebacks variously ironic, part of a retro trend or something engineered by industry, the tactics now including the use of the millions of influencers available for hire.          

The Dinner Scrunchie

A more recent evolution is the enlarged version called the Dinner Scrunchie, so named, the brand suggests, because it's "elevated enough to wear to dinner".  They're available from MyKitsch in black and designer colors and, covered with semi-sheer chiffon, they're eight inches (200mm) in diameter, about the size of a small dinner plate.  Breakfast and lunch scrunchie seem not to be a thing but those gaps in the market are catered to by the Brunch Scrunchie which while larger than most, is smaller and cheaper than the dinner version; it appear to be about the size of a bread & butter plate.    

Rita Ora (b 1990) in fluoro scrunchie, New York City, May 2014.

The most obvious novelty of the bigger scrunchies is of course the large size and because that translates into greater surface area, in the minds of many, thoughts turn immediately to advertising space.  There are possibilities but because of the inherent scrunchiness, they're really not suitable for text displays except perhaps for something simple like X (formerly known as Twitter) although Elon Musk (b 1971) probably thinks whatever else X may require, it's not brand awareness or recognition.  Where a message can be conveyed with something as simple as a color combination (such as the LGBTQQIAAOP's rainbow flag), the scrunchie can be a good billboard.