Saturday, October 12, 2024

Ekpyrosis

Ekpyrosis (pronounced eck-pyh-row-sys)

(1) In modern cosmology, a speculative theory proposing the known universe originated in the collision of two other three-dimensional universes traveling in a hidden fourth dimension. This scenario does not require a singularity at the moment of the Big Bang.

(2) In the philosophy of the Stoic school in Antiquity, the idea that all existence is cyclical in nature and universe is the result of a recurring conflagration in which the all is destroyed and reborn in the same process.

1590s (in English): From the Ancient Greek ἐκπύρωσις (ekpúrōsis) (conflagration, cyclically recurring conflagration in which the universe is destroyed and reborn according to some factions in Stoic philosophy), the construct being the Ancient Greek ἐκ (ek) (out of; from) + πύρωσις (pyrōsis), from πῦρ (pyr) (fire) + -ōsis (the suffix).  While there’s no direct relationship between the modern “big bang theory” and the Stoic’s notion of periodic cosmic conflagration (the idea the universe is periodically destroyed by fire and then recreated), the conceptual similarity is obvious.  The Stoic philosophy reflected the general Greek (and indeed Roman) view of fire representing both destruction and renewal.  In English, ekpyrosis first appeared in the late sixteenth century translations or descriptions of ancient Stoic philosophy, particularly in relation to their cosmological theories and it came to be used either as the Stoics applied it or in some analogous way.  It was one of a number of words which during the Renaissance came to the attention of scholars in the West, a period which saw a revival of interest in ancient Greek and Roman thought, art & architecture and for centuries many of the somewhat idealized descriptions and visions of the epoch were those constructed (sometimes rather imaginatively) during the Renaissance.  The alternative spelling was ecpyrosis.  Ekpyrosis is a noun and ekpyrotic is an adjective; the noun plural is ekpyroses.

In stoic philosophy, ekpyrosis was described sometimes as a recurring, unitary process (the periodic destruction & rebirth of the universe in a single conflagration) and sometimes and the final stage of one existence (destruction) which was the source of a palingenesis (the subsequent rebirth).  Palingenesis was almost certainly a variant of palingenesia (rebirth; regeneration) with the appending of the suffix -genesis (used to suggest “origin; production”).  Palingenesia was a learned borrowing from the Late Latin palingenesia (rebirth; regeneration), from the Koine Greek παλιγγενεσία (palingenesía) (rebirth), the construct being the Ancient Greek πᾰ́λῐν (pálin) (again, anew, once more), ultimately from the primitive Indo-European kwel (to turn (end-over-end); to revolve around; to dwell; a sojourn)) + γένεσις (genesis) (creation; manner of birth; origin, source).  The construct of the suffix was from the primitive Indo-European ǵenh- (to beget; to give birth; to produce”) + -ῐ́ᾱ (-íā) (the suffix used to form feminine abstract nouns).

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

In biology, the word was in the nineteenth century was adopted to describe “an apparent repetition, during the development of a single embryo, of changes that occurred previously in the evolution of its species) came directly from the German Palingenesis (the first papers published in Berlin).  In geology & vulcanology, it was used to mean “regeneration of magma by the melting of metamorphic rocks”) and came from the Swedish palingenes (which, like the German, came from the Greek).  In the study of history, palingenesis could be used to describe (often rather loosely) the recurrence of historical events in the same order, the implication being that was the natural pattern of history which would emerge if assessed over a sufficiently long time.  When such things used to be part of respectable philosophy, it was used to mean “a spiritual rebirth through the transmigration of the soul”, a notion which exists in some theological traditions and it has an inevitable attraction for the new-age set.

The Death of Seneca (1773), oil on canvas by Jacques-Louis David (1748–1825), Petit Palais, Musée Des Beaux-Arts, De La Ville De Paris, France.  Lucius Annaeus Seneca (Seneca the Younger, (circa 4 BC–65 AD)) was one of the best known of the Roman Stoics and the painting is a classic example of the modern understanding of stoicism, Seneca calmly accepting being compelled to commit suicide, condenmed after being implicated in a conspiracy to assassinate the Nero (37-68; Roman emperor  54-68).  The consensus among historians is seems to be Seneca was likely “aware of but not involved in” the plot (a la a number of the Third Reich's generals & field marshals who preferred to await the outcome of the July 1944 plot to assassinate Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) before committing themselves to the cause).  There are many paintings depicting the death of Seneca, most showing him affecting the same air of “resigned acceptance” to his fate.

The Stoics were a group of philosophers whose school of thought was for centuries among the most influential in Antiquity.  Although the word “stoic” is now most often used to refer to someone indifferent to pleasure or pain and who is able gracefully to handle the vicissitudes of life, that’s as misleading as suggesting the Ancient Epicureans were interested only in feasting.  What Stoicism emphasized was living a virtuous life, humans like any part of the universe created and governed by Logos and thus it was essential to at all times remain in harmony with the universe.  Interestingly, although the notion of ekpyrosis was one of the distinctive tenants of the school, there was a Stoic faction which thought devoting much energy to such thoughts was something of a waste of energy and that they should devote themselves to the best way to live, harmony with logos the key to avoiding suffering.  Their ideas live on in notions like “virtue is its own reward” and ultimately more rewarding than indulgence or worldly goods which are mere transitory vanities.

While the speculative theory of an ekpyrotic universe in modern cosmology and the ancient Stoic idea of ekpyrosis both revolve around a cyclical process of destruction and renewal, they differ significantly in detail and the phenomena they describe.  Most significantly, in modern cosmology there’s no conception of this having an underlying motivation, something of great matter in Antiquity.  The modern theory is an alternative to what is now the orthodoxy of the Big Bang theory; it contends the universe did not with a “big bang” (originally a term of derision but later adopted by all) begin from a singular point of infinite density in but rather emerged from the collision of two large, parallel branes (membranes) in higher-dimensional space.  In the mysterious brane cosmology, the universe is imagined as a three- dimensional “brane” within a higher-dimensional space (which tends to be called the “bulk”).  It’s the great, cataclysmic collision of two branes which triggers each defining event in the endless cycle of cosmic evolution.  In common with the Stoics, the process is described as cyclical and after each collusion, the universe undergoes a long period of contraction, followed by another collision that causes a new expansion.  Thus, elements are shared with the “Big Bang” & “Big Crunch” cycles but the critical variations are (1) there’s no conception of a singularity (2) although this isn’t entirely clear according to some, time never actually has to “begin” which critics have called a bit of a “fudge” because it avoids the implications of physical laws breaking down (inherent in the Big Bang’s singularity) and assumes cosmic events occur smoothly (in the sense of physics rather than violence) during brane collisions.

Bust of Marcus Aurelius (121–180; Roman emperor 161-180), Musée Saint-Raymond, Toulouse, France.

Something in the vein of the “philosopher kings” many imagine they’d like to live under (until finding the actual experience less pleasant than they’d hoped), Marcus Aurelius was a Stoic philosopher who has always been admired for his admirable brevity of expression, the stoic world-view encapsulated in his phases such as “Waste no more time arguing about what a good man should be.  Be one.”, “The happiness of your life depends upon the quality of your thoughts.” and “Our life is what our thoughts make it.  Marcus Aurelius was the last emperor of Pax Romana (Roman peace, 27 BC-180 AD), a golden age of Roman imperial power and prosperity.  

To the Stoics of Antiquity, ekpyrosis described the periodic destruction of the universe by a great cosmic fire, followed by its rebirth, fire in the Classical epoch a common symbol both of destruction and creation; the Stoic universe was a deterministic place.  In the metaphysics of the ancients, the notion of fire and the central event was not unreasonable because people for millennia had been watching conflagrations which seemed so destructive yet after which life emerged, endured and flourished and the idea was the same conflagration which wrote finis to all was the same primordial fire from which all that was new would be born.  More to the point however, it would be re-born, the Stoics idea always that the universe would re-emerge exactly as it had been before.  The notion of eternal recurrence doesn’t actually depend on the new being the same as the old but clearly, the Greeks liked things the way they were and didn’t want anything to change.  That too was deterministic because it was Logos which didn’t want anything to change.  The Stoics knew all that had been, all this is and all that would be were all governed by Logos (rational principle or divine reason) and it was this which ensured the balance, order and harmony of the universe, destruction and re-birth just parts of that.  Logos had motivation and that was to maintain the rational, natural order but in modern cosmology there’s no motivation in the laws of physics, stuff just happens by virtue of their operation.

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant), under such multi-string parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “0” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Thursday, October 10, 2024

Malevolent, malicious & malignant

Malevolent (pronounced muh-lev-uh-luhnt)

(1) Wishing evil or harm to another or others; showing ill will; ill-disposed; malicious.

(2) Evil; harmful; injurious.

(3) In astrology, a force evil or malign in influence.

1500–1510:  From the Middle English malevolent (suggested by Middle English malevolence (analyzed of late as “male violence”)), from the Old French malivolent and the Latin malevolentem, the construct being male (badly, ill, wrongly) + volens (wanting, willing, wishing”), the present participle of velle (to want, wish for, desire).  The most commonly used form in Latin appears to have been malevolēns (ill-disposed, spiteful).  Upon entering English in the sixteenth century, the word retained this sense of ill will or harmful intent.  The adjective malevolent (having an evil disposition toward another or others, wishing evil to others) dates from the early sixteenth century while the noun malevolence (the character of being ill-disposed toward another or others; ill-will, malice, personal hatred) was in use by the mid-fifteenth, from the Old French malevolence and directly from Latin malevolentia (ill-will, dislike, hatred), from malevolentem (nominative malevolens) (ill-disposed, wishing ill, spiteful, envious).  The antonym is benevolent and the usual negative forms are unmalevolent & non-malevolent.  Malevolent is an adjective, malevolence is a noun and malevolently is an adverb; the noun plural malevolences.

The writings of Russian-American author & mystic Helena Petrovna Blavatsky (often styled Madame Blavatsky (1831-1891; co-founder of the Theosophical Society (1875)) were in the nineteenth century influential in non-mainstream theology and philosophy circles.  Her work included exploring "the horrifying principles and malignant influence of the Society of Jesus [the Jesuit Order, a Roman Catholic cult] are brought out in the open for all to see, hitherto secret ciphers of the so-called higher Masonic degrees revealed, examples of Jesuit cryptography exposed, and a High Mason’s critical strictures upon Masonry itself articulated.   In July 1773, Clement XIV (1705–1774; pope 1769-1774), acting on a request from many governments disturbed by the Jesuits’ plotting and scheming, issued the brief Dominus ac Redemptor (Lord and Redeemer) which dissolved the cult.  However, the Jesuits went underground and conducted a masonic-like infiltration of the Church which culminated in the pressure exerted on Pius VII (1742–1823; pope 1800-1823) who in 1814 issued the papal bull Sollicitudo omnium ecclesiarum (The care of all Churches) allowing the order to be re-established and resume its Masonic ways.

Malicious (pronounced muh-lish-uhs)

(1) Full of, characterized by, or showing malice; intentionally harmful; spiteful.

(2) In common law jurisdictions, vicious, wanton, or mischievous in motivation or purpose (often in statute as an “aggravating circumstance”).

(3) In common law jurisdictions as malicious prosecution, an intentional tort which arises from a party (1) intentionally and maliciously instituting or pursuing (or causing to be instituted or pursued) a legal action (civil or criminal) that is (2) brought without probable cause and (3) dismissed in favor of the other party.  It belongs sometimes to the class of actions called “abuse of process”.

(4) In common law jurisdictions as “malicious prosecution”, a common law intentional tort which arises from a party (1) intentionally and maliciously instituting or pursuing (or causing to be instituted or pursued) a legal action (civil or criminal) that is (2) brought without probable cause and (3) dismissed in favor of the other party.

(5) In common law jurisdictions as “malicious mischief”, the willful, wanton, or reckless destruction of the personal property of another occasioned by actual ill will or resentment toward the owner or possessor of such property.

1175–1225: From the Middle English malicious (which may have existed in the Old English as malicius but this is contested), from the Old French malicios (showing ill will, spiteful, wicked (which persists in Modern French as malicieux)) from the Latin malitiōsus (wicked, malicious), the construct being maliti(a) (badness; ill will; spite), from malus (bad; evil) + -osus.  In Latin, the -ōsus suffix was added to a noun to form an adjective indicating an abundance of that noun.  The Middle English form displaced the earlier native Middle English ivelwilled & ivelwilly (malicious), both related to the Old English yfelwillende (literally “evil-willing”).  In early fourteenth century Anglo-French legal language, it meant “characterized by malice prepense”, essentially little different from the sense “malicious” today enjoys in statute in common law jurisdictions.  The adverb maliciously (in a spiteful manner, with enmity or ill-will) emerged in the late fourteenth century while the noun maliciousness (extreme enmity or disposition to injure; actions prompted by hatred) was in use a few decades later.  The spelling malitious is obsolete.  The usual negative forms are non-malicious & unmalicious but lexicographers note also the use of semi-malicious & quasi-malicious, forms adopted presumably when some nuance of the evil done seems helpful.  At the other end of the scale of maliciousness, the comparative is more malicious and the superlative most malicious.  Malicious is an adjective, maliciousness is a noun and maliciously is an adverb.

Malignant (pronounced muh-lig-nuhnt)

(1) Disposed to cause harm, suffering, or distress deliberately; feeling or showing ill will or hatred.

(2) Very dangerous or harmful in influence or effect.

(3) In pathology, tending to produce death.

(4) In medicine (usually of cells or a tumor), characterized by uncontrolled growth; cancerous, invasive, or metastatic.

1540s: From the Middle French malignant, from the Late Latin malignantem (nominative malignans) (acting from malice), stem of malignāns, present participle of malignāre (to act maliciously; to behave with malign intent) and malignō (to malign, viciously to act).  The English malign (evil or malignant in disposition, nature, intent or influence) was from the Middle English maligne, from the Old French maligne, from the Latin malignus, the construct being malus (bad) + -gnus (born), from gignere (to bear, beget) from the primitive Indo-European root gene- (give birth, beget).  In medicine (of tumors and such), the antonym is “benign” but non-malignant & unmalignant both exist as does semi-malignant which sounds strange to non-clinical ears but which is used apparently with the sense of “not very malignant”, presumably something of a comfort to a patient.  The most commonly distinction in medicine seems to be between “malignant” and “benign” and this provide the author Evelyn Waugh (1903-1966) with one of his better jabs.  Learning that the notoriously obnoxious Randolph Churchill (1911-1968) had been operated on after a tumor was found, when told it had been removed and sent for an analysis which proved it “benign”, he observed: “What a miracle that modern medicine could find the only part of Randolph that is not malignant and then remove it. Malignant is an adjective, malignancy & malignance are nouns, malignantly is an adverb; the noun plural is malignancies.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

The word entered the medical jargon in the 1560s but the earlier use was as a theological slur, the Church describing as malignant “those damnable followers of the antichrist” in the ecclesiam malignantum (best translated as “Church of the Wicked”), a concept found in many writings in early Christian thought, particularly among certain groups that emphasized the contrast between the true, faithful Church and those who they believed were corrupt or evil within the broader Christian community.  The theme continues to this day and can be identified as the source of many schisms and internecine conflicts within and between many religions.  The term existed in a number of Latin Christian writings, often linked to Augustinian theology.  Saint Augustine of Hippo (354–430), in his work attacking the Donatists (a Christian sect which in the fourth century forced a schism in the Church of Carthage) referenced the ecclesia malignantium to describe those within the Church who were corrupt or sinful, in contrast to the ecclesia sancta (the holy Church).  It was Augustine who constructed the influential doctrine that while within the Church, there could be both saints and sinners, ultimately the Church itself remained holy, an interesting proto-structralism upon which churches of many denominations to this day fall back upon in their handling of clerical scandals.

The ecclesia malignantium were used metaphorically to contrast the “true” Church (those who genuinely followed Christ) with those who may have been Christian in name but acted in ways that were contrary to Christian teachings, thus aligning themselves with evil or wickedness.  In the secular world, the model is not unfamiliar, a modern example being those in the US Republican Party not judged sufficiently “pure” by the right-wing fanatics being labeled “RINOs” (Republicans in Name Only), an idea Saint Augustine would have recognized.  So, faith and politics can both be binary exercises, those judged heretical, schismatic, or in some way morally corrupt being a malignant presence in the community and needing to be excised as swiftly as the surgeon’s scalpel slices out a malignant tumor.  During the sixteenth century Protestant Reformation in Europe, the language was re-purposed, by the 1540s used by protestant theologians and activists to condemn as heretics the pope and the Church in Rome.  By the 1590s, malignant was in use to mean (of persons) “disposed to inflict suffering or cause distress” whereas in the early fourteenth century “malign” was used as an adjective and the now extinct malignous meant “poisonous, noxious”.  The noun malignancy dates from circa 1600 and by mid century had come to mean “state of extreme malevolence, bitter enmity”, the particular use in medicine (of diseases, growths, tumors etc with a virulence and tendency to get worse) appears in the medical literature from the 1680s.  In English history, borrowing from the turbulent priests, both the followers of Oliver Cromwell (1599–1658; Lord Protector of the Commonwealth 1653-1658) and the royalist forces would label each other “malignants”.

In English, “mal-” words are familiar.  The mal- prefix was from the Old French mal- (bad; badly) from the Latin adverb male, from malus (bad, wicked).  In English the prefix was applied to create words variously with some denotation of the negative including (1) bad, badly (malinfluence), (1) unhealthy; harmful (malware), (3) unpleasant (malodorous) (4) incorrect (malformed), (5) incomplete (maldescent) & (6) deficiently (malnourished).  Malevolent, malicious & malignant are from a different linage but all are in some way negative on nature but there are differences between them:  Malevolent means “having or showing a desire to cause harm to others and carries the connotation of “a deep-rooted ill will or hatred”.  Malicious means “intending to do harm, typically without justification” and connotes something of an emphasis on a “spiteful or cruel intent”.  Malignant means “harmful, dangerous, or likely to cause death and while historically it was used to refer to “extreme malevolence”, the use in medicine has in the modern age tended to make that use almost exclusive although it can still be used of anything (or anyone) actively harmful or evil.  So in use, the modern tendency is for malevolent to be used of “ill will or hatred”, malicious “an intent to cause harm” and malignant “something that is dangerously harmful, often in a physical or medical context”.  The related "malign" seems most be used of intent and harmful speech.  Which to use hangs also on intent; if someone is murdered by the Freemasons, it’s not unreasonable to suppose the intent was malicious and the act malevolent but had they been eaten by a shark while swimming, neither word should be invoked because that’s just a thing sharks do.

Wednesday, October 9, 2024

Decker

Decker (pronounced dek-er)

(1) Something (typically a bus, ship, aircraft, bed, sandwich et al), having a specified number of decks, floors, levels, layers and such (used usually in combination with a numerical or other expression indicating the number in the construction (double decker, triple decker, upper decker, five decker etc (sometimes hyphenated).

(2) As “table decker” an employee who “decks” (ie sets or adorns) a table used for entertaining (used also as a “coverer”) (archaic).  The idea lives on in the verb “bedeck” (to adorn).

(3) In boxing slang, a fighter with a famously powerful punch, able to “deck” an opponent (ie knock them to the canvas with a single punch).

(4) In historic naval slang, as “quarter-decker”, a label applied to officers known more for their attention to matters of etiquette or trivial regulations than competent seamanship or ability in battle.  It was an allusion to a warship’s “quarter deck” (the part of the spar-deck of a man-of-war (warship) between the poop deck and main-mast (and originally (dating from the 1620s), a smaller deck above the half-deck, covering about a quarter of the vessel’s LOA (length overall)).  In many navies, the quarter-deck was reserved as “a promenade for officers only”.

1785–1795: The construct was deck + -er.  Deck in this context was from the Middle English dekke (covering extending from side to side over part of a ship), from a nautical use of the Middle Dutch decke & dec (roof, covering), from the Middle Dutch decken, from the Proto-Germanic thakam (source also of the noun “thatch” and from the primitive Indo-European root steg & teg- (to cover) and the Old Dutch thecken, from the Proto-West Germanic þakkjan, from the Proto-Germanic þakjaną and related to the German Decke (covering, blanket).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  The noun double-decker was first used in 1835 of ships with two decks above the water line and this extended to land transport (trains) in 1867.  Decker is a noun & adjective; the noun plural is deckers.

Flight deck of the US Navy's Nimitz-class aircraft carrier USS Carl Vinson (CVN 70).

The reason ships, trains, buses, aircraft and such have "decks" while buildings have "floors” or “stories (or storeys)” is traceable to nautical history and the nomenclature used in shipbuilding.  English picked up “deck” from the Middle Dutch decke & dec (roof, covering) where the use had been influenced by the Old Norse þekja (to cover) and in early shipbuilding, a “deck” was the structure which covered the hull of the ship, providing both a horizontal “working surface” and enclosing the vessel, creating a space for stores, cargo or accommodation which was protected from the elements.  In that sense the first nautical decks acted as a “roof”.  As ships became larger, the nautical architects began to include multiple decks, analogous with the floors of buildings in that they fulfilled a similar function, providing segregated layers (ie the storeys in buildings) used for cannons, crew quarters, storage and such.  As the terminology of shipbuilding became standardized, each deck came to have a specific name depending on its purpose or position (main deck, flight deck, poop deck, gun deck etc).

Ford Mustang convertible (1965–1973) replacement floor pan (complete, part number 3648B) by Moonlight Drive Sheet Metal.

Until the nineteenth century, although the vehicles used on land became larger, they tended to get longer rather than higher but the advent of steam propulsion made possible trains which ran on railways and these could pull carriages carrying freight or passengers.  The first “double decker” versions appeared in France in 1867 and were described as voitures à imperial, (imperial cars) were used on the Chemin de Fer de l'Ouest (Western Railway), the upper deck roofless and thus an “open-air experience”,  Rapidly, the idea spread and double-deck carriages became common for both long-distance and commuter services.  An outlier in the terminology is car design; cars have a floor (sometimes called the “floor pan”) rather than a deck, presumably because there’s only ever one.  In the narrow technical sense there have been cars with “two floors” but they were better understood as a “double-skinned” single floor and they were used for armor or to provide a space for something specialized such as hydrogen fuel-cells, the technique often called “sandwich construction”.

Boeing 314 Clipper flying boat cutaway (left) and front schematics of Boeing 747-300 (right).  Re-using some of an earlier design for a bomber which failed to meet the military’s performance criteria, between 1938-1941, Boeing built twelve 314 Clippers, long-range flying boats with the range to cross both the Atlantic and Pacific oceans.  Although used by the military during World War II, most of their service was with the two commercial operators Pan Am (originally Pan American Airways) and BOAC (British Overseas Airways Corporation).  Very much a machine of the pre-war age, the last Clippers were retired from service between 1946-1948, the advances in aviation and ground infrastructure built during war-time rendering them obsolete and prohibitively expensive to maintain.

Because train designers adopted the nautical terminology, it naturally came to be used also in buses, and aircraft, the term “flight deck” (where the pilot(s) sat) common even before multiple decks appeared on flying boats and other long-distance airframes.  The famous “bubble” of the Boeing 747 (1968-2023) remains one of the best known decks and although most associated with the glamour of first-class international travel, was designed originally as a freight compartment.  The multi-deck evolution continued and the Airbus A380 (2005-2021) was the first “double decker” with two passenger decks extending the full length of the fuselage (with cargo & baggage) carried in the space beneath hence the frequent description of the thing as a “triple decker”.

Lindsay Lohan contemplating three decker sandwich, now usually called a “club sandwich”.  Many menus do specify the number of decks in the clubs.

Deck widely was used of many raised flat surface which people could walk or stand upon (balcony, porch, patio, flat rooftop etc) and came to be used of the floor-like covering of the horizontal sections or compartments, of a ship, a use later extended to land transport (trains, busses etc) and in the twentieth century, to aircraft.  A pack or set of playing cards can be called a deck as (less commonly), can the dealt cards which constitute the “hand” of each player and the notion was extended to sets of just about anything vaguely similar (such as a collection of photographic slides). , Because slides tended to be called a “deck” only when in their magazine, this influenced the later use in IT when certain objects digitally were assemble for storage or use and in audio and video use when cartridges or cassettes were loaded into “tape decks”.  In print journalism, a deck is a headline consisting of one or more full lines of text (applied especially to a sub-headline).  The slang use in the trade of illicit narcotics to describe the folded paper used for distributing drugs was a US regionalism.  There are dozens of idiomatic and other uses of deck, the best known including “all hands on deck”, “swab the decks”, “hit the deck” “clear the decks”, “deck-chair”, “deckhand”, “deck shoes”, “flight deck”, “gun deck”, “observation deck”, “play with a full deck”, “promenade deck”, “re-arranging the deck chairs on the Titanic”, “decked out”, “stack the deck”, “sun deck”, “top deck” & “to deck someone”.

Schematic of the Royal Navy’s HMS Victory, a 104-gun first-rate ship of the line, laid down in 1759 and launched in 1765, most famous as the flagship of Admiral Lord Nelson’s (1758-1805) flagship at the Battle of Trafalgar on 21 October 1805; it was on her Nelson was killed in battle.  Uniquely, after 246 years on the active list, she is the world's oldest naval vessel still in commission.  Although the term wasn’t in use until the 1830s, Victory was a “five decker” configured thus:

Orlop Deck: The lowest deck, mainly used for storage and ship's equipment.
Lower Gun Deck: The deck housing the heaviest cannons.
Middle Gun Deck: This deck contained another set of guns, slightly lighter than those on the lower gun deck.
Upper Gun Deck: The third level of guns, with even lighter cannons.
Quarterdeck and Forecastle: The uppermost decks, where the captain and officers usually directed the ship during battle.

The early meanings in English evolved from “covering” to “platform of a ship” because of the visual similarity and it’s thought the idea of a deck being a “pack of cards” (noted in the 1590s) was based on them being stacked like the decks of a multi-deck man-of-war (warship).  The tape-deck was first so described in 1949 an was a reference to the flat surface of the old reel-to-reel tape recorders.  The first deck chairs were advertised in 1844, an allusion to the use of such thing on the decks of passenger ocean liners and deck shoes were those with sturdy rubber soles suitable for use on slippery surfaces; the modern “boat shoes” are a descendent.  The old admiralty phrase “clear the decks” dated from the days of the tall-masted warships (the best known of which was the big “ship-of-the-line”) and was a reference to the need to remove from the main deck the wreckage resulting from an attack (dislodged masts, sails, spas etc) to enable the battle to be rejoined without the obstructions.  Being made of wood, the ships were hard to sink but highly susceptible to damage, especially to the rigging which, upon fragmentation, tended to fall to the deck.  It may have been a adaptation of the French army slang débarasser le pont (clear the bridge).

Ford 302 cubic inch (4.9 litre) Windsor V8 with the standard deck (left) and the raised deck 351 (5.8) (right).  In production in various displacements between 1961-2000, the 221 (3.6), 255 (4.2), 260 (4.3), 289 (4.7) & 302 (4.9) all used what came retrospectively to be called the “standard deck” while the 351 (5.8) was the sole “raised deck” version.

For decades, it was common for US manufacturers to increase the displacement of their V8 engines but means of creating a “raised deck” version, the process involving raising the height of the engine block's deck surface (the surface where the cylinder heads bolt on).  What this allowed was the use of longer connecting rods while using the original heads and pistons which in combination with a “longer stroke crankshaft” increases the displacement (the aggregate volume of all cylinders).  The industry slang for such things was “decker” and the technique was used with other block configurations but is best known from the use in the 1960s & 1970s for V8s because it’s those which tend to be fetishized.  The path to greater displacement lay either in lengthening the stroke or increasing the bore (or a combination of the two) and while there were general engineering principles (longer stroke=emphasis on more torque at the cost of reducing maximum engine speed and bigger bore=more power and higher engine speeds) but there were limitations in how much a bore could safely be increased including the available metal.  A bigger bore (ie increasing the internal diameter of the cylinder) reduces the thickness of the cylinder walls and if they become too thing, there can be problems with cooling, durability or even the structural integrity of the block.  The piston size also increases which means the weight increases and thus so too does the reciprocating mass, increasing friction, wear and has the potential to compromise reliability, especially at high engine speeds.

Increasing the stroke will usually enhance the torque output, something of greater benefit to most drivers, most of the time than the “top end power” most characteristic of the “big bore” approach.  In street use, most engines spend most time at low or mid-range speed and it’s here a longer stroke tends to produce more torque so it has been a popular approach and the advantage for manufacturers is that creating a “decker” almost always is easier, faster and cheaper than arranging one which will tolerate a bigger bore, something which can demand a new block casting and sometimes changes to the physical assembly line.  With a raised deck, there can be the need to use different intake and exhaust manifolds and some other peripheral components but it’s still usually a cheaper solution than a new block casting.  Ford’s “thinwall” Windsor V8 was one of the longest-serving deckers (although the raised-deck version didn’t see out the platform’s life, the 351 (introduced in 1969) retired in 1997).  Confusingly, during the Windsor era, Ford also produced other 351s which belonged to a different engine family.  Ford didn’t acknowledge the biggest Windsor's raised deck in its designation but when Chrysler released a decker version of the “B Series” big-block V8 (1958-1978), it was designated “RB” (Raised B) and produced between 1959-1979.

1964 AEC Routemaster double decker Bus RM1941 (ALD941B) (left), two sightseeing AEC Routemasters in Christchurch, New Zealand (centre) and one of the "new" Routemasters, London 2023 (right).

London’s red, double-decker busses are one of the symbols most associated with the city and a fixture in literature, art and films needing something with which to capture the verisimilitude.  The classic example of the breed was the long-running AEC Routemaster, designed by the London Transport Board and built by the Associated Equipment Company (AEC) and Park Royal Vehicles.  The Routemaster entered service in 1956 and remained in production until 1968, changed over those years in many details but visually there was such continuity that it takes an expert (and buses are a thing so experts there are) to pick the model year.  They entered service in 1956 and remained in regular service until 2005 although some were retained as “nostalgia pieces” on designated “tourist” routes until COVID-19 finally saw their retirement; since then, many have been repurposed for service around the world on sightseeing duties and other tourist projects.

Boris Johnson (b 1964; UK prime-minister 2019-2022) will leave an extraordinary political legacy which in time may came to be remembered more fondly than it now may appear but one of his most enduring achievements is likely to be the “New Routemaster” which had the typically bureaucratic project name “New Bus for London” but came to be known generally as the “Boris Bus”, the honor accorded by virtue of him championing the idea while serving as Lord Mayor of London (2008-2016).  In truth, the original Routemaster, whatever its period charm, was antiquated years before it was withdrawn from service and although the doorless design made ingress and egress convenient, it was also dangerous and apparently a dozen passenger fatalities annually was not uncommon.  The Borisbus entered service in 2012 and by 2024 almost 1200 were in service.

1930 Lancia Omicron with 2½ deck coachwork and a clerestoried upper windscreen (left) and a “three decker” bus in Pakistan (right).

The Lancia Omicron was a bus chassis produced between 1927-1936; over 600 were built in different wheelbase lengths with both two and three-axle configurations.  Most used Lancia's long-serving, six-cylinder commercial engine but, as early as 1933, some had been equipped with diesel engines which were tested in North Africa where they proved durable and, in the Sudan, Ethiopia, Libya and Algeria, once petrol powered Omicron chassis were being re-powered with diesel power-plants from a variety of manufacturers as late as the 1960s.  Typically of bus use, coachbuilders fabricated many different styles of body but, in addition to the usual single and double deck arrangements, the Omicron is noted for a number of two and a half deck models, the third deck configured usually as a first-class compartment but in at least three which operated in Italy, they were advertised as “smoking rooms”, the implication presumably that the rest of the passenger compartment was smoke-free.  History doesn't record if the bus operators were any more enthusiastic about or successful in enforcing smoking bans than the usual Italian experience.  For a variety of reasons, busses with more than 2.something decks were rare and the Lancias and Alfa Romeos which first emerged in the 1920s were unusual.  However, the famously imaginative and inventive world of Pakistani commerce has produced a genuine “three decker” bus, marketed as the “limousine bus”.  What the designer did was take a long-distance, double decker coach and use the space allocated usually as a luggage compartment to configure as the interior of a long wheelbase (LWB) limousine, thereby creating a “first class” section, the four rows of seating accessible via six car-like (ie limousine) doors.