Showing posts sorted by relevance for query Freemason. Sort by date Show all posts
Showing posts sorted by relevance for query Freemason. Sort by date Show all posts

Wednesday, January 25, 2023

Schizophrenia

Schizophrenia (pronounced skit-suh-free-nee-uh or skit-suh-freen-yuh)

(1) In psychiatry (also called dementia praecox), a severe mental disorder characterized by some, but not necessarily all, of the following features: withdrawal from reality, illogical patterns of thinking, delusions, and hallucinations, and varying degrees of other emotional, behavioral, or in emotional blunting, intellectual deterioration, social isolation, disorganized speech and behavior, delusions, and hallucinations.

(2) A state characterized by the coexistence of contradictory or incompatible elements; informal behavior that appears to be motivated by contradictory or conflicting principles.

(3) In informal use, used to suggest a split personality, identity or other specific forms of dualism.  In popular usage, the term is often confused with dissociative identity disorder (also known as multiple personality disorder).

1908: From German Schizophrenie, from the New Latin schizophrenia and Coined by Swiss psychiatrist Eugen Eugen Bleuler (1857-1939) as an umbrella term covering a range of more or less severe mental disorders involving a breakdown of the relation between thought, emotion, and action and literally "a splitting of the mind", the construct being the Ancient Greek σχίζω (skhizein or skhízō) (to split), from the primitive Indo-European root skei (to cut, split apart), + φρήν (phrn(genitive phrenos(mind, heart, diaphragm) + -ia (the suffix from the Latin -ia and Ancient Greek -ία (-ía) & -εια (-eia), which forms abstract nouns of feminine gender).  It's from phrthat English gained phrenes (wits, sanity) and hence phreno-.

The adjective schizophrenic (characteristic of or suffering from schizophrenia) dates in the medical literature from 1912 (in English translations of Bleuler's publications) and was immediately adopted also as a noun (schizophrenic patient).  That survived but another noun formation in English was schizophrene which emerged in 1925, the construct presumably a tribute to Dr Bleuler's original work having been written in German.  As such things became more publicized during the post-war years (and picked up in popular culture including film and novels), the transferred adjectival sense of "contradictory, inconsistent" emerged in the mid 1950s, applied to anything from the behavior of race horses and motor-cycles to the nature of mucical composition.  The jargon of psychology also produced schizophrenogenic (tending to spark or inspire schizophrenia).  The adjective schizoid (resembling schizophrenia; tending sometimes to less severe forms of schizophrenia) dates from 1925, from the 1921 German coining schizoid (1921), the construct being schiz(ophrenia) + -oid.  The suffix -oid was from a Latinized form of the the Ancient Greek -ειδής (-eids) & -οειδής (-oeids) (the “ο” being the last vowel of the stem to which the suffix is attached); from εδος (eîdos) (form, shape, likeness).  It was used (1) to demote resembling; having the likeness of (usually including the concept of not being the same despite the likeness, but counter-examples exist), (2) to mean of, pertaining to, or related to and (3) when added to nouns to create derogatory terms, typically referring to a particular ideology or group of people (by means of analogy to psychological classifications such as schizoid).  Schizophrenia is a noun, schizophrenic & schizoids are nouns & adjectives and schizophrenically is an adverb; the noun plural is schizophrenics.

Madness

Within the profession of psychiatry, schizophrenia has a long (and technical) definitional history although, in essence, it’s always been thought a severe and chronic mental disorder characterized by disturbances in thought, perception and behavior.  Lacking any physical or laboratory test, it can be difficult to diagnose as schizophrenia involves a range of cognitive, behavioral, and emotional symptoms.  In the fifth edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-5 (2013)), the lifetime prevalence of schizophrenia is noted as 0.3%-0.7%, the psychotic manifestations typically emerging between the mid-teens and mid-thirties, with the peak age of onset of the first psychotic episode in the early to mid-twenties for males and late twenties for females.  The DSM-5 editors also made changes to the criteria to make a diagnosis of schizophrenia, the most significant amendments since DSM-III (1980).

Madness and Modernism: Insanity in the Light of Modern Art, Literature, and Thought (1992), by US clinical psychologist Louis A Sass (b 1949), was an exploration of why mystery continues to shroud schizophrenia, which, despite advances in biological psychiatry and neuroscience, appears little changed in the quarter-century since.  Sass quoted approvingly a description of schizophrenia as "a condition of obscure origins and no established etiology, pathogenesis and pathology…" without "…even any clear disease marker or laboratory test by which it can readily be identified."

However, in a departure from most writings on mental illness, Sass explored the "striking similarities" between the seemingly bizarre universe of schizophrenic experiences and the sensibilities and structures of consciousness revealed in the works of modernist artists and writers such as Kafka, Valery, Beckett, Robbe-Grillet, de Chirico and Dali.  Applying the techniques of psychology to modernism, he traced similar cognitive configurations reflected in schizophrenia and modern art & literature, finding both artist and schizophrenic characterized by a pronounced thrust to deconstruct the world and subjectively to reconstruct human experience without reference to objective reality.  Layers of reality, real and constructed, co-exist and interact, frequently fusing into each-other, producing an acute self-awareness Sass called "hyperreflexivity", as well as a profound sense of alienation from the empirical world.  Sass allowed his analysis to reach its logical conclusion, that there is a tenuous, though clearly discernible, connection between modern culture and madness, speculating that insanity might be “…a disease of certain highly advanced forms of cultural organization, perhaps a part of the price we pay for civilization?"  His thesis wasn’t without critics although most acknowledged Madness and Modernism was a literary classic.

Duncan's Ritual Of Freemasonry (2021 edition) by Malcolm C Duncan, Lushena Books, 288 pp, ISBN-10-1631829904.

As the DSM makes clear, not all schizophrenics are the same.  In 2011, Lindsay Lohan was granted a two-year restraining order against alleged stalker David Cocordan.  The order was issued some days after she filed complaint with police who, after investigation by their Threat Management Department, advised the court Mr Cocordan (who at the time had been using at least five aliases) “suffered from schizophrenia”, was “off his medication and had a "significant psychiatric history of acting on his delusional beliefs.”  That was worrying enough but Ms Lohan may have revealed her real concerns in an earlier post on twitter in which she included a picture of David Cocordan, claiming he was "the freemason stalker that has been threatening to kill me- while he is TRESPASSING!"  Being stalked by a schizophrenic is bad enough but the thought of being hunted by a schizophrenic Freemason is truly frightening.  Apparently an unexplored matter in the annals of psychiatry, it seems the question of just how schizophrenia might particularly manifest in Freemasons awaits research so there may be a PhD there for someone.  

Thursday, September 14, 2023

Defenestration

Defenestration (pronounced dee-fen-uh-strey-shuhn)

(1) The act of throwing a person out of a window.

(2) In casual, often humorous use, to throw anything out of a window.

(3) A sardonic term in the business of politics which refers to an act which deposes a leader).

(4) In nerd humor, the act of removing the Microsoft Windows operating system from a computer in order to install an alternative.

1618: From New Latin dēfenestrātiō, the construct being dē (from; out) + fenestra (window) + -atio (the suffix indicating an action or process).  It was borrowed also by the Middle French défenestrer (which persists in Modern French) & défenestration.  The German form is Fenstersturz; the verb defenestrate formed later.  The related forms are defenestrate (1915) & defenestrated (1620).  Derived terms (which seem only ever used sardonically) include autodefenestration (the act of hurling oneself from a window), dedefenestration (the act of hurling someone back through the window from which recently they were defenestrationed and redefenestration (hurling someone from a window for a second time, possibly just after their dedefenestration).  Use of these coinings is obviously limited.

The de- prefix was from the Latin -, from the preposition (of; from)  It was used in the sense of “reversal, undoing, removing”; the similar prefix in Old English was æf-.  The –ation suffix is from the Middle English –acioun & -acion, from the Old French acion & -ation, from the Latin -ātiō, an alternative form of -tiō (from which Modern English gained -tion).  It was used variously to create the forms describing (1) an action or process, (2) the result of an action or process or (3) a state or quality.  Fenestra is of unknown origin.  Some etymologists link fenestra with the Greek verb phainein (to show) while others suggest an Etruscan borrowing, based on the suffix -(s)tra, as in the Latin loan-words aplustre (the carved stern of a ship with its ornaments), genista (the plant broom) or lanista (trainer of gladiators).  Fenestration dates from 1870 in the anatomical sense, a noun of action from the Latin fenestrare, from fenestra (window, opening for light).  The now rare but once familiar meaning "arrangement of windows" dates from 1846 and described a certain design element in architecture.  The related form is fenestrated.

Second Defenestration of Prague (circa 1618), woodcut by Matthäus Merian der Ältere (1593–1650).

Although it was already known in the Middle French, defenestrate entered English to lament (or celebrate, depending on one’s view of such things) the Defenestration of Prague in 1618, when Two Roman Catholic regents of Ferdinand II, representing the Holy Roman Emperor in the Bohemian national assembly, were tossed from a third floor window of Hradshin Castle by Protestant radicals who accused them of suppressing their rights.  All three survived, landing either in a moat or rubbish heap defending on one’s choice of history book and thus began the Thirty Years’ War.  The artist called his painting the "Second Defenestration" because he was one of the school which attaches no significance to the 1438 event most historians now regard as the second of three.

The defenestration of 1618 that triggered the Thirty Years’ War wasn’t the first, indeed it was at the time said it had been done in "…good Bohemian style" by those who recalled earlier defenestrations, although, in fairness, the practice wasn’t exclusively Bohemian, noted in the Bible and not uncommon in Medieval and early modern times, lynching and mob violence a cross-cultural political language for centuries.  The first governmental defenestration occurred in 1419, second in 1483 and the third in 1618, although the term "Defenestration of Prague" is applied exclusively to the last.  The first and last are remembered because they trigged long wars of religion in Bohemia and beyond, the Hussite Wars (1419-1435) associated with the first and the Thirty Years’ War (1618-1648) with the Third.  The neglected second ushered in the religious peace of Kutná Hora which lasted decades, clearly not something to remember.  The 1618 event is the third defenestration of Prague).

The word has become popular as a vivid descriptor of political back-stabbing and is best understood sequentially, the churn-rate of recent Australian prime-ministers a good example: (1) Julia Gillard (b 1961) defenestrated Kevin Rudd (b 1957), (2) Kevin Rudd defenestrated Julia Gillard, (3) Malcolm Turnbull (b 1954) defenestrated Tony Abbott (b 1957), (4) Peter Dutton (b 1970) defenestrated Malcolm Turnbull (although that didn’t work out quite as planned, Mr Dutton turning out to be the hapless proxy for Scott Morrison (b 1968)).  Given the recent history it's surprising no one has bother to coin the adjective defenestrative to describe Australian politics although given it's likely there are more defenestrations will be to come, that may yet happen.  Mr Dutton has never denied being a Freemason.

Some great moments in defenestration

King John of England (1166-1216) killed his nephew, Arthur of Brittany (1187-1203), by defenestration from the castle at Rouen, France, in 1203 (the method contested though not the death).

In 1378, the crafts and their leader Wouter van der Leyden occupied the Leuven city hall and seized the Leuven government.  In an attempt to regain absolute control, they had Wouter van der Leyden assassinated in Brussels. Seeking revenge, the crafts handed over the patrician to a furious crowd. The crowd stormed the city hall and threw the patricians out of the window. At least 15 patricians were killed during this defenestration of Leuven.

In 1383, Bishop Dom Martinho (1485-1547) was defenestrated by the citizens of Lisbon, having been suspected of conspiring with the enemy when Lisbon was besieged by the Castilians.

In 1419 Hussite mob defenestrates a judge, the burgomaster, and some thirteen members of the town council of New Town of Prague. (First defenestration of Prague).

Death of Jezebel (1866) by Gustave Doré (1832–1883).

In the Bible, Jezebel was defenestrated at Jezreel by her own servants at the urging of Jehu. (2 Kings 9:33).  Jezabel is used today to as one of the many ways to heap opprobrium upon women although it now suggests loose virtue, rather than the heresy or doctrinal sloppiness mentioned in the Bible.

Jezebel encouraged the worship of Baal and Asherah, as well as purging the prophets of Yahweh from Israel.  This so damaged the house of Omride that the dynasty fell.  Ever since, the Jews have damned Jezabel as power-hungry, violent and whorish.  However, she was one of the few women of power in the Bible and there is something of a scriptural dislike of powerful women, an influence which seems still to linger among the secular.

In the Book of Revelation (2:20-23), Jezebel's name is linked with false prophets:

20 Nevertheless, I have this against you: You tolerate that woman Jezebel, who calls herself a prophet. By her teaching she misleads my servants into sexual immorality and the eating of food sacrificed to idols.

21 I have given her time to repent of her immorality, but she is unwilling.

22 So I will cast her on a bed of suffering, and I will make those who commit adultery with her suffer intensely, unless they repent of her ways.

23 I will strike her children dead. Then all the churches will know that I am he who searches hearts and minds, and I will repay each of you according to your deeds.

Lorenzo de' Medici (circa 1534) by Giorgio Vasari (1511-1574).

On 26 April 1478, after the failure of the "Pazzi conspiracy" to murder the ruler of Florence, Lorenzo di Piero de' Medici (Lorenzo the Magnificent 1449–1492), Jacopo de' Pazzi (1423-1478) was defenestrated.  In 1483, Prague's Old-Town portreeve and the bodies of seven murdered New-Town aldermen were defenestrated.  (Second defenestration of Prague).  On 16 May 1562, Adham Khan (1531-1652), The Mughal emperor Akbar the Great’s (1542-1605) general and foster brother, was defenestrated (twice!) for murdering a rival general, Ataga Khan (d 1562).  Akbar was woken up in the tumult after the murder. He struck Adham Khan down personally with his fist and immediately ordered his defenestration by royal order. The first time, his legs were broken but he remained alive.  Akbar ordered his defenestration a second time, killing him. Adham Khan had wrongly counted on the influence of his mother and Akbar's wet nurse, Maham Anga (d 1562) to save him as she was almost an unofficial regent in the days of Akbar's youth.  Akbar personally informed Maham Anga of her son's death, to which, famously, she commented, “You have done well”.  After forty days and forty nights, she died of acute depression.  On the morning of 1 December 1640, in Lisbon, a group of supporters of the Duke of Braganza party found Miguel de Vasconcelos (1590-1640), the hated Portuguese Secretary of State of the Habsburg Philip III (1605-1665), hidden in a closet, killed and defenestrated him.  His corpse was left to the public outrage.  On 11 June 1903, a group of Serbian army officers murdered and defenestrated King Alexander (1876-1903) and Queen Draga (1866-1903).

Poster of Benito Mussolini (1883-1946, Duce of Italy, 1922-1943), Ethiopia, 1936.

In 1922, Italian politician and writer Gabriele d'Annunzio (1863-1938) was temporarily crippled after falling from a window, possibly pushed by a follower of Benito Mussolini.  The Duce might almost have been grateful had he suffered the illustrious fate of defenestration, the end of not a few kings and princes.   Instead, Italian communist partisans found him hiding in the back of a truck with his mistress Clara Petacci (1912-1945), attempting to flee to neutral Switzerland.  Taken to a village near Lake Como, on 28 April 1945, both were summarily executed by firing squad, their bodies hung upside down outside a petrol station where the corpses were abused by the mob.  When Hitler saw the photographs, he quickly summoned Otto Günsche (1917–2003), his personal SS adjutant, repeating his instruction that nothing must remain of him after his suicide.

On 10 March 10 1948, the Czechoslovakian minister of foreign affairs Jan Masaryk (1886-1948) was found dead, in his pajamas, in the courtyard of the Foreign Ministry below his bathroom window. The initial (KGB) investigation stated that he committed suicide by jumping out of the window.  A 2004 police investigation concluded that he was defenestrated by the KGB.  In 1968, the son of China's future paramount leader Deng Xiaoping (2004-1997), Deng Pufang (b 1944), was thrown from a window by Red Guards during the Cultural Revolution.  In 1977, as a result of political backlash against his album Zombie, musician Fela Kuti's (1938-1997) mother (Chief Funmilayo Ransome-Kuti, 1900-1978) was thrown from a window during a military raid on his compound.  In addition, the commanding officer defecated on her head, while the soldiers burned down the compound, destroying his musical equipment, studio and master tapes.  Adding insult to injury, they later jailed him for being a subversive.  On 2 March 2007, Russian investigative journalist Ivan Safronov (1956-2007), who was researching the Kremlin's covert arms deals, fell to his death from a fifth floor window.  There was an investigation and the death was ruled to be suicide, a cause of death which of late has become uncommonly common in Russia, people these days often falling from windows high above the ground.

Dominion Centre, Toronto.

On 9 July 1993, in a case of self-defenestration, Toronto attorney Garry Hoy (1955-1993) fell from a window after a playful attempt to prove to a group of new legal interns that the windows of Toronto’s Dominion Centre were unbreakable.  The glass sustained the manufacturer’s claim but, intact, popped out of the frame, the unfortunate lawyer plunging to his death.  Mr Hoy actually also held an engineering degree and is said to have many times performed the amusing stunt.  Unfortunately he didn’t live to explain to the interns how the accumulation of stresses from his many impacts may have contributed to the structural failure.

Thursday, December 8, 2022

Atrophy & Hypertrophy

Atrophy (pronounced a-truh-fee)

(1) In pathology, a wasting away of the body or of an organ or part, or a failure to grow to normal size as the result of disease, defective nutrition, nerve damage or hormonal changes.

(2) Degeneration, decline, or decrease, as from disuse.

1590–1600: From the earlier Middle French atrophie and Late Latin atrophia from the Ancient Greek τροφία (atrophía) (a wasting away), (derived from trephein (to feed)) from τροφος (átrophos) (ill-fed, un-nourished), the construct being - (a-) (not) + τροφή (troph) (nourishment) from τρέφω (tréphō) (I fatten).  Atrophic is the most familiar adjectival form.  The a- prefix, a proclitic form of preposition, is from the Ancient Greek - (not, without) and is used to form taxonomic names indicating a lack of some feature that might be expected.  In Middle English a- (up, out, away) was from the Old English ā- (originally ar- & or-) from the Proto-Germanic uz- (out-), from the primitive Indo-European uds- (up, out); it was cognate with the Old Saxon ā- and the German er-.  The suffix –ia is from Classical Latin from the Ancient Greek -ία (-ía) & -εια (-eia) and was used to form abstract nouns of feminine gender.  It creates names of countries, diseases, flowers, and (rarely) collections of things such as militaria & deletia).  The spelling atrophia is obsolete.  Atrophy is a noun and atrophic is an adjective; the noun plural is atrophies.  The noun atrophication is non-standard and seems to be restricted to the pro-ana community, usually as "self-atrophication".

Self-atrophication: Lindsay Lohan during "thin phase".

In pathology there are specific classes of atrophy including encephalatrophy (atrophy of the brain), fibroatrophy (atrophy of fibres), dermatrophy (atrophy (a thinning) of the skin), lipoatrophy (the loss of subcutaneous fatty tissue), scleroatrophy (Any of various atrophic conditions characterized by a hardening of tissue, including atrophic fibrosis of the skin, hypoplasia of the nails, and palmoplantar keratoderma), hemiatrophy (atrophy that affects only one half of the body) and the mysterious pseudoatrophy (where tissue (typically muscle), appears to have reduced in size or wasted away but the perceived reduction is really caused by factors such as swelling, inflammation, or the presence of fluid in surrounding tissues that make the muscle look smaller than it actually is).  In genetics, atrogene describes any gene which has some influence on atrophy of muscle tissue and in biochemistry atrophins are any of a class of proteins found in nervous tissue.

Hypertrophy (pronounced hahy-pur-truh-fee)

(1) In physiology, the abnormal (but usually non-tumorous) enlargement of an organ or a tissue as a result of an increase in the size rather than the number of constituent cells.

(2) In bodybuilding, an increase in muscle size through increased size of individual muscle cells (responding to exercise such as weightlifting); it is distinct from muscle hyperplasia (the formation of new muscle cells).

(2) Excessive growth or accumulation of any kind.

1825–1835: A compound word hyper- + -trophy.  Hyper is from the French hypertrophie, from the Ancient Greek πέρ (hupér) (over, excessive), from the primitive Indo-European upér (over, above) (from which English gained over), from upo (under, below) (source of the English up). It was cognate with the Latin super- and is a common prefix appearing in loanwords from Greek, where it meant “over,” usually implying excess or exaggeration (eg hyperbole); on this model used, especially as opposed to hypo-, in the formation of compound words.  Hypertrophy is a noun & verb, hypertrophic, hypertrophical, hypertrophous & hypertrophical are adjectives and hypertrophically is an adverb; the noun plural is hypertropheries.  For those for whom the successively multi-syllabic is a fetish or compile scripts for speech therapy practice, there's hypertrophic pulmonary osteoarthropathy (in pathology, a paraneoplastic condition characterised by clubbing and periostitis of the long bones of the arms and legs).

Two “Chernobyl mutant catfish” (left) and one with a human for scale (right).  They're now found in the cooling ponds of the Chernobyl nuclear reactor which in 1986 suffered a meltdown and subsequent explosion.

Their huge size is not a radiation-induced mutation but a result of the absence of a predator since humans were removed from their environment.  As far as is known, radiation-induced mutations very rarely lead to a generalized hypertrophy and it's much more common for an affected specimen to grow “less efficiently” and for a variety of reasons they can be less capable of catching food and thus not live as long as is typical for the species.  The “Chernobyl mutant catfish” which gained their infamy in the early days of the internet were no more the result of nuclear contamination than the “Fukushima mutant wolfish” of the social media age.  Genetically, there’s nothing novel about the size of the bulky inhabitants of the Chernobyl ponds.  While it’s not typical, the wels catfish (Silurus glanis) can weigh over 350 kg (770 lb), examples recorded both in scientific research and by fishers although up to 150 kg (330 lb) is more common for a large example.  So, to describe the big fish as “hypertrophic” is to use the word in a loose way because the growth registered is “extreme” rather than excessive, something made possible by them living without predators, in a suitable habitat with ample food.  With humans no longer killing them, the catfish have become both apex-level predators and scavengers, enjoying fish, amphibians, worms, birds and even small mammals; they appear to eat just about anything (dead or alive) which fits into their large mouths.  They can live for many decades and scientists are monitoring their progress in what is a close to unique experiment.  Thus far, despite some being tested as being some 16 times more radioactive than normal, the indications are they’re continuing happily to feed, re-produce and grow but because the hypertrophied state is function of genetics interacting with an ideal environment, it’s thought unlikely they’ll ever exceed the recorded maximum size.

Peter Dutton (b 1970; leader of the opposition and leader of the Australian Liberal Party since May 2022) announces the Liberal Party's new policy advocating the construction of multiple nuclear power-plants in Australia.  The prosthetic used in the digitally-altered image (right) was a discarded proposal for the depiction of Lord Voldemort in the first film version of JK Rowling's (b 1965) series of Harry Potter children's fantasy novels; it used a Janus-like two-faced head.  It's an urban myth Peter Dutton auditioned for the part when the first film was being cast but was rejected as being "too scary".  If ever there's another film, the producers might reconsider and should his career in politics end (God forbid), he could bring to Voldemort the sense of menacing evil the character has never quite achieved.  Interestingly, despite many opportunities, Mr Dutton has never denied being a Freemason.

Friday, June 9, 2023

Sinister

Sinister (pronounced sin-uh-ster)

(1) Threatening or portending evil, harm, or trouble; ominous.

(2) Bad, evil, base, or wicked; fell; treacherous, especially in some mysterious way.

(3) Unfortunate; disastrous; unfavorable.

(4) Of or on the left side; left-handed (mostly archaic except as a literary device).

(5) In heraldry, noting the side of an escutcheon or achievement of arms that is to the left of the bearer (as opposed to dexter), ie from the bearer's point of view and therefore on the spectator's right.

(6) Wrong, as springing from indirection or obliquity; perverse; dishonest (obsolete).

1375-1425: From the Middle English sinistre (unlucky), from the Old French sinistra (left), from the Latin sinestra (left hand) from the Proto-Italic senisteros, of unknown origin, but possibly from a euphemism from the same primitive Indo-European root as the Sanskrit सनीयान् (sanīyān) (more useful, more advantageous) with the contrastive or comparative suffix -ter, (as in the Latin dexter (on the right-hand)) familiar in the modern “dexterity”.  Sinister is an adjective, sinisterly is an adverb and sinisterness is a noun.  Unlike some other adjectives applied to people, such as exquisite, sinister never evolved into a self-definitional noun.  The alternative spelling sinistre is long obsolete but those for whom historical authenticity matters should note sinister was once accented on the middle syllable by poets including Shakespeare, Milton and Dryden.

Evolution of Sinister

The now universal meanings (evil et al), emerged in the late fifteenth century, a sense inherited from the fourteenth century Old French senestre & sinistre (contrary, false; unfavorable; to the left) picked up directly from the Classical Latin sinestra (left, on the left side (opposite of dexter)).  Sinister had been used in heraldry from the 1560s to indicate "left, to the left" and left in heraldry indicated illegitimacy and in that it preserves the literal sense from Latin of "on or from the left side".  In zoology, the botanists in 1856 created the adjective sinistrorse a word to describe the direction of spiral structures in nature, from Latin sinistrorsus (toward the left side) the construct being sinist- (left) + versus (turned), past participle of vertere (to turn), from the primitive Indo-European root wer- (to turn, bend).  In the scientific literature it was paired with dextrorse but a broader adoption was doomed by confusion; it was never agreed what was the correct point of view to reckon the leftward or rightward spiraling.  

Peter Dutton (b 1970; Liberal-National Party MP for Dickson (Queensland, Australia) since 2001).  Sometimes, a sinister look is just a matter of chance, there being nothing sinister about Mr Dutton (although he has never denied being a Freemason).

The former Research Institute For Experimental Medicine, Berlin, Germany.  Built for the purpose of housing live animal testing facilities, and until 2003 known as the Central Animal Laboratories of the Free University of Berlin, its common name among Berliners (long known for their sardonic humor) the Mäusebunker (Mouse Bunker). 

Those working in visual media, photographers, cinematographers and painters use tricks of lighting and angle to convey a sense of the sinister, even buildings and landscapes, carefully framed, can invoke the feeling.  Although it can be because of a structure's historical associations, when a building is described as "sinister" it's a thing usually of subjective perception, induced often by a a dark, eerie, or foreboding appearance. There are a number of elements which may be involved:

(1) The architectural style, lighting and choice of materials can contribute to a perception of sinisterness, buildings in the Gothic or Brutalist tradition with their sharp angles, heaviness and use of slabs of dark stone noted for this.

(2) A notorious historical context can impart a meaning which transcends the actual architecture.  Buildings known to be associated with dark historical events or periods in history can gain a reputation as being sinister.  Places once the sites of suffering, torture or death gain this reputation such as the Lubyanka Building in Moscow.  Although an inoffensive Neo-Baroque structure in yellow brick, for most of its life it's been the home of one branch or other of the Russian internal agency, most famously the KGB.

(3) The surrounding environment can make an otherwise charming building seem mysterious and threatening.  Some will find a building standing alone in an isolated area or surrounded by dark and overgrown vegetation can provoke feeling of unease.

Lindsay Lohan as she would appear if left-handed, signing photographs with Sharpie (as recommended by Pippa Middleton), Rachael Ray Show, New York City, January 2019 (digitally mirrored image).  As a general principle, pink is not associated with sinisterness (although crooked Hillary Clinton in pink pantsuit would be most sinister).

A quirk from antiquity is that in matters of religion, to Romans sinister meant auspicious whereas for Greeks it meant inauspicious.  The curious duplicity arose because the Latin word was used in augury in the sense of "unlucky; unfavorable" a natural inheritance because omens, especially the flight of birds seen on the left hand were regarded as portending misfortune thus sinister acquired a sense of "harmful, adverse".  This was from the influence of Greek, reflecting the early Hellenic practice of facing north when observing omens.  In genuine Roman auspices, the augurs faced south so the left was thought good and favorable.  The Romans were a superstitious lot but seem to have managed this strange dichotomy of meaning without difficulty, sinister suggesting something bad except in the temple when it meant something good.

The salute associated with the Nazi regime (1933-1945) has long been regarded as something sinister or worse.  The pantsuit wasn't thought sinister until it became emblematic of crooked Hillary Clinton.

Monday, August 14, 2023

Obsolete & Obsolescent

Obsolete (pronounced ob-suh-leet)

(1) No longer in general use; fallen into disuse; that is no longer practiced or used, out of date, gone out of use, of a discarded type; outmoded.

(2) Of a linguistic form, no longer in use, especially if out of use for at least the past century.

(3) Effaced by wearing down or away (rare).

(4) In biology, imperfectly developed or rudimentary in comparison with the corresponding character in other individuals, as of a different sex or of a related species; of parts or organs, vestigial; rudimentary.

(5) To make obsolete by replacing with something newer or better; to antiquate (rare).

1570–1580: From the Latin obsolētus (grown old; worn out), past participle of obsolēscere (to fall into disuse, be forgotten about, become tarnished), the construct assumed to be ob- (opposite to) (from the Latin ob- (facing), a combining prefix found in verbs of Latin origin) + sol(ēre) (to be used to; to be accustomed to) + -ēscere (–esce) (the inchoative suffix, a form of -ēscō (I become)).  It was used to form verbs from nouns, following the pattern of verbs derived from Latin verbs ending in –ēscō).  Obsoletely is an adverb, obsoleteness is a noun and the verbs (used with object), are obsoleted & obsoleting; Although it does exist, except when it’s essential to covey a technical distinction, the noun obsoleteness is hardly ever used, obsolescence standing as the noun form for both obsolete and obsolescent.  The verb obsolesce (fall into disuse, grow obsolete) dates from 1801 and is as rare now as it was then.

Although not always exactly synonymous, in general use, archaic and obsolete are often used interchangeably.  However, dictionaries maintain a distinction: words (and meanings) not in widespread use since English began to assume its recognizably modern form in the mid-1700s, are labeled “obsolete”.  Words and meanings which, while from Modern English, have long fallen from use are labeled “archaic” and those now seen only very infrequently (and then in often in specialized, technical applications), are labeled “rare”.

Obsolescent (promounced ob-suh-les-uhnt)

(1) Becoming obsolete; passing out of use (as a word or meaning).

(2) Becoming outdated or outmoded, as applied to machinery, weapons systems, electronics, legislation etc.

(3) In biology, gradually disappearing or imperfectly developed, as vestigial organs.

1745–1755: From the Latin obsolēscentum, from obsolēscēns, present participle of obsolēscere (to fall into disuse); the third-person plural future active indicative of obsolēscō (degrade, soil, sully, stain, defile).  Obsolescently is an adverb and obsolescence a noun.  Because things that are obsolescent are becoming obsolete, the sometimes heard phrase “becoming obsolescent” is redundant.  The sense "state or process of gradually falling into disuse; becoming obsolete" entered general use in 1809 and although most associated with critiques by certain economists in the 1950s, the phrase “planned obsolescence was coined” was coined in 1932, the 1950s use a revival.

Things that are obsolete are those no longer in general use because (1) they have been replaced, (2) the activity for which they were designed is no longer undertaken.  Thing that are considered obsolescent are things still to some extent in use but are for whatever combination of reasons, are tending towards becoming obsolete.  in fading from general use and soon to become obsolete. For example, the Windows XP operating system (released in 2001) is not obsolete because some still use it, but it is obsolescent because, presumably it will in the years ahead fall from use.

Ex-Royal Air Force (RAF) Hawker Hunter in Air Force of Zimbabwe (AFZ) livery; between 1963-2002 twenty-six Hunters were at different times operated by the AFZ.  Declared obsolete as an interceptor by the RAF in 1963, some Hunters were re-deployed to tactical reconnaissance, ground-attack and close air support roles before being retired from front-line service in 1970.  Some were retained as trainers while many were sold to foreign air forces including India, Pakistan and Rhodesia (Zimbabwe since 1980).

Despite the apparent simplicity of the definition, in use, obsolescent is highly nuanced and much influenced by context.  It’s long been a favorite word in senior military circles; although notorious hoarders, generals and admirals are usually anxious to label equipment as obsolescent if there’s a whiff of hope the money might to forthcoming to replace it with something new.  One often unexplored aspect of the international arms trade is that of used equipment, often declared obsolescent by the military in one state and purchased by that of another, a transaction often useful to both parties.  The threat profile against which a military prepares varies between nations and equipment which genuinely has been rendered obsolescent for one country may be a valuable addition to the matériel of others and go on enjoy an operational life of decades.  Well into the twentieth-first century, WWII & Cold War-era aircraft, warships, tanks and other weapon-systems declared obsolescent and on-sold (and in some cases given as foreign aid or specific military support) by big-budget militaries remain a prominent part of the inventories of many smaller nations.  That’s one context, another hinges on the specific-tasking of materiel; an aircraft declared obsolescent as a bomber could go on long to fulfil a valuable role as in transport or tug.

In software, obsolescence is so vague a concept the conventional definition really isn’t helpful.  Many software users suffer severe cases of versionitis (a syndrome in which they suffer a sometimes visceral reaction to using anything but the latest version of something) so obsolescence to them seems an almost constant curse.  The condition tends gradually to diminish in severity and in many cases the symptoms actually invert: after sufficient ghastly experiences with new versions, versionitis begins instead to manifest as a morbid fear of every upgrading anything.  Around the planet, obsolescent and obsolete software has for decades proliferated and there’s little doubt this will continue, the Y2K bug which prompted much rectification work on the ancient code riddling the world of the main-frames and other places unlikely to be the last great panic (one is said to be next due in 2029).  The manufacturers too have layers to their declaration of the obsolete.  In 2001, Microsoft advised all legacy versions of MS-DOS (the brutish and now forty year old file-loader) were obsolete but, with a change of release number, still offer what's functionally the same MS-DOS for anyone needing a small operating system with minimal demands on memory size & CPU specification, mostly those who use embedded controllers, a real attraction being the ability easily to address just about any compatible hardware, a convenience more modern OSs have long restricted.  DOS does still have attractions for many, the long-ago derided 640 kb actually a generous memory space for many of the internal processes of machines and it's an operating system with no known bugs.  

XTree’s original default color scheme; things were different in the 1980s.

Also, obsolescent, obsolete or not, sometimes the old ways are the best.  In 1985, Underware Sytems (later the now defunct Executive Systems (EIS)) released a product called XTree, the first commercially available software which provided users a visual depiction of the file system, arranged using a root-branch tree metaphor.  Within that display, it was possible to do most file-handling such as copying, moving, re-naming, deleting and so on.  Version 1.0 was issued as a single, 35 kb executable file, supplied usually on a 5.25" floppy diskette and although it didn’t do anything which couldn’t (eventually) be achieved using just DOS, XTree made it easy and fast; reviewers, never the most easily impressed bunch, were effusive in their praise.  Millions agreed and bought the product which went through a number of upgrades until by 1993, XTreeGold 3.0 had grown to a feature-packed three megabytes but, and it was a crucial part of the charm, the user interface didn’t change and anyone migrating from v1 to v3 could carry on as before, using or ignoring the new functions as they choose.

However, with the release in 1990 of Microsoft’s Windows 3.0, the universe shifted and while it was still an unstable environment, it was obvious things would improve and EIS, now called the XTree Company, devoted huge resources to producing a Windows version of their eponymous product, making the crucial decision that when adopting the Windows-style graphical user interface (GUI), the XTree keyboard shortcuts would be abandoned.  This mean the user interface was something that looked not greatly different to the Windows in-built file manager and bore no resemblance to the even then quirky but marvelously lucid one which had served so well.  XTree for Windows was a critical and financial disaster and in 1993 the company was sold to rival Central Point Software, themselves soon to have their own problems, swallowed a year later by Symantec which, in a series of strategic acquisitions, soon assumed an almost hegemonic control of the market for Windows utilities.  Elements of XTree were interpolated into other Symantec products but as a separate line, it was allowed to die.  In 1998, Symantec officially deleted the product but the announcement was barely noted by the millions of users who continued to use the text-based XTree which ran happily under newer versions of Windows although, being a real-time program and thus living in a small memory space, as disks grew and file counts rose, walls were sometimes hit, some work-arounds possible but kludgy.  The attraction of the unique XTree was however undiminished and an independent developer built ZTree, using the classic interface but coded to run on both IBM’s OS/2 and the later flavors of Windows.  Without the constraints of the old real-time memory architecture, ZTree could handle long file and directory names, megalomaniacs now able to log an unlimited number of disks and files, all while using the same, lightning-fast interface.  The idea spread to UNIX where ytree, XTC, linuXtree and (most notably), UnixTree were made available.

ZTree, for those who can remember how things used to be done.

ZTree remains a brute-force favorite for many techs.  Most don’t often need to do those tasks at which it excels but, when those big-scale needs arise, as a file handler, ZTree still can do what nothing else can.  It’ll also do what’s now small-scale stuff; anyone still running XTree 1.0 under MS-DOS 2.11 on their 8088 could walk to some multi-core 64-bit monster with 64 GB RAM running Windows 11 and happily use ZTree.  ZTree is one of the industry’s longest-running user interfaces.

The Centennial Light, Livermore-Pleasanton Fire Department, Livermore, California.  Illuminated almost continuously since 1901, it’s said to be the world's longest-lasting light bulb.  The light bulb business became associated with the idea of planned obsolescence after the revelation of the existence of a cartel of manufacturers which had conspired to more than halve the service life of bulbs in order to stimulate sales.

As early as 1924, executives in US industry had been discussing the idea of integrating planned obsolescence into their systems of production and distribution although it was then referred to with other phrases.  The idea essentially was that in the industrial age, modern mercantile capitalism was so efficient in its ability to produce goods that it would tend to over-produce, beyond the ability to stimulate demand.  The result would be a glut, a collapse in prices and a recession or depression which affected the whole society, a contributing factor to what even then was known as the boom & bust economy.  One approach was that of the planned economy whereby government would regulate production and maintain employment and wages at the levels required to maintain some degree of equilibrium between supply and demand but such socialistic notions were anathematic to industrialists.  Their preference was to reduce the lifespan of goods to the point which matched the productive capacity and product-cycles of industry, thereby ensuring a constant churn.  Then, as now, there were those for and against, the salesmen delighted, the engineers appalled.

The actual phrase seems first to have been used in the pamphlet Ending the Depression Through Planned Obsolescence, published in 1932 by US real estate broker (and confessed Freemason) Bernard London (b circa 1873) but it wasn’t popularized until the 1950s.  Then, it began as a casual description of the techniques used in advertising to stimulate demand and thus without the negative connotations which would attach when it became part of the critique of materialism, consumerism and the consequential environmental destruction.  There had been earlier ideas about the need for a hyper-consumptive culture to service a system designed inherently to increase production and thus create endless economic growth: one post-war industrialist noted the way to “avoid gluts was to create a nation of gluttons” and exporting this model underlies the early US enthusiasm for globalism.  As some of the implications of that became apparent, globalization clearly not the Americanization promised, enthusiasm became more restrained.

Betamax and VHS: from dominant to obsolescent to obsolete; the DVD may follow.

Although the trend began in the United States in the late 1950s, it was in the 1970s that the churn rate in consumer electronics began to accelerate, something accounted for partly by the reducing costs as mass-production in the Far East ramped up but also the increasing rapidity with which technologies came and went.  The classic example of the era was the so-called videotape format war which began in the mid 1970s after the Betamax (usually clipped to Beta) and Video Home System (VHS) formats were introduced with a year of each other.  Both systems were systems by which analog recordings of video and audio content cold be distributed on magnetic tapes which loaded into players with a cassette (the players, regardless of format soon known universally as video cassette recorders (VCR).  The nerds soon pronounced Betamax the superior format because of superior quality of playback and commercial operators agreed with it quickly adopted as the default standard in television studios.  Consumers however came to prefer VHS because, on most of the screens on which most played their tapes, the difference between the two was marginal and the VHS format permitted longer recording times (an important thing in the era) and the hardware was soon available at sometimes half the cost of Betamax units.

It was essentially the same story which unfolded a generation later in the bus and operating systems wars; the early advantages of OS/2 over Windows and Micro Channel Architecture (MCA) over ISA/EISA both real and understood but few were prepared to pay the steep additional cost for advantages which seemed so slight and at the same time brought problems of their own.  Quite when Betamax became obsolescent varied between markets but except for a handful of specialists, by the late 1980s it was obsolete and the flow of new content had almost evaporated.  VHS prevailed but its dominance was short-lived, the Digital Versatile Disc (DVD) released in 1997 which within half a decade was the preferred format throughout the Western world although in some other markets, the thriving secondary market suggests even today the use of VCRs is not uncommon.  DVD sales though peaked in 2006 and have since dropped by some 80%, their market-share cannibalized not by the newer Blu-Ray format (which never achieved critical mass) but by the various methods (downloads & streaming) which meant many users were able wholly to abandon removable media.  Despite that, the industry seems still to think the DVD has a niche and it may for some time resist obsolescence because demand still exists for content on a physical object at a level it remains profitable to service.  Opinions differ about the long-term.  History suggests that as the “DVD generation” dies off, the format will fade away as those used to entirely weightless content available any time, in any place won’t want the hassle but, as the unexpected revival of vinyl records as a lucrative niche proved, obsolete technology can have its own charm which is why a small industry now exists to retro-fit manual gearboxes into modern Ferraris, replacing technically superior automatic transmissions.