Showing posts sorted by relevance for query Rational. Sort by date Show all posts
Showing posts sorted by relevance for query Rational. Sort by date Show all posts

Wednesday, April 3, 2024

Rational

Rational (pronounced rash-nl (U) or rash-uh-nl (non-U))

(1) Agreeable to reason; reasonable; sensible.

(2) Having or exercising reason, sound judgment, or good sense.

(3) Of a person or their personal characteristics, being in or characterized by full possession of one's reason; sane; lucid; healthy or balanced intellectually; exhibiting reasonableness.

(4) Endowed with the faculty of reason; capable of reasoning.

(5) Of or relating to, or constituting reasoning powers.

(6) Proceeding or derived from reason or based on reasoning.

(7) Logically sound; not self-contradictory or otherwise absurd

(8) In mathematics, capable of being expressed exactly by a ratio of two integers or (of a function) capable of being expressed exactly by a ratio of two polynomials.

(9) In chemistry, expressing the type, structure, relations, and reactions of a compound; graphic; said of formulae.

(10) In physics, expressing a physical object.

(11) In the philosophy of science, based on scientific knowledge or theory rather than practical observation.

(12) The breastplate worn by Israelite high priests (historic references only).

1350-1400: From the Old French rationel & rational, from the Middle English racional, from the Late Latin ratiōnālis (of or belonging to reason, rational, reasonable; having a ratio), the construct being ratiōn (stem of ratiō (reason; calculation)) + -ālis.  The –alis suffix was from the primitive Indo-European -li-, which later dissimilated into an early version of –āris and there may be some relationship with hel- (to grow); -ālis (neuter -āle) was the third-declension two-termination suffix and was suffixed to (1) nouns or numerals creating adjectives of relationship and (2) adjectives creating adjectives with an intensified meaning.  The suffix -ālis was added (usually, but not exclusively) to a noun or numeral to form an adjective of relationship to that noun. When suffixed to an existing adjective, the effect was to intensify the adjectival meaning, and often to narrow the semantic field.  If the root word ends in -l or -lis, -āris is generally used instead although because of parallel or subsequent evolutions, both have sometimes been applied (eg līneālis & līneāris).  The use to describe the breastplate worn by Israelite high priests was from the Old French rational, from the Medieval Latin ratiōnāle (a pontifical stole, a pallium, an ornament worn over the chasuble), neuter of the Latin rationalis (rational).  The spelling rationall is obsolete.  Rational is a noun & adjective, rationalizing is a noun & verb, rationalize & rationalized are verbs, rationalism , rationalness & rationalizer are nouns and rationally is an adverb; the noun plural is rationals.  The rarely used adjective hyperrational means literally “extremely rational” and can be used positively or neutrally but it’s applied also negatively, usually as a critique of “economic rationality”.

Rational & irrational numbers illustrated by Math Monks.

In something of a departure from the usual practice in English, “antirational”, “nonrational” & “irrational” (there are hyphenated forms of both) are not necessarily synonymous.  Antirational describes something or someone who is or acts in a way contrary to the rational while arational (often in the form arationality) is a technical term used in philosophy in the sense of “not within the domain of what can be understood or analyzed by reason; not rational, outside the competence of the rules of reason” an applied to matters of faith (religious & secular).  Nonrational (used usually in the hyphenated form) is literally simply the antonym of rational (in most senses) but now appears most often in the language of economics where it’s used of decisions made by actors (individual, collective & corporate) which are contrary to economic self-interest.  Irrational can be used as another antonym but it’s also a “loaded” adjective which carries an association with madness (now called mental illness) while in mathematics (especially the mysterious world of number theory) it’s the specific antonym of the “ration number” and means a “real number unable to be written as the ratio of two integers”, a concept dating from the 1560s.

The adjective rational emerged in the mid-1400s and was was a variant of the late fourteenth century racional (“pertaining to or springing from reason” and of persons “endowed with reason; having the power of reasoning”, from the Old French racionel and directly from the Latin rationalis (of or belonging to reason, reasonable) from ratio (genitive rationis) (reckoning, calculation, reason).  By the 1560s it was picked up in mathematics to mean “expressible in finite terms” before becoming more precisely defined.  The meaning “conformable to the precepts of practical reason” dates from the 1630s.  The adverb rationally was from the same source as ratio and ration; the sense in rational is aligned with that in the related noun reason which got deformed in French.  The noun rationality by the 1620s was used in the sense of “quality of having reason” and by mid-century that had extended to “fact of being agreeable to reason”, from the French rationalité and directly from the Late Latin rationalitas (reasonableness, rationality (the source also for the Spanish racionalidad and the Italian razionalita), from the Latin rationalis (of or belonging to reason, reasonable).  As late as the early fifteenth century racionabilite (the faculty of reason) was in Middle English, from the Latin rationabilitas.

Rational AG's iCombi Pro range: Gas or Electric.

By the 1820s, the noun rationalization was in use in the sense of “a rendering rational, act of subjection to rational tests or principles”, the specific modern sense in psychology in reference to subconscious (to justify behavior to make it seem rational or socially acceptable) adopted by the profession early in the twentieth century.  The verb rationalize (explain in a rational way, make conformable to reason) dates from the mid eighteenth century although the sense familiar in psychology (to give an explanation that conceals true motives) came into use only in the 1920s on the notion of “cause to appear reasonable or socially acceptable” although decades earlier it had been used with the intransitive sense of “think for oneself, employ one's reason as the supreme test”.  The use in psychology endured but “rationalize” also came into use in applied economics with the meaning “to reorganize an industry or other commercial concern to eliminate wasteful processes”.  That seems to have come from US use although the first recorded entry was the Oxford English Dictionary’s (OED) supplementary edition in 1927.  In this context, it became a “vogue word” of the inter-war years of both sides of the Atlantic although it fell from favour after 1945 as the vogue shifted to “integrate”, “tailor”, “streamline” and that favourite of 1970s management consultants: the “agonizing reappraisal”.  However, in the 1980s & 1990s, “rationalize” gained a new popularity in economics and (especially) the boom industry of financial journalism, presumably because the “economic rationalists” coalesced during the Reagan-Thatcher era as the dominant faction in political economy.

Many have their own favourite aspect of Sigmund Freud’s (1856-1939) theories but one concept which infuses mush of his work is the tussle in the human psyche between the rational and irrational.  Freud’s structural model consisted of the three major components: id, ego & superego, the elements interacting and conflicting to shape behavior and personality.  The id was the primitive & instinctual part containing sexual and aggressive drives; operating on the pleasure principle, it seeks seeking immediate gratification and pleasure.  Present even before birth, it’s the source of our most basic desires and in its purest processes is wholly irrational, focused on wants and not the consequences of actions.

Concept of the id, ego & superego by the Psych-Mental Health Hub.

The rational was introduced by the ego, something which developed from the id and was the rational, decision-making part of the mind which balanced the demands of the id and the constraints of reality.  As Freud noted, implicit in this interaction was that the ego repressed the id which obviously was desirable because that’s what enables a civilized society to function but the price to be paid was what he called “surplus repression”.  That was a central idea in Freud's later psychoanalytic theory, exploring the consequences of the repression of innate, instinctual drives beyond that which was necessary for the functioning of society and the individual: the rational took its pound of flesh.  Discussed in Civilization and its Discontents (1930), “primary repression” was essential to allow the individual to adapt to societal norms and function in a civilized society while “surplus repression” was the operation of these forces beyond what is required for that adaptation.  Freud identified this as a source of psychological distress and neurosis.

Lindsay Lohan’s early century lifestyle made her a popular choice as a case-study for students in Psychology 101 classes studying the interaction of the rational and irrational process in the mind.  Most undergraduates probably enjoyed writing these essays more than had they been asked to analyse Richard Nixon (1913-1994; US president 1969-1974), America’s other great exemplar of the struggle.

It was the ego which mediated between the id, the superego, and the external world, making possible realistic and socially acceptable decisions, essentially by making individuals consider the consequences of their actions.  The superego developed last and built a construct of the morality, ethical standards & values internalized from parents, the education system, society and cultural norms; operating on the “morality principle”, the superego one of the “nurture” parts of the “nature vs nurture” equation which would for decades be such an important part of research in psychology.

Thursday, April 4, 2024

Rationale

Rationale (pronounced rash-uh-nal)

(1) The fundamental reason or reasons serving to account for something.

(2) A statement of reasons.

(3) A reasoned exposition of principles, especially one defining the fundamental reasons for a course of action or belief; a justification for action.

(4) A liturgical vestment worn by some Christian bishops of various denominations (now rare), the origin of which is the breastplate worn by Israelite high priests (a translation of λογεῖον (logeîon) or λόγιον (logion) (oracle) in the Septuagint version of Exodus 28)).  The French spelling (rational) of the Latin ratiōnāle was used in Biblical translations.

(5) In engineering, a design rationale is the explicit documentation of the reasons behind decisions made when designing a system; it was once used of what now would be described as a set of parameters.

1650-1660: From the Late Latin ratiōnāle (exposition of principles), nominative singular neuter of ratiōnālis (rational, of reason).  After some early inventiveness, the modern sense "fundamental reason, the rational basis or motive of anything" became standardised during the (1680s).  In the nature of such things, many rationales are constructed ex post facto.  Rationale is a noun; the noun plural is rationales or rationalia.

Prince Metternich & Dr Rudd: illustrating rationale & rational

Portrait of Prince Metternich (1822), miniature on card by Friedrich Lieder (1780-1859).

Rationale and rational are sometimes confused.  A rationale is a process variously of explanation, reason or justification of something that need not be at all rational (although many fashioned ex post facto are re-formulated thus).  To be rational, something must make sense and be capable of being understood by the orthodox, accepted methods of the time.  That something may subsequently be shown to be irrational does not mean it did not at some time appear rational; one can construct a rationale for even something irrational.  To construct a post-Napoleonic Europe, Prince Metternich (Prince Klemens of Metternich-Winneburg zu Beilstein (1773–1859); foreign minister of the Austrian Empire 1809-1848 & chancellor 1821-1848) built a rationale for the Congress of Vienna (1814-1815) that was well understood.  It was vision of a Europe, divided between the great powers, in which was maintained a perpetual balance of power which would ensure peace.  That in the two centuries since, the Congress has attached much criticism, largely for imposing a stultifying air of reaction on the continent, does not render the structure irrational nor detract from Metternich’s rationale.  Some historians have come to regard the congress more fondly and while it’s not true the consequence was a century of peace in Europe, it created a framework which meant a good number of decades in that time were notably less blood-soaked than what came before and certainly what followed.

Dr Rudd at the ceremony to be conferred DPhil, University of Oxford, September, 2022.

By 2009, Kevin Rudd ((b 1957); Prime Minister of Australia 2007-2010 & June-September 2013), having realised being prime-minister was a squandering of intellectual talent, embarked on a re-design of relationships in the Asia-Pacific, structured in a way to suit what was self-evidently obvious: he should assume regional leadership.  These things do happen when folk get carried away.  Not discouraged by the restrained enthusiasm for his good idea, Mr Rudd penned one of his wordy rationales which, to him, must have sounded rational but less impressed was just about everybody else in the region including his own cabinet and it’s difficult to recall any hint of interest from other countries.  Mr Rudd quibbled a bit, claiming his use of the word community was just diplomatic shorthand and he wasn’t suggesting anything like what the EU ever was or had become but just better way of discussing problems.  Anyway, it for a while gave him a chance to use phrases like “ongoing and continuing discussions” and “regional and sub-regional architecture” so there was that.  By 2010 the idea had been allowed quietly to die and he had more pressing problems.

Attaining the premiership was Rudd’s mistake.  Had he never achieved to position he’d probably be spoken of as “the best prime-minister Australia never had” but instead he’s among those (and of late there have been a few) remembered as the Roman historian Tacitus (circa 56–circa 120) in the first volume of his Histories (circa 100) wrote of Galba (3 BC–AD 69; Roman Emperor 68-69): "...omnium consensu capax imperii nisi imperasset" (everyone would have agreed he was qualified for governing if he had not held the office).  His background was as a senior public servant who provided advice to others so they could make decisions and he enjoyed a solid career which was clearly well-suited to his skills.  Unfortunately, when occupying the highest political office in the land, he proved indecisive and too often inclined to refer to committees matters which he should have insisted came to cabinet with the necessary documents.  His other character flaw was he seemed unable to understand there was a difference between “leadership” and “command”, unable to realise there was a difference between the structured hierarchy of the public service and the swirling clatter of politics.  His career in The Lodge (the prime-minister’s official residence in Canberra) can be recalled as the Italian historian and politician Francesco Guicciardini (1483–1540) noted of Pope Clement VII (1478–1534; pope 1523-1534): “knowledgeable and effective as a  subordinate, he fell victim when in charged to timidity, perplexity and habitual irresolution.  With that, the Italian writer Piero Vettori (1499–1585) concurred, writing: “From a great and renowned cardinal, he was transformed into a little and despised pope”, a sentiment familiar in the phrase repeated in militaries around the world (outstanding major; average colonel; lousy general) to describe that truism in organizational behaviour: “Everyone gets promoted to their own level of incompetence”.

That aphorism was from The Peter Principle (1970), written by Raymond Hull (1919–1985) and based on the research of Laurence Peter (1919–1990), the idea being someone who proves successful in one role will be promoted and if competent there, they will be promoted again.  However, should they fail, within the hierarchy, that is the point of their incompetence, the implication being that the tendency is, as time passes, more and more positions within a corporation will be filled by the incompetent.  The exceptions of course are (1) those competent souls who for whatever reason decline promotion and (2) the habitually successful who will in theory continue to be promoted until they reach the top and, if they prove competent there, this results in the paradox of the typical corporation being run by someone competent but staffed substantially by the incompetent.  In politics, reaching the top means becoming prime-minister, president or some similar office and as Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) described it: "...if he trips he must be sustained. If he makes mistakes they must be covered. If he sleeps he must not wantonly be disturbed. If he is no good he must be poleaxed.  In one of the more amusing recent episodes in politics, the Australian Labor Party (ALP) decided Dr Rudd had been promoted to the relevant point and poleaxed him, a back-stabbing which remains one of the best organized and executed seen in years.  Subsequently, the party concluded his replacement was even more of a dud and restored Dr Rudd to the job, a second coming which lasted but a few months but that was long enough for him to revenge himself upon the hatchet men responsible for his downfall so there was that.       

Still, after his political career (which can be thought a success because he did did reach the top of the “greasy pole” and the delivered the ALP a handsome election victory although their gratitude was short-lived (a general tendency in democracies noted (sometimes gleefully) by many political scientists)) he has been busy, even if the secretary-generalship of the United Nations (UN) (an office which is an irresistible lure for a certain type) proved elusive.  Recently he became Dr Rudd, awarded Doctorate of Philosophy (DPhil) by the University of Oxford.  His 420 page thesis, written over four years, explores the world view of Xi Jinping (b 1953; general secretary of the Chinese Communist Party (CCP) and paramount leader of the People's Republic of China (PRC) since 2013) and the relationship of his ideology to both the direction taken by the CCP and the links with the thoughts (and their consequences) of Chairman Mao (Mao Zedong 1893–1976; chairman of the Chinese Communist Party (CCP) 1949-1976).

Dr Rudd says his thesis argues “there has been a significant change in China’s ideological worldview under Xi Jinping compared with previous ideological orthodoxies under Deng Xiaoping, Jiang Zemin and Hu Jintao [and summarises] Xi’s worldview as a new form of ‘Marxist-Leninist Nationalism’”.  Dr Rudd says he preferred “Marxist Nationalism” because “the term contains within it three core propositions”: (1) “Xi’s Leninism has taken both the party and Chinese politics in general to the left” (and he defines “left” for these purposes as …the reassertion of the power of the party over all public policy as well as elevating the position of the individual leader against the rest of collective leadership”), (2) “Xi’s notion of Marxism has similarly taken the centre of gravity of Chinese economic thought to the left” ("left" in this aspect defined as “…a new priority for party-state intervention in the economy, state-owned enterprises over the private sector and a new ideology of greater income equality”) and (3) “Xi has also taken Chinese nationalism to the right (“right” here meaning “a new assertion of Chinese national power as reflected in a new array of nationalist ‘banner terms’ that are now used in the party’s wider ideological discourse.”)  Dr Rudd views these three forces as …part of a wider reification of the overall role of ideology under Xi Jinping. This has been seen in the fresh application of Marxist Leninist concepts of dialectical materialism, historical materialism, the primary stage of socialism, contradiction and struggle across the range of China’s current domestic and international challenges. The role of nationalism has also been enhanced within Xi’s new ideological framework. This hybrid form of Marxist Nationalist ideology is also being increasingly codified within the unfolding canon of Xi Jinping thought. 

Finally, the thesis argues there is a high degree of correlation between these ideological changes on the one hand and changes in the real world of Chinese politics, economic policy and a more assertive foreign policy on the other - including a different approach to Chinese multilateral policy as observed by diplomatic practitioners at the UN in New York.  The thesis concludes these changes in Xi Jinping’s ideological worldview and its impact on Chinese politics and public policy is best explained by a theoretical framework that integrates Authoritarian Resilience Theory, the realist and constructivist insights of the English School of International Relations Theory, and Foreign Policy Analysis.  Clearly, Dr Rudd thinks the CCP has come a long way since comrade Stalin (1878-1953; Soviet leader 1924-1953) casually dismissed Maoist theory as “ideologically primitive”.

Since March 2023, Dr Rudd has served as Australian Ambassador to the United States, the announcement of the appointment attracting some speculation there may be a secret protocol to the contract, providing for him to report to the prime-minister rather than the foreign minister.  It was mischievous speculation and there has been little but praise for the solid work he has been doing in the Washington embassy.  Dr Rudd’s role attracted headlines in March 2022 when a interview with Donald Trump (b 1946; US president 2017-2021) was broadcast in which the former president was acquainted (apparently for the first time) with some uncomplimentary assessments Dr Rudd had made of him including describing him “the most destructive president in history” and “a traitor to the West”.

Having doubtless heard and ignored worse over the years, Mr Trump seemed little concerned but did respond in his usual style, observing he didn’t know much about Dr Rudd except he’d heard he was “a little bit nasty” and “not the brightest bulb”, adding “he’d not be there long” if hostile to a second Trump presidency.  Trumpologists analysing these thoughts suggested the mildness of the reaction indicated the matter was unlikely to be pursued were he to return to the Oval Office, noting his habit of tending to ignore or forget about anything except actual threats to his immediate self-interest.  After taking office in 2017, when asked if he would pursue the legal action he’d during the campaign threatened against Bill (b 1946; US president 1993-2001) & crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) (mostly on the basis of crooked Hillary’s crooked crookedness), he quickly brushed it off saying: “No, they’re good people” and moving on.  It’s thought Dr Rudd won't end up in the diplomatic deep-freeze, the most severe version of which is for a host nation to declare a diplomat "persona non grata" (the construct being the Latin persōna (person) + nōn (not) + grāta (from grātus (acceptable)), the consequence of which is an expulsion from the territory and the worst fate he may suffer is not receiving an invitation to a round of golf (something unlikely much to upset him).  Others however should be worried, in a second Trump White House, there will be vengeance.

Like "diplomatic toothache" and "null & void", the phrase "persona non gratia" has become part of general language, the utility being in few words describing what would otherwise take many more.  Impressionistically, it would seem "troubled starlets" are more than most declared "persona non gratia".

Friday, January 19, 2024

Teleology

Teleology (pronounced tel-ee-ol-uh-jee or tee-lee-ol-uh-jee)

(1) In philosophy, the study of final causes; the doctrine that final causes exist; the belief that certain phenomena are best explained in terms of purpose rather than cause (a moral theory that maintains that the rightness or wrongness of actions solely depends on their consequences is called a teleological theory).

(2) The study of the evidences of design or purpose in nature; such design or purpose; in the cult of intelligent design, the doctrine that there is evidence of purpose or design in the universe, and especially that this provides proof of the existence of a Designer

(3) The belief that purpose and design are a part of or are apparent in nature.

(4) In the cult of vitalism, the doctrine that phenomena are guided not only by mechanical forces but that they also move toward certain goals of self-realization.

(5) In biology, the belief that natural phenomena have a predetermined purpose and are not determined by mechanical laws

1728: From the New Latin teleologia a construct of the Ancient Greek τέλος (télos) (purpose; end, goal, result) genitive τέλεος (téleos) (end; entire, perfect, complete) + λόγος (lógos) (word, speech, discourse).  Teleology is a noun, teleological & teleologic are adjectives, teleologism & teleologist are nouns and teleologically is an adverb.; the noun plural is teleologies.

Christian von Wolff (circa 1740), mezzotint by Johann Jacob Haid (1704-1767).

Although teleology concepts had been discussed in the West (and likely too elsewhere) since at least antiquity, the word teleology appears first in Philosophia rationalis, sive logica (Rational philosophy or logic), a work published in 1728 by German philosopher Baron Christian von Wolff (1679-1754), an author whose writings cover an extraordinary range in formal philosophy, metaphysics, ethics and mathematics.  He used the word to mean something like "the study of stuff in terms of purpose and final cause" and were it not for the way in which Immanuel Kant’s (1724-1804) work has tended to be the intellectual steamroller which has flattened the history of German Enlightenment rationality, he’d probably now be better remembered beyond the profession.

Teleological Ethical Theories

Ethical Egoism posits that an action is good if it produces or is likely to produce results that maximize the person’s self-interest as defined by him, even at the expense of others.  It is based on the notion that it is always moral to promote one’s own good, but at times avoiding the personal interest could be a moral action too. This makes the ethical egoism different from the psychological egoism which holds that people are self-centred and self-motivated and perform actions only with the intention to maximize their personal interest without helping others, thereby denying the reality of true altruism.  Utilitarianism theory holds that an action is good if it results in maximum satisfaction for a large number of people who are likely to get affected by the action.  Eudaimonism is a teleological theory which holds an action is good if it results in the fulfilment of goals along with the welfare of the human beings.  In other words, the actions are said to be fruitful if it promotes or tends to promote the fulfilment of goals constitutive of human nature and its happiness.

Lindsay Lohan and her layer in court, Los Angeles, December 2011.

At the first of the Nuremberg trials, convened to try two-dozen odd  of the senior surviving Nazis, one of the criticisms of the conceptual model adopted by the US prosecution team under Justice Robert Jackson (1892–1954; US Supreme Court Justice 1941-1954; Chief US Prosecutor at the Nuremberg (IMT) trials of Nazi war criminals 1945-1946) was that it was teleological: "the final crimes being implicit in the very origins of the regime".  His approach was of benefit to historians and added to the drama (and sometimes the tedium) of the event but was viewed by the British team, all highly experienced trial lawyers, as a needless diversion from the core business of simply winning the cases.

As a concept in philosophy, teleology can be applied practically or in the abstract to the study of purpose or design in natural phenomena.  Because the idea of teleology is there is (or can be) some inherent purpose or goal in the development and existence of things and events, it implies they are (at least sometimes) directed toward realizing that purpose.  As a tool of philosophers it can be helpful because usually it's contrasted with a mechanistic world view in which everything in the universe exists (or is perceived) as a series of cause-and-effect interactions with no inherent purpose.  Teleology is thus ultimately one extreme of a spectrum onto which observations and theories can be mapped, shifting around as need be.  Well and good, but teleological arguments do seem to exert a powerful attraction on those with some point to make, notably among those who like to assert the existence of a purposeful creator or designer of the universe and these people are inclined to conflate elegance of argument with proof.  Ultimately, the application of teleology can provide a framework of arguments for conclusions which, however audacious and compelling, remain wholly speculative and there is the suspicion that the internal logic which a teleological map can lend does lead some to be convinced of what are, just arguments.  Advances in knowledge have in some fields have diminished the appeal.  There was a time when in the biological sciences, teleology was associated with the notion vitalism which held that living organisms possess a purpose or life force guiding their development and their role on the planet (and presumably the universe).  More recently however, the functionality and complexity of life has come to be understood through evolutionary processes, genetics, and natural selection, there being neither the need nor any apparent evidence for a predetermined purpose.

Saturday, October 2, 2021

Pareidolia

Pareidolia (pronounced par-ei-do-lia)

In psychiatry and psychology, the perception of a recognizable image or meaningful pattern where none exists or is intended.

1994: From Ancient Greek παρα ((para (alongside, concurrent)) + εδωλον ((eídōlon (image, phantom)).  Word was invented by UFO debunker Steven Goldstein in 22 June 1994 edition of Skeptical Inquirer magazine, a publication devoted to rational, evidence-based explanations of the para-normal, magic, UFOs, conspiracies and the many crackpot notions spread by new-agers, spiritualists and other nutjobs.

Pareidolia is a form of apophenia where the mind will attempt to find connections in random events, thoughts or patterns where none actually exist. Pareidolia concentrates the visual and audio aspects of the brain in constructing a perception from a vague stimulus.  Pre-dating the actual word, in some circles in both psychology and psychiatry, it was for some decades popular to attempt to induce a form of pareidolia in a patient to be able to assess them better, most famously with the Rorschach Ink Blots.

Technically, there are two forms of pareidolia, the first, the mechanistic, where man-made objects, by mere coincidence have a resemblance to something else.  The second, the matrixed, is where natural phenomenon such as rock formations, clouds or the surfaces of planets include shapes which can be interpreted as something human, animal or supernatural and instead of being regarded as coincidental and amusing, are treated as having some inherent meaning or being evidence of some theory otherwise unsupported by any evidence.

The phenomenon of pareidolia manifests with such frequency as the identification of the human face in various structures that, given the wealth of behavioral evidence of diminished orientation towards faces as well as the presence of face perception impairments in autism spectrum disorder (ASD), interest was taken in the possibility of a relationship between the two.  Not something in which observational studies offered obvious potential, even the of design of experiments was challenging and the legacy of ASD research seemed not a guide, the underlying mechanisms of the deficits in ASD, although habitually described as “unclear” were better called “unknown”.  In ASD research, face-like object stimuli which had been shown to evoke pareidolia in TD (typically developing according defined criteria in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5; 2013) individuals were used to test the effect of a global face-like configuration on orientation and perceptual processes in young children with ASD and age-matched TD controls.  That had demonstrated TD children were more likely to look first towards upright face-like objects than children with ASD, suggesting a global face-like configuration elicit a stronger orientation bias in TD children as compared to children with ASD.

However, once focused on the stimuli, both groups spent more time exploring the upright face-like object, suggesting both perceived it as a face.  The conclusion was the result was in agreement with earlier work in the field of abnormal social orienting in ASD.  Whether variations on the approaches in ASD research would be useful in the study of pareidoila was of interest because face detection is an automatic, rapid and subconscious process, considered as a core component of the social perceptual system subtending social behaviors.  That faces can (as the illusory detection called pareidolia) be perceived in non-face stimuli, such as toast, clouds or landscapes by some while many on the ASD suffer difficulties in the perception of the real thing, did at least hint at the possibility of a link or even perhaps the need to revise the parameters of ASD.

Detecting faces in non-face stimuli may have a strong adaptive value given that from an evolutionary point of view, the cost of erroneously detecting a face in non-face stimuli might be less than failing to detect another’s face in the environment.  Pareidolia may thus be just another spectrum condition in that the perception of pareidolic faces or other shapes in a variety of surfaces or spaces may vary little between people, the difference being more the individual’s reaction and the reporting of the event(s).

Friday, January 26, 2024

Brand

Brand (pronounced brand)

(1) The kind, grade, or make of a product or service, as indicated by a stamp, trademark, or such.

(2) A mark made by burning or otherwise, to indicate kind, grade, make, ownership (of both objects and certain animals) etc.

(3) A mark formerly put upon slaves or criminals, made on the skin with a hot iron.

(4) Any mark of disgrace; stigma.

(5) A kind or variety of something distinguished by some distinctive characteristic.

(6) A set of distinctive characteristics that establish a recognizable image or identity for a person or thing.

(7) A conflagration; a flame.  A burning or partly burned piece of wood (now rare except regionally although the idea of brand as “a flaming torch” still exists as a poetic device).  In the north of England & Scotland, a brand is a torch used for signalling. 

(8) A sword (archaic except as a literary or poetic device).

(9) In botany, a fungal disease of garden plants characterized by brown spots on the leaves, caused by the rust fungus Puccinia arenariae

(10) A male given name (the feminine name Brenda was of Scottish origin and was from the Old Norse brandr (literally “sword” or “torch”).

(11) To label or mark with or as if with a brand.

(12) To mark with disgrace or infamy; to stigmatize.

(13) Indelibly to impress (usually in the form “branded upon one’s mind”)

(14) To give a brand name to (in commerce including the recent “personal brand).

Pre 950: From the Middle English, from the Old English brond & brand (fire, flame, destruction by fire; firebrand, piece of burning wood, torch (and poetically “sword”, “long blade”) from the Old High German brant, the ultimate source the primitive Indo-European bhrenu- (to bubble forth; brew; spew forth; burn).  It was cognate with the Scots brand, the Dutch & German Brand, the Old Norse brandr, the Swedish brand (blaze, fire), the Icelandic brandur and the French brand of Germanic origin.  The Proto-Slavic gorěti (to burn) was a distant relation.  Brand is a noun & verb, brander is a noun, brandless is an adjective, branded is a verb and branding is a noun & verb; the noun plural is brands.  Forms (hyphenated and not) like de-brand, non-brand, mis-brand & re-brand are created as required and unusually for English, the form brander seems never to have been accompanied by the expected companion “brandee”.

Some work tirelessly on their “personal brand”, a term which has proliferated since social media gained critical mass.  Lindsay Lohan’s existence at some point probably transcended the notion of a personal brand and became an institution; the details no longer matter.

The verb brand dates from the turn of the fifteenth century in the sense of “to impress or burn a mark upon with a hot iron, cauterize; stigmatize” and originally described the marks imposed on criminal or cauterized wounds, the used developed from the noun.  The figurative use (often derogatory) of “fix a character of infamy upon” emerged in the mid-fifteenth century, based on the notion of the association with criminality.  The use to refer to a physical branding as a mark of ownership or quality dates from the 1580s and from this developed the familiar modern commercial (including “personal brands”) sense of “brand identity”, “brand recognition”, “brand-name” etc.  Property rights can also attach to brands, the idea of “brand-equity”.

Although it’s unknown just when the term “branding iron” (the (almost always) iron instrument which when heated burned brands into timber, animal hides etc) was first used (it was an ancient device), the earliest known citation dates only from 1828.  The “mark made by a hot iron” was older and in use since at least the 1550s, noted especially of casks and barrels”, the marks indicating variously the maker, the type of contents, the date (of laying down etc) or the claimed quality..  By the early-mid nineteenth century the meaning had broadened to emphasise “a particular make of goods”, divorced from a particular single item and the term “brand-name” appears first to have been used in 1889, something significant in the development of the valuable commodity of “brand-loyalty” although that seems not to have been an acknowledged concept in marketing until 1961.  The idea of “brand new” is based on the (not always accurate) notion a brand was the last thing to be applied to a product before it left the factory.

BMC ADO16 brands, clockwise from top left: Wolseley 1300, Riley Kestrel 1300, MG 1300, Austin 1300 GT, Morris 1100 and Vanden Plas Princess 1300.  The British Motor Corporation's (BMC) ADO16 (Austin Drawing Office design 16) was produced between 1962-1974 and was a great success domestically and in many export markets, more than two million sold in 1.1 & 1.3 litre form.  The Austin & Morris brands made up the bulk of the production but versions by Wolseley, Riley, MG & Vanden Plas versions were at various times available.  All were almost identically mechanically with the brand differentiation restricted to the interior trim and the frontal panels.  This was the high (or low) point of the UK industry's “badge engineering”.  The abbreviation ADO is still sometimes said to stand for “Amalgamated Drawing Office”, a reference to the 1952 creation of BMC when the Austin & Morris design & engineering resources were pooled.  Like many such events subsequently, the amalgamation was more a “takeover” than a “merger” and the adoption of “Austin Drawing Office” reflected the priorities and loyalties of Leonard Lord (later Lord Lambury, 1896–1967), the former chairman of Austin who was appointed to head the conglomerate.  The appearance of “Amalgamated Drawing Office” appears to be a creation of the internet age, the mistake still circulating.

Since the beginnings of mass-production made possible by powered industrial processes and the ability to distribute manufactured stuff world-wide, brand-names have become (1) more prevalent and (2) not of necessity as distinctive as once they were.  Historically, in commerce, a brand was an indication of something unique but as corporations became conglomerates they tended to accumulate brands (sometimes with no other purpose than ceasing production in order to eliminate competition) and over time, it was often tempting to reduce costs by ceasing separate development and simply applying a brand to an existing line, hoping the brand loyalty would be sufficient to overlook the cynicism.  The British car manufactures in the 1950s use the idea to maintain brand presence without the expense of developing unique products and while originally some brand identity was maintained with the use of unique mechanical components or coachwork while using a common platform, by the late 1960s the system had descended to what came to be called “badge engineering”, essentially identical products sold under various brand-names, the differences restricted to minor variations in trim and, of course, the badge.

Australia Day vs Invasion Day: The case for a re-brand

Although it came to be known as “Australia’s national day” and in some form or other had been celebrated or at last marked since the early nineteenth century, as a large-scale celebration (with much flag waving) it has been a thing only since the 1988 bi-centennial of white settlement.  What the day commemorated was the arrival in 1788 in what is now Sydney of the so-called “First Fleet” of British settlers, the raising of the Union Flag the first event of legal significance in what ultimately became the claiming of the continental land-mass by the British crown.  Had that land been uninhabited, things good and bad would anyway have happened but in 1788, what became the Commonwealth of Australia was home to the descendants of peoples who had been in continuous occupation sine first arriving up to 50,000 years earlier (claims the history extends a further 10,000 remain unsupported by archaeological evidence); conflict was inevitable and conflict there was, the colonial project a violent and bloody business, something the contemporary records make clear was well understood at the time but which really entered modern consciousness only in recent decades.

What the colonial authorities did was invoke the legal principle of terra nullius (from the Latin terra nūllīus (literally “nobody's land”)) which does not mean “land inhabited by nobody” but “land not owned by anyone”.  The rational for that was the view the local population had no concept of land “ownership” and certainly no “records” or “title deeds” as they would be understood in English law.  Given that, not only did the various tribes not own the land but they had no system under which they could own land; thus the place could be declared terra nullis.  Of late, some have devoted much energy to justifying all that on the basis of “prevailing standards” and “accepted law” but even at the time there were those in London who were appalled at what was clearly theft on a grand scale, understanding that even if the indigenous population didn’t understand their connection to the land and seas as “ownership” as the concept was understood in the West, what was undeniable by the 1830s when the doctrine of terra nullius was formally interpolated into colonial law was that those tribes understood what “belonged” to them and what “belonged” to other tribes.  That’s not to suggest it was a wholly peaceful culture, just that borders existed and were understood, even if sometimes transgressed.  Thus the notion that 26 January should better be understood as “Invasion Day” and what is more appropriate than a celebration of a blood-soaked expropriation of a continent is there should be a treaty between the colonial power (and few doubt that is now the Australian government) and the descendants of the conquered tribes, now classified as “first nations”.  Although the High Court of Australia in 1992 overturned the doctrine of terra nullius when it was recognized that in certain circumstances the indigenous peoples could enjoy concurrent property rights to land with which they could demonstrate a continuing connection, this did not dilute national sovereignty nor in any way construct the legal framework for a treaty (or treaties).

The recognition that white settlement was an inherently racist project based on theft is said by some to be a recent revelation but there are documents of the colonial era (in Australia and elsewhere in the European colonial empires) which suggest there were many who operated on a “we stole it fair and square” basis and many at the time probably would not have demurred from the view 26 January 1788 was “Invasion Day” and that while it took a long time, ultimately that invasion succeeded.  Of course, elsewhere in the British Empire, other invasions also proved (militarily) successful but usually these conflicts culminated in a treaty, however imperfect may have the process and certainly the consequences.  In Australia, it does seem there is now a recognition that wrong was done and a treaty is the way to offer redress.  That of course is a challenging path because, (1) as the term “first nations” implies, there may need to be dozens (or even hundreds according to the count of some anthropologists) of treaties and (2) the result will need to preserve the indivisible sovereignty of the Commonwealth of Australia, something which will be unpalatable to the most uncompromising of the activists because it means that whatever the outcome, it will still be mapped onto the colonial model.

As the recent, decisive defeat of a referendum (which would have created an constitutionally entrenched Indigenous advisory body) confirmed, anything involving these matters is contentious and while there are a number of model frameworks which could be the basis for negotiating treaties, the negotiating positions which will emerge as “the problems” are those of the most extreme 1% (or some small number) of activists whose political positions (and often incomes) necessitate an uncompromising stance.  Indeed, whatever the outcome, it’s probably illusory to imagine anything can be solved because there are careers which depend on there being no solution and it’s hard to envisage any government will be prepared to stake scare political capital on a venture which threatens much punishment and promises little reward.  More likely is a strategy of kicking the can down the road while pretending to be making progress; many committees and boards of enquiry are likely to be in our future and, this being a colonial problem, the most likely diversion on that road will be a colonial fix.

One obvious colonial fix would be a double re-branding exercise.  The New Year’s Day public holiday could be shifted from 1 January to December 31 and re-branded “New Year’s Eve Holiday”, about the only practical change being that instead of the drinking starting in the evening it can begin early in the day (which for many it doubtless anyway does).  Australia Day could then be marked on 1 January and could be re-branded to “Constitution Day” although given the history that too might be found objectionable.  Still, the date is appropriate because it was on 1 January 1901 the country and constitution came into existence as a consequence of an act of the Imperial Parliament, subsequently validated by the parliament of the Commonwealth of Australia (an institution created by the London statute).  It’s the obvious date to choose because that was the point of origin of the sovereign state although in the narrow technical sense, true sovereignty was attained only in steps (such as the Statute of Westminster (1931)), the process not complete until simultaneously both parliaments passed their respective Australia Acts (1986).  The second re-branding would be to call 26 January “Treaty Day” although the actual date is less important than the symbolism of the name and Treaty Day could be nominated as the day on which a treaty between the First Nations and the Commonwealth could be signed.  The trick would be only to name 26 January as the date of the signing, the year a function of whenever the treaty negotiations are complete.  The charm of this approach is the can can be kicked down the road for the foreseeable future.  Any colonial administrator under the Raj would have recognized this fix.

Wednesday, November 23, 2022

Bliss

Bliss (pronounced blis)

(1) Perfect happiness; supreme joy or contentment.

(2) In theology, the ecstatic joy of heaven.

(3) A cause of great joy or happiness (archaic).

(4) A name used for a wide variety of locational, commercial and artistic purposes.

Pre-1000: From the Middle English blys, blice, blisce, blise, blesse & blisse, from the Old English bliss (bliss, merriment, happiness, grace, favor), from a variant of earlier blīds, blīþs & blīths (joy, gladness), from the Proto-West Germanic blithsjo & blīþisi (joy, goodness, kindness), the construct being blīthe (blithe) + -s, source also of the Old Saxon blizza & blīdsea (bliss), the construct being blithiz (gentle, kind) + -tjo (the noun suffix).  The early use was concerned almost exclusively with earthly happiness but, because of the fondness scholars in the Medieval Church felt for the word, in later Old English it came increasingly to describe spiritual ecstasy, perfect felicity and (especially), the joy of heaven.  In that sense as a verb it remains in common use in evangelical churches (especially in the southern US) to suggest the “attaining and existing in a state of perfect felicity”.  The adjective blissful was from the late twelfth century blisfulle (glad, happy, joyous; full of the glory of heaven).  Synonyms in a general sense include euphoria, happiness & joy while in a theological context there’s paradise, beatitude, blessedness, elicity, gladness, heaven & rapture; there is no better antonym than misery.  Bliss & blissfulness are nouns, blissy, blissed & blissless are adjectives, blissful is a noun & adjective and blissfully is an adverb; the noun plural is blisses.

The unrelated verb bless was from the Middle English blessen, from the Old English bletsian & bledsian and the Northumbrian bloedsian (to consecrate by a religious rite, make holy, give thanks), from the Proto-Germanic blodison (hallow with blood, mark with blood), from blotham (blood) and originally it meant the sprinkling of blood on pagan altars.  The pagan origins didn’t deter the early English scribes who chose the word for Old English bibles, translating the Latin benedicere and the Greek eulogein, both of which have a ground sense of "to speak well of, to praise," but were used in Scripture to translate Hebrew brk (to bend (the knee), worship, praise, invoke blessings).  In late Old English, the meaning shifted towards "pronounce or make happy, prosperous, or fortunate" under the influence of the etymologically unrelated bliss, (the resemblance obviously a factor in this) and by the early fourteenth century it was being used in religious services to mean "invoke or pronounce God's blessing upon" and is unusual in that there are no cognates in other languages.

State of bliss.  Lindsay Lohan embraces her inner Zen, Phuket, Thailand, 2017.

In idiomatic use, a "bliss ninny" is (1) one unrealistically optimistic (a Pollyanna, which, in Marxist theory, can align with the concept of "false consciousness), (2) one who prefers to ignore or retreat from difficult situations rather dealing with the problem (sometimes expressed as a "state of blissful ignorance") or (3) a student of theology intoxicated with the spiritual aspects of the teachings, but ignorant of the underlying scholarship.  A "bliss out" is the experience of great pleasure, often analogous with a "love rush" and the state in which one can be said to be "blissed up".  In economics, a "bliss point" is quantity of consumption where any further increase would make the consumer less satisfied (as opposed to the law of diminishing returns where increases deliver pleasure in decreasing increments; a classic example is alcohol.  It's used also in cooking as the measure of certain critical ingredients (fat, salt, sugar etc) at which point palatability is optimized.  To follow one's bliss is a notion from pop-psychology and the new age which advocates using one's awareness of what causes one to experience rapture as a guide for determining what constitutes authentic and proper living.

Charles O'Rear's original 1996 photograph, licenced in 2000 by Microsoft which used it as the desktop wallpaper for the Windows XP operating system.  Much time was spent in Microsoft's compatibility labs working out what would be the most "blissful" opening music (the "startup chime") to accompany the images' appearance upon boot-up. 

There are claims that Bliss, the default desktop wallpaper used in Microsoft’s Windows XP operating system, is the most viewed photograph of all time.  It was taken in 1996 by Charles O'Rear (b 1941) at Sonoma County, a viticultural region in California, using a Mamiya RZ67 film camera and as used by Microsoft, was barely changed, just cropped to better suit the shape of computer screens, the green hues slightly more saturated to render the image more “wallpaper-like”.

Quite how often bliss has been viewed isn’t known.  Economists and others use a variety of mathematical models and equations to calculate numbers where exact or even indicative records either don’t exist or can’t be relied upon, a famous example of which is the “piano tuner” problem posed by Italian-American nuclear physicist Enrico Fermi (1901–1954) for his students to ponder.  The challenge for the students was to create a formula to estimate the number of piano tuners in Chicago, based only on the known population of the city.  It would thus be a task of extrapolation, using one constant and a number of assumptions.  Fermi deconstructed his equation thus:

(1) Chicago has a population of 3 million.

(2) Assume an average family contains four members so that the number of families in Chicago must be about 750,000.

(3) Assume one in five families owns a piano, meaning there will be 150,000 pianos in Chicago.

(4) Assume the average piano tuner services four pianos a day and works a for five day week, taking an annual two week vacation.

(5) Therefore, in his (50 week) working year, a tuner would tune 1,000 pianos. The formula is thus 150,000 divided by (4 x 5 x 50) = 150.  There must be around 150 piano tuners in Chicago.

The method obviously doesn’t guarantee an exactly correct result but it does provide an indicative number might be off by no more than a factor of 2-3 and almost certainly within a factor of 10-12 so it’s reasonable to conclude there will be neither 15 nor 1,500 piano tuners.  A number with a factor error of even 2-3 in most cases is probably not a great deal of help (except to cosmologists for whom a factor of 10 error remains “within cosmological accuracy” but the piano tuner problem does illustrate how the concept can work and the more (useful) constants which are known, the more accurate the result is likely to be achieved.

Bliss, a little greener and cropped to fit on computer monitors.

Even so, it’s probably impossible to estimate how often bliss has been viewed, even were one to assemble as many constants and assumptions as are available such as:

(1) Number of copies of Windows XP sold.

(2) Number of copies of Windows XP in use in each year since it was introduced.

(3) Number of users per copy of Windows XP.

(4) Number of instances which retained bliss as wallpaper.

(5) Number of times per day each user saw bliss.

However, even with those and as many more assumptions as can be imagined, it’s doubtful if a vaguely accurate number could be derived, simply because data such as the number of users who changed their wallpaper (or have such a change imposed on them by corporate policies) isn’t available and there’s no rational basis on which to base an assumption.  However, although any estimate will almost certainly be out by millions or even billions, the bliss viewing number will be a big number and it being the world’s most viewed photograph is not implausible.

One of the reasons for the big number was the unexpected longevity of Windows XP which proved more enduring than two of its intended successors, the somewhat misunderstood Windows Vista and the truly awful Windows 8, the ongoing popularity of the thing meaning Microsoft repeatedly extended the end-date for support.  Introduced later in 2001, with a final substantive update made in 2008, support for Windows XP was intended to end in 2012 but such was the response that this was shifted in one form or another to 2014 for the mainstream products while for specialist installations (such as embedded devices), it lingered on until 2019.  That extension appealed to the nerd after-market which quickly provided hacks (with titles like “XP Update Extender”) to allow users to make XP on their desktop or laptop appear to Microsoft’s update services as one on the devices still supported.  Microsoft could have stopped this at any time but never did which was a nice courtesy.

More productive but less blissful: the scene in Sonoma County, 2006 after the land was given over to a vineyard

Another aspect of XP where “bliss point” could be used was that the users interface proved for many something of an ideal, combining the basic design of the model introduced when the object-oriented GUI (graphical user interface) was offered on Windows 95 (and subsequently bolted to Windows NT4) along with a few colorful embellishments.  So compelling was this that when, inexplicably, Microsoft introduced something less usable for Windows 8, the nerd after-market quickly mobilized and many “classic menus” appeared, the best of which remains “Open-Shell” (previously called “Classic Shell” & “Classic Start”) and there are those still so nostalgic for the ways of XP that some add it to their Windows 10/11 systems, even though the menu structures of those are a genuine improvement.  How many also add the bliss wallpaper (which remains widely available) isn’t known but Microsoft certainly haven’t attempted to suppress the memory, the Office 365 team including it in 2021 in a set of historical images for use with their Teams communication platform.

Microsoft Windows XP: The startup chime.