Showing posts sorted by date for query acid. Sort by relevance Show all posts
Showing posts sorted by date for query acid. Sort by relevance Show all posts

Sunday, November 17, 2024

Now

Now (pronounced nou)

(1) At the present time or moment (literally a point in time).

(2) Without further delay; immediately; at once; at this time or juncture in some period under consideration or in some course of proceedings described.

(3) As “just now”, a time or moment in the immediate past (historically it existed as the now obsolete “but now” (very recently; not long ago; up to the present).

(4) Under the present or existing circumstances; as matters stand.

(5) Up-to-the-minute; fashionable, encompassing the latest ideas, fads or fashions (the “now look”, the “now generation” etc).

(6) In law, as “now wife”, the wife at the time a will is written (used to prevent any inheritance from being transferred to a person of a future marriage) (archaic).

(7) In phenomenology, a particular instant in time, as perceived at that instant.

Pre 900: From the Middle English now, nou & nu from the Old English (at the present time, at this moment, immediately), from the Proto-West Germanic , from the Proto-Germanic nu, from the primitive Indo-European (now) and cognate with the Old Norse nu, the Dutch nu, the German nun, the Old Frisian nu and the Gothic .  It was the source also of the Sanskrit and Avestan nu, the Old Persian nuram, the Hittite nuwa, the Greek nu & nun, the Latin nunc, the Old Church Slavonic nyne, the Lithuanian and the Old Irish nu-.  The original senses may have been akin to “newly, recently” and it was related to the root of new.  Since Old English it has been often merely emphatic, without any temporal sense (as in the emphatic use of “now then”, though that phrase originally meant “at the present time”, and also (by the early thirteenth century) “at once”.  In the early Middle English it often was written as one word.  The familiar use as a noun (the present time) emerged in the late fourteenth century while the adjective meaning “up to date” is listed by etymologists as a “mid 1960s revival” on the basis the word was used as an adjective with the sense of “current” between the late fourteenth and early nineteenth centuries.  The phrase “now and then” (occasionally; at one time and another) was in use by the mid 1400s, “now or never” having been in use since the early thirteenth century.  “Now” is widely used in idiomatic forms and as a conjunction & interjection.  Now is a noun, adjective & adverb, nowism, nowness & nowist are nouns; the noun plural is nows.

Right here, right now: Acid House remix of Greta Thunberg’s (b 2003) How dare you? speech by Theo Rio.

“Now” is one of the more widely used words in English and is understood to mean “at the present time or moment (literally a point in time)”.  However, it’s often used in a way which means something else: Were one to say “I’ll do it now”, in the narrow technical sense that really means “I’ll do it in the near future”.  Even things which are treated as happening “now” really aren’t such as seeing something.  Because light travels at a finite speed, it takes time for it to bounce from something to one’s eye so just about anything one sees in an exercise in looking back to the past.  Even when reading something on a screen or page one’s brain is processing something from a nanosecond (about one billionth of a second) earlier.  For most purposes, “now” is but a convincing (an convenient) illusion and even though, in certain, special sense, everything in the universe is happening at the same time (now) it’s not something that can ever be experienced because of the implications of relativity.  None of this causes many problems in life but among certain physicists and philosophers, there is a dispute about “now” and there are essentially three factions: (1) that “now” happened only once in the history of the known universe and cannot again exist until the universe ends, (2) that only “now” can exist and (3) that “now” cannot ever exist.

Does now exist? (2013), oil & acrylic on canvas by Fiona Rae (b 1963) on MutualArt.

The notion that “now” can have happened only once in the history of our universe (and according to the cosmological theorists variously there may be many universes (some which used to exist, some extant and some yet to be created) or our universe may now be in one of its many phases, each which will start and end with a unique “now”) is tied up with the nature of time, the mechanism upon which “now” depends not merely for definition but also for existence.  That faction deals with what is essentially an intellectual exercise whereas the other two operate where physics and linguistics intersect.  Within the faction which says "now can never exist" there is a sub-faction which holds that to say “now” cannot exist is a bit of a fudge in that it’s not that “now” never happens but only that it can only every be described as a particular form of “imaginary time”; an address in space-time in the past or future.  The purists however are absolutists and their proposition is tied up in the nature of infinity, something which renders it impossible ever exactly to define “now” because endlessly the decimal point can move so that “now” can only ever be tended towards and never attained.  If pushed, all they will concede is that “now” can be approximated for purposes of description but that’s not good enough: there is no now.

nower than now!: Lindsay Lohan on the cover of i-D magazine No.269, September, 2006.

The “only now can exist” faction find tiresome the proposition that “the moment we identify something as happening now, already it has passed”, making the point that “now” is the constant state of existence and that a mechanism like time exists only a thing of administrative convenience.  The “only now can exist” faction are most associated with the schools of presentism or phenomenology and argue only the present moment (now) is “real” and that any other fragment of time can only be described, the past existing only in memory and the future only as anticipation or imagination; “now” is the sole verifiable reality.  They are interested especially in what they call “change & becoming”, making the point the very notion of change demands a “now”: events happen and things become in the present; without a “now”, change and causality are unintelligible.  The debate between the factions hinges often on differing interpretations of time: whether fundamentally it is subjective or objective, continuous or discrete, dynamic or static.  Linguistically and practically, “now” remains central to the human experience but whether it corresponds to an independent metaphysical reality remains contested.

Sunday, November 3, 2024

Anonymuncule

Anonymuncule (pronounced uh-non-uh-monk-u-elle)

An insignificant, anonymous writer

1859: A portmanueau word, the construct being anony(mous) + (ho)muncule.  Homnuncle was from the Latin homunculus (a little man), a diminutive of homō (man).  Anonymous entered English circa 1600 and was from the Late Latin anonymus, from the Ancient Greek ᾰ̓νώνῠμος (annumos) (without name), the construct being ᾰ̓ν- (an-) (“not; without; lacking” in the sense of the negating “un-”) + ὄνῠμᾰ (ónuma), an Aeolic & Doric dialectal form of ὄνομᾰ (ónoma) (name).  The construct of the English form was an- +‎ -onym +‎ -ous.  The an- prefix was an alternative form of on-, from the Middle English an-, from the Old English an- & on- (on-), from the Proto-Germanic ana- (on).   It was used to create words having the sense opposite to the word (or stem) to which the prefix is attached; it was used with stems beginning either with vowels or "h".  The element -onym (word; name) came from the international scientific vocabulary, reflecting a New Latin combining form, from Ancient Greek ὄνυμα (ónuma).  The –ous suffix was from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus (full, full of); a doublet of -ose in an unstressed position.  It was used to form adjectives from nouns to denote (1) possession of (2) presence of a quality in any degree, commonly in abundance or (3) relation or pertinence to.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example, sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  The Latin homunculus (plural homunculi) enjoyed an interesting history.  In medieval medicine, it was used in the sense of “a miniature man”, a creature once claimed by the spermists (once a genuine medical speciality) to be present in human sperm while in modern medicine the word was resurrected for the cortical homunculus, an image of a person with the size of the body parts distorted to represent how much area of the cerebral cortex of the brain is devoted to it (ie a “nerve map” of the human body that exists on the parietal lobe of the human brain).  Anonymuncule is a noun; the noun plural is anonymuncules.

Preformationism: Homunculi in sperm (1695) illustrated by Nicolaas Hartsoeker who is remembered also as the inventor in 1694 of the screw-barrel simple microscope.

Like astrology, alchemy once enjoyed a position of orthodoxy among scientists and it was the alchemists who first popularized homunculus, the miniature, fully formed human, a concept with roots in both folklore and preformationism (in biology. the theory that all organisms start their existence already in a predetermined form upon conception and this form does not change in the course of their lifetime (as opposed to epigenesis (the theory that an organism develops by differentiation from an unstructured egg rather than by simple enlarging of something preformed)).  It was Paracelsus (the Swiss physician, alchemist, lay theologian, and philosopher of the German Renaissance Theophrastus von Hohenheim (circa 1493-1541)) who seems to have been the first to use the word in a scientific paper, it appearing in his De homunculis (circa 1529–1532), and De natura rerum (1537).  As the alchemists explained, a homunculus (an artificial humanlike being) could be created through alchemy and in De natura rerum Paracelsus detailed his method.

A writer disparaged as an anonymuncule differs from one who publishes their work anonymously or under a pseudonym, the Chicago Tribune in 1871 explaining the true anonymuncule was a “little creature who must not be confounded with the anonymous writers, who supply narratives or current events, and discuss public measures with freedom, but deal largely in generalities, and very little in personalities.  That was harsh but captures the place the species enjoy in the literary hierarchy (and it’s a most hierarchal place). Anonymuncules historically those writers who publish anonymously or under pseudonyms, without achieving renown or even recognition and there’s often the implication they are “mean & shifty types” who “hide behind their anonymity”.

Primary Colors: A Novel of Politics (1996), before and after the lifting of the veil.

Some however have good and even honourable reasons for hiding behind their anonymity although there is also sometime mere commercial opportunism.  When former Time columnist Joe Klein (born 1946) published Primary Colors: A Novel of Politics (1996), the author was listed as “anonymous”, a choice made to avoid the political and professional risks associated with openly critiquing a sitting president and his administration.  Primary Colors was a (very) thinly veiled satire of Bill Clinton’s (b 1946; US president 1993-2001) 1992 presidential campaign and offered an insider's view of campaign life, showing both the allure and moral compromises involved.  By remaining anonymous, Klein felt more able candidly to discuss the ethical dilemmas and personal shortcomings of his characters, something that would have been difficult has his identity been disclosed, the conflicts of interest as a working political journalist obvious.  Critically and commercially, the approach seems greatly to have helped the roman à clef (a work of fiction based on real people and events) gain immediate notoriety, the speculation about the author’s identity lying at the core of the book’s mystique.  Others have valued anonymity because their conflicts of interest are insoluble.  Remarkably, Alfred Deakin (1856-1919; prime minister of Australia 1903-1904, 1905-1908 & 1909-1910) even while serving as prime-minister, wrote political commentaries for London newspapers including the National Review & Morning Post and, more remarkably still, some of his pieces were not uncritical of both his administration and his own performance in office.  Modern politicians should be encouraged to pursue this side-gig; it might teach them truthfulness and encourage them more widely to practice it.

For others, it can be a form of pre-emptive self defense.  The French philosopher Voltaire (François-Marie Arouet; 1694–1778) wrote under a nom de plume because he held (and expressed) views which often didn’t please kings, bishops and others in power and this at a time when such conduct was likely to attract persecution worse than censorship or disapprobation.  Mary Ann Evans (1819–1880) adopted the pseudonym George Eliot in an attempt to ensure her works would be taken seriously, avoiding the stigma associated with female authorship at the time.  George Eliot’s style of writing was however that of a certain sort of novelist and those women who wrote in a different manner were an accepted part of the literary scene and although Jane Austen’s name never appeared on her published works, when Sense and Sensibility (1811) appeared its author was listed as “A Lady”.  Although a success, all her subsequent novels were billed as: “By the author of Sense and Sensibility”, Austen's name never appearing on her books during her lifetime.  Ted Kaczynski (1942-2023), the terrorist and author of the Unabomber Manifesto (1995) had his own reasons (wholly logical but evil) for wanting his test to be read but his identity as the writer to remain secret.

Nazi poetry circle at the Berghof: Left to right, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945), Martin Bormann (1900–1945), Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945), and Baldur von Schirach (1907-1974; head of the Hitlerjugend (Hitler Youth) 1931-1940 & Gauleiter (district party leader) and Reichsstatthalter (Governor) of Vienna (1940-1945)), Berchtesgaden, Bavaria, Germany, 1936.  Of much, all were guilty as sin but von Schirach would survive to die in his bed at 67.

The "poet manqué" is a somewhat related term.  A poet manqué is an aspiring poet who never produced a single book of verse (although it’s used also of an oeuvre so awful it should never have been published and the poetry of someone Baldur von Schirach comes to mind.  The adjective manqué entered English in the 1770s and was used originally in the sense of “unfulfilled due to the vagary of circumstance, some inherent flaw or a constitutional lack”.  Because it’s so often a literary device, in English, the adjective does often retain many grammatical features from French, used postpositively and taking the forms manquée when modifying a feminine noun, manqués for a plural noun, and manquées for a feminine plural noun.  That’s because when used in a literary context (“poet manqué”, “novelist manqué” et all) users like it to remain inherently and obviously “French” and thus it’s spelled often with its diacritic (the accent aigu (acute accent): “é”) although when used casually (to suggest “having failed, missed, or fallen short, especially because of circumstances or a defect of character”) as “fly-half manqué”, “racing driver manqué” etc), the spelling manque” is sometimes used.

Manqué (that might have been but is not) was from the French manqué, past participle form of the sixteenth century manquer (to lack, to be lacking in; to miss), from the Italian mancare, from manco, from the Latin mancus (maimed, defective), from the primitive Indo-European man-ko- (maimed in the hand), from the root man- (hand).  Although it’s not certain, the modern slang adjective “manky” (bad, inferior, defective (the comparative mankier, the superlative mankiest)), in use since the late 1950s, may be related.  Since the 1950s, the use in the English-speaking world (outside of North America) has extended to “unpleasantly dirty and disgusting” with a specific use by those stationed in Antarctica where it means “being or having bad weather”.  The related forms are the noun mankiness and the adverb mankily.  Although it’s not an official part of avian taxonomy, bird-watchers (birders) in the UK decided “manky mallard” was perfect to describe a mallard bred from wild mallards and domestic ducks (they are distinguished by variable and uneven plumage patterns).  However, it’s more likely manky is from the UK slang mank which was originally from Polari mank and used to mean “disgusting, repulsive”.

No poet manqué:  In January 2017, Lindsay Lohan posted to Instagram a poem for her 5.2 million followers, the verse a lament of the excesses of IS (the Islamic State), whetting the appetite for the memoir which might one day appear (hopefully "naming names").  The critical reaction to the poem was mixed but the iambic pentameter in the second stanza attracted favorable comment:

sometimes i hear the voice of the one i loved the most
but in this world we live in of terror
who i am to be the girl who is scared and hurt
when most things that happen i cannot explain
i try to understand
when i'm sitting in bed alone at 3am
so i can't sleep, i roll over
i can't think and my body becomes cold
i immediately feel older.....
 
than i realise, at least i am in a bed,
i am still alive,
so what can really be said?
just go to bed and close the blinds,
still and so on, i cannot help but want to fix all of these idle isis
minds
because,
there has to be something i can figure out
rather than living in a world of fear and doubt
they now shoot, we used to shout.
 
if only i can keep trying to fix it all
i would keep the world living loving and small
i would share my smiles
and give too Many kisses

Sunday, October 20, 2024

Heterochromatic

Heterochromatic (pronounced het-er-uh-kroh-mat-ik or het-er-oh-kruh-mat-ik)

(1) Of, having, or pertaining to more than one color (especially as related to relating to heterochromia (in medicine & ophthalmology, the anatomical condition in which multiple pigmentations or colorings occur in the eyes, skin or hair).

(2) Having a pattern of mixed colors.

(3) Of light, having more than one wavelength.

(4) In genetics, of or relating to heterochromatin (in cytology, tightly coiled chromosome material; believed to be genetically inactive).

1890–1895: The construct was hetero- + chromatic + -ic.  The hetero- prefix was from the Ancient Greek ἕτερος (héteros) (the other (of two), another, different; second; other than usual) and was itself an ancient compound, the first element meaning "one, at one, together", from the primitive Indo-European root sem- (one; as one, together with), the second cognate with the second element in the Latin al-ter, the Gothic an-þar and the Old English o-ðer "other."  It's familiar in constructs such as “heterosexual”, “heterogeneity” et al.  In Classical Greek, there was quite a range of application including Heterokretes (true Cretan (ie one bred from the old stock)), heteroglossos (of or from a foreign language), heterozelos (zealous for one side (ie “one-eyed” in the figurative sense)), heterotropos (of a different sort or fashion (literally “turning the other way”) and the useful heterophron (raving (literally “of other mind” (ie “he’s barking mad”))).  The adjective chromatic dates from the turn of the seventeenth century and was used first of music in the sense of “involving tones foreign to the normal tonality of the scale, not diatonic” and was from the Latin chromaticus, from the Ancient Greek khrōmatikos (relating to color, suited for color) (which was applied also to music), from khrōma (genitive khrōmatos) (color, complexion, character (but used mostly metaphorically of embellishments in music); the original meaning was “skin, surface”.  In the Greek, khrōma was used also for certain modifications of the usual diatonic music scale but quite why it came to be used of music remains unclear, the most supported speculation being the influence of the extended sense of khrōma ("ornaments, makeup, embellishments) via the idea of it being “characteristic” of a musical scale or speech.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  Heterochromatic, heterochrome & heterochromous are adjectives, heterochrony, heterochromia & heterochromatism are nouns and heterochromatically is an adverb; the noun plural is heterochromias.

The obvious synonym is monochromatic (although is modern use that’s come often to mean “black & white” rather than “a single color”.  The -mono- prefix was from the Ancient Greek μόνος (mónos) (alone, only, sole, single), from the Proto-Hellenic mónwos and the ending may be compared with οἶος (oîos) (only, single) from óywos.  The etymology of the initial element remains murky but may be from the primitive Indo-European men- (small) and it may be compared with the Ancient Greek μανός (manós) (sparse, rare), the Armenian մանր (manr) (slender, small) and even the Proto-West Germanic muniwu (small fish, minnow).  The sense it’s understood in photography dates from 1940 when (presumably almost instantly), the verbal shorthand became “mono”, exactly the same pattern of use when the need arose to distinguish between color printers and those using only black consumables.  The word was used as an adjective after 1849 although monochromatic (of one color, consisting of light of one wavelength and probably based either on the French monochromatique or the Ancient Greek monokhrōmatos) had been used thus since at least 1807 (presumably it pre-dated this because the adverb monochromatically is documented since 1784).  The alternative forms are both self-explanatory: unicolour used usually single solids and monotint, rare and used mostly as a technical term in art-production where, properly, it describes a reproduction of a multi-color image using just shades of a single color.  Monochrome is a noun & adjective, monochromaticity, monochromy & monochromist are nouns, monochromic is an adjective and monochromatically is an adverb.

Sectoral heterochromia.

In medicine & ophthalmology, heterochromia describes a coloration variously of hair, skin and the eyes but it’s used most commonly of eyes.  At the cellular level, heterochromia is a function of the production, delivery, and concentration of the pigment melanin and the condition may be inherited or caused by genetic mosaicism, chimerism, disease, or injury.  It is not uncommon in certain breeds of domesticated animals (notably those subject to breeding programmes which tend to reduce the gene pool and cats with a predominantly white coat).  In clinical use heterochromia of the eye is known as heterochromia iridum or heterochromia iridis and can be complete, sectoral, or central.  The complete heterochromia is the best known by virtue of being the most photogenic for Instagram and other purposes: it exists when one iris is a wholly different color from the other.  In sectoral heterochromia, part of one iris is a different color from the rest while in central heterochromia, typically there is a ring around the pupil (less commonly seen as spikes radiating from the pupil) in a different color.

Ms Amina Ependieva (b 2008, left) was born in Grozny, the capital city of Chechnya, Russia; she has two rare, genetic conditions: (1) albinism which reduces the quantity the pigment melanin in the skin and (2) heterochromia, the latter manifesting as her having with one blue and one brown eye.  Ms Ependieva was photographed in 2019, aged eleven.  Alkira (b 2015, right) is a Persian cat with heterochromia, felines with predominately white coats the most prone to the condition; in domestic cats, it's most common for them to have one yellow and one blue eye.  The condition does not affect vision.

Bilateral gynandromorphs: butterfly (left), bird (centre) & lobster (right).  The visually related phenomenon is gynandromorphism (an organism (most typically an insect, crustacean or bird) with both male and female attributes & characteristics, the construct being the Ancient Greek γυνή (gynē) (female) + νήρ (anēr) (male) + μορφή (morphē) (form).  In Western science it was first extensively documented in (notably where it was most obvious: in Lepidoptera (butterflies and moths)) but it has since been identified in literally dozens of species.

Lindsay Lohan likes the heterochromatic look, whether in the vertical (Mercedes-Benz Fall 2004 Fashion Week, Smashbox Studios, Culver City, California, September, 2003, left), the horizontal (Maxim Hot 100 Party, Gansevoort Hotel, New York, May 2007, centre) or the variegated (ABC’s Good Morning America studios, New York, November 2022, right).

1985 BMW 635CSi (E24, 1976-1989) with heterochromatic headlights.

In Europe and beyond, yellow headlights are still seen because although few jurisdictions still mandate their use, for some they’ve become an aesthetic choice.  Between 1936-1993, French law required all vehicles to have yellow headlights, the intent being to reduce glare and improve visibility in foggy or rainy conditions, the science being the belief yellow light (which has a longer wavelength) reduced eye strain and made driving at night safer by “cutting through” mist and fog more effectively than white light.  The German philosopher Goethe (Johann Wolfgang von Goethe (1749–1832)) was said to be heterochromatic and he was interested in eyes & color, in 1810 publishing a 1,400-page treatise on the topic, asserting he was the first to questioned the validity of Sir Isaac Newton’s (1642–1727) ideas about light and color: “That I am the only person in this century who has the right insight into the difficult science of colors, that is what I am rather proud of, and that is what gives me the feeling that I have outstripped many.”  Goethe did misinterpret the results of some of Newton’s experiments, thinking he was explaining their invalidity but he did reformulate the topic of color in an entirely new way.  While Newton had viewed color as a physical “process” in which light entered the eye after striking and reflecting off an object, Goethe was the first to explain the sensation of color is shaped also by our perception (the mechanics of human vision and by the way our brains process information (the eyes being an out-growth of the brain).  So, explained Goethe, what we see is a construct of (1) the properties of the object, (2) the dynamics of lighting and (3) our perception.

Tuesday, October 8, 2024

Decorum

Decorum (pronounced dih-kawr-uhm or dih-kohr-uhm)

(1) Dignified propriety of behavior, speech, dress, demeanour etc.

(2) The quality or state of being decorous, or exhibiting such dignified propriety; orderliness; regularity.

(3) The conventions of social behaviour; an observance or requirement of one’s social group (sometimes in the plural as “decorums” the use an allusion to the many rules of etiquette (the expectations or requirements defining “correct behaviour” which, although most associated with “polite society”, do vary between societal sub-sets, differing at the margins)).

1560–1570: A learned borrowing (in the sense of “that which is proper or fitting in a literary or artistic composition”) from the Latin decōrum, noun use of neuter of decōrus (proper, decent (ie decorous) from decor (beauty, elegance, charm, grace, ornament), probably from decus (an ornament; splendor, honor), the Proto-Italic dekos (dignity), from the primitive Indo-European os (that which is proper), from de- (take, perceive) (and used in the sense of “to accept” on the notion of “to add grace”).  By the 1580s the use of decorum has spread from its literary adoption from the Latin to the more generalized sense of “propriety of speech, behavior or dress; formal politeness”, a resurrection of the original sense in Latin (polite, correct in behaviour, that which is seemly).  Decorously (in a decorous manner) is an adverb, decorousness (the state or quality of being decorous; a behavior considered decorous) is a noun, indecorous (improper, immodest, or indecent) and undecorous (not decorous) are adjectives).  The adjective dedecorous (disgraceful; unbecoming) is extinct.  Decorum is a noun; the noun plural is decora or decorums.

Whether on rugby pitches, race tracks, in salons & drawing rooms or geo-politics, disagreements over matters of decorum have over millennia been the source of innumerable squabbles, schisms and slaughter but linguistically, the related adjective decorous (characterized by dignified propriety in conduct, manners, appearance, character, etc) has also not been trouble-free.  Decorous seems first to have appeared in the 1650s from the Latin decōrus and akin to both decēre (to be acceptable, be fitting) and docēre (to teach (in the sense of “to make fitting”) with the adjectival suffix –ōsus appended.  In Latin, the -ōsus suffix (full, full of) was a doublet of -ose in an unstressed position and was used to form adjectives from nouns, to denote possession or presence of a quality in any degree, commonly in abundance.  English picked this up from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus and it became productive.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  Decorous is an adjective, decorousness is a noun and decorously is an adverb.

In use there are two difficulties with decorous: (1) the negative forms and (2) how it should be pronounced, both issues with which mercifully few will be troubled (or even see what the fuss is about) but to a pedantic subset, much noted.  The negative forms are undecorous & indecorous (both of which rarely are hyphenated) but the meanings are differences in the meaning.  Undecorous means simply “not decorous” which can be bad enough but indecorous is used to convey “improper, immodest, or indecent” which truly can be damning in some circles so the two carefully should be applied.  There’s also the negative nondecorous but it seems never to have been a bother.  The problem is made worse by the adjective dedecorous (disgraceful; unbecoming) being extinct; it would have been a handy sort of intermediate state between the “un-” & “in-” forms and the comparative (more dedecorous) & superlative (most dedecorous) would have provided all the nuance needed.  The related forms are the nouns nondecorousness, indecorous & indecorous and the adverbs nondecorously, undecorously & undecorously.

The matter of the pronunciation of decorous is one for the pedants but there’s a lot of them about and like décor, the use is treated as a class-identifier, the correlation between pedantry and class-identifiers probably high; the two schools of thought are  dek-er-uhs & dih-kawr-uhs (the second syllable -kohr- more of a regionalism) and in 1926 when the stern Henry Fowler (1858–1933) published his A Dictionary of Modern English Usage, he in his prescriptive way insisted on the former.  By 1965, when the volume was revised by Sir Ernest Gowers (1880–1966), he noted the “pronunciation has not yet settled down”, adding that “decorum pulls one way and decorate the other”.  In his revised edition, Sir Ernest distinguished still between right & wrong (a position from which, regrettably, subsequent editors felt inclined to retreat) but had become more descriptive than his predecessor of how things were done rather than how they “ought to be” done and added while “most authorities” had come to prefer dih-kawr-uhs, that other arbiter, the Oxford English Dictionary (OED) had listed dek-er-uhs first and it thus “may win”.  By the 2020s, impressionistically, it would seem it has.

Décor is another where the pronunciation can be a class-identifier and in this case it extend to the spelling, something directly related.  In English, the noun décor dates from 1897 in the sense of “scenery and furnishings” and was from the eighteenth century French décor, a back-formation from the fourteenth century décorer (to decorate), from the Latin decorare (to decorate, adorn, embellish, beautify), the modern word thus duplicating the Latin decor.  The original use in English was of theatre stages and such but the term “home décor” was in use late in the 1890s to described the technique of hanging copies of old masters as home decoration.  From this evolved the general use (decorations and furnishings of a room, building etc), well established by the mid 1920s and it’s been with us ever since.  Typically sensibly, the French l'accent aigu (acute accent) (the “é” pronounced ay in French) was abandoned by the Americans without corroding society but elsewhere, décor remained preferred by among certain interior decorators and their clients, the companion French pronunciation obligatory too.

Courtoom decorum: Lindsay Lohan arriving at court, Los Angeles, 2011-2013.  All the world's a catwalk.

Top row; left to right: 9 Feb 2011; 23 Feb; 2011; 10 Mar 2011; 22 Apr 2011.
Centre row; left to right: 23 Jun 2011; 19 Oct 2011; 2 Nov 2011; 14 Dec 2011.
Bottom row; left to right: 17 Dec 2011; 30 Jan 2012; 22 Feb 2012; 28 Mar 2012.

In English, the original use of decorum was in the technical jargon of what word come to be called literary theory; decorum describing a structuralist adherence to formal convention.  It was applied especially to poetry where rules of construction abound and it was about consistency with the “canons of propriety” (in this context defined usually as “good taste, good manners & correctness” which in our age of cultural (and linguistic) relativism is something many would label as “problematic” but all are free to “plug-in” their own standards).  Less controversially perhaps, decorum was understood as the matter of behavior on the part of the poet qua ("in the capacity or character of; as being" and drawn from the Latin legal qua (acting in the capacity of, acting as, or in the manner of)) their poem and therefore what is proper and becoming in the relationship between form and substance.  That needs to be deconstructed: decorum was not about what the text described because the events variously could be thought most undecorous or indecorous but provided the author respected the character, thought and language appropriate to each, the literary demands of decorum were satisfied.  Just as one would use many different words to describe darkness compared to those used of sunlight, a work on a grand and profound theme should appear in a dignified and noble style while the trivial or humble might be earthier.

The tradition of decorum is noted as a theme in the works by the Classical authors from Antiquity but the problem there is that we have available only the extant texts and they would be but a fragment of everything created and it’s acknowledged there was much sifting and censoring undertaken in the Medieval period (notably by priests and monks who cut out “the dirty bits” and it’s not known how much was destroyed because it was thought “worthless” or worse “obscene”.  What has survived may be presumed to be something of the “best of” Antiquity and there’s no way of knowing if in Athens and Rome there were proto-post modernists who cared not a fig for literary decorum.  The Greek and Roman tradition certainly seems to have been influential however because decorum is obvious in Elizabethan plays.  In William Shakespeare’s (1564–1616) Much Ado About Nothing (circa 1598), the comic passages such as the badinage between Beatrice and Benedick appear for amusing effect in colloquial dramatic prose while the set-piece romantic episodes are in formal verse; the very moment Benedick and Beatrice realize they are in love, that rise in the emotional temperature is signified by them suddenly switched to poetic verse.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

By contrast, in rhetoric, the conventions of literary decorum were probably most useful when being flouted.  Winston Churchill’s (1875-1965; UK prime-minister 1940-1945 & 1951-1955) World War II (1939-1945) speeches are remembered now for their eloquence and grandeur but there’s much evidence that at the time many listeners regarded their form as an anachronism and preferred something punchier but what made them effective was the way he could mix light & dark, high and low to lend his words a life which transcended the essential artificiality of a speech.  Once, when discussing serious matter of international relations and legal relationships between formerly belligerent powers, he paused to suggest that while Germany might be treated harshly after all that had happened, the Italians “…might be allowed to work their passage back.” [to the community of the civilized world].  What the flouting of decorum could do was make something worthy but dull seem at least briefly interesting or at least amusing, avoiding what the British judge Lord Birkett (1883–1962) would have called listening to “the ‘refayned’ and precious accents of a decaying pontiff.

In English literature, it was during the seventeenth & eighteenth centuries that decorum became what might now be called a fetish, a product of the reverence for what were thought to be the “Classical rules and tenets” although quite how much these owned to a widespread observance in Antiquity and how much to the rather idealized picture of the epoch painted by medieval and Renaissance scholars really isn’t clear.  Certainly, in the understanding of what decorum was there were influences ancient & modern, Dr Johnson (Samuel Johnson (1709-1784)) observing that while terms like “cow-keeper” or “hog-herd” would be thought too much the vulgar talk of the peasantry to appear in “high poetry”, to the Ancient Greeks there were no finer words in the language.  Some though interpolated the vulgarity of the vernacular just because of the shock value the odd discordant word or phrase could have, the English poet Alexander Pope (1688-1744) clearly enjoying mixing elegance, wit and grace with the “almost brutal forcefulness” of the “the crude, the corrupt and the repulsive” and it’s worth noting he made his living also as a satirist.  His example must have appealed to the Romantic poets because they sought to escape the confines imposed by the doctrines of Neoclassicism, William Wordsworth (1770–1850) writing in the preface to Lyrical Ballads (1798 and co-written with Samuel Taylor Coleridge (1772-1834)) that these poems were here to rebel against “false refinement” and “poetic diction”.  He may have had in mind the odd “decaying pontiff”.

Saturday, October 5, 2024

Ballistic

Ballistic (pronounced buh-lis-tik)

(1) A projected object having its subsequent travel determined or describable by the laws of exterior ballistics, most used in denoting or relating to the flight of projectiles after the initial thrust has been exhausted, moving under their own momentum and subject to the external forces of gravity and the fluid dynamics of air resistance

(2) Of or relating to ballistics.

(3) In slang and idiomatic use, (as “go ballistic”, “went ballistic” etc), to become overwrought or irrational; to become enraged or frenziedly violent.  For those who need to be precise is describing such instances, the comparative is “more ballistic” and the superlative “most ballistic”.

(4) Of a measurement or measuring instrument, depending on a brief impulse or current that causes a movement related to the quantity to be measured

(5) Of materials, those able to resist damage (within defined parameters) by projectile weapons (ballistic nylon; ballistic steel etc), the best-know use of which is the “ballistics vest”.

(6) As “ballistics gel(atin)”, as substance which emulates the characteristics and behavior under stress of human or animal flesh (used for testing the effect of certain impacts, typically shells fired from firearms).

(7) As “ballistic podiatry”, industry slang for “the act of shooting oneself in the foot”, used also by military doctors to describe soldiers with such self-inflicted injuries.  The more general term for gunshot wounds is “ballistic trauma”

(8) In ethno-phonetics, as “ballistic syllable”, a phonemic distinction in certain Central American dialects, characterized by a quick, forceful release and a rapid crescendo to a peak of intensity early in the nucleus, followed by a rapid, un-controlled decrescendo with fade of voicing.

(9) As “ballistic parachute”, a parachute used in light aircraft and helicopters, ejected from its casing by a small explosion.

1765–1775: The construct was the Latin ballist(a) (a siege engine (ancient military machine) for throwing stones to break down fortifications), from the Ancient Greek βαλλίστρα (ballístra), from βάλλω (bállō) (I throw). + -ic.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  The modern use (of the big military rockets or missiles (those guided while under propulsion, but which fall freely to their point of impact (hopefully the intended target)) dates from 1949 although the technology pre-dated the label.  The term “ballistic missile” seems first to have appeared in 1954 and remains familiar in the “intercontinental ballistic missile” (ICBM).  The figurative use (“go ballistic”, “went ballistic”) to convey “an extreme reaction; to become irrationally angry” is said to have been in use only since 1981 which is surprising.  To “go thermo-nuclear” or “take the nuclear option” are companion phrases but the nuances do differ.  The noun ballistics (art of throwing large missiles; science of the motion of projectiles) seems first to have appeared in 1753 and was from the Latin ballist(a), from the Ancient Greek ballistes, from ballein (to throw, to throw so as to hit that at which the object is aimed (though used loosely also in the sense “to put, place, lay”)), from the primitive Indo-European root gwele- (to throw, reach).  In the technical jargon of the military and aerospace industries, the derived forms included (hyphenated and not) aeroballistic, antiballistic, astroballistic, ballistic coefficient, quasiballistic, semiballistic, subballistic, superballistic & thermoballistic.  In science and medicine, the forms include bioballistic, cardioballistic, electroballistic and neuroballistic.  Ballistic & ballistical are adjectives, ballisticity, ballistician & ballistics are nouns and ballistically is an adverb; the wonderful noun plural is ballisticies.

The basilisk was a class of large bore, heavy bronze cannons used during the late Middle Ages and in their time were a truly revolutionary weapon, able quickly to penetrate fortifications which in some cases had for centuries enabled attacks to be resisted.  Although there were tales of basilisks with a bores between 18-24 inches (460-610 mm), these were almost certainly a product of the ever-fertile medieval imagination and there’s no evidence any were built with a bore exceeding 5 inches (125 mm).  As a high-velocity weapon however, that was large enough for it to be highly effective, the 160 lb (72 kg) shot carrying a deadly amount of energy and able to kill personnel or destroy structures.  Because of the explosive energy needed to project the shot, the barrels of the larger basilicks could weigh as much as 4000 lb (1,800 kg); typically they were some 10 feet (3 m) in length but the more extraordinary, built as long-range devices, could be as long as 25 feet (7.6 m).  Despite the similarity in form, the name basilisk was unrelated to “ballistics” and came from the basilisk of mythology, a fire-breathing, venomous serpent able to kill and destroy, its glace alone deadly.  It was thus a two part allusion (1) the idea of “spitting fire” and (2) the thought the mere sight of an enemy’s big canons would be enough to scare an opponent into retreat.

As soon as it appeared in Europe, it was understood the nature of battlefields would change and the end of the era of the castle was nigh.  It was the deployment of the big cannons which led to the conquest of Constantinople (capital of the Byzantine Empire now Istanbul in the Republic of Türkiye) in 1453 after a 53 day siege; the city’s great walls which for centuries had protected it from assault were worn down by the cannon fire to the point where the defenders couldn’t repair the damage at the same rate as the destruction.  In an example of the way economics is a critical component of war, the Austrian cannon makers had offered the cannons to the Byzantines but the empire was in the throes of one of the fiscal crises which determined to outcomes of so many conflicts and had no money with which to make the purchase.  The practical Austrians then sold their basilisks to the attacking Ottoman army and the rest is history.  Despite such successes, the biggest of the basilisks became rare after the mid sixteenth century as military tactics evolved to counter their threat by becoming more mobile and the traditional siege of static targets became less decisive and smaller, more easily transported cannon, lighter and cheaper to produce, came to dominate artillery formations.

Queen Elizabeth's Pocket Pistol, Navy, Army and Air Force Institute Building, Dover Castle, Dover, Kent, England.

Queen Elizabeth's Pocket Pistol was a basilisk built in 1544 in Utrecht (in the modern-day Netherlands), the name derived from it being a presented to Henry VIII (1491–1547; King of England (and Ireland after 1541) 1509-1547) as a for his daughter (the future Elizabeth I (1533–1603; Queen of England & Ireland 1558-1603) although the first known reference to it being called “Queen Elizabeth's Pocket Pistol” dates from 1767. Some 24 feet (7.3 m) in length and with a 4.75 inch (121 mm) bore, it was said to be able to launch a 10 lb (4.5 kg) ball almost 2000 yards (1.8 km) although as a typical scare tactic, the English made it known to the French and Spanish that its shots were heavier and able to reach seven miles (12 km).  Just to makes sure the point was understood, it was installed to guard the approaches to the cliffs of Dover.  Modern understandings of the physics of ballistics and the use of computer simulations have since suggested there may have been some exaggeration in even the claim of a 2000 yard range and it was likely little more than half that.  Such use of propaganda remains part of the military arsenal to this day.

It was fake news:  Responding to viral reports, the authoritative E!-News in April 2013 confirmed Lindsay Lohan did not "go ballistic" and attack her ex-assistant at a New York City club.  For some reason, far and wide, the fake news had been believed.

Despite the costs involved and the difficulties in maintaining and transporting big cannons, some militaries couldn’t resist them and predictably, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945), who thought just about everything (buildings, tanks, trains, monuments, cars, battleships et al) should be bigger, oversaw some of the most massive artillery pieces ever built, often referred to by historians as “super heavy guns”.  The term is no exaggeration and the most striking example were the Schwerer Gustav and Dora.  With a bore of 31.5 inches (800 mm), the Schwerer Gustav and Dora apparatus weighed 1350 tons (1225 tonnes) and could fire a projectile as heavy as 7.1 tons (6.4 tonnes) some 29 miles (47 km).  Two were built, configured as “railway guns” and thus of most utility in highly developed areas where rail tracks lay conveniently close to the targets.  The original design brief from the army ordinance office required long-range device able to destroy heavily fortified targets and for that purpose, they could be effective.  However, each demands as crew of several thousand soldiers, technicians & mechanics with an extensive logistical support system in place to support their operation which could be fewer than one firing per day.  The Schwerer Gustav’s most successful deployment came during the siege of Sevastopol (1942).  Other big-bore weapons followed but success prove patchy, especially as allied control of the skies made the huge, hard to hid machines vulnerable to attack and even mounting them inside rock formations couldn’t resist the Royal Air Force’s (RAF) new, ground-penetrating bombs.

Schwerer Gustav being readied for a test firing, Rügenwalde, Germany, 19 March 1943, Hitler standing second from the right with Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) to his right.  Hitler referred to huge gun as “meine stählerne faust” (my steel fist) but it never fulfilled his high expectations and like many of the gigantic machines which so fascinated the Führer (who treated complaints about their ruinous cost as “tiresome”) it was a misallocation of scarce resources.

It was the development of modern ballistic rockets during World War II (1939-1945) which put an end to big guns (although the Iraqi army did make a quixotic attempt to resurrect the concept, something which involved having a British company “bust” UN (United Nations) sanctions by claiming their gun barrel components were “oil pipes”), the German’s A4 (V-2) rocket the world’s first true long-range ballistic missile. The V-2 represented a massive leap forward in both technology and military application and briefly it would touch “the edge of space” before beginning its ballistic trajectory, reaching altitudes of over 100 km (62 miles) before descending toward its target.  Everything in the field since has to some degree been an evolution of the V-2, the three previous landmarks being (1) the Chinese “Fire Arrows” of the early thirteenth century which were the most refined of the early gunpowder-filled rockets which followed a simple ballistic path, (2) the eighteenth century Indian Mysorean Rockets with the considerable advance of metal casings, the great range a shock to soldiers of the British Raj who had become accustomed to enjoying a technology advantage and (3) the British Congreve Rockets of the early nineteenth century, essentially a refinement of Mysorean enhanced by improved metallurgy and aerodynamics and made more effective still when combined with the well organized logistics of the British military.