Showing posts sorted by relevance for query Noon. Sort by date Show all posts
Showing posts sorted by relevance for query Noon. Sort by date Show all posts

Saturday, January 28, 2023

Noon

Noon (pronounced noon)

(1) Midday; twelve o'clock in the daytime or the time or point at which the sun crosses the local meridian (the time of day when the sun is in its zenith).

(2) Figuratively (usually in literary or poetic use), the highest, brightest, or finest point or part; culmination; capstone; apex.

(3) The corresponding time in the middle of the night; midnight (archaic but historic use means old documents with the word must be read with care, entries appearing as both “noon” & “noon of the night”).

(4) Three o’clock in the afternoon (archaic).

(5) To relax or sleep around midday (as “to noon” “nooning” or “nooned”) (archaic).

(6) The letter ن in Arabic script.

(7) Midday meal (archaic).

Pre 900: From the Middle English noen, none & non, from the Old English nōn (the ninth hour), from a Germanic borrowing of the Classical Latin nōna (ninth hour) (short for nōna hōra), the feminine. singular of nonus (ninth), contracted from novenos, from novem (nine).  It was cognate with the Dutch noen, the (obsolete) German non and the Norwegian non.  Synonyms (some archaic) include apex, capstone, meridian, midday, noontide, noonday, noontime, nones (the ninth hour of daylight), midpoint (of the day), & twelve.  Descendants include the Modern English none and the Scots nane (none), Noon the proper noun enduring as a surname.  Noun is a noun and noons, nooning & nooned are verbs; the noun plural is noons.

Although derived from the Latin word for the number nine, the English word noon refers to midday, the time when the sun reaches the meridian.  The Romans however counted the hours of the day from sunrise which, for consistency, was declared for this purpose to be 06:00; the ninth hour (nona hora) was thus 15:00.  The early Christians adopted Jewish customs of praying at certain hours and when Christian monastic orders formed, the ecclesiastical reckoning of the daily timetable was structured around the hours for prayer.  In the earliest schedules, the monks prayed at three-hour intervals: 6-9 pm, 9 pm-midnight, midnight-3 am and 3-6 am.  The prayers are known as the Divine Office and the times at which they are to be recited are the canonical hours:

Vigils: night
Matins: dawn
Lauds: dawn
Prime: 6 am (first hour)
Terce: 9 am (third hour)
Sext: noon (sixth hour)
None: 3 pm (ninth hour)
Vespers: sunset
Compline: before bed

The shift in the common meaning of noon from 3 pm to 12 noon began in the twelfth century when the prayers said at the ninth hour were set back to the sixth, the reasoning practical rather theological, the unreliability of medieval time-keeping devices and the seasonal elasticity of the hours of daylight in northern regions meaning it was easier to standardise on an earlier hour.  Additionally, in monasteries and on holy days, fasting ended at nones, which perhaps offered another administrative incentive to nudge it up the clock.  An alternative explanation offered by social historians is that it was simply the abbots deciding to align their noon meal with those taken in the towns and villages, the Old English word non having assumed the meaning “midday” or “midday meal” by circa 1140.  Whatever the reason, the meaning shift from "ninth hour" to "sixth hour" seems to have been complete by the fourteenth century, the same path of evolution as the Dutch noen).  Noon is an example of what etymologists call a fossil word, one which that embeds customs of former ages.

The use as a synonym for midnight existed between the seventeenth & nineteenth centuries, apparently because the poetic phrase “noon of the night” entered popular use.  The noun forenoon (the morning (ie (be)fore + noon)) applied especially the latter part of it, those hours “when business is done”, the word emerging circa 1500.  The noun noonday (middle of the day) was first used by Myles Coverdale (1488–1569), the English cleric and ecclesiastical reformer remembered for his printed translation of the Bible into English (1535) and it was used as an adjective from 1650s.  In the Old English there had been non tid (noon-tide, midday, noon) and non-tima (noon, noon-time, midday).  The noun afternoon (part of the day from noon to evening) dates from circa 1300 and it was subject to an interesting shift in grammatical form.  In the fifteenth & sixteenth centuries it was used as “at afternoon” but from circa 1600 this shifted to “in the afternoon”; it emerged as an adjective from the 1570s.  In the Middle English there had been the mid-fourteenth century aftermete (afternoon, part of the day following the noon meal).

Lindsay Lohan at nuncheon, Scott's Restaurant, Mayfair, London, 2015.

The noun nuncheon was from the mid-fourteenth century nōn-schench (slight refreshment of food (with or without liquor) taken at midday, the name shifting with the meal, nuncheon taken originally in the afternoon (ie notionally the three o’clock meal), the construct being none (noon) + shench (draught, cup), from the Old English scenc, related to scencan (to pour out, to give to drink) and cognate with the Old Frisian skenka (to give to drink) and the German & Dutch schenken (to give).  The most obvious descendent of nuncheon is luncheon (and thus lunch).

Lāhainā Noon is the solar phenomenon (known only in the tropics) when the Sun culminates at the zenith at solar noon, passing directly overhead, thus meaning objects underneath cast no shadow, creating a effect something like the primitive graphics in some video games.  The name Lāhainā Noon (Lāhainā Noons the plural) was the winner in a contest organised by Hawai'i's Bishop Museum in 1990, the museum noting the word lāhainā (originally lā hainā) may be translated into English as “cruel sun” but makes reference also to the severe droughts experienced in that part of the island of Maui.  The old Hawai'ian name for the event was the much more pleasing kau ka lā i ka lolo (the sun rests on the brains).

Saturday, May 1, 2021

Horology

Horology (pronounced haw-rol-uh-jee)

(1) The science of time.

(2) The art and science of making timepieces or measuring time.

(3) In Orthodox Christianity, the office-book of the Greek Church for the canonical hours.

1852: The construct was the Ancient Greek hōro (combining form of hra (hour; part of the day; any period of time)) + -logy.  The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism etc).  Descents of the Greek hōro came into use in many languages including the Hebrew הוֹרָה (hóra), the Romanian horă and the Turkish while from the Modern Greek χορό (choró) (accusative of χορός (khorós) (dance)) came Hora, a circle dance popular in the Balkans and Israel. In Late Latin, the derived form was horologium.

Between the early sixteen and nineteenth centuries the meaning was restricted to describing clocks or their dials by at least 1820 reference books were noting “term horology is at present more particularly confined to the principles upon which the art of making clocks and watches is established”.  The earlier sense in English reflected the inheritance from the Latin horologium (instrument for telling the hour (and in Medieval Latin “a clock”), from the Ancient Greek hōrologion (instrument for telling the hour (ie the sundial; water-clock etc), from hōrologos (telling the hour).  Horological was used as early as 1590s, horologiography (the art or study of watches and timepieces) by the 1630s and the first horologists (the practitioners of horologiography) appeared to have emerged (or at least first advertised themselves) in 1795.  The noun horologe (a clock or sundial) is long obsolete.  Horology, horologiography & horologist are nouns, horological is an adjective and horologically is an adverb; the noun plural is horologists.

Greenwich Mean Time

Greenwich Mean Time (GMT) is the mean solar time at the Royal Observatory in Greenwich, London.  It’s daily reset point is now midnight but, in the past, it has been set from different times including at noon and for this reason, if GMT is of substantive importance in some historic document, it’s sometimes necessary to determine which method of calculation applied at the time.  Because of Earth's uneven angular velocity in its elliptical orbit and its axial tilt, noon (12:00:00) GMT is rarely the exact moment the Sun crosses the Greenwich meridian and reaches its highest point in the sky.  The event may occur up to 16 minutes before or after noon GMT, a discrepancy included in the calculation of time: noon GMT is thus the annual average (ie "mean") moment of this event, which accounts for the "mean" in GMT.  In the English-speaking world, GMT is often used as a synonym for Coordinated Universal Time (UTC) and while this is close enough for many practical purposes, in the narrow technical sense GMT is now a time zone rather than time’s absolute reference.  For navigation, it is considered equivalent to UT1 (the modern form of mean solar time at 0° longitude); but this meaning can differ from UTC by up to 0.9 seconds so GMT should no longer be used for purposes demanding a high degree of precision.

Shepherd Gate (slave) Clock, Royal Observatory, Greenwich.

The Shepherd gate clock is installed at the gates of the Royal Observatory in Greenwich and was the first clock ever to display GMT to the public.  It is a “slave clock”, hardwired to the Shepherd “master clock” which was first commissioned at the observatory in 1852.  One obviously unusual aspect of the gate clock is that it has 24 hours on its face rather than the typical 12, thus at 12 noon the hour hand is points straight down rather than up.  In digital timepieces are common and the user often has the choice of a 12 or 24 hour format by in analogue devices they’re historically rare although Ford Australia did include one as a novelty in the first series of its locally produced LTD & Landau (1973-1976).  The clock remained a one-off.

Lindsay Lohan wearing Rolex Datejust Blue Diamond.  Ms Lohan has a number of Rolexes and some watch sites have noted her preferences for the larger, chunkier men's versions.  That larger face is certainly easier to read but some also prefer the more extravagant look.

Between 1852-1893, the Shepherd master clock was the baseline of the UK’s system of time, its time was sent over telegraph wires to London and many other cities including some in Ireland and from 1866, the signal was also relayed to a clock in Harvard University, Cambridge, Massachusetts, along the new transatlantic submarine cable.  One of history’s most significant clocks, it originally indicated astronomical time, in which the counting of the 24 hours of each day starts at noon though this was later changed to starts at midnight.  It continues to show GMT and is never adjusted for daylight saving time.

Tuesday, July 23, 2024

Lunch

Lunch (pronounced luhnch)

(1) A light midday meal between breakfast and dinner; luncheon.

(2) Any light meal or snack.

(3) To eat lunch.

(4) In slang, as “out to lunch”, dim, vague, uselessly ineffectual.

(5) In slang as “lunchy”, old-fashioned; passé; out of style (obsolete).

(6) In slang as “eating their lunch”, outwitting an opponent.

(7) In Caribbean slang (among older folk), mid afternoon tea.

(8) In first-class cricket, the break in play between the first and second sessions (confusingly for those new to cricket, although the first session is often called the "pre-lunch session", the second is known as the "lunch session" and not the "post lunch session").

(9) In Minnesota, USA, any small meal, especially one eaten at a social gathering.

1580:  It’s never been clear which came first: lunch or luncheon.  Origin of both is thought to lie in a dissimilated variant of nuncheon, the Middle English nonechenche (noon ling meal and drink), equivalent to none (noon) + schench (from the Old English scenc or scencan (to pour out, give drink)), cognate with the Dutch and German schenken.  Apparent unrelated, Old English had nonmete (afternoon meal, literally "noon-meat").  Nonechenche was possibly altered by the northern English dialect lunch (hunk of bread or cheese) from 1590 which may be from lump or the Spanish lonja (slice, literally “loin”).  Because dinner in the sense of the biggest or main meal of the day) could be eaten either at around noon, in the evening or at night, there was a need for a meal to fill the gap between breakfast and dinner.  Lunch is a noun & verb, luncher is a noun, lunching is a noun & verb and lunched is a verb; the noun plural is lunches.

A montage of a languid Lindsay Lohan lingering over lunch.

The idea of lunch as it’s now understood took a long time to evolve, to “take a lunch” in 1786 is recorded as eating a chunk of something (perhaps evolved from lump), carved sufficiently large to constitute a filling meal and as late as 1817, the US Webster’s Dictionary offered as the only definition of lunch "a large piece of food", a meaning long obsolete and in the 1820s, the Oxford English Dictionary (OED) thought it either “a vulgarism or a fashionable affectation".   Nevertheless, lunch’s intrusion into the language in the nineteenth century does suggest some sort of social change was afoot, either in the type, style or timing of meals or at least the words used to describe them.  Lunch-money was attested from 1868; lunch-time from 1821; lunch hour from 1840 and the lunch-break from 1960.  The slang phrase out to lunch in the sense of “a bit vague, dim, clueless (but some way short of actually insane) was first recorded in recorded 1955, the notion of being "not there" and instead at lunch.  The luncheon voucher was a public health measure, introduced in 1946 by the UK’s post-war Labour government (1945-1951).  It was literally a paper voucher which represented the mechanism by which the government would subsidize midday meals taken in private restaurants by employees in workplaces where there was no staff canteen.  Luncheon vouchers were an attempt to improve the national diet by encouraging the consumption of healthy, nutritious food at a time when so many basic items were still subject to the rationing imposed during wartime (indeed, some foodstuffs were subject to rationing only after the conflict ceased).  In an example of bureaucratic inertia, the scheme existed to an extent until 2013 by which time the effects of inflation had made the by then trivial subsidy inconsequential.

Receptacles in which to store one’s lunch for transport have a history.  The lunch-box is documented from 1864, the lunch-pail from 1891.  Those were descriptive nouns whereas lunch-bucket emerged in the 1990s as an adjective indicating working-class men or values, bucket presumably the best word because it was universally understood in the English-speaking world to an extent pail was not.  Lunch-bag seems never to have become a common form despite being widely used but in the 1970s, the verb brown-bag (and the related brown-bagging) referring to bringing lunch or liquor in a brown paper bag.  A long-time staple of a lunch-pail’s contents, lunch-meat (a processed form of meat-based protein produced in a size which, when sliced, was aligned with the slices of standard loaves of bread and thus convenient for making sandwiches) was first documented in 1931.  The lunch-counter (a long, elevated table or bench where customers eat standing or sitting on high stools) is an 1854 invention of US English.

The possible future of lunch: Grilled jellyfish.  Although many fish species are in decline, jellyfish numbers are growing.  The part eaten for lunch is called the umbrella. 

The portmanteau word brunch dates from circa 1890, a British student slang merging of breakfast and lunch, according to the magazine Punch (1 August 1896).  It appeared in 1895 in the defunct Hunter's Weekly, but two years earlier, at the University of Oxford, the students had drawn what must at the time have seemed an important distinction: The combination-meal, when nearer the usual breakfast hour, is "brunch" and, when nearer luncheon, is "blunch".  That’s a linguistic curiosity in that the brunch survived while blunch did not yet the modern understanding of a brunch appears to be something taken closer to the time of lunch than breakfast.  It may be that brunch was just the more pleasingly attractive word, blunch not so well rolling off the tongue.  Several spellings of luncheon were noted in the decades after the 1640s, the now standardised form not widespread until 1706.  Of uncertain origin, in the 1580s was used to describe something like the northern English dialectal lunch (hunk of bread or cheese), though influenced by the Spanish lonja (a slice, literally "loin"), blended with or influenced by nuncheon, from the mid-fourteenth century Middle English nonechenche, (light mid-day meal), from none (noon) + schench (drink), from the Old English scenc, from scencan (pour out).

The possible future of lunch: Fishcakes.  Fishcakes are a way by-products of the industrial processing of seafood can be sold as a protein source (ie make use of what would be otherwise used for agricultural feed, the pet-food business or end up a waste product.

The etymology of all these words is tangled and there are reasons to suspect the similar forms arose independently in different place rather than as forks of anything vaguely lineal, the OED discounting the notion of lunching, which dates from the 1650s, being derived from the verb lunch because that wasn’t to be attested for another century, the OED suggesting there may be some connection (by analogy) with words like truncheon etc to simulate a French origin which is speculative but such things are not unknown in ever class-conscious England.  Whatever the origin, it does seem to have been used to describe an early afternoon meal eaten by those who take dinner at noon.

Thursday, November 16, 2023

Amethyst

Amethyst (pronounced am-uh-thist)

(1) A purple or violet transparent variety of quartz used as a gemstone.  The color is caused by the presence of iron compounds in the crystal structure.

(2) As the oriental amethyst, a purple variety of sapphire.

(3) A variety of shades of purple; darker hues of fuchsia.

(4) A thing containing or set with an amethyst or amethysts.

(5) A nymph from Greek mythology.

1250-1300: From the Middle English amatist, from the twelfth century Old French ametiste (the Modern French being améthyste) and directly from the Medieval Latin amatistus, from the Classical Latin amethystus, from the Ancient Greek αμέθυστος (améthystos) (amethyst) a noun use of the adjective which translated literally as "not intoxicating; not drunken", the construct being a- (not) + methyskein (make drunk) from methys (wine (and a variant stem of methýein (to intoxicate), the source of methylene)) + -tos (the Latin verbal adjective suffix); the source was the primitive Indo-European root medhu- (honey; mead), famous as the nectar the Valkyries would serve to fallen warriors in the halls of Valhalla.  The meaning in Ancient Greek was literal, the belief being that the stone prevented drunkenness, the link to reality being the color which resembled red wine diluted with water which was of course less intoxicating; chemistry then rather than magic but those who took their wine pure were still inclined to wear rings with an amethyst stone in the hope of avoiding a hangover.

One (dodgy) legend of Amethyst

Lindsay Lohan in amethyst-colored tank-top.

In antiquity, the Greeks believed amethyst could prevent intoxication and the practice was to wear the gem in a ring if the drinking session was to be epic although some maintain there were those who kept a stone in their mouth which seems not a good idea when taking strong drink.  As was often the case, later writers also created their own Greek "myths" and one was the story of the how the beautiful nymph Amethyst, while walking to worship at the Temple of Diana, had the misfortune of crossing paths with Bacchus, the god of wine.  Angry (as often he was), he had vowed vengeance on the next person he met so unleashed his two guardian tigers upon the poor nymph.  As the great beasts bounded towards her, the goddess Diana intervened and to spare her from her terrible fate, transformed her into a pure, clear stone.  Remorse immediately seized Bacchus and in an attempt to atone, poured his wine over the stone, staining the crystal a deep, violet hue and that's how Amethyst lent her name to the crystal.  Although presented in Classical guise, this "myth" dates only from the Renaissance, the French poet Remy Belleau (1528-1577) creating the tale in 1576.

1994 Porsche 911 Turbo 3.6  (964) in Amethyst Metallic over Classic Gray.

The presence of manganese in clear Quartz produces Amethyst, while additional amounts of iron vary the purple coloration. It ranges in hue from pale red-violet to deep violet and may be transparent or opaque. In addition, it is sometimes layered with white Quartz (as Chevron Amethyst), found in combination with Cacoxenite, mixed with Citrine as Ametrine, or in rare cases, “rutilated” with Goethite.  In the modern system of, it's a semi-precious stone but to the ancients it was a “gem of Fire" and at some points in history has been as highly valued as diamonds.  Anglican bishops wear an episcopal ring often set with an amethyst, an allusion to Acts 2:15 in which the Apostles are noted to be sober at nine in the morning, the piece of scripture from which is derived that measure of English respectability: never taking a G&T before noon.  Medieval European soldiers wore amethyst amulets into battle in the belief the stone had healing properties and in several cultures, they were a popular burial stone, found most often in Anglo-Saxon graves in England.  Faith in the healing power of the stone is maintained by the new-age movement, something probably no more nutty than their other beliefs.

An amethyst crystal cluster.

In the weird word of the new age, crystals are of great significance and each is said to be imbued with its own unique properties, the amethyst known often as the “stone of the dreamers”, apparently because it can inspire positive thoughts and inspire one to go forth and turn one’s dreams into reality.  Long associated with February, the month the Romans dedicated to the water god Neptune, it’s the stone of Saint Valentine and faithful love, signifying ecclesiastical dignity as the Bishop’s Stone.  To new agers, it carries the energy of fire and passion, creativity and spirituality; yet bears the logic of temperance and sobriety and crystal specialists among the practitioners extol its properties:

"In the modern world, Amethyst’s healing properties and meanings are similar to their historic roots and it remains a remarkable stone of spirituality and contentment; known for its metaphysical abilities to still the mind and inspire an enhanced meditative state.  Its inherent high frequency purifies the aura of any negative energy or attachments, creating a protective shield of light around the body, allowing one to remain clear and centred while being open to spiritual direction.  Amethyst stimulates the Third Eye, Crown, and Etheric Chakras enhancing cognitive perception as well as accelerating the development of intuitive and psychic ability. It initiates wisdom and greater understanding and is a stone of comfort for those grieving the loss of a loved one.  Amethyst’s ability to expand the higher mind also enhances one’s creativity and passion, strengthening the imagination and intuition while refining the thinking processes. It helps in the assimilation of new ideas, putting thought into action, and brings projects to fruition; amethyst is also well-known as a talisman of focus and success.  Amethyst is an exceptional crystal for wearing on the body, for use in healing rituals, and for enhancing one’s environment.  It has however been known to fade if left in direct sunlight so care should be taken and it’s wise from time to time to clear its energies by holding the stone under running water for short periods.  Remarkably, an unpolished amethyst also has special properties which can recharge other crystals so keep one in a dark space and leave some crystals with it to re-energize."

Monday, February 1, 2021

Knownothingism

Knownothingism (pronounced noh-nuhth-ing-is-uhm)

A humorous coining to describe the American Party (1855 on) based on a stock reply the members were instructed to use if asked probing questions.

1855: A compound word, know + nothing+ -ism.  Know is from the Middle English knowen, from the Old English cnāwan (to know, perceive, recognise), from the Proto-Germanic knēaną (to know), from the primitive Indo-European ǵneh- (to know).  Nothing is from the Middle English noon thing, non thing, na þing, nan thing & nan þing, from the Old English nāþing & nān þing (nothing (literally “not any thing”)) and was equivalent to no + thing (and can be compared with the Old English nāwiht (nothing (literally “no thing”)) and the Swedish ingenting (nothing (literally “not any thing”, “no thing”)).  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).

Knowing nothing

A nineteenth century US political phenomenon, the Know Nothing Party was originally a secret society known as the Order of the Star Spangled Banner (OSSB) which, like organisations such as the Freemasons or the Secret Society of the Les Clefs d’Or, featured rites of initiation, passwords, hand signs and demanded of its members a solemn pledge never to betray the order.  One practical measure was an instruction to members, if asked probing questions about the society, to answer only “I know nothing.”  The phrase was widely reported and members of the OSSB, despite many name-changes, were always known as “the know nothings”.  As a tactic in politics, there is much to commend it, as easy as it is for one to talk one’s way into trouble, it’s easier still to avoid it by saying nothing.

The roots of the party lay in New York City politics, emerging in 1843 as the American Republican Party, spawning a number of forks in different states which in 1853 merged, becoming the OSSB.  In this form, seeking national influence, it was re-branded, firstly in 1854 as the Native American Party and a year later, the American Party.  Sounding surprisingly modern, Trumpesque even, (as opposed to emulating Crooked Hillary Clinton which would be described as "knoweverythingism") the platform supported deportation of foreign beggars and criminals, a twenty-one year naturalization period for immigrants and mandatory Bible reading in schools.  Their stated aim was to restore their vision of what America should look like: a society underpinned by temperance, Protestantism and self-reliance with the American nationality and work ethic enshrined as the nation's highest values; a kind of Make America Great Again vibe.  Their especial concern was the infiltration of Roman Catholics and the influence of the Pope and they advocated the dismissal of all Catholics from public office.  In this vein, their catchy campaign slogan was “Rum, Romanism and Ruin”.

The Know Nothings in Louisiana (2018) by By Marius M. Carriere Jr, University Press of Mississippi, 230pp.

The Know Nothings were the American political system’s first major third party. In the early nineteenth century, the two parties leftover from the revolution were the Federalists and the Democratic-Republicans.  Later would come the National Republicans, the Whigs, the Democrats and the Republicans but it was the Know Nothings which filled the political vacuum even as the Whigs were disintegrating.  They were the first party to leverage economic concerns over immigration as a major part of their platform and though short-lived, the values and positions of the Know Nothings ultimately contributed to the two-party system which has characterised US politics since the 1860s.

Monday, August 22, 2022

Pleonasm & Tautology

Pleonasm (pronounced plee-uh-naz-uhm)

(1) In rhetoric, the use of more words than are necessary to express an idea; a redundancy in wording.

(2) An instance of this, as free gift or true fact.

(3) Any redundant word or expression.

(4) In a variety of disciplines, an excess in the number or size of parts (now rare except in pathology).

1580–1590: A learned borrowing from the French pléonasme, from the Late Latin pleonasmus, from the Ancient Greek πλεονασμός (pleonasmós) (redundancy, surplus), from πλεονάζω (pleonázō) (to be superfluous), from pleonázein (to be or have more than enough (in grammatical use "superfluously to add”)), a combining form of πλείων (pleíōn) (more), from the primitive Indo-European root pele- (to fill).  The adjective pleonastic (characterized by pleonasm, redundant in language, using more words than are necessary to express an idea) dates from 1778 although sources list the related pleonastical as being in use since the 1650s.  Pleonasm is a noun, pleonastic and pleonasmic are adjectives and pleonastically & pleonasmically are adverbs; the noun plural is pleonasms.  Despite the modern practice, verb forms seem never to have evolved.

Tautology (pronounced taw-tol-uh-jee)

(1) The needless repetition of an idea, especially in words other than those of the immediate context, without imparting additional force or clarity of meaning.

(2) In formal logic, as a logical tautology, something true under any possible case or interpretation; it differs from the linguistic form in that in propositional logic it’s a compound propositional form in which all instances simultaneously are true.

(3) In pathology, an excess in the number or size of parts (archaic).

(4) In engineering, the addition of a strengthening device to a design in which all calculations prove it unnecessary.  By convention tautology is applied to small-scale instances whereas a redundancy tends to be larger, extending even to duplicated systems.

1570–1580: From the Late Latin tautologia (representation of the same thing in other words), from the Ancient Greek τατολογία (tautología from tautologos) (a repetition of something already said (the word originally from rhetoric)), the construct being τατός (tautós) (the same) + λόγος (lógos) (saying; explanation), related to legein (to say), from the primitive Indo-European root leg- (to collect, gather).  The modern version is tauto- + -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism etc).  Tautology, tautologism & tautologist are nouns, tautologize is a verb, tautologically & tautologously are adverbs and tautological, tautologic & tautologous are adjectives; the noun plural is tautologies.

A tautology is the unnecessary repetition (often in close proximity) of an idea, statement, or word in circumstances in which the meaning has already been expressed.  In the expression 4 am in the morning”, the tautology is created by morning because am (an abbreviation of the Latin ante meridiem (before noon) has already established an unambiguous meaning.  For technical reasons however the odd tautology may be required, 4 am in the morning once used for the lyrics of a pop song because, were either of the tautological elements to be removed, the rhythm of the tune would be lost.  In the same manner a poet might be moved (poets are often moved) to write of the dawn’s sunrise and that’s one word too many but the tautology might be justified if it adds to the lyrical quality (something not guaranteed in poetry).  Tautologies seem sometimes to be used to add emphasis or strengthen a meaning and thus function adjectivally.  To say completely and totally beyond my comprehension and understanding technically loses nothing if either of the two tautological pairs are pared down but the practice is common as a rhetorical device and probably often effective as long as the wordiness is restricted to the odd flourish and doesn’t infect the rest of the speech.  A device of oral use therefore but usually an absurdity in writing.

Tautologies abound but those who condemn need to consider the context and history.  The phrase PIN number has long been ubiquitous and sounds right but seems wrong once deconstructed: undo the acronym and it becomes personal identification number number; what has happened is either PIN has become a word or PIN number an encapsulated phrase.  Democratic English resolves the argument in the usual manner: pedants can have their PINs while the rest of us use pin numbers.  In commerce, tautologies are often part of what the law describes as “mere puffery”.  A phrase like absolutely unique and a one-off, something of a favorite of antique dealers, is not only a tautology but not infrequently also an untruth but in the business such things are understood.  Forgivable then in a way that the linguistic sin very unique is not often tolerated by the fastidious although strangely, quite unique seems to be, presumably because it’s a more elegant construction.

Pleonasm refers to overabundance, and is mow rarely used outside of the medical context in which it describes aspects of tissue growth.  A linguistic pleonasm is usually identified as a phrase with more words than necessary, often by being repetitive or having empty or clichéd words, but is not necessarily wrong or confusing.  At the margins the difference between tautology and pleonasm does get ragged and not all dictionaries and style guides agree.  The Oxford English Dictionary (OED) indicates the difference seems to be between redundancy of expression and repetition and as a general principle that’s probably helpful, if not exhaustive.  One suggestion of a method to define a tautology is to substitute an antonym for one of the allegedly offending elements.  That works well if it creates contradictions in terms like 4 pm in the morning or the dawn’s sunset but doesn’t resolve everything.  A pleasurable delight seems a pleonasm because it uses unnecessary words to make the point and, under the test, a tautology because there are presumably no un-pleasurable delights although even then there are nuances because the rare delicacy most would enjoy as a delight might to someone with a specific allergy be not at all enjoyable.

Actually, biological reactions aside, something most would not find a delight can to others be entirely that.  In Freudian psychoanalysis, Lustprinzip (the pleasure principle) describes the driving force of the id: the human instinct to seek pleasure and avoid pain.  However, the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders notes the existence of masochism in various forms which involve pleasure being gained from pain.  Thus the connotations of words are a subjective and not objective test for there are those for whom pleasurable pain needs to be distinguished from un-pleasurable pain, the latter a mere tautology to most.  Sexual masochism disorder (SMD) had an interesting history in the DSM.  It wasn’t in the first edition (DSM-1, 1952) but in the second edition (DSM-II 1968) the only mention of masochism was in the categorization of sexual deviations, then defined as applying to those individuals for who sexual interest was directed primarily towards objects other than people of the opposite sex, toward sexual acts not usually associated with coitus, or toward coitus performed under bizarre circumstances as in necrophilia, pedophilia, sexual sadism, and fetishism.  It was noted that while many patients found their practices distasteful, they were unable to substitute normal sexual behavior and the diagnostic criteria was also exclusionary, noting the diagnosis was not appropriate for individuals who perform deviant sexual acts because normal sexual objects are not available to them.  This changed little in the third & fourth editions issued between 1980-2000 which refined the technical description and diagnostic criteria.  In the fifth editions (2013-2022), while classified as one of the paraphilias (algolagnic disorders) and thus "anomalous activity preferences", clinicians were advised a formal diagnosis of SMD was appropriate only if individual experiences clinically significant distress or impairment in social, occupational, or other important areas of functioning.  By 2013 the DSM seemed to be back where Freud had started.

A mammary pleonasm (or tautology depending on one's view): Jasmine Tridevil during addition and the final result.

Pleonasm should not be confused with pleomastia (now largely supplanted by polymastia in clinical use) which is the condition of having more than two mammary glands (breasts) or nipples.  It’s a rare condition which doesn’t present in the geometrically perfect example presented in 2014 by Jasmine Tridevil, the stage name of Florida massage therapist Alisha Jasmine Hessler (b 1993).  Ms Tridevil initially claimed to have had the central unit implanted by a plastic surgeon but later admitted it was a construction made substantially of latex and silicone, attached to her with surgical glue, helpfully providing photographs of the maintenance being undertaken.  However, encouraged by enjoying more than fifteen minutes of fame, in 2019 Ms Tridevil sought to crowdfund the money (apparently US$50-000) needed actually to have the surgery performed.  Progress on this project hasn’t been reported but Ms Tridevil has maintained her presence on a number of internet platforms including vlogs on topics as varied as "How to dominate your boyfriend" and “My gothic Christmas tree”.

The offence caused by unnecessary words is such that not only do tautology and pleonasm exist but for serious critics there’s also auxesis (from the Ancient Greek: αξησις (aúxēsis) (growth; increase (which in rhetoric references various forms of increase)) and describes exaggerated language, battology (from the Ancient Greek βαττολογία (battología) (stammering speech)) which is the repeated reiteration of the same words, phrases, or ideas and perissology (from the Latin perissologia) which is the use of more words than are necessary to convey meaning.  At the margins, there’s often a bit of overlap so care need to be taken that one’s critique of a redundant (and all the constructions are really forks of that) word or phrase doesn’t itself commit the same offence.  Grammar Nazis of course delight in faulting others when they use a tautology, some particularly pedantic even correcting other obsessives who might wrongly have tagged a tautology when really they should have perceived a pleonasm.

Monday, March 27, 2023

Nothing

Nothing (pronounced nuhth-ing)

(1) No thing; not anything; naught.

(2) No part, share, or trace (usually followed by of).

(3) Something that is nonexistent; non-existence; nothingness.

(4) Something of no importance or significance.

(5) A trivial action, matter, circumstance, thing, or remark.

(6) A person of little or no importance; a nobody.

(7) Something that is without quantity or magnitude.

(8) A cipher or naught; the quantity or quality of zero.  The value represented by the numeral zero (and the empty set: {}).

(9) As “think nothing of it” and related forms, a procedural response to expressions of thanks.

(10) In no respect or degree; not at all.

(11) Amounting to nothing, as in offering no prospects for satisfaction, advancement, or the like.

(12) In architecture, the contents of a void.

Pre 900: From the Middle English nothyng, noon thing, non thing, na þing, nan thing & nan þing, from the Old English nāþing, nān þing & naðinc (nānthing & nathing) (nothing (literally “not any thing”), the construct being nān- (not one (source of the modern none)) + þing (thing).  The earlier Old English was nāwiht (nothing (literally “no thing”), related to the Swedish ingenting (nothing (literally “not any thing, no thing”).  The ultimate source was the primitive Indo-European ne- (not).  In slang and dialectical English there have been many non-standard forms including nuffin, nuffink, nuttin', nuthin, nuthin', nowt, nuthing & nothin'.  Slang has been productive (jack, nada, zip, zippo, zilch, squat, nix) as has vulgar slang (bugger all, jack shit, sod all, fuck all, dick).  Nothing is a noun & adverb and nothingness is a noun; the noun plural is nothings.

Lindsay Lohan wearing nothing (shoes don't count; everybody knows that).  Playboy magazine pictorial, January / February 2012.

The meaning "insignificant thing, a thing of no consequence" emerged circa 1600 (although as an adverb (not at all, in no degree), it was known in late Old English) whereas nothing in the sense of "not at all" had existed since circa 1300.  Phrases in the twentieth century were created as needed: “Nothing to it”, indicating something easily accomplished was noted from 1925 and “nothing to write home about” was really literal, recorded first and with some frequency by censors monitoring the letters written by soldiers serving at the front in Word War I (1914-1918); it appears to date from 1917, the extent of use apparently encouraged by it being a useful phrase exchanged between soldiers by word-of-mouth.  Nothing seems not to have been an adjective until 1961, an evolution of use (or a decline in standards depending on one’s view) which saw words like “rubbish” re-applied in a similar way.  A do-nothing (an idler) is from the 1570s, the noun an adoption from the from the verbal phrase and as an adjective to describe the habitually indolent, it’s noted from 1832.  The adjective good-for-nothing (a worthless person) is from 1711.  The term know-nothing (an ignoramus) is from 1827 and was later applied (though not deliberately) to the US nativist political party, active between 1853-1856, the bulk of which eventually migrated to the Republican Party.  The noun nothingness (non-existence, absence or negation of being) was first used in the 1630s but is most associated with the ideas around nihilism, the exploration of which became a mainstream part of philosophy in the nineteenth century.  Nothingness is distinct from the noun nothingarian which references "one who has no particular belief," especially in religious matters, a descriptive dating from 1789.  It's striking how often in religion, even when factions or denominations are in disputes with one another (sometime actually at war), one thing which seems to unite them is the feeling that whatever their differences, the nothingarians are the worst sinners of all.

The noun nihilist, in a religious or philosophical sense, is from the French nihiliste, from the Latin nihil (nothing at all).  Nihilism, the word first used in 1817, is “the doctrine of negation", initially in reference to religion or morals but later extended universally.  It’s from the German Nihilismus, from the Latin nihil (nothing at all) and was a coining of German philosopher Friedrich Heinrich Jacobi (1743-1819).  In philosophy, it evolved quickly into an extreme form of skepticism, the political sense of a "rejection of fundamental social and political structures", first used circa 1824 by the German journalist Joseph von Görres (1776-1848).  Most associated with a German school of philosophical thought including (rather misleadingly) GWF Hegel (1770–1831) and (most famously) Friedrich Nietzsche (1844–1900), the particular Russian strain was more a revolutionary political movement with something of a premium on violence (that would much influence Vladimir Lenin (1870–1924)).  Thus with an initial capital, Nihilism (Nigilizm in the Russian) as used in this context is specific to the movement of Russian revolutionary anarchism 1863-1917 and limited in that the meaning refers to the participants’ disapproval of all social, economic & political possibilities in pre-Soviet Russia; the sense they viewed “nothing” with favor.

A probably inaccurate representation of nothing.  

The idea of nothing, in a universal sense in which literally nothing (energy, matter, space or time) exists is difficult to imagine, imaginable presumably only as infinite blackness, probably because that’s the closest to a two-dimensional representation of the absence of any sense of the special, white implying the existence of light.  That nothingness is perhaps impossible to imagine or visualize doesn’t however prove it’s impossible but the mere fact matter, energy and time now exist in space does imply that because, were there ever nothing, it’s a challenge to explain how anything could have, from nothing, come into existence.  Some have mused that there are aspects of quantum theory which suggest even a state of nothingness can be inherently unstable and where there is instability there is the possibility of an event.  The argument is that under quantum theory, if long enough is allowed to pass (something which, bewilderingly, apparently can happen even if there is no time) then every possible event may happen and from this may evolve energy, matter space or time.  To speak of a time scale in all of this is irrelevant because (1) time may not exist and (2) infinity may exist but it can for administrative purposes be thought of as a very long time.  The intriguing link between time starting and energy, matter or space coming into existence as a consequence is that at that point (in time), it may be the only time “now” could exist in the absence of the past and future so everything would happen at the same time.  Clearly, the conditions operative at that point would be unusual so, anything could happen. 

That is of course wholly speculative but in recent decades, the “string theorists” have extended and refined their mathematical models to a degree which not long ago would have been thought impossible so some modelling of a unique point of “now” in nothing would be interesting and the basic framework of that would seem to demand the mathematics of a model which would describe what conditions would have to prevail in order for there truly to be nothing.  That may or may not be possible but might be an interesting basis from which to work for those trying to explain things like dark matter & dark energy, either or both of which also may or may not exist.  Working with the existing universe seems not to be helpful in developing theories about the nature of all this supposedly missing (or invisible) matter and energy whereas were one, instead of working backwards as it were, instead to start with nothing and then work out how to add what seems to be missing (while remaining still not visible), the result might be interesting and one thing which seems not much discussed is the notion the famous "dark energy" may be time itself.

It’s not a new discussion.  The thinkers from Antiquity were known to ponder the philosophers’ traditional concerns such as “why are we here?” and “what is the meaning of life?” but they also realized a more basic matter was “why does anything exist instead of there being nothing?” and for thousands of years this has been “explained” as the work of gods or a god but that really not a great deal of help.  In the Western tradition, this basic question seems not to have bothered angst-ridden Teutonic philosophers, the German Gottfried Leibniz (1646-1716) writing on the subject, as later would the Austrian Ludwig Wittgenstein (1889–1951).  Martin Heidegger (1889–1976, who was only briefly a Nazi) called it the “fundamental question of metaphysics”.  The English-speaking school, more tied to the empirical, noted the matter.