Friday, February 10, 2023

IIII

IIII (pronounced fawr (U) or fohr (non-U))

A translingual form, an alternative form of IV: the Roman numeral representing four (4), the other known forms being iv, iiii & iiij

Circa 500 BC: The Roman numeral system spread as Roman conquest expanded and remained widely used in Europe until from circs 1300 it was replaced (for most purposes) with the more adaptable Hindu-Arabic system (including the revolutionary zero (0) which remains in use to this day.

IIII as a representation where the value four is involved has long been restricted to the value 4.  To avoid numbers becoming too cumbersome, the Roman system always used subtraction when a smaller numeral precedes a larger numeral so the number 14 would be represented as XIV instead of XIIII.  The convention which emerged was that a numeral can precede only another numeral which is less than or equal to ten times the value of the smaller so I can precede only (and thus be subtracted from) V (five) & X (ten).  However, these “rules” didn’t exist during Antiquity and weren’t (more or less) standardized until well into the medieval period; it’s thus not unusual to find old documents where 9 is represented as VIIII instead of IX.  The practical Romans, unlike the Greeks for whom abstraction was a calling, were little concerned with the concepts of pure mathematics, such as number theory or geometric proofs, and other abstract ideas, devoted instead to utilitarian purposes such as financial accounting, keeping military records and building things.

The numeral system had to be manageable to make simple calculations like addition and subtraction so it was attractive to make the text strings conveniently short: 44 as XLIV obvious preferable to XXXXIIII.  Although its limitations seem obvious to modern eyes, given the demands of the times, the system worked remarkably well for almost two millennia despite the largest numeral being M (1000).  It was silly to contemplate writing a string of 1000 M’s to indicate a million (presumably not a value then often used) so the Romans concocted a bar (the vinculum) which, when it appeared above a numeral, denoted a multiplier of 1000: MMMM (6000) could thus appear as V̄Ī and a million as M̄.  Compared with the Hindu-Arabic system, it was a fudged but one which for centuries proved serviceable.

Where Roman numbers are occasionally still used (book prefaces & introductions, some aeroplanes & automobiles and charmingly, some software), the number four is almost always represented by IV rather than IIII.  One exception to this however is watch & clock faces where the use of IIII outnumbers IV, regardless of the cost of the device.  Watchmakers have provided may explanations for the historical origin of this practice, the most popular of which dates from Antiquity: Because “I” stood for the “J” and “V” for the “U”, IV would be read as JU and thus Jupiter, an especially venerated Roman god, Jupiter Optimus Maximus being the king of all gods, chief of the pantheon and protector of ancient Rome.  The suggestion is that invoking the name of Jupiter for such a banal purpose would be thought offensive if not actually blasphemous.  Thus IIII it became.

Lindsay Lohan wearing 19mm (¾ inch) Cartier Tank Americaine in 18 karat white gold with a quartz movement and a silver guilloche dial with Roman numerals including the traditional IIII.  The Cartier part-number is B7018L1.

There’s the notion to that the convention arose just because of one of those haphazard moments in time by which history sometimes is made.  The appearance of IIII was said to be the personal preference of Louis XIV (1638–1715; le Roi Soleil (the Sun King), King of France 1643-1715), the Sun King apparently issuing an instruction (though there’s no evidence it was ever a formal decree) that IIII was the only appropriate way to write the number four, watchmakers ever since still tending to comply.  Whether Louis XIV wished to retain some exclusivity in the IV which was part of “his” XIV isn’t known and it may be he simply preferred the look of IIII.  Despite the belief of some, it’s anyway wrong to suggest IIII is wrong and IV right.  The design of the IIII was based upon four outstretched fingers which surely had for millennia been the manner in which the value of 4 was conveyed in conversation and V denoted 5 in tribute to the shape the hand formed when the thumb was added.  The IV notation came later and because it better conformed with the conventions used for writing bigger numbers, came in medieval times to be thought correct; it was thus adopted by the Church, becoming the “educated” form and that was that.

Not all agree with those romantic tales however, the German Watch Museum noting that in scholarly, ecclesiastical and daily use, IIII was widely used for a millennia, well into the nineteenth century, while the more efficient “IV” didn’t appear with any great frequency until circa 1500.  The museum argues that the watch and clock-makers concerns may have been readability and aesthetics rather than any devotion to historic practice, IIII having display advantages in an outward-facing arrangement relative to the centre of the dial (ie partially upside down, such as on wall, tower or cuckoo clocks), any confusion between IV (4) & VI (6) eliminated.  Also, a watch, while a functional timepiece, is also decorative and even a piece of jewellery so aesthetics matter, the use of III rendering the dial symmetrically balanced because 14 individual characters exist on each side of the dial and the IIII counterbalances the opposite VIII in the manner IX squares off against III.  So there’s no right or wrong about IIII & IV but there are reasons for the apparent anomaly of the more elegant IV appearing rarely on the dials of luxury watches.

Thursday, February 9, 2023

Gown

Gown (pronounced goun) 

(1) A type of woman's dress or robe, especially one full-length and worn on formal occasions and often styled as “evening gown” or “ball gown”.

(2) As nightgown, a loose fitting garment worn by sleeping (historically by both men & women but now most associated with the latter); the shortened for is “nightie”.

(3) As surgical gown, a light, protective garment worn in hospitals by medical staff, a specialized form of which is the isolation gown.

(4) As dressing gown (also call bathrobe), a garment in the form of an open robe secured by a tie and often worn over pajamas, after a bath and prior to dressing or on other occasions where there’s no immediate need to dress.

(5) A loose, flowing outer garment in various forms, worn to denote an office held, profession practiced or as an indication of rank or status, most associated with formal academic dress (sometimes in the phrase “cap & gown”).

(6) Those who work or study at a university as opposed to the other residents of the university town, expressed in the phrase “town & gown”.

(7) Historically, the dress of civil, as opposed to military officers.

(8) To supply with or dress in a gown.

1300-1350: From Middle English goune & gowne, from Anglo-Norman gune & goune (fur-trimmed coat, pelisse), from the Old French goune (robe, coat; nun's habit), from the Late Latin gunna (a garment of fur or leather), from the Ancient Greek γούνα (goúna) (coarse garment), of unknown origin but may be from a Balkan or Apennine language where it seems to have been used as early as the eighth century to describe a fur (or fur-lined), cloak-like garment worn by old or infirm monks; More speculatively, some scholars suggest a Celtic source.  The alternative explanation suggests a Scythian origin, from the Proto-Iranian gawnám (fur), the possibility of this link supported by the Younger Avestan gaona (body hair) and the Ossetian гъун (ǧun).  The alternative spelling gowne is obsolete and descendants in other languages include the Bengali গাউন (gaun), the Japanese ガウン, the Korean  가운 (gaun), the Malay gaun, the Punjabi ਗਾਊਨ (gāūna) and the Welsh gown.  Gown is a noun and verb and gowned is an adjective; the noun plural is gowns.

Surgeon in blood-splattered surgical gown (also called hospital or medical gowns), mid-surgery.

As late as the eighteenth century, gown was the common word for what is now usually described as dress and gown in this sense persisted in the US longer than in the UK and there was on both sides of the Atlantic something of a twentieth century revival and the applied uses (bridal gown, nightgown etc) became more or less universal.  The meaning “a loose, flowing outer garment in various forms, worn to denote an office held, profession practiced or as an indication of rank” emerged in the late fourteenth century and the collective singular for “residents of a university” dates from the 1650s, still heard in the rhyming phrase “town & gown”.  The night-gown (worn once by both men & women but now associated almost exclusively with the latter) became a thing in the fourteenth century.

Lindsay Lohan in white & black color-blocked bandage dress.

Dress dates from circa 1300 and was from the Middle English dressen & dresse (to arrange, put in order), from the Anglo-Norman & Old French dresser, drecier (which persists in as dresser), from the unattested Vulgar Latin dīrēctiāre, from the Classical Latin dīrēctus, the perfect passive participle of dīrigō (to arrange in lines, direct, steer), the construct being dis- (the prefix in this context meaning “apart; asunder; in two’) + regō (to govern, manage), ultimately from the primitive Indo-European h₃reǵ- (straight, right).  The noun dress was derived from the verb and emerged in the sense of “attire” in the early 1600s.  Originally, a dress was always something which covered both the upper and lower parts of the female body but not of necessity in once piece.  The dressing gown seems first to have been described as such in 1854 although in French both robe de chambre (dressing gown) & robe de nuit (nightgown) had been in use for centuries.

Lindsay Lohan in dressing gowns; in the US such things would usually be called bathrobes.

Robe dates from the mid-thirteenth century Middle English robe & robbe and was from the Old French robe, robbe & reube (booty, spoils of war, robe, garment), from the Frankish rouba & rauba (booty, spoils, stolen clothes (literally “things taken”)), from the Old High German roub, from the Proto-Germanic raubō, raubaz & raubą (booty, that which is stripped or carried away), from the primitive Indo-European Hrewp- (to tear away, peel off).  The noun use of robe to refer to garments had entered general use by the late thirteenth century, an adoption of a meaning from the Old French, presumably because fine clothing looted from defeated enemies were among the most prized of the spoils of war.  The Old French robe (and the alternative spellings) had as concurrent meanings both “clothing” & “plunder: as did the Germanic forms including the Old English reaf (plunder, booty, spoil; garment, armor, vestment).  By the late thirteenth century robe had assumed the meaning “a long, loose outer garment reaching almost to the floor, worn by men or women over other dress”, those closest European equivalents being the twelfth century Old French robe (long, loose outer garment) and the Old High German rouba (vestments).  In royal, academic and ecclesiastical circles, the particular style of robes became regulated to denote rank, function or or membership of a religious order and royal courts would include offices like “page of the robes”, “mistress of the robes”, master of the robes etc” although those titles are (to modern eyes) misleading because their responsibilities extended to garments generally and not just robes as they’re now understood.  The metonymic sense of “the robe” for "the legal profession" dates from the 1640s, a reference to the dark robes worn by advocates when appearing in court.  Robe went on productively to be adopted for other purposes including (1) in the US “the skin of a bison (later applied to other slaughtered beasts) used as a cloak or wrap, (2) a short form of wardrobe (especially when built into a wall rather than being stand-alone) and (3) the largest and strongest leaves on a tobacco plant.

Singer Dr Taylor Swift in academic gown after being conferred an honorary doctorate in fine arts from New York University, May 2022.

In formal and vocational use, gown and robe and well understood and there tends not to be overlap except among those unacquainted with such things.  That’s understandable because to the casual observer the things can look much the same and the differences in nomenclature are more to do with tradition than style or cut.  Judges for example ware judicial robes and in the US these are usually black whereas elsewhere in the English-speaking world they can be of quite vivid hues, red and scarlet the most admired.  The US influence however seem pervasive and the trend is now almost universally black, certainly among newly established courts; in the same courts, barristers robes look much the same the term “judicial robe” is exclusive to the bench, the advocates garments variously called “barristers’ robes” “legal robes” or lawyers’ robes”.  Academics however wear gowns and again, the Americans tend to favor black while in the English tradition, all the colors of the rainbow have been seen.  These differ from surgical (also known as hospital or medical gowns) which, compared with just about every other gown, really aren’t gowns at all.  Surgical gowns are made usually in a blue, beige or green pastel color (better to show the blood) and are a kind of inverted dress which is fastened at the back (by an assistant so the wearer’s fingers don’t pick up germs).  In the UK parliament, there were many robes for offices of state and the one worn by the speaker made its way to colonial and dominion parliaments.  They're now rarely worn except on ceremonial occasions and the best known is probably that of the UK’s chancellors of the exchequer although the last one, dating from the late nineteenth century, is said to have “gone missing” while Gordon Brown (b 1951; UK prime-minister 2007-2010) was chancellor.

New South Wales (Australia) Supreme Court and Court of Appeal judges in judicial robes during the pandemic.

It’s in women’s fashion where the distinction between a gown and a dress can become muddied and probably most illustrative is the matter of the “wedding dress” and the “wedding gown”.  Even among couturiers, there’s actually no agreed definition of where one ends and the other begins and it’s very much in the eye of the beholder although the eye of the retailer is doubtless quite an influence, the theory being that the grander the design and the more the fabric, the more plausible is the label “wedding gown” and the higher the price-tag.  These informal (but serviceable) rules of thumb work also for dresses & gowns in general, the distinction more one of semantics and personal preference although in saying that, it’s only at the margins where there can be confusion; a minimalist LBD (little black dress) would never be confused with a gown and the grandest creations recalling those worn at the famous balls held in conjunction with the Congress of Vienna (1814-1815) would never be called dresses.


Watercolor of one of the many balls held during the Congress of Vienna.

Despite that, in the narrow technical sense, to a seamstress, all gowns are dresses, but not all dresses are gowns and as late as the early eighteenth century the word "dress" was still not the exclusive province of women’s clothing ensembles.  In recent centuries, the dress has been defined by its modifiers (sun-dress, summer-dress, evening-dress, travelling dress, riding-dress etc) and the modern convention seems to be that if an invitation specifies semi-formal then an evening dress is expected and that might be something thought a gown but not necessarily.  However, when an invitation states that the occasion is formal, women are expected to wear an evening gown.  Classically, that’s understood to be something at once precise yet frivolous, with a tight fitting bodice and a skirt which reaches to the floor and this was once the accepted standard for any red-carpet event of note but the recent trend towards outrageous displays of skin has in the entertainment industry subverted the tradition although the audience is expected still to adhere.


Lindsay Lohan in a diaphanous gown, Met Gala, New York, 2007.

Wednesday, February 8, 2023

Formalism

Formalism (pronounced fawr-muh-liz-uhm)

(1) Strict adherence to, or observance of, prescribed or traditional forms, as in music, poetry and art.

(2) In religion, a strong attachment to external forms and observances.

(3) In philosophy (ethics), a doctrine that acts are in themselves right or wrong regardless of consequences.

(4) In mathematics (formal logic), a doctrine that mathematics, including the logic used in proofs, can be based on the formal manipulation of symbols without regard to their meaning (the mathematical or logical structure of a scientific argument as distinguished from its subject matter; the theory a statement has no meaning but that its symbols, regarded as physical objects, exhibit a structure that has useful applications).

(5) A scrupulous or excessive adherence to outward form at the expense of inner reality or content

(6) In Marxist criticism, scrupulous or excessive adherence to artistic technique at the expense of social values etc; also a view adopted by some non-Marxist critical theorists).

(7) In performance art, theatre a deliberately stylized mode of production.

(8) In both structural engineering and computer science, the notation, and its structure, in (or by) which information is expressed.

1830–1840: The construct was formal + -ism.  Formal was from the Middle English formel, from the Old French formel, from the Latin formalis, from forma (form) of unknown origin but possibly from the Etruscan morma, from the Ancient Greek μορφή (morph) (shape, fashion, appearance, outward form, contour, figure), dissimilated as formīca and possibly formīdō.  The –ism suffix was from the Ancient Greek –ismos & -isma noun suffixes, often directly, often through the Latin –ismus & -isma, though sometimes through the French –isme or the German –ismus, all ultimately from the Greek.  It appeared in loanwords from Greek, where it was used to form action nouns from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Although actual use of the word formalism dates only from its adoption (1830s) in critical discourse, disputes related to the matter can be found in texts since antiquity in fields as diverse as agriculture, literature and theology.  Formalism is a noun, formalist is a noun & adjective, formalistic is an adjective and formalistically is an adverb; the usual noun plural is formalists.

Comrade Stalin, Comrade Shostakovich and Formalism

Comrade Stalin (1878–1953; leader of the USSR, 1924-1953) didn’t invent the regime’s criticism of formalism but certainly made it famous after comrade Dmitri Shostakovich (1906-1975) was denounced in the Soviet newspaper Pravda (Truth) in January 1936, after the Moscow performance of his opera Lady Macbeth of the Mtsensk District Stalin didn’t like music he couldn’t whistle and the complex strains of Shostakovich’s opera, sometimes meandering, sometimes strident, certainly didn’t permit that; he labeled the composition формализм (formalism), "chaos instead of music", a self-indulgence of technique by a composer interested only in the admiration of other composers, an audience of no value in the school of Soviet realism.  It’s believed the Pravda article may have been written by Stalin himself and he used the word "formalism" in the sense it was understood English; formality being the observance of forms, formalism the disposition to make adherence to them an obsession.  To Stalin, the formal rules of composition were but a means to an end and the only legitimate end was socialist realism; anything other than that "an attack on the people".  Lest it be thought the defeat of fascism in the Great Patriotic War (1941-1945) might have mellowed his views in such matters, Stalin at the 1948 party congress made sure the point was hammered home in the Communist Party's brutish way:  

"Comrades, while realistic music is written by the People's composers, formalistic music is written by composers who are against the People.  Comrades, one must ask why it is that realistic music is always written by composers of the People? The People's composers write realistic music simply due to the fact that being by nature realists right to their very core, they simply cannot help writing music that is realistic, while those anti-People composers, being by nature unrepentant formalists, cannot help... cannot help... cannot help writing music that is formalistic."

Comrade Stalin signing death warrants.

In the Soviet Union, producing or performing stuff hated by Stalin was not good career move.  Shostakovich completed his Fourth Symphony in C minor, Opus 43, in May 1936 and, even after the attack in Pravda, planned to stage its premiere in Leningrad December but found the orchestra unwilling to risk incurring the Kremlin’s wrath and almost as soon as rehearsals began, the orchestra's management cancelled the performance, issuing a statement saying comrade Shostakovich had withdrawn the work.  Actual responsibility for the decision remains unclear but it was certainly in accord with the views of the Kremlin and not until 1961, almost a decade on from Stalin’s death, was it performed.

Comrade Shostakovich at his dacha.

Shostakovich became a realist, his response to denunciation the melodic Fifth Symphony in D minor, Opus 47.  Premiered in November 1937 in Leningrad, it was a resounding triumph, attracting a standing ovation that lasted more than thirty minutes.  The following January, just before its first performance in Moscow, an article, under the composer’s name, appeared in the local newspaper Vechernyaya Moskva in which he described the Fifth Symphony as "…a Soviet artist's creative response to justified criticism."  Whether Shostakovich actually wrote the piece isn’t known but there’s never been any doubt it’d never have been published without Stalin’s approval and the success of the Fifth Symphony re-personed Shostakovich.  Whatever it lacked in glasnost (openness), it made up for in perestroika (restructuring) and the party engineered his rehabilitation as carefully as it had his fall a couple of years earlier, anxious to show how those bowing its demands could be rewarded as easily and fully as dissidents could be punished.

Tuesday, February 7, 2023

Awful

Awful (pronounced aw-fuhl)

(1) Extremely bad; unpleasant; ugly.

(2) Inspiring fear; dreadful; terrible.

(3) Solemnly impressive; inspiring awe; full of awe; reverential (obsolete).

(4) Extremely dangerous, risky, injurious, etc.

(5) Very; extremely.

1250-1300: From the Middle English agheful, awfull, auful aueful & aȝefull (worthy of respect or fear, striking with awe; causing dread), the construct of all based on the idea of awe +‎ -ful (aghe the earlier form of awe), the same model as the Old English eġeful & eġefull (terrifying; awful).  Etymologists treat the emergence in the early nineteenth century (1809) of the meaning “very bad” as a weakening of the original sense but it can be regarded as a fork and thus a parallel path in the same way as the sense of "excessively; very great" which is documented since 1818.  Interestingly, there’s evidence from the late sixteenth century that was spasmodic use of awful that was more a variation of the original, meaning “profoundly reverential, full of awe” (awe in this case a thing more of reverence than fear and trepidation).  The spellings awfull, aweful & awefull are all obsolete although some dictionaries list awfull as archaic, a fine distinction of relevance only to lexicographers.  Awful is an adjective & (in colloquial US use, mostly south of the Mason-Dixon Line) an adverb, awfully is an adverb, awfuller & awfullest are adjectives, awfulize is a verb and awfulization & awfulness are nouns; in slang the non-standard noun plural “awfuls” is used in the same sense as the disparaging “ghastlies”.

The adverb awfully (which would later assume a life of its own) around the turn of the fourteenth century meant "so as to inspire reverence" by the end of the 1300s had come also to mean "dreadfully, so as to strike one with awe (in the sense of “fear & dread”) and this was by the 1830s picked up as a simple intensifier meaning "very, exceedingly", Henry Fowler (1858–1933) in his A Dictionary of Modern English Usage (1926) noting with his usual weary disapproval that awfully’s “downward path” was such that it was now nothing but a synonym of “very”.  That seems harsh given “awfully” would seem able to convey a nuance and even Henry Fowler conceded that in Ancient Greek the equivalent word αἰνόςως (ainósōs) was used to mean both (1) “horribly, dreadfully, terribly” & (2) “very, extremely, exceptionally” but despite his reverence for all things Hellenic, he didn’t relent.

Awfully good: Lindsay Lohan at the premiere of Mr & Mrs Smith, Los Angeles, June, 2005.  A kind of elaborated bandage dress with some nice detailing, the dress Lindsay Lohan wore in 2005 attracted much favourable comment, as did the designer's sense of restraint, necklaces and other embellishments eschewed, a sprinkle of freckles presumably thought adornment enough.  A dress like this encapsulates the simple rule: When in doubt, stick to the classics.

The adjective god-awful (also as godawful) had an even more muddled evolution, the Oxford English Dictionary (OED) in 1878 listing the meaning “impressive” before, a decade later, revising this to “impressively (ie “very”) terrible”, which seems better to reflect the sense in which it seems always to have been applied since being coined as a colloquialism of US English.  In use it’s thought to have been mostly part of oral speech and except in dictionary entries appeared rarely in print prior to the 1920s so the origin is obscure, etymologists pondering that either “God” was used as a simple intensifier or in the sense of the frequent God's awful vengeance, judgment etc, a phrase common in religious literature.

As adjectives, the usual forms of the comparative & superlative are respectively more awful & most awful but dictionaries continue to acknowledge awfuller & awfullest as correct English although most users would probably flag both as “wrong” and their clumsy sound means they’re avoided even by those aware of their status.  The verbs awfulize, awfulizes, awfulizing & awfulized are technical terms in psychotherapy which describe patients reacting dramatically or catastrophically to distressing events, usually in the sense of a disproportionate reaction; the noun form is awfulization.  Perhaps surprisingly, social media users seem not to have picked up “awfulization”; it would seem a handy descriptor of much content.

A sentence like “it was a godawful book and awfully long but awfully well-written” actually makes sense and few would be confused because the various meanings are conveyed by context.  So, despite the tangled history, awful and its derivatives usually present few problems, even the colloquial “something awful” (“awfully; dreadfully; terribly” except in North America (mostly south of the Mason-Dixon Line & among classes so influenced) where it means “very, extremely”) usually able to be decoded: “I was hungry something awful” and “there’s something awful about crooked Hillary Clinton” both unambiguous even if the former sounds strange to those accustomed to “educated speech”, a term much criticized but well-understood.

Awful: Lindsay Lohan at the afterparty for Roberto Cavalli's fashion show, Milan Fashion Week, March 2010.  Although they tend to group-think, fashion critics are not monolithic but none had a good word to say about this outfit, the consensus being: Awful.  A few grudgingly granted a pass to the glittering Roberto Cavalli harem pants but the fur gilet was condemned as if Ms Lohan had with her bare hands skinned a live mink, eating the liver; these days, even faux fur seems grounds for cancellation.  Some, presumably those who picked up a photo from the agencies, called it a stole and at certain angles it resembled one but it really was as gilet.  As a footnote, many did admire the Fendi platform pumps so there was that though nobody seemed to think they redeemed things.

Gilet was from the French gilet (vest, waistcoat), from the regional Italian gileccu (Calabria), gilecco (Genoa), gelecco (Naples) & ggileccu (Sicily), (though the standard Italian gilè was borrowed directly from the French), from the Turkish yelek (jelick; vest, waistcoat, from the Proto-Turkic yẹl (the noun of “wind”) with the final syllable modified to match other styles of garments such as corselet and mantelet.  Historically a gilet was (1) a man’s waistcoat & (2) a woman’s bodice a la the waistcoat or a decorative panel either attached to the bodice or worn separately.  In modern use, a gilet is a sleeveless jacket which can be closed to the neck and is often padded to provide warmth.  Some puffer jackets and garments described as bodywarmers can be classed as gilets.

Stole was from the Old English stole, from the Latin stola, from the Ancient Greek στολή (stol) (stole, garment, equipment).  The original stoles were ecclesiastical vestments and were decorated bands worn on the back of the neck, each end hanging over the chest (reaching sometimes to the ground) and could, inter alia, be an indication of clerical rank, geographical location or membership of an order.  In English and European universities, stoles were also adopted as academic dress, often added to an undergraduate’s gown for a degree conferring ceremony.  In fashion, the stole was a garment in the style of a scarf or muffler and was there always for visual effect and sometimes warmth.  Fur stoles were especially popular until wearing it became socially proscribed and (trigger warning) there were fox stoles which included the beast's entire pelt including the head and the much admired brush (tail).

Monday, February 6, 2023

Ultra

Ultra (pronounced uhl-truh)

(1) The highest point; acme; the most intense degree of a quality or state; the extreme or perfect point or state.

(2) Going beyond what is usual or ordinary; excessive; extreme.

(3) An extremist, as in politics, religion, sporting team supporters, fashion etc, used semi-formally on many occasions in history.

(4) In the history of military espionage, the British code name for intelligence gathered by decrypting German communications enciphered on the Enigma machine during World War II (initial capital letter).

1690–1700: A New Latin adverb and preposition ultrā (uls (beyond) + -ter (the suffix used to form adverbs from adjectives) + (suffixed to the roots of verbs)).  The prefix ultra- was a word-forming element denoting "beyond" (eg ultrasonic) or "extremely" (ultralight (as used in aviation)) and was in common use from the early nineteenth century, the popularity of use apparently triggered by the frequency with which it was used of political groupings in France.  As a stand-alone word (in the sense now used of the most rabid followers of Italian football teams) meaning "extremist", it dates from 1817 as a shortening of ultra-royaliste (extreme royalist (which at the time was a thing))."  The independent use of ultra (or shortening of words prefixed with it) may also have been influenced by nē plūs ultrā (may you) not (go) further beyond (this point), said to be a warning to sailors inscribed on the Pillars of Hercules at Gibraltar.  This legend comes not from Greek mythology but dates from the resurrection of interest in antiquity which began during the Renaissance, influenced by Plato having said the lost city of Atlantis “lay beyond the Pillars of Hercules” and the most useful translations of nē plūs ultrā probably something like "go no further, nothing lies beyond here".

As a prefix, ultra- has been widely appended.  The construct of ultra vires (literally "beyond powers") was ultra (beyond) + vires (strength, force, vigor, power) and is quoted usually by courts and tribunals to describe their jurisdictional limits, something ultra vires understood as "beyond the legal or constitutional power of a court".  In political science, the term ultranationalism was first used in 1845, a trend which has ebbed & flowed but certainly hasn't died.  The speed of light being what it is, ultralight refer not to optics but to very small (often home-built or constructed from a kit) aircraft, the term first used in 1979 although it was (briefly) used in experimental physics in the late 1950s.  Ultrasound in its current understanding as a detection & diagnostic technique in medicine dates from 1958 but it had been used in 1911 to mean "sound beyond the range of human hearing", this sense later replaced by ultrasonic (having frequency beyond the audible range) in 1923, used first of radio transmission; the applied technology ultrasongraphy debuted in 1960.  Ultraviolet (beyond the violet end of the visible spectrum) was first identified in 1840 and in 1870 ultra-red was coined to describe what is now known as infra-red.  First identified in the 1590s, ultramarine (blue pigment made from lapis lazuli) was from the Medieval Latin ultramarinus ("beyond the sea"), the construct being ultra +  marinus (of the sea) from mare (sea, the sea, seawater), from the primitive Indo-European root mori- (body of water), the name said to be derived from the mineral arriving by ship from mines in Asia.  Ultramontane has a varied history and was first used in the 1590s.  It was from the Middle French ultramontain (beyond the mountains (especially the Alps)), from the early fourteenth century Old French, the construct being ultra + the stem of mons (hill), from the primitive Indo-European root men- (to project) and was used particularly of papal authority, though the precise meaning bounced around depending on context.  The acronym UHF (ultra-high frequency) was coined in 1937 although the technology using radio frequencies in the range of 300-3000 megahertz (Mhz) became available in 1932.  Other forms (ultramodern, ultra-blonde et al) are coined as required and survive or fall from use in the usual way English evolves.

The Ultras

The prefix ultra- occurred originally in loanwords from Latin, meaning essentially “on the far side of, beyond.”  In relation to the base to which it is prefixed, ultra- has the senses “located beyond, on the far side of” (eg ultraviolet), “carrying to the furthest degree possible, on the fringe of” (eg ultramodern) or “extremely” (eg ultralight); nouns to which it is added denote, in general, objects, properties, phenomena etc that surpass customary norms, or instruments designed to produce or deal with such things (eg ultrasound).  The more recent use as a noun (usually in the collective as “the ultras”) applied to members of an extreme faction dates from early nineteenth-century English parliamentary politics and is associated also with the most extreme supporters of certain Italian football (soccer) teams.

Although never formally a faction in the modern sense of the word, the ultra Tories (the ultras) operated from 1827 (some political scientists insists the aggregation coalesced only in 1828) as a formal as a loose and unstructured grouping of politicians, intellectuals, and journalists who constituted, in embryonic form, the “extreme right wing” of British and Irish politics.  Essentially reactionary conservatives unhappy with changes associated with the Enlightenment, the industrial revolution and urbanization, they regarded the 1689 protestant constitution as the unchangeable basis of British social, economic and political life and treated all their opponents with a rare obsessional hatred.  In another echo of recent right-wing politics, the ultras showed some scepticism of economic liberalism and supported measures designed to ameliorate the hardships suffered by the poor during the early industrial age.  Like a number of modern, nominally right-wing populist movements, the ultras were suspicious of “free trade” and the destructive consequences these policies had on industries vulnerable to competition from foreign producers.

Portrait of the Duke of Wellington (1769-1852) by Francisco Goya (1746–1828), circa 1812–14, oil on mahogany panel, National Gallery, London.

The previously inchoate ultras coalesced into a recognizable force in the period of instability which followed the death in 1827 of a long-serving prime-minister.  Their first flexing of political muscle, which proved unsuccessful, was an attempt to deny the premiership to a supporter of Catholic emancipation but the ultras emerged as a powerful influence in Tory politics although many claimed to belong to the Whig tradition.  Their annus mirabilis (a remarkable or auspicious year) came in 1830 when the ultras brought down the Duke of Wellington’s government (1828-1830) but the need for reform was unstoppable and while the label was for decades to be applied to the far-right of the Conservative Party, the latter iterations never matched the political ferocity of the early adherents.

Ultra Blonde product.

Although there are packaged products labeled as such and the phrase "ultra-blonde" is far from uncommon, there's no precise definition of such a thing and while some blondes are blonder than others, on the spectrum, there is a point at which going further means the color ceases to anymore to be blonde and becomes some shade which tends to grey, white or the dreaded yellow.  For that reason, some hairdressers prefer to describe platinum as a stand-alone color rather than the usual "platinum blonde", noting that the end result will anyway usually to some degree differ, depending on the shade and physiology of the hair to be treated.  They also caution the idea of ultra blonde isn't suitable for everyone and base their recommendations of whether a client's skin is warm or cool toned, the practical test being to assess the veins visible in the wrist; if they're mostly blue and purple (source of the word "blue-blooded" which was based on the notion of those with obviously blue veins being rich enough not to have to work in the fields), then the undertone is cool, if mostly green then it's warm and if a mix of both, the undertone is neutral.

Lindsay Lohan had an ultra-blonde phase but for her Playboy photo shoot in 2012, wore a blonde wig; many would call this "ultra blonde" but to a professional hairdresser it's a "pale".

The undertone interacts with skin tone, paler, pinky skin tones suit cool, delicate blondes like ash, beige or baby-blonde whereas darker or more golden-toned skins suit honey hues described often as butter, golden or caramel.  For perfectionists, there's also eye color to consider and here the trick is to achieve the desired degree of contrast; soft, multi-tonal shades better complementing lighter colours whereas deeper, richer blondes flatter the darker eye.  Those especially obsessive can use non-optically corrective contact lens, eye color often easier to change than hair.  So, while hairdressers think of ultra blonde as shifting concept rather than a specific color, most agree (whatever the sometimes extraordinary proliferation of imaginatively named products on manufacturers' color charts), there are essentially four stages of blondness and they’re usually described as something like: medium, light, pale & platinum.  In each of those categories, it's possible to be an "ultra" though hairdressers will readily admit their technical distinctions resonate little with customers whose expectation of "ultra" is simply the limit of what's physically possible.

Sunday, February 5, 2023

Boulle

Boulle (pronounced bool)

(1) In woodworking, furniture design, cabinet making and bibelots, denoting or relating to a type of marquetry of patterned inlays of brass and tortoiseshell (and occasionally other metals such as pewter or silver), widely used in French (and later Italian) furniture from the late-seventeenth century.

(2) Something ornamented with such marquetry; furniture having ornamentation of this kind.

Circa 1680s: Named after André Charles Boulle (1642–1732), the French cabinet-maker much associated with the style although Boulle was noted also for his work in the intarsia (an Italian form of decorative wood inlaying (and (in knitting) a design resembling a mosaic)) of wood.  The alternative spellings are buhl and the less common boule; Boulle (and buhl) are the common short forms for the product (often with an initial capital letter) but among historians of furniture, antique dealers et al, boullework, boulle work & boulle-work are all used as descriptors.  Boulle is a noun & proper noun and an adjective, the verb form usually spelled bouled; the noun plural is boulles.

Armoire (circa 1700) by André-Charles Boulle, Royal Collection Trust, London.

Variation of the type of marquetry which came to be known as boulle work had been around for centuries before it was brought to an extraordinary standard fineness and intricracy by French cabinetmaker André Charles Boulle (1642–1732).  His most memorable creations were veneered furniture with tortoiseshell inlaid primarily with brass, pewter and silver, his elaborate designs often incorporating arabesques.  The large pieces by Boulle and his imitators are a staple of museums and the high-end of antique market but the technique was used also on countless bibelots.  Those personally crafted by Boulle are the most prized but because (1) the sheer volume of the eighteenth and nineteenth century imitations and (2) Boulle not signing or imposing some verifiable marking, it can at the margins be difficult definitively authenticate the works.  For this reason, the sign “attributed to André-Charles Boulle” is often seen in museum collections and is not unknown in antique shops.

Pair of oak cabinets by Pierre Garnier (circa 1726-1806) a Master Ébéniste, veneered with ebony and boulle marquetry in brass, pewter and tortoiseshell, representing a later neoclassical rendering of the Boulle technique, Royal Collection Trust, London.

Boulle was appointed furniture-maker, gilder and sculptor to Louis XIV (1638–1715; le Roi Soleil (the Sun King), King of France 1643-1715) and his work adorned the palaces and other royal places of the L'Ancien Régime but most of the furniture in the Royal Collection made by, or attributed to, Boulle was later acquired by George IV (1762–1830; King of the UK 1820-1830).  A Francophile and noted for the extravagance of his tastes, the king had been furnishing the royal palaces with French furniture since the 1780s and this habit he was able to indulge more and more after the French Revolution (1789) because, for a variety of reasons, in the aftermath of that and during the Napoleonic years, much more fine French furniture came onto the market, much of it shipped to England.

A boulle tortoise shell inkwell with brass inlays, circa 1870.

Marquetry is the use of small pieces of different materials (including burl timber, tortoiseshell, pewter, silver, brass, horn, mother-of-pearl) to create elaborate designs inlaid upon furniture.  So skilled was Boulle at pictorial marquetry he became known as a “painter in wood” but it was his use of tortoiseshell and brass that made his reputation and established him as a favourite of royalty and the nobility.  Pewter or brass inlay on tortoiseshell was known as premier-partie, while tortoiseshell inlay on brass or pewter was contre-partie but the most sumptuous pieces included mother-of-pearl, stained horn and dyed tortoiseshell.

Saturday, February 4, 2023

Anhedonia

Anhedonia (pronounced an-hee-doh-nee-uh)

In psychiatry, the lack of desire for or the capacity to experience pleasure.

1896: From the French anhédonie (an inability to feel pleasure (and an antonym of analgesia)), the construct being the Ancient Greek ἀν (an) (in grammar, the privative prefix, indicating negation or absence) + ἡδονή (hēdon) (pleasure) + -ia (the abstract noun ending).  Hēdonḗ’s better known gift to the language was hedonist (one who seeks pleasure).  The an- prefix was an alternative form of on-, from the Middle English an-, from the Old English an- & on- (on-), from the Proto-Germanic ana- (on).   It was used to create words having the sense opposite to the word (or stem) to which the prefix is attached; it was used with stems beginning either with vowels or "h".  The word was coined in either 1896 or 1987 by French psychologist Professor Théodule-Armand Ribot (1839-1916).  Anhedonia is a noun and anhedonic is an adjective; the noun plural is anhedonias.  Unexpectedly, given the profession's propensity to intricate categorization, anhedonism seems not to exist.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The term anhedonia encompasses a range symptoms related to a reduction in desire for or ability to experience pleasure.  It is a generalized condition which is diagnosed only in those where the experience is universal and does not apply to those with aversion to specific activities, this something (usually) considered healthy and not unusual.  The original model in clinical psychiatry was limited to an inability to experience pleasure but this was later extended to a reduction in motivation even to seek experiences which most would find pleasurable.  The fifth edition of the American Psychiatric Association’s (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-5-TR (2022)) defines anhedonia as a “lack of enjoyment from, engagement in, or energy for life’s experiences; deficits in the capacity to feel pleasure and take interest in things”.  In modern practice, clinicians distinguish between anticipatory and consummatory anhedonia.  Anticipatory pleasure involves the prediction of pleasure from future reward and the experience of pleasure associated with a positive prediction while consummatory pleasure involves the reward that is the actual moment of experience.  Thus, anticipatory anhedonia is reflects an inability to predict the future experience of pleasure as well as lower motivation to take action toward achieving pleasure and consummatory anhedonia is the lack of pleasure in what’s experienced (ie synonymous with the original definition of anhedonia).

Specific instances usually are not of necessity anhedonic (although an inability to derive any enjoyment from listening to country & western music seems indicative of little more than good taste).  The exception to this seems to be the range of activities clinicians have on their “suspect categories” list including things like sex and human friendship and this view may reflect the long shadow Sigmund Freud (1856-1939) has cast over the profession.  Although in both Western theology and philosophy there's a discernible tradition of what verges on an insistence that humans are social creatures and that interaction with others should be both sought and enjoyed, Freud raised the bar by suggesting every form of sexual behavior among humans was "natural" (though some might be neither lawful or desirable) except the absence of such interest.       

Anhedonia accompanies a range of neuropsychiatric conditions and is frequently associated with depression although it’s not an essential component.  Clinically, anhedonia needs to be suffered as a generalised condition, not as the common phenomenon of losing interest in something specific, something a normal part of the human condition.  There are no specific treatments for anhedonia and there are some dissident psychiatrists and psychologists who suggest this is a tacit admission it may be a normal part of the spectrum of human behaviour.  It is commonly treated alongside the condition of which it’s a part including depression, bipolar disorder (the old manic depression), schizophrenia, post-traumatic stress disorder (PTSD) and the various anxiety disorders.  This association with schizophrenia is striking, the medical orthodoxy being that up to 80% of those with schizophrenia may experience anhedonia and because it’s classified as a negative symptom (indicative of the absence of something that occurs in most healthy individuals), it’s considered more difficult to treat.