Sunday, February 12, 2023

Oikophobia

Oikophobia (pronounced oick-oh-foh-bee-uh)

(1) In political science, an aversion to or rejection of one’s own culture, and traditions; a dislike of one's own compatriots.

(2) In psychiatry, one of a number of phobias related to (1) one’s home (either as a building or as place of abode), (2) returning to one’s home or (3) some or all of the contents of one’s home.

From the Ancient Greek οκος (oîkos) (house; household; a basic societal unit in Ancient Greece; a household or family line) + -phobia from phóbos (fear).  The suffix -phobia (fear of a specific thing; hate, dislike, or repression of a specific thing) was from the New Latin, from the Classical Latin, from the Ancient Greek -φοβία (-phobía) and was used to form nouns meaning fear of a specific thing (the idea of a hatred came later).  Oikophobia & oikophobe are nouns, oikophobic is a noun and adjective and oikophobically is an adverb; the noun plural is oikophobes.

Roger Scruton in his study.  Although a staunch conservative tied to earlier traditions, even The Guardian granted a deservedly generous obituary.

The political sense where oikophobia (literally the antonym of xenophobia (hatred, fear or strong antipathy towards strangers or foreigners)) dates only from 2004 when it was used by English philosopher Roger Scruton (1944-2020) as part of the culture wars which swirl still around the critiques and defenses of Western civilization, the Enlightenment and the implications of post-modernism.  Scruton’s slim volume England and the Need for Nations (2004, Civitas, 64 pp ISBN-10-1903386497) argued that empirically, based on the last two-hundred years odd, it was the nation state which best created the conditions necessary for peace, prosperity, and the defense of human rights.  There are obviously not a few examples of nation states which have proven not to be exemplars of the values Scruton values but his agreement was essentially structural: Where there have been attempts to replace the nation-state with some kind of transnational political order, such things have tended to descend to totalitarian dictatorships like the old Soviet Union or evolve into bloated unaccountable bureaucracies like the post Maastricht European Union (EU).  It surprised nobody that enthusiastically he supported the UK’s exit (Brexit) from the EU:

 I believe we are on the brink of decisions that could prove disastrous for Europe and for the world and that we have only a few years in which to take stock of our inheritance and to reassume it.  Now more than ever do those lines from Goethe’s Faust ring true for us: "Was du ererbt von deinen Vätern hast, Erwirb es, um es zu besitzen" (What you have inherited from your forefathers, earn it, that you might own it).  We in the nation states of Europe need to earn again the sovereignty that previous generations so laboriously shaped from the inheritance of Christianity, imperial government and Roman law. Earning it, we will own it, and owning it, we will be at peace within our borders.”

Portrait of Theodor Herzl (circa 1890), oil on canvas by Leopold Pilichowski (1869-1933), Ben Uri Gallery and Museum, London.

Scruton of course rejected the notion he was in any way xenophobic but did reference that as oikophobia’s antonym when he described the latter as a “…need to denigrate the customs, culture and institutions that are identifiably ours” and ominously implicit in his critique was the observation it was a cultural malaise which tended to befalls civilizations in the days of decline before their fall.  Plenty have documented the mechanisms by which the faith in Western civilization was undermined, their phrases famous landmarks in the development of post-modernism including “cultural relativism”, “march through the institutions” & “deconstructionism” et al.  However, in a political context the idea of oikophobia wasn’t then entirely new, the idea of the “self-hating Jew” documented in 1896 by Austro-Hungarian Jewish lawyer Theodor Herzl (1860–1904) in his book 1896 book The Jewish State.  Regarded as “the father of modern political Zionism”, Herzl denounced those who opposed his model of a Jewish state in Palestine, calling them “disguised anti-Semites of Jewish origin”.  Essentially, Herzl saw being Jewish as not only compulsory for Jews but defined the only “true” Judaism as his Zionist vision but despite that, among European Jews, especially the educated and assimilated, Zionism was by no means universally supported and both sides weaponized their vocabularies.  In 1930, German Jewish philosopher Theodor Lessing (1872–1933) published Juedischer Selbsthass (Jewish Self-Hatred) and from then onwards the “self-hating Jew” came to be slung at those (often intellectuals) opposed to Zionism.  In 1933, Lessing (who had fled to Czechoslovakia) was murdered at the instigation of the Nazis.  In the post war years “self-hating Jew” has come to be used by Israeli politicians against any Jew who opposes their policies, often with as little basis as “fascist” came to be deployed in post-Franco Spain.

Before it was picked up in political science and purloined for the culture wars, oikophobia had been a technical term in psychiatry to refer to a patient’s aversion to a home environment, or an abnormal fear (phobia) of being in their own home, the companion terms being (1) ecophobia (fear of a home environment) the construct being eco- (from the French eco-, from the Latin oeco, from the Ancient Greek οκος (oîkos) (house, household) + -phobia & (2) nostophobia (a fear of, or aversion to, returning to one's home), the construct being the Ancient Greek νόστος (nóstos) (a return home) + -phobia.  It was the idea of the “unwillingness to return home” that was later absorbed by the deconstructionists and other post-modernists in the sense of “an aversion to the past, the antithesis of nostalgia” because in their assault on Western society, it was the political and social relics they attacked, condemning them as symbols (indeed tools) of oppression and mechanisms by which the power elite maintained their hegemony.  Thus, Western legal & theological traditions and the artistic & literary canon were just one of many constructs and, because of their oppressive history, needed to be overthrown.

In psychiatry, oikophobia, ecophobia & nostophobia cold also be used of patients exhibiting the symptoms of phobia relating to all or some of the contents of a house: electrical appliances, the plumbing, the cupboards, the furniture, the light fittings etc.  So specific were some of these cases (an there were some not unjustified such as a fear of certain allergy-inducing substances such as chemicals) that the profession added domatophobia (a specific fear of a house as opposed to its contents), the construct being domato- (from the Middle French domestique, from the Latin domesticus, from domus (house, home) + -phobia.  In the years after World War II (1939—1945), the word domatophobia came to be used by journalists to described what was emerging as a mass phenomenon: women attracted to careers outside the home, this explained by (usually male) journalists as “a fear of or aversion to housework”, presumably their proper role.

Saturday, February 11, 2023

Oiler

Oiler (pronounced oi-ler)

(1) A person or device which is some way delivers oil.

(2) A worker employed to oil machinery.

(3) Any of several devices, other than pressure devices, for feeding lubricating oil to a bearing.

(4) In oil exploration, a productive well.

(5) An oilcan.

(6) An oilskin garment, especially a coat.

(7) A ship which uses oil as fuel (archaic).

(8) In admiralty slang, an oil tanker used to refuel other vessels.

(9) In admiralty slang, an assistant in the engine room of a ship, senior only to a wiper, mainly responsible for keeping machinery lubricated (archaic).

(10) In the cleaning kits of firearms, a small (typically thumb-sized) metal container of oil, often containing an integral brush.

(11) As an ethnic slur (mostly southern US), a Mexican (sometimes extended to other of Latino appearance.

Circa 1290: The construct was oil + -er.  Oil was from the Middle English olyer, oyller & oyellere (the later alternative spellings included oylle, olie, oli, eoli, eoyle, olige, oyll, uile, oile & oyl.  Oyler was from the Anglo-Norman olie and the Old French oile, from the Latin oleum (olive oil), from the Ancient Greek λαιον (élaion) (olive oil).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  The meaning “an appliance for distributing oil in machines" was in use by 1861 and was adopted by the British Admiralty in 1916 to describe "navy vessels carrying oil for use by other ships"; although such vessels had been in use for some years, the Royal Navy having begun the conversion from coal to oil a decade earlier, by 1911 only the submarine fleet ran exclusive on oil and coal (sometimes sprayed with oil) still fuelled most of the navy’s vessels.

Evolution of the Ford 427 side-oiler

The side valve (usually called “the flathead”, an allusion to the almost flat plate covering the combustion chambers) Ford V8 of 1932 is remembered for its vices as well as the many things which made it one of the great engines of the mid-century.  In the 1930s, those vices could be both forgiven and worked-around but by 1953, it was still in production and outdated (though in overseas production it would continue, in French Simca cars until 1961, in Brazilian Fords until 1964 and remarkably, until 1990 in the Simca Unic Marmon Bocquet military truck.  For 1954, Ford responded to the modern overhead valve (OHV) V8s others had introduced with the debut of two new engines, essentially (by the standards of the time) small and big block versions of the same design.  Known as the Y-Blocks because of the shape of the castings, they were sturdy pieces of machinery and addressed many of the problems identified in the flathead over two decades of production but neither was suited to the evolutionary path the American automobile would follow during the 1950s.

1962 Ford 406 FE V8 with 3 x 2 barrel carburetors.

That path was not one which anyone in Detroit was likely to foresee in the late 1940s when the design work on the Y-Blocks began but by 1954, it was at least competitive with the competition.  However, in 1955, Chevrolet introduced their small-block V8 which was light, compact and free-breathing, not something which could be said of the Y-Blocks and more importantly, the design afforded a potential for development which would play out over decades.  By contrast, the Y-Blocks’ potential in both capacity and power output soon plateaued and Ford was forced to resort to exotic solutions like supercharging, something not practical for low-cost mass-manufacturing.  Ford’s solution was not one new V8 but three.  All released during 1958, the SD (Super Duty, a large, low revving truck engine), the MEL (a big block for what were now very large Lincolns and Mercurys) and the FE (thought at the time a big-block but subsequently listed by pedants as a mid-block because later castings would out-weigh it by so much).  The durable SD would remain in the catalogue until 1980, its demise prompted only by the implications of the second oil-shock in 1979, the sole complaint about it being its prodigious thirst.  The MEL would last a decade, early attempts to use it on the race-tracks abandoned because of the penalty imposed by excessive weight although it did enjoy some success in powerboat racing where it’s capacity to run reliably at full throttle for sustained periods was highly valued.

1966 Ford 427 FE V8 side oiler with tunnel-port cylinder heads and Kar-Kraft transaxle, the specification used in the mid-engined Ford GT40s which recorded a 1-2-3 finish at that year's 24 Le Mans 24 hour classic.

Although rapidly it would earn a stellar reputation which endures to this day, Ford’s FE V8 engine didn’t enjoy a wholly auspicious start, associated as it was with the ill-fated Edsel (FE really did stand for “Ford-Edsel” despite some post-debacle attempts to suggest “Ford Engine” (the contemporary MEL decodes as Mercury-Edsel-Lincoln)).  However, whatever the problems of the Edsel, the use of the FE in some was not one.  Offered in its first few seasons in several displacements, the most produced in the 1960s would be the 352 & 390 cubic inch (5.8 & 6.5 litre) versions, both of which briefly were offered in high-performance versions until the decision was taken to develop such engines (to be used in competition) as a separate FE branch, the first fruit of which was the 406 (6.6 litre) which debuted in 1962.  The 406 had performed well on Ford’s test-rigs, its output slightly exceeding the engineers’ projections and when installed in the new, slippery bodies offered that year, the combination proved fast on the racetracks.  The power however came at the cost of reliability and the increasing speeds on the circuits had exposed weaknesses in the bottom-end, the main bearing caps “walking” when vibrations attained a certain resonance.  

By their bolts they shall be known.  By convention a "four bolter" was one with the four all into the boss while in a "cross bolter" two were in the boss and two into the block.  "Six bolters" (with four in the boss, two in the block) are now common.

The solution was to “cross-bolt” the caps; an additional two securing bolts (installed sideways through the block) per cap augmenting the pairs mounted in the conventional vertical position.  This approach, still widely used to this day, proved successful and was carried over when in 1963 the FE was further enlarged to 425 cubic inches (7.0 litre), Ford labelling the new mill the 427 to align it with the displacement limit used by both NASCAR (National Association for Stock Car Auto Racing) and the FIA (Fédération Internationale de l'Automobile (the International Automobile Federation and world sport’s dopiest regulatory body)).  However, greater capacity meant more power, higher speeds and increased heat and the 427 began to also to suffer, the higher internal pressures meaning lubrication to the now cross-bolted main bearings had become marginal.  Ford’s solution was to reverse the priority with which oil was delivered.  The original design (subsequently known as the “top oiler”) lubricated first the valve-train at the top of the engine, then the main bearings which supported the crankshaft.  The new process reversed this order and the design became known as the side-oiler: on one side of the block was cast an additional oil galley, the bulge the external distinguishing feature of the new arrangement.

By their oil galleys they shall be known: Lubrication systems of 1964 Ford 427 FE V8 top oiler (left) & Ford 427 FE V8 side oiler (right). 

Introduced in 1965, the side-oiling proved the final solution and the 427 became a paragon of reliability, powering even the Le Mans 24 hour winning GT40s in 1966 & 1967.  Today the 427 is perhaps best remembered as the power-plant in the Shelby American AC Cobra 427 (although some of those actually used the rather more tame FE 428) but in those happy days when one could tick a box and have what was essentially a racing engine installed in a road car, it was available also in full-sized machines (the Galaxie), intermediates (the Fairlane) and, at the tail-end of production, a few (by then somewhat toned down) were even put in the Cougar, Mercury’s Mustang-based take on the pony-car.  By then however, the side-oiler’s days were numbered because not only was it noisy, apt to be cantankerous and a bit of an oil-burner, the complex lubrication and cross bolting made it quite expensive to build, added to which the big bore was at close to the limit the FE block could accommodate so during the manufacturing process, even a slight shift in the casting cores meant a scrapped block.  Thus the attraction for most (non racing) purposes of the 428 with its smaller bore.

Hair oiler: Lindsay Lohan recommends Nexxus Hair Oil and provides the following technical recommendation for hair care products: "I would select three. I love the hair oil because I think we always tend to overdo putting products in our hair, so it's good to give it some rest and let it refresh and replenish. I love a hairspray that doesn't make your hair hard and crunchy. And so the Nexxus XXL Volume Spray does that—it holds, but it doesn't make your hair feel icky, and that's important to me. And then the Slick Stick is good because I mean, I'm always on the go and sometimes I just need it to look nice and chic and back and clean-looking. So that's perfect for that. These are actually products that I use and like."

As supplied ex-factory: Ford 427 SOHC on stand.

The side oiler also provided the basis for one engine which wasn’t exactly mythical because quite a few were built but remains mysterious because nobody seems quite sure how many, the consensus being it was somewhere in mid-three figures, the last of which (in a crate) wasn't sold until 1970 although production ended in 1967.  This was the 427 SOHC (single overhead camshaft (the “sock” in the slang of some)) which for all sorts of reasons never made it onto the circuits for which it was intended nor into even one road car, despite the wishes of many.  Popularly known as “the cammer”, even some sixty years on there’s still a mystique surrounding the cammer and if one can’t find an original for sale (one sold at auction in 2021 for US$60,000), from a variety of manufacturers it’s possible still to buy all the bits and pieces needed to build one (in a quirk of timing and the overlap of simultaneous product development, some of the very early SOHCs used the top oiler block although most were side oilers and the third party reproductions over the years have always been the latter).  Although the production numbers have never been verified (which seems strange given Ford's accounting system recorded everything which emerged with a serial number), what all agree is the horsepower of a stock SOHC was somewhere over 600, the number bouncing around a bit because there were versions with single and dual four barrel carburetors, different camshaft profiles and variations in the cylinder heads and while it never made it into a production car, it remains the ultimate FE.   

Friday, February 10, 2023

IIII

IIII (pronounced fawr (U) or fohr (non-U))

A translingual form, an alternative form of IV: the Roman numeral representing four (4), the other known forms being iv, iiii & iiij

Circa 500 BC: The Roman numeral system spread as Roman conquest expanded and remained widely used in Europe until from circs 1300 it was replaced (for most purposes) with the more adaptable Hindu-Arabic system (including the revolutionary zero (0) which remains in use to this day.

IIII as a representation where the value four is involved has long been restricted to the value 4.  To avoid numbers becoming too cumbersome, the Roman system always used subtraction when a smaller numeral precedes a larger numeral so the number 14 would be represented as XIV instead of XIIII.  The convention which emerged was that a numeral can precede only another numeral which is less than or equal to ten times the value of the smaller so I can precede only (and thus be subtracted from) V (five) & X (ten).  However, these “rules” didn’t exist during Antiquity and weren’t (more or less) standardized until well into the medieval period; it’s thus not unusual to find old documents where 9 is represented as VIIII instead of IX.  The practical Romans, unlike the Greeks for whom abstraction was a calling, were little concerned with the concepts of pure mathematics, such as number theory or geometric proofs, and other abstract ideas, devoted instead to utilitarian purposes such as financial accounting, keeping military records and building things.

The numeral system had to be manageable to make simple calculations like addition and subtraction so it was attractive to make the text strings conveniently short: 44 as XLIV obvious preferable to XXXXIIII.  Although its limitations seem obvious to modern eyes, given the demands of the times, the system worked remarkably well for almost two millennia despite the largest numeral being M (1000).  It was silly to contemplate writing a string of 1000 M’s to indicate a million (presumably not a value then often used) so the Romans concocted a bar (the vinculum) which, when it appeared above a numeral, denoted a multiplier of 1000: MMMM (6000) could thus appear as V̄Ī and a million as M̄.  Compared with the Hindu-Arabic system, it was a fudged but one which for centuries proved serviceable.

Where Roman numbers are occasionally still used (book prefaces & introductions, some aeroplanes & automobiles and charmingly, some software), the number four is almost always represented by IV rather than IIII.  One exception to this however is watch & clock faces where the use of IIII outnumbers IV, regardless of the cost of the device.  Watchmakers have provided may explanations for the historical origin of this practice, the most popular of which dates from Antiquity: Because “I” stood for the “J” and “V” for the “U”, IV would be read as JU and thus Jupiter, an especially venerated Roman god, Jupiter Optimus Maximus being the king of all gods, chief of the pantheon and protector of ancient Rome.  The suggestion is that invoking the name of Jupiter for such a banal purpose would be thought offensive if not actually blasphemous.  Thus IIII it became.

Lindsay Lohan wearing 19mm (¾ inch) Cartier Tank Americaine in 18 karat white gold with a quartz movement and a silver guilloche dial with Roman numerals including the traditional IIII.  The Cartier part-number is B7018L1.

There’s the notion to that the convention arose just because of one of those haphazard moments in time by which history sometimes is made.  The appearance of IIII was said to be the personal preference of Louis XIV (1638–1715; le Roi Soleil (the Sun King), King of France 1643-1715), the Sun King apparently issuing an instruction (though there’s no evidence it was ever a formal decree) that IIII was the only appropriate way to write the number four, watchmakers ever since still tending to comply.  Whether Louis XIV wished to retain some exclusivity in the IV which was part of “his” XIV isn’t known and it may be he simply preferred the look of IIII.  Despite the belief of some, it’s anyway wrong to suggest IIII is wrong and IV right.  The design of the IIII was based upon four outstretched fingers which surely had for millennia been the manner in which the value of 4 was conveyed in conversation and V denoted 5 in tribute to the shape the hand formed when the thumb was added.  The IV notation came later and because it better conformed with the conventions used for writing bigger numbers, came in medieval times to be thought correct; it was thus adopted by the Church, becoming the “educated” form and that was that.

Not all agree with those romantic tales however, the German Watch Museum noting that in scholarly, ecclesiastical and daily use, IIII was widely used for a millennia, well into the nineteenth century, while the more efficient “IV” didn’t appear with any great frequency until circa 1500.  The museum argues that the watch and clock-makers concerns may have been readability and aesthetics rather than any devotion to historic practice, IIII having display advantages in an outward-facing arrangement relative to the centre of the dial (ie partially upside down, such as on wall, tower or cuckoo clocks), any confusion between IV (4) & VI (6) eliminated.  Also, a watch, while a functional timepiece, is also decorative and even a piece of jewellery so aesthetics matter, the use of III rendering the dial symmetrically balanced because 14 individual characters exist on each side of the dial and the IIII counterbalances the opposite VIII in the manner IX squares off against III.  So there’s no right or wrong about IIII & IV but there are reasons for the apparent anomaly of the more elegant IV appearing rarely on the dials of luxury watches.

Thursday, February 9, 2023

Gown

Gown (pronounced goun) 

(1) A type of woman's dress or robe, especially one full-length and worn on formal occasions and often styled as “evening gown” or “ball gown”.

(2) As nightgown, a loose fitting garment worn by sleeping (historically by both men & women but now most associated with the latter); the shortened for is “nightie”.

(3) As surgical gown, a light, protective garment worn in hospitals by medical staff, a specialized form of which is the isolation gown.

(4) As dressing gown (also call bathrobe), a garment in the form of an open robe secured by a tie and often worn over pajamas, after a bath and prior to dressing or on other occasions where there’s no immediate need to dress.

(5) A loose, flowing outer garment in various forms, worn to denote an office held, profession practiced or as an indication of rank or status, most associated with formal academic dress (sometimes in the phrase “cap & gown”).

(6) Those who work or study at a university as opposed to the other residents of the university town, expressed in the phrase “town & gown”.

(7) Historically, the dress of civil, as opposed to military officers.

(8) To supply with or dress in a gown.

1300-1350: From Middle English goune & gowne, from Anglo-Norman gune & goune (fur-trimmed coat, pelisse), from the Old French goune (robe, coat; nun's habit), from the Late Latin gunna (a garment of fur or leather), from the Ancient Greek γούνα (goúna) (coarse garment), of unknown origin but may be from a Balkan or Apennine language where it seems to have been used as early as the eighth century to describe a fur (or fur-lined), cloak-like garment worn by old or infirm monks; More speculatively, some scholars suggest a Celtic source.  The alternative explanation suggests a Scythian origin, from the Proto-Iranian gawnám (fur), the possibility of this link supported by the Younger Avestan gaona (body hair) and the Ossetian гъун (ǧun).  The alternative spelling gowne is obsolete and descendants in other languages include the Bengali গাউন (gaun), the Japanese ガウン, the Korean  가운 (gaun), the Malay gaun, the Punjabi ਗਾਊਨ (gāūna) and the Welsh gown.  Gown is a noun and verb and gowned is an adjective; the noun plural is gowns.

Surgeon in blood-splattered surgical gown (also called hospital or medical gowns), mid-surgery.

As late as the eighteenth century, gown was the common word for what is now usually described as dress and gown in this sense persisted in the US longer than in the UK and there was on both sides of the Atlantic something of a twentieth century revival and the applied uses (bridal gown, nightgown etc) became more or less universal.  The meaning “a loose, flowing outer garment in various forms, worn to denote an office held, profession practiced or as an indication of rank” emerged in the late fourteenth century and the collective singular for “residents of a university” dates from the 1650s, still heard in the rhyming phrase “town & gown”.  The night-gown (worn once by both men & women but now associated almost exclusively with the latter) became a thing in the fourteenth century.

Lindsay Lohan in white & black color-blocked bandage dress.

Dress dates from circa 1300 and was from the Middle English dressen & dresse (to arrange, put in order), from the Anglo-Norman & Old French dresser, drecier (which persists in as dresser), from the unattested Vulgar Latin dīrēctiāre, from the Classical Latin dīrēctus, the perfect passive participle of dīrigō (to arrange in lines, direct, steer), the construct being dis- (the prefix in this context meaning “apart; asunder; in two’) + regō (to govern, manage), ultimately from the primitive Indo-European h₃reǵ- (straight, right).  The noun dress was derived from the verb and emerged in the sense of “attire” in the early 1600s.  Originally, a dress was always something which covered both the upper and lower parts of the female body but not of necessity in once piece.  The dressing gown seems first to have been described as such in 1854 although in French both robe de chambre (dressing gown) & robe de nuit (nightgown) had been in use for centuries.

Lindsay Lohan in dressing gowns; in the US such things would usually be called bathrobes.

Robe dates from the mid-thirteenth century Middle English robe & robbe and was from the Old French robe, robbe & reube (booty, spoils of war, robe, garment), from the Frankish rouba & rauba (booty, spoils, stolen clothes (literally “things taken”)), from the Old High German roub, from the Proto-Germanic raubō, raubaz & raubą (booty, that which is stripped or carried away), from the primitive Indo-European Hrewp- (to tear away, peel off).  The noun use of robe to refer to garments had entered general use by the late thirteenth century, an adoption of a meaning from the Old French, presumably because fine clothing looted from defeated enemies were among the most prized of the spoils of war.  The Old French robe (and the alternative spellings) had as concurrent meanings both “clothing” & “plunder: as did the Germanic forms including the Old English reaf (plunder, booty, spoil; garment, armor, vestment).  By the late thirteenth century robe had assumed the meaning “a long, loose outer garment reaching almost to the floor, worn by men or women over other dress”, those closest European equivalents being the twelfth century Old French robe (long, loose outer garment) and the Old High German rouba (vestments).  In royal, academic and ecclesiastical circles, the particular style of robes became regulated to denote rank, function or or membership of a religious order and royal courts would include offices like “page of the robes”, “mistress of the robes”, master of the robes etc” although those titles are (to modern eyes) misleading because their responsibilities extended to garments generally and not just robes as they’re now understood.  The metonymic sense of “the robe” for "the legal profession" dates from the 1640s, a reference to the dark robes worn by advocates when appearing in court.  Robe went on productively to be adopted for other purposes including (1) in the US “the skin of a bison (later applied to other slaughtered beasts) used as a cloak or wrap, (2) a short form of wardrobe (especially when built into a wall rather than being stand-alone) and (3) the largest and strongest leaves on a tobacco plant.

Singer Dr Taylor Swift in academic gown after being conferred an honorary doctorate in fine arts from New York University, May 2022.

In formal and vocational use, gown and robe and well understood and there tends not to be overlap except among those unacquainted with such things.  That’s understandable because to the casual observer the things can look much the same and the differences in nomenclature are more to do with tradition than style or cut.  Judges for example ware judicial robes and in the US these are usually black whereas elsewhere in the English-speaking world they can be of quite vivid hues, red and scarlet the most admired.  The US influence however seem pervasive and the trend is now almost universally black, certainly among newly established courts; in the same courts, barristers robes look much the same the term “judicial robe” is exclusive to the bench, the advocates garments variously called “barristers’ robes” “legal robes” or lawyers’ robes”.  Academics however wear gowns and again, the Americans tend to favor black while in the English tradition, all the colors of the rainbow have been seen.  These differ from surgical (also known as hospital or medical gowns) which, compared with just about every other gown, really aren’t gowns at all.  Surgical gowns are made usually in a blue, beige or green pastel color (better to show the blood) and are a kind of inverted dress which is fastened at the back (by an assistant so the wearer’s fingers don’t pick up germs).  In the UK parliament, there were many robes for offices of state and the one worn by the speaker made its way to colonial and dominion parliaments.  They're now rarely worn except on ceremonial occasions and the best known is probably that of the UK’s chancellors of the exchequer although the last one, dating from the late nineteenth century, is said to have “gone missing” while Gordon Brown (b 1951; UK prime-minister 2007-2010) was chancellor.

New South Wales (Australia) Supreme Court and Court of Appeal judges in judicial robes during the pandemic.

It’s in women’s fashion where the distinction between a gown and a dress can become muddied and probably most illustrative is the matter of the “wedding dress” and the “wedding gown”.  Even among couturiers, there’s actually no agreed definition of where one ends and the other begins and it’s very much in the eye of the beholder although the eye of the retailer is doubtless quite an influence, the theory being that the grander the design and the more the fabric, the more plausible is the label “wedding gown” and the higher the price-tag.  These informal (but serviceable) rules of thumb work also for dresses & gowns in general, the distinction more one of semantics and personal preference although in saying that, it’s only at the margins where there can be confusion; a minimalist LBD (little black dress) would never be confused with a gown and the grandest creations recalling those worn at the famous balls held in conjunction with the Congress of Vienna (1814-1815) would never be called dresses.


Watercolor of one of the many balls held during the Congress of Vienna.

Despite that, in the narrow technical sense, to a seamstress, all gowns are dresses, but not all dresses are gowns and as late as the early eighteenth century the word "dress" was still not the exclusive province of women’s clothing ensembles.  In recent centuries, the dress has been defined by its modifiers (sun-dress, summer-dress, evening-dress, travelling dress, riding-dress etc) and the modern convention seems to be that if an invitation specifies semi-formal then an evening dress is expected and that might be something thought a gown but not necessarily.  However, when an invitation states that the occasion is formal, women are expected to wear an evening gown.  Classically, that’s understood to be something at once precise yet frivolous, with a tight fitting bodice and a skirt which reaches to the floor and this was once the accepted standard for any red-carpet event of note but the recent trend towards outrageous displays of skin has in the entertainment industry subverted the tradition although the audience is expected still to adhere.


Lindsay Lohan in a diaphanous gown, Met Gala, New York, 2007.

Wednesday, February 8, 2023

Formalism

Formalism (pronounced fawr-muh-liz-uhm)

(1) Strict adherence to, or observance of, prescribed or traditional forms, as in music, poetry and art.

(2) In religion, a strong attachment to external forms and observances.

(3) In philosophy (ethics), a doctrine that acts are in themselves right or wrong regardless of consequences.

(4) In mathematics (formal logic), a doctrine that mathematics, including the logic used in proofs, can be based on the formal manipulation of symbols without regard to their meaning (the mathematical or logical structure of a scientific argument as distinguished from its subject matter; the theory a statement has no meaning but that its symbols, regarded as physical objects, exhibit a structure that has useful applications).

(5) A scrupulous or excessive adherence to outward form at the expense of inner reality or content

(6) In Marxist criticism, scrupulous or excessive adherence to artistic technique at the expense of social values etc; also a view adopted by some non-Marxist critical theorists).

(7) In performance art, theatre a deliberately stylized mode of production.

(8) In both structural engineering and computer science, the notation, and its structure, in (or by) which information is expressed.

1830–1840: The construct was formal + -ism.  Formal was from the Middle English formel, from the Old French formel, from the Latin formalis, from forma (form) of unknown origin but possibly from the Etruscan morma, from the Ancient Greek μορφή (morph) (shape, fashion, appearance, outward form, contour, figure), dissimilated as formīca and possibly formīdō.  The –ism suffix was from the Ancient Greek –ismos & -isma noun suffixes, often directly, often through the Latin –ismus & -isma, though sometimes through the French –isme or the German –ismus, all ultimately from the Greek.  It appeared in loanwords from Greek, where it was used to form action nouns from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Although actual use of the word formalism dates only from its adoption (1830s) in critical discourse, disputes related to the matter can be found in texts since antiquity in fields as diverse as agriculture, literature and theology.  Formalism is a noun, formalist is a noun & adjective, formalistic is an adjective and formalistically is an adverb; the usual noun plural is formalists.

Comrade Stalin, Comrade Shostakovich and Formalism

Comrade Stalin (1878–1953; leader of the USSR, 1924-1953) didn’t invent the regime’s criticism of formalism but certainly made it famous after comrade Dmitri Shostakovich (1906-1975) was denounced in the Soviet newspaper Pravda (Truth) in January 1936, after the Moscow performance of his opera Lady Macbeth of the Mtsensk District Stalin didn’t like music he couldn’t whistle and the complex strains of Shostakovich’s opera, sometimes meandering, sometimes strident, certainly didn’t permit that; he labeled the composition формализм (formalism), "chaos instead of music", a self-indulgence of technique by a composer interested only in the admiration of other composers, an audience of no value in the school of Soviet realism.  It’s believed the Pravda article may have been written by Stalin himself and he used the word "formalism" in the sense it was understood English; formality being the observance of forms, formalism the disposition to make adherence to them an obsession.  To Stalin, the formal rules of composition were but a means to an end and the only legitimate end was socialist realism; anything other than that "an attack on the people".  Lest it be thought the defeat of fascism in the Great Patriotic War (1941-1945) might have mellowed his views in such matters, Stalin at the 1948 party congress made sure the point was hammered home in the Communist Party's brutish way:  

"Comrades, while realistic music is written by the People's composers, formalistic music is written by composers who are against the People.  Comrades, one must ask why it is that realistic music is always written by composers of the People? The People's composers write realistic music simply due to the fact that being by nature realists right to their very core, they simply cannot help writing music that is realistic, while those anti-People composers, being by nature unrepentant formalists, cannot help... cannot help... cannot help writing music that is formalistic."

Comrade Stalin signing death warrants.

In the Soviet Union, producing or performing stuff hated by Stalin was not good career move.  Shostakovich completed his Fourth Symphony in C minor, Opus 43, in May 1936 and, even after the attack in Pravda, planned to stage its premiere in Leningrad December but found the orchestra unwilling to risk incurring the Kremlin’s wrath and almost as soon as rehearsals began, the orchestra's management cancelled the performance, issuing a statement saying comrade Shostakovich had withdrawn the work.  Actual responsibility for the decision remains unclear but it was certainly in accord with the views of the Kremlin and not until 1961, almost a decade on from Stalin’s death, was it performed.

Comrade Shostakovich at his dacha.

Shostakovich became a realist, his response to denunciation the melodic Fifth Symphony in D minor, Opus 47.  Premiered in November 1937 in Leningrad, it was a resounding triumph, attracting a standing ovation that lasted more than thirty minutes.  The following January, just before its first performance in Moscow, an article, under the composer’s name, appeared in the local newspaper Vechernyaya Moskva in which he described the Fifth Symphony as "…a Soviet artist's creative response to justified criticism."  Whether Shostakovich actually wrote the piece isn’t known but there’s never been any doubt it’d never have been published without Stalin’s approval and the success of the Fifth Symphony re-personed Shostakovich.  Whatever it lacked in glasnost (openness), it made up for in perestroika (restructuring) and the party engineered his rehabilitation as carefully as it had his fall a couple of years earlier, anxious to show how those bowing its demands could be rewarded as easily and fully as dissidents could be punished.

Tuesday, February 7, 2023

Awful

Awful (pronounced aw-fuhl)

(1) Extremely bad; unpleasant; ugly.

(2) Inspiring fear; dreadful; terrible.

(3) Solemnly impressive; inspiring awe; full of awe; reverential (obsolete).

(4) Extremely dangerous, risky, injurious, etc.

(5) Very; extremely.

1250-1300: From the Middle English agheful, awfull, auful aueful & aȝefull (worthy of respect or fear, striking with awe; causing dread), the construct of all based on the idea of awe +‎ -ful (aghe the earlier form of awe), the same model as the Old English eġeful & eġefull (terrifying; awful).  Etymologists treat the emergence in the early nineteenth century (1809) of the meaning “very bad” as a weakening of the original sense but it can be regarded as a fork and thus a parallel path in the same way as the sense of "excessively; very great" which is documented since 1818.  Interestingly, there’s evidence from the late sixteenth century that was spasmodic use of awful that was more a variation of the original, meaning “profoundly reverential, full of awe” (awe in this case a thing more of reverence than fear and trepidation).  The spellings awfull, aweful & awefull are all obsolete although some dictionaries list awfull as archaic, a fine distinction of relevance only to lexicographers.  Awful is an adjective & (in colloquial US use, mostly south of the Mason-Dixon Line) an adverb, awfully is an adverb, awfuller & awfullest are adjectives, awfulize is a verb and awfulization & awfulness are nouns; in slang the non-standard noun plural “awfuls” is used in the same sense as the disparaging “ghastlies”.

The adverb awfully (which would later assume a life of its own) around the turn of the fourteenth century meant "so as to inspire reverence" by the end of the 1300s had come also to mean "dreadfully, so as to strike one with awe (in the sense of “fear & dread”) and this was by the 1830s picked up as a simple intensifier meaning "very, exceedingly", Henry Fowler (1858–1933) in his A Dictionary of Modern English Usage (1926) noting with his usual weary disapproval that awfully’s “downward path” was such that it was now nothing but a synonym of “very”.  That seems harsh given “awfully” would seem able to convey a nuance and even Henry Fowler conceded that in Ancient Greek the equivalent word αἰνόςως (ainósōs) was used to mean both (1) “horribly, dreadfully, terribly” & (2) “very, extremely, exceptionally” but despite his reverence for all things Hellenic, he didn’t relent.

Awfully good: Lindsay Lohan at the premiere of Mr & Mrs Smith, Los Angeles, June, 2005.  A kind of elaborated bandage dress with some nice detailing, the dress Lindsay Lohan wore in 2005 attracted much favourable comment, as did the designer's sense of restraint, necklaces and other embellishments eschewed, a sprinkle of freckles presumably thought adornment enough.  A dress like this encapsulates the simple rule: When in doubt, stick to the classics.

The adjective god-awful (also as godawful) had an even more muddled evolution, the Oxford English Dictionary (OED) in 1878 listing the meaning “impressive” before, a decade later, revising this to “impressively (ie “very”) terrible”, which seems better to reflect the sense in which it seems always to have been applied since being coined as a colloquialism of US English.  In use it’s thought to have been mostly part of oral speech and except in dictionary entries appeared rarely in print prior to the 1920s so the origin is obscure, etymologists pondering that either “God” was used as a simple intensifier or in the sense of the frequent God's awful vengeance, judgment etc, a phrase common in religious literature.

As adjectives, the usual forms of the comparative & superlative are respectively more awful & most awful but dictionaries continue to acknowledge awfuller & awfullest as correct English although most users would probably flag both as “wrong” and their clumsy sound means they’re avoided even by those aware of their status.  The verbs awfulize, awfulizes, awfulizing & awfulized are technical terms in psychotherapy which describe patients reacting dramatically or catastrophically to distressing events, usually in the sense of a disproportionate reaction; the noun form is awfulization.  Perhaps surprisingly, social media users seem not to have picked up “awfulization”; it would seem a handy descriptor of much content.

A sentence like “it was a godawful book and awfully long but awfully well-written” actually makes sense and few would be confused because the various meanings are conveyed by context.  So, despite the tangled history, awful and its derivatives usually present few problems, even the colloquial “something awful” (“awfully; dreadfully; terribly” except in North America (mostly south of the Mason-Dixon Line & among classes so influenced) where it means “very, extremely”) usually able to be decoded: “I was hungry something awful” and “there’s something awful about crooked Hillary Clinton” both unambiguous even if the former sounds strange to those accustomed to “educated speech”, a term much criticized but well-understood.

Awful: Lindsay Lohan at the afterparty for Roberto Cavalli's fashion show, Milan Fashion Week, March 2010.  Although they tend to group-think, fashion critics are not monolithic but none had a good word to say about this outfit, the consensus being: Awful.  A few grudgingly granted a pass to the glittering Roberto Cavalli harem pants but the fur gilet was condemned as if Ms Lohan had with her bare hands skinned a live mink, eating the liver; these days, even faux fur seems grounds for cancellation.  Some, presumably those who picked up a photo from the agencies, called it a stole and at certain angles it resembled one but it really was as gilet.  As a footnote, many did admire the Fendi platform pumps so there was that though nobody seemed to think they redeemed things.

Gilet was from the French gilet (vest, waistcoat), from the regional Italian gileccu (Calabria), gilecco (Genoa), gelecco (Naples) & ggileccu (Sicily), (though the standard Italian gilè was borrowed directly from the French), from the Turkish yelek (jelick; vest, waistcoat, from the Proto-Turkic yẹl (the noun of “wind”) with the final syllable modified to match other styles of garments such as corselet and mantelet.  Historically a gilet was (1) a man’s waistcoat & (2) a woman’s bodice a la the waistcoat or a decorative panel either attached to the bodice or worn separately.  In modern use, a gilet is a sleeveless jacket which can be closed to the neck and is often padded to provide warmth.  Some puffer jackets and garments described as bodywarmers can be classed as gilets.

Stole was from the Old English stole, from the Latin stola, from the Ancient Greek στολή (stol) (stole, garment, equipment).  The original stoles were ecclesiastical vestments and were decorated bands worn on the back of the neck, each end hanging over the chest (reaching sometimes to the ground) and could, inter alia, be an indication of clerical rank, geographical location or membership of an order.  In English and European universities, stoles were also adopted as academic dress, often added to an undergraduate’s gown for a degree conferring ceremony.  In fashion, the stole was a garment in the style of a scarf or muffler and was there always for visual effect and sometimes warmth.  Fur stoles were especially popular until wearing it became socially proscribed and (trigger warning) there were fox stoles which included the beast's entire pelt including the head and the much admired brush (tail).