Showing posts sorted by relevance for query Whig. Sort by date Show all posts
Showing posts sorted by relevance for query Whig. Sort by date Show all posts

Wednesday, July 13, 2022

Whig

Whig (pronounced wig)

(1) To move along briskly (obsolete except in Scotland).

(2) A political party in Great Britain and the United Kingdom between 1679-circa 1860 and in the and US circa 1834-1855 (initial capital).

(3) Slang for a conservative member of the Liberal Party in Great Britain (used both with and without initial capital).

(4) Slang for certain factions in the US Republican Party (used both with and without initial capital).

(5) A (rarely used) historical term for a seventeenth-century Scottish Presbyterian, especially one in rebellion against the Crown (used both with and without initial capital).

(6) In Northern English dialectal use, acidulated whey, sometimes mixed with buttermilk and sweet herbs, used as a cooling beverage (obsolete).

(7) Buttermilk (now rare)

Circa 1657: The British political movement later called Whig began to emerge in the mid-1650s, (“emerged” is a better expression than “was formed”), in part perhaps a disparaging use of the 1640s whigg (a country bumpkin) but the greater influence was the 1649 Whiggamaire (later Whiggamore) (the Covenanters, adherents of the Presbyterian cause in western Scotland who marched on Edinburgh in 1648 to oppose Charles I)  The sense, from circa 1635, of a country bumpkin may have been linked to "a horse drover," from the dialectal verb whig "to urge forward" + mare (in the sense of a horse).  In 1689 the name was first used in reference to members of the British political party opposed to the Tories.  The American Revolution era sense of "colonist who opposes Crown policies" is from 1768 and, as early as 1825, was applied to opponents of Andrew Jackson and taken as the name of a political party (1834), most of the factions of which were absorbed by the Republican Party between 1854-1856.  The adjective whiggish (used usually as a disparaging way of describing the tendencies of some towards the philosophies of the Whigs while claiming alignment with another political faction) is from the 1670s, the noun whiggery (principles or practices of the Whigs) noted during the next decade.  Whig, Whiggishness & Wiggery are nouns, Wiggish is an adjective; the noun plural is Whigs.

Portrait of Lord Shelburne (1776), oil on canvas, by Joshua Reynolds (1723–1792).

The Whigs were a faction of the Tory party which later became a party in its own right.  In its varied forms, the Whigs operated between 1679 and the late nineteenth-century, their philosophy based on a defense of constitutional monarchism and an opposition to absolutism, the part the Whigs played in British politics later absorbed by Tory factions and the Liberal Party although strains of its philosophy can sometimes be seen still in the Labour Party (depending on which faction is in the ascendant).  Structurally, the morphing of a Tory faction into a formalized party had far-reaching consequences which continue to this day; no prime-minister since Lord Shelburne (1737-1805; UK prime-minister 1782–1783) has attempted to govern without the support of a party.

In the US, a Whig Party was active in the mid-nineteenth-century and four US presidents belonged to the party while in office.  Formed originally in opposition to the policies of Democratic President Andrew Jackson, the Whigs supported the supremacy of the congress over the presidency and favored a program of modernization, banking, and economic protectionism to stimulate manufacturing.  Not directly related to the British Whigs, party founders chose the name to echo those of the eighteenth-century who fought for independence, nodding also in the direction of the earlier Federalist Party but would later dissolve because of internal tensions over the expansion of slavery to the territories.  Charmingly, many joined the short-lived Know Nothing Party; most eventually drifted back to the Democrats or Republicans although the name is revived from time-to-time but without much electoral success.  Of late, some belonging to the more conservative factions in the Republican Party are labeled Whigs and this can be either in disparagement or self-referentially.

The term “Whig historian” was first recorded in 1924.  Despite the temptation, it really can’t be used in any neutral sense because of the legacy of the words of Sir Herbert Butterfield (1900–1979; Regius Professor of Modern History and Vice-Chancellor of the University of Cambridge) who, early in life, published the book for which he is still remembered: The Whig Interpretation of History (1931).  In that slim volume, he defined Whig history as "the tendency in many historians... to emphasise certain principles of progress in the past and to produce a story which is the ratification if not the glorification of the present."  Both "Whig historian" and the "Whig interpretation of history" are thus loaded terms.  Sir Herbert, it was clear, was thinking of the English tradition of historiography but his critique has been widely adopted, the idea of the retrospective creation of a line of progress toward the glorious present a theme now explored not only by the odd Whig but also the post-modernists.

Lindsay Lohan in blonde wig (asymmetric bob) on the Late Night with Jimmy Fallon show, New York, November 2012.

The word wig (a head of real or synthetic hair worn on the head for various reasons) is unrelated to Whig.  Dating from the 1660s, it was a clipping of the French periwig (a wig, especially the large, stylised constructions worn by both men & women) which was an alteration of the Middle French perruque (wig).  The word “wig” in 1730s England was adapted to created the informal “bigwig” (an important person), based on the fashion at the time for those in authority to wear large, elaborate wig, the idea (presumably not without foundation) that the more important the person, the bigger the wig.  The same linkage explains the military slang “brass hat” (a high-ranking officer), based on the brass embellishments or insignia applied to the hats of the upper ranks.  The term persists (even outside the military) even though the metal is now rare even on the hats of dress uniforms but there's still often gold braid to justify the connection.

Wednesday, December 6, 2023

Bedchamber

Bedchamber (pronounced bed-cheym-ber)

A now archaic word for bedroom; the alternative form was bed-chamber.

1325–1375:  From the Middle English bedchaumbre, the construct being bed + chamber.  Bed was from the Middle English bed or bedde, from the pre-1000 Old English bedd (bed, couch, resting-place; garden-bed, plot), from the Proto-Germanic badją (plot, grave, resting-place, bed) and thought perhaps derived from the Proto-Indo-European bhed (to dig).  It was cognate with the Scots bed and bede, the North Frisian baad and beed, the West Frisian bêd, the Low German Bedd, the Dutch bed, the German bett, the Danish bed, the Swedish bädd, the Icelandic beður and perhaps, (depending on the efficacy of the Proto-Indo-European lineage), the Ancient Greek βοθυρος (bothuros) (pit), the Latin fossa (ditch),the Latvian bedre (hole), the Welsh bedd (grave), the Breton bez (grave).  Any suggestion of links to Russian or other Slavic words is speculative.

Chamber dates from 1175-1225 and was from the Middle English chambre, borrowed from Old French chambre, from the Latin camera, derived from the Ancient Greek καμάρα (kamára) (vaulted chamber); the meaning “room”, usually private, drawn from French use.  As applied to anatomy, use emerged in the late fourteenth century; it was applied to machinery in 1769 and to ballistics from the 1620s.  The meaning "legislative body" is from circa 1400 and the term chamber music was first noted in 1789, not as a descriptor of any musical form but to indicate that intended to be performed in private rooms rather than public halls.

The Bedchamber Crisis, 1839

A Lady of the Bedchamber, a position held typically by women of noble descent, is a kind of personal assistant to the Queen of England.  A personal appointment by the Queen, they’ve existed for centuries, their roles varying according to the relationships enjoyed.  Most European royal courts from time-to-time also adopted the practice.

The 1839 bedchamber crisis is emblematic of the shifting of political power from monarch to parliament.  Although the eighteenth-century administrative and economic reforms created the framework, it was the 1832 Reform Act which, in doing away with a monarch’s ability to stack parliaments with ample compliant souls, shattered a sovereign’s capacity to dictate election results and within two years the new weakness was apparent.  In 1834, William IV (1765–1837; King of the UK 1830-1837)  dismissed the Whig Lord Melbourne (1779–1848; Prime Minister of the UK 1834 & 1835-1841) and appointed the Tory Sir Robert Peel (1788–1850; Prime Minister of the UK 1834–1835 & 1841–1846).  However, the King no longer enjoyed the electoral influence necessary to secure Peel a majority in the Commons and after being defeated in the house six times in as many weeks, the premier was obliged to inform the palace of his inability to govern, compelling the king to invite Melbourne to form a new administration, one which endured half a decade, out-living William IV.  The king's exercise in 1834 of the royal prerogative proved the last time the powers of the head of state would be invoked sack a prime-minister until an Australian leader was dismissed in 1975 by the governor-general (and in a nice touch the sacked PM had appointed the clearly ungrateful GG).

Queen Mary's State Bed Chamber, Hampton Court Palace (1819) by Richard Cattermole (1795–1858).

By 1839, Melbourne felt unable to continue and the new Queen Victoria (1819–1901; Queen of the UK 1837-1901), reluctantly, invited Sir Robert Peel to assume the premiership, a reticence some historians attribute as much to her fondness for the avuncular Melbourne as her preference for his Whig (liberal) politics.  Peel, knowing any administration he could form would be nominally in a minority, knew his position would be strengthened if there was a demonstration of royal support so asked Victoria, as a gesture of good faith, to replace some of the Whig Ladies of the Bedchamber with a few of Tory breeding.  Most of the ladies were the wives or daughters of Whig politicians and Sir Robert’s request made sense in the world of 1839.

Victoria rejected his request and prevailed upon Melbourne to continue which he did, until a final defeat in 1841.  By then it was clear only Peel could command a majority in the Commons and he insisted on his bedchamber cull, forcing Victoria to acquiesce to the parliament imposing on her the most intimate of her advisors.  This is the moment in constitutional history where the precedent is established of the parliament and not the Crown determining the formation and fate of governments.  Since then, the palace can warn, counsel and advise but not compel.

A lady in, if not of, the bedchamber.  A recumbent Lindsay Lohan in The Canyons (IFC Films, 2013).

Friday, November 18, 2022

Serendipity

Serendipity (pronounced ser-uhn-dip-i-tee)

(1) The faculty of making fortunate discoveries by accident; a combination of events which have come together by chance to make a surprisingly good or wonderful outcome.

(2) Luck, good fortune.

(3) As the serendipity berry (Dioscoreophyllum volkensii), a tropical dioecious rainforest vine in the family Menispermaceae, native to tropical Africa from Sierra Leone east to Eritrea, and south to Angola and Mozambique.

1754: The construct was Serendip + -ity.  The proper noun Serendip (Serendib the alternative form) was an archaic name for the island of Ceylon (सिंहल (sihala (Sri Lanka”) after 1972 from द्वीप (dvīpa) (island)), from the Arabic سَرَنْدِيب‎ (sarandīb), from the Persian سرندیپ (sarandip), from the Prakrit sīhaladīva & Sanskrit सिंहलद्वीप (sihaladvīpa (literally “island of the Sinhala people”)).  The –ity suffix was from the French -ité, from the Middle French -ité, from the Old French –ete & -eteit (-ity), from the Latin -itātem, from -itās, from the primitive Indo-European suffix –it.  It was cognate with the Gothic –iþa (-th), the Old High German -ida (-th) and the Old English -þo, -þu & (-th).  It was used to form nouns from adjectives (especially abstract nouns), thus most often associated with nouns referring to the state, property, or quality of conforming to the adjective's description.

Serendipity berries, one of the “miracle berries”.

The serendipity berry is noted as a source of monellin, an intensely sweet protein and if chewed, alters the perception of taste to make tart, acidic or bitter food taste sweet.  Pills containing synthesized monellin are sold as “miracle fruit tablets” for this purpose (a lemon eaten after sucking on one of these tablets quite a revelation) and as “miracle fruit”, serendipity and related berries are widely used in African folk medicine although there’s scant evidence for their efficacy as a treatment for the many diseases they’re said to cure.  Words with a similar meaning include fluke, happenstance, blessing, break and luck but serendipity carries the particular sense of something very useful and wholly unexpected being the result while The phrases “Murphy's law” & “perfect storm” are close to being antonyms.  In science and industry, serendipity has played a part in the discovery or development of vaccination, insulin to treat diabetes, penicillin, quinine, Viagra, x-rays, radioactivity, pulsars, cosmic microwave background radiation, Teflon, vulcanized rubber, microwave ovens, Velcro and 3M's (originally Minnesota Mining & Manufacturing) post-it notes (though it seems its part in the invention of stainless steel may be a myth).  Serendipity & serendipitist are nouns, serendipitously is an adverb and serendipitous is an adjective; the noun plural is serendipities.  Serendipiter & serendipper are non-standard noun forms adopted in popular culture.

Serendipity was in 1754 coined by the English Whig politician & author writer Horace Walpole (1717–1797), derived from the fairy tale The Three Princes of Serendip, the three, Walpole noted, “always making discoveries, by accidents and sagacity, of things which they were not in quest of”.  The Three Princes of Serendip was an English version of Peregrinaggio di tre giovani figliuoli del re di Serendippo, printed in 1557 by Venetian publisher Michele Tramezzino (1526-1571), the text said to have been the work of a Cristoforo Armeno who had translated the Persian fairy tale into Italian, adapting Book One of Amir Khusrau's Hasht-Bihisht (1302). The story was translated into French before the first English edition was published and Voltaire (François-Marie Arouet, 1694–1778) used the tale in his 1797 novella Zadig ou la Destinée (Zadig or The Book of Fate), an intriguing fusion of fiction and philosophy which influenced systematic science, the evolution of creative writing about crime and even horror stories.

Portrait of Horace Walpole (1728), aged ten by William Hogarth (1697-1784) in gilt frame, Strawberry Hill House, Twickenham.  He was the youngest son of Sir Robert Walpole (1676-1745), the Whig politician who between 1721-1742 served as first lord of the Treasury, chancellor of the exchequer and leader of the House of Commons; by virtue of this, he came to be recognized as Britain's first prime-minister.

Walpole used serendipity first in a letter (dated 28 January 1754) he wrote to Florence-based British diplomat Sir Horace Mann (1706–1786) but which seems not to have been published until 1833, the new word remaining almost unnoticed until the 1870s when there was a brief spike; it was in the early-mid twentieth century it became popular and until then it was rare to find a dictionary entry although the adjective serendipitous appeared as early as 1914.  Walpole was compelled to coin serendipity to illustrate his delighted surprise at finding a detail in a painting of Bianca Cappello (1548–1587 and latterly of the clan Medici) by Giorgio Vasari (1511-1574).  The charm of the word is such that it’s been borrowed, unaltered, by many languages and it frequently appears in "favorite word" or "words of the year" lists.

Portrait of Bianca Cappello, Second Wife of Francesco I de' Medici (circa 1580), fresco by Alessandro Allori (1535-1607), Galleria degli Uffizi, Firenze.

The twenty-two-year-old Walpole fell under the charm of the long dead Bianca Cappello while staying in Florence during his grand tour of the continent.  Besotted by the portrait of the peach-skinned Venetian beauty which hung in the Casa Vitelli, it's not clear what immediately drew his eye but a diary note by the French writer Montaigne (1533–1592), who in 1580 had the pleasure of meeting her, might provide a hint: “...belle à l’opinion italienne, un visage agréable et imperieux, le corsage gros, et de tetins à leur fouhait” ("...according to the Italians she is beautiful.  She has an agreeable and imposing face, and large breasts, the way they like them here…").  He confided his passion to Mann who around 1753 purchased the work, sending it to his friend who had by then returned to England, his cover letter including the revelatory “It is an old acquaintance of yours, and once much admired by you... it is the portrait you so often went to see in Casa Vitelli of Bianca Capello… to which, as your proxy, I have made love to for a long while… It has hung in my bedchamber and reproached me indeed of infidelity, in depriving you of what I originally designed for you”.  These days, such things are called objectum sexuality or fictosexualism but in the eighteenth century it was just something the English aristocracy did.

Lindsay Lohan in polka-dots, enjoying a frozen hot chocolate, Serendipity 3 restaurant, New York, 7 January 2019.

Whatever other pleasures the oil on canvas bought him, Walpole must also have devoted some attention to detail for he would soon write back to Mann: “I must tell you a critical discovery of mine a propos in a book of Venetian arms.  There are two coats of Capello… on one of them is added a fleu de luce on a blue ball, which I am persuaded was given to the family by the Grand Duke” (of Medici who was Bianca's second husband (who may have murdered the first)).  Much pleased at having stumbled upon this link between the two families in a book of Venetian heraldry he happened at the time to be reading in the search for suitable emblems with which to adorn the painting's frame, he told his dear friend: “This discovery indeed is almost of that kind which I call SERENDIPITY”.

Wednesday, June 15, 2022

Blueprint

Blueprint (pronounced bloo-print)

(1) A process of photographic printing, used chiefly in copying architectural and mechanical drawings, which produces a white line on a blue background; also called a cyanotype.

(2) A physical print made by this process.

(3) A slang term for a digital rendition of the process.

(4) A slang term for such a drawing, whether blue or not.

(5) By analogy, a detailed outline or plan of action (in text or image).

(6) To make a blueprint.

(7) A technique for optimizing the performance of internal combustion engines by machining (or matching) components to their exact specifications.

1887: The construct was blue + print (blue print and blue-print (1882) were the rarely used alternative spellings).  The figurative sense of "detailed plan" dates from 1926 and use as a verb is from 1939.

Blue dates from the sixteenth century and was from the Middle English blewe, from the Anglo-Norman blew (blue), from the Frankish blāu (blue) (possibly via the Medieval Latin blāvus & blāvius (blue)), from the Proto-Germanic blēwaz (blue, dark blue), from the primitive Indo-European bhlēw (yellow, blond, grey).  It was cognate with the dialectal English blow (blue), the Scots blue, blew (blue), the North Frisian bla & blö (blue), the Saterland Frisian blau (blue), the Dutch blauw (blue), the German blau (blue), the Danish, Norwegian & Swedish blå (blue), the Icelandic blár (blue).  It was cognate also with the obsolete Middle English blee (color) related to the Welsh lliw (color), the Latin flāvus (yellow) and the Middle Irish blá (yellow). A doublet of blae.  The present spelling in English has existed since the sixteenth century and was common by circa 1700.  Many colors have in English been productive in many senses and blue has contributed to many phrases in fields as diverse as mental health (depression, sadness), semiotics (coolness in temperature), popular music (the blues), social conservatism (blue stocking; blue rinse), politics (conservative (Tory) & Whig identifiers and (unrelated) the US Democratic Party), labor-market segmentation (blue-collar), social class (blue-blood), stock market status (blue-chip) and, inexplicably, as an intensifier (blue murder).

Print dates from circa 1300 and was from the Middle English printen, prenten, preenten & prente (impression, mark made by impression upon a surface), an apheretic form of emprinten & enprinten (to impress; imprint).  It was related to the Dutch prenten (to imprint), the Middle Low German prenten (to print; write), the Danish prente (to print), the Swedish prenta (to write German letters).  The late Old French preinte (impression) was a noun use of the feminine past participle of preindre (to press, crush), altered from prembre, from the Latin premere (to press, hold fast, cover, crowd, compress), from the primitive Indo-European root per- (to strike).  The Old French word was also the source of the Middle Dutch (prente (the Dutch prent) and was borrowed by other Germanic languages.

The sense of "a printed publication" (applied later particularly to newspapers) was from the 1560s.  The meaning "printed lettering" is from the 1620s and print-hand (print-like handwriting) from the 1650s.  The sense of "picture or design from a block or plate" dates from the 1660s while the meaning "piece of printed cloth or fabric" appeared first in 1756.  The photographic sense emerged apparently only by 1853, some three decades after the first photographs, the use evolving as printed photographs became mass-market consumer products.  Print journalism seemed to have been described as such only from 1962, a form of differentiation from the work of those employed by television broadcasters.

Blueprinting internal combustion engines is the practice of disassembling the unit and machining the critical components (piston, conrods etc) to the point where they exactly meet the stated specifications (dimensions & weight).  Essentially, the process is one of exactitude, using precision tools to make components produced using the techniques of mass production (which inherently involves wider tolerances) and modifying them by using tighter tolerances, meeting exact design specifications.  It’s most associated with high-performance racing cars, especially those which compete in “standard-production” classes which don’t permit modifications to most components.  In some cases, especially with factory-supported operations, the components might be specially selected, prior to assembly.

Blueprint of the USS Missouri (BB-63), an Iowa-class battleship launched in 1944.  Missouri was the last battleship commissioned by the US Navy.

The first blueprint was developed in 1842 by English mathematician, astronomer, chemist & experimental photographer Sir John Herschel (1792-1871).  What he then termed a “cyanotype process” eliminated the need to copy original drawings by means of hand-tracing, a cumbersome, time consuming (and therefore expensive) process.  At what was then an astonishingly low cost, it permitted the rapid and accurate production of an unlimited number of copies.  The cyanotype process used a drawing on semi-transparent paper that was weighted down on top of a sheet of paper which was then placed over another piece of paper, coated with a mix of ammonium iron citrate and potassium ferrocyanide (derived from an aqueous solution and latter dried).  When the two papers were exposed to light, the chemical reaction produced an insoluble blue compound called blue ferric ferrocyanide (which became famous as Prussian Blue), except where the blueprinting paper was covered and the light was blocked by the lines of the original drawing. After the paper was washed and dried to preserve those lines, the result was a negative image of white (or whatever color the blueprint paper originally was) against a dark blue background.  White was by far the most used paper and the most common cyanotypes were thus blue with white lines.  At least by 1882 they were being described as “blue prints” but by 1887, they were almost universally called blueprints and in engineering and architecture had become ubiquitous, Herschel’s photochemical process producing copies at a tenth the cost of hand-tracing.

Factory blueprint (quotation drawing produced on diazo machine) of 1955 Mercedes-Benz 300 SLR (W196S Uhlenhaut Coupé).  Two were built, one of which sold in June 2022 for a world record US$142 million at a private auction held at the Mercedes-Benz Museum in Stuttgart.

Refinements and economies of scale meant that during the early twentieth century the quality of blueprints improved and costs further fell but by the 1940s, they began to be supplanted by diazo prints (known also as “whiteprints” or “bluelines”).  Diazo prints were rendered with blue lines on a white background, making them easier to read and they could be produced more quickly on machinery which was simpler and much less expensive than the intricate photochemical devices blueprints demanded.  Accordingly, reprographic companies soon updated their plant, attracted too by the lower running costs, the diazo machinery not requiring the extensive and frequent maintenance demanded by the physically big and intricate photochemical copiers.

1929 Mercedes-Benz SSKL printed in blueprint style.

One tradition of the old ways did however endure.  The diazo machines caught on but “diazo print”, “whiteprint” & “blueline” never did; the drawings, regardless of the process used, the color of the paper or the lines (and many used black rather than blue) continued to be known as “blueprints”.  That linguistic tribute persisted even after diazo printing was phased-out and replaced with the xerographic print process, the standard copy machine technology using toner on bond paper.  Used for some time in commerce, large-size xerography machines became available in the mid-1970s and although originally very expensive, costs rapidly fell and the older printing methods were soon rendered obsolete.  As computer-aided design (CAD) software entered the mainstream during the 1990s, designs increasingly were printed directly from a computer to printer or plotter and despite the paper used being rarely blue, the output continued to be known as the blueprint.

Blueprint of the Chrysler Building, New York City, 1930.

Even now, although often viewed only as multi-colored images on screens (which might be on tablets or phones), such electronic drawings are still usually called blueprints.  Nor have blueprints vanished.  There are many things (buildings, bridges, roads, power-plants, railroads, sewers et al) built before the 1990s which have an expected life measured in decades or even centuries and few of these were designed using digital records.  The original blueprints therefore remain important to those engaged in maintenance or repair and can be critical also in litigation.  Old blueprints can be scanned and converted to digital formats but in many cases, the originals are fragile or physically deteriorated and finer details are sometimes legible only if viewed on the true blueprint.  Centuries from now, magnifying glasses in hand, engineers may still be examining twentieth century blueprints.

Lindsay Lohan, blueprinted.    

Thursday, August 3, 2023

Mason

Mason (pronounced mey-suhn)

(1) A person whose trade is building with units of various natural or artificial mineral products, as stones, bricks, cinder blocks, or tiles, usually with the use of mortar or cement as a bonding agent.

(2) A person who dresses stones or bricks.

(3) A clipping of Freemason (should always use an initial capital but frequently mason and variations in this context (masonry, masonism etc) appear; a member of the fraternity of Freemasons.

(4) To construct of or strengthen with masonry.

1175–1225: From the Middle English masoun & machun (mason), from the Anglo-Norman machun & masson, from the Old French masson & maçon (machun in the Old North French), from the Late Latin maciō (carpenter, bricklayer), from the Frankish makjon & makjō (maker, builder; to make (which may have some link with the Old English macian (to make)) from makōn (to work, build, make), from the primitive Indo-European mag- (to knead, mix, make), conflated with the Proto-West Germanic mattijō (cutter), from the primitive Indo-European metn- or met- (to cut).  Etymologists note there may have been some influence from another Germanic source such as the Old High German steinmezzo (stone mason (the Modern German Steinmetz has a second element related to mahhon (to make)), from the primitive Indo-European root mag-.  There’s also the theory of some link with the seventh century Medieval Latin machio & matio, thought derived from machina, source of the modern English machine and the medieval word might be from the root of Latin maceria (wall).    From the early twelfth century it was used as a surname, one of a number based on occupations (Smith, Wright, Carter etc) and the now-familiar use to denote “a member of the fraternity of freemasons” was first recorded in Anglo-French in the early fifteenth century Mason is a noun & verb, masonry & masonism are nouns, masoning is a verb, masoned is an adjective & verb and masonic is an adjective; the noun plural is masons.

The noun masonry was from the mid-fourteenth century masonrie, (stonework, a construction of dressed or fitted stones) and within decades it was used to describe the “art or occupation of a mason”.  It was from the fourteenth century Old French maçonerie from maçon.  The adjective Masonic was adopted in the 1767 in the sense of “of or pertaining to the fraternity of freemasons” and although it was early in the nineteenth century used to mean “of or pertaining to stone masons”, that remained rare, presumably because of the potential for confusion; not all stonemasons would have wished to have been thought part of the order.  The stonemason seems first to have been used in 1733.  An earlier name for the occupation was the fifteenth century hard-hewer while stone-cutter was from the 1530s (in the Old English there was stanwyrhta (stone-wright).  The US television cartoon series The Simpsons parodied the Freemasons in well-received episode called Homer the Great (1995) in which the plotline revolved around a secret society called the “Stonecutters”.  Dating from 1926, Masonite was a proprietary name of a type of fiberboard made originally by the Mason Fibre Company of Mississippi, named after William H. Mason (1877-1940 and a protégé of Thomas Edison (1847-1931) who patented the production process of making it.  In 1840, the word enjoyed a brief currency in the field of mineralogy to describe a type of chloritoid (a mixed iron, magnesium and manganese silicate mineral of metamorphic origin), the name honoring collector Owen Mason from Rhode Island who first brought the mineral to the attention of geologists.

The Mason jar was patented in 1858 by New York-based tinsmith John Landis Mason (1832–1902); it was a molded glass jar with an airtight screw lid which proved idea for the storage of preserves (usually fruits or vegetables), a popular practice by domestic cooks who, in season, would purchase produce in bulk and preserve it using high temperature water mixed with salt, sugar or vinegar.  The jars were in mass-production by the mid-1860s and later the jars (optimized in size to suit the quantity of preserved food a family would consume in one meal) proved equally suited to the storage and distribution of moonshine (unlawfully distilled spirits).  Much moonshine was distributed in large containers (the wholesale level) but the small mason jars were a popular form because it meant it could be sold in smaller quantities (the retail level) to those with the same thirst but less cash.

A mason jar (left), Mason jar with pouring spout (centre) and mason jar with handle (right).

For neophytes, the classic mason jar can be difficult to handle either to drink from or to pour the contents into a glass.  Modern moonshine distillers have however stuck to the age-old jar because it’s part of the tradition and customers do seem to like purchasing their (now lawful) spirits in one.  South of the Mason-Dixon Line, “passing the jar” is part of the ritual of the shared moonshine experience and, being easily re-sealable, it’s a practical form of packaging.  To make things easier still, lids with pourers are available (which true barbarians put straight to their lips, regarding a glass as effete) and there are also mason jars with handles.

The Mason-Dixon Line and the Missouri Compromise Line.  

The Mason-Dixon Line was named after English astronomers Charles Mason (1728–1786) and Jeremiah Dixon (1733–1779) who between 1763-1767 surveyed the disputed boundary between the colonial holdings of the Penns (Pennsylvania) and the Calverts (Maryland), one of the many boarders (New South Wales & Victoria in Australia, Kashmir in the sub-continent of South Asia et al) in the British Empire which were ambiguously described (or not drawn at all) which would be the source of squabbles, sometimes for a century or more.  The line would probably by history have little been noted had it not in 1804 become the boundary between "free" and "slave" states after 1804, New Jersey (the last slaveholding state north of the line) passed an act of abolition.  In popular use “south of the Mason-Dixon Line” thus became the term used to refer to “the South” where until the US Civil War (1861-1865) slave-holding prevailed although, in a narrow technical sense, the line created by the Missouri Compromise (1820) more accurately reflected the political and social divisions.

A mason’s mark etched into a stone (left) and and image created from one of the registers of mason’s marks (right).

A mason's mark is literally a mark etched into a stone by as mason and historically they existed in three forms (1) an identifying notch which could be used by those assembling a structure as a kind of pattern so they would know where one stone was to be placed in relation to another, (2) as an mark to identify the quarry from which the stone came (which might also indicate the type of rock or the quality but this was rare within the trade where there tended to be experts at every point in the product cycle) and (3) the unique identifying mark of the stonemason responsible for the finishing (rather in the manner of the way the engineer assembling engines in companies like Aston Martin or AMG stamp their names into the block).  With the masons, these were known also bankers’ marks because, when the payment was by means of piece-work (ie the payment was by physical measure of the stone provided rather than the time spent) the tally-master would physically measure the stones and pay according to the cubic volume.  Every mason, upon their admission to the guild would enter into a register their unique mark.

Reinhard Heydrich (second from left, back to camera) conducting a tour of the SS Freemasonry Museum, Berlin, 1935.

Freemasonry has always attracted suspicion and at times the opposition to them has been formalized.  As recently as the papacy of Pius XII (1876-1958; pope 1939-1958), membership of Freemasonry was proscribed for Roman Catholics, Pius disapproving of the sinister, secretive Masons about as much as he did of communists and homosexuals.  In that he was actually in agreement with the Nazis.  By 1935, the Nazis considered the “Freemason problem” solved and the SS even created a “Freemason Museum” on Berlin’s Prinz-Albrecht-Palais (conveniently close to Gestapo headquarters) to exhibit the relics of the “vanished cult”.  SS-Obergruppenführer (Lieutenant-General) Reinhard Heydrich (1904–1942; head of the Reich Security Main Office 1939-1942) originally included the Freemasons on his list of archenemies of National Socialism which, like Bolshevism, he considered an internationalist, anti-fascist Zweckorganisation (expedient organization) of Jewry.  According to Heydrich, Masonic lodges were under Jewish control and while appearing to organize social life “…in a seemingly harmless way, were actually instrumentalizing people for the purposes of Jewry”.  That wasn’t the position of all the Nazis however.  Hermann Göring (1893–1946; leading Nazi 1922-1945 and Reichsmarschall 1940-1945) revealed during the Nuremberg Trial (1945-1946) that on the day he joined the party, he was actually on his way to join the Freemasons and was distracted from this only by a “toothy blonde” while during the same proceedings, Hjalmar Schacht (1877–1970; President of the German Central Bank (Reichsbank) 1933–1939 and Nazi Minister of Economics 1934–1937) said that even while serving the Third Reich he never deviated from his belief in the principles of “international Freemasonry”.  It’s certainly a trans-national operation and the Secret Society of the Les Clefs d’Or has never denied being a branch of the Freemasons.

In an indication they'll stop at nothing, the Freemasons have even stalked Lindsay Lohan.  In 2011, Ms Lohan was granted a two-year restraining order against alleged stalker David Cocordan, the order issued some days after she filed complaint with police who, after investigation by their Threat Management Department, advised the court Mr Cocordan (who at the time had been using at least five aliases) “suffered from schizophrenia”, was “off his medication and had a "significant psychiatric history of acting on his delusional beliefs.”  That was worrying enough but Ms Lohan may have revealed her real concerns in an earlier post on twitter in which she included a picture of David Cocordan, claiming he was "the freemason stalker that has been threatening to kill me- while he is TRESPASSING!"  Being stalked by a schizophrenic is bad enough but the thought of being hunted by a schizophrenic Freemason is truly frightening.  Apparently an unexplored matter in the annals of psychiatry, it seems the question of just how schizophrenia might particularly manifest in Freemasons awaits research so there may be a PhD there for someone.

The problem Ms Lohan identified has long been known.  In the US, between 1828-1838 there was an Anti-Mason political party which is remembered now as one of the first of the “third parties” which over the decades have often briefly flourished before either fading away or being absorbed into one side or the other of what has for centuries tended towards two-party stability.  Its initial strength was that it was obsessively a single-issue party which enabled it rapidly to gather support but that proved ultimately it’s weakness because it never adequately developed the broader policy platform which would have attracted a wider membership.  The party was formed in reaction to the disappearance (and presumed murder) of a former Mason who had turned dissident and become a most acerbic critic and the suspicion arose that the Masonic establishment had arranged his killing to silence his voice.  They attracted much support, including from many church leaders who had long been suspicious of Freemasonry and were not convinced the organization was anything but anti-Christian.  Because the Masons were secretive and conducted their meetings in private, their opponents tended to invent stories about the rituals and ceremonies (stuff with goats often mentioned) and the myths grew.  The myths were clearly enough to secure some electoral success and the Anti-Masons even ran William Wirt (1772-1834 and still the nation’s longest-serving attorney-general (1817-1829)) as their candidate in the 1832 presidential election where he won 7.8% of the popular vote and carried Vermont, a reasonable achievement for a third-party candidate.  Ultimately though, that proved the electoral high-water mark and most of its members thereafter were absorbed by the embryonic Whig Party.

Monday, February 6, 2023

Ultra

Ultra (pronounced uhl-truh)

(1) The highest point; acme; the most intense degree of a quality or state; the extreme or perfect point or state.

(2) Going beyond what is usual or ordinary; excessive; extreme.

(3) An extremist, as in politics, religion, sporting team supporters, fashion etc, used semi-formally on many occasions in history.

(4) In the history of military espionage, the British code name for intelligence gathered by decrypting German communications enciphered on the Enigma machine during World War II (initial capital letter).

1690–1700: A New Latin adverb and preposition ultrā (uls (beyond) + -ter (the suffix used to form adverbs from adjectives) + (suffixed to the roots of verbs)).  The prefix ultra- was a word-forming element denoting "beyond" (eg ultrasonic) or "extremely" (ultralight (as used in aviation)) and was in common use from the early nineteenth century, the popularity of use apparently triggered by the frequency with which it was used of political groupings in France.  As a stand-alone word (in the sense now used of the most rabid followers of Italian football teams) meaning "extremist", it dates from 1817 as a shortening of ultra-royaliste (extreme royalist (which at the time was a thing))."  The independent use of ultra (or shortening of words prefixed with it) may also have been influenced by nē plūs ultrā (may you) not (go) further beyond (this point), said to be a warning to sailors inscribed on the Pillars of Hercules at Gibraltar.  This legend comes not from Greek mythology but dates from the resurrection of interest in antiquity which began during the Renaissance, influenced by Plato having said the lost city of Atlantis “lay beyond the Pillars of Hercules” and the most useful translations of nē plūs ultrā probably something like "go no further, nothing lies beyond here".

As a prefix, ultra- has been widely appended.  The construct of ultra vires (literally "beyond powers") was ultra (beyond) + vires (strength, force, vigor, power) and is quoted usually by courts and tribunals to describe their jurisdictional limits, something ultra vires understood as "beyond the legal or constitutional power of a court".  In political science, the term ultranationalism was first used in 1845, a trend which has ebbed & flowed but certainly hasn't died.  The speed of light being what it is, ultralight refer not to optics but to very small (often home-built or constructed from a kit) aircraft, the term first used in 1979 although it was (briefly) used in experimental physics in the late 1950s.  Ultrasound in its current understanding as a detection & diagnostic technique in medicine dates from 1958 but it had been used in 1911 to mean "sound beyond the range of human hearing", this sense later replaced by ultrasonic (having frequency beyond the audible range) in 1923, used first of radio transmission; the applied technology ultrasongraphy debuted in 1960.  Ultraviolet (beyond the violet end of the visible spectrum) was first identified in 1840 and in 1870 ultra-red was coined to describe what is now known as infra-red.  First identified in the 1590s, ultramarine (blue pigment made from lapis lazuli) was from the Medieval Latin ultramarinus ("beyond the sea"), the construct being ultra +  marinus (of the sea) from mare (sea, the sea, seawater), from the primitive Indo-European root mori- (body of water), the name said to be derived from the mineral arriving by ship from mines in Asia.  Ultramontane has a varied history and was first used in the 1590s.  It was from the Middle French ultramontain (beyond the mountains (especially the Alps)), from the early fourteenth century Old French, the construct being ultra + the stem of mons (hill), from the primitive Indo-European root men- (to project) and was used particularly of papal authority, though the precise meaning bounced around depending on context.  The acronym UHF (ultra-high frequency) was coined in 1937 although the technology using radio frequencies in the range of 300-3000 megahertz (Mhz) became available in 1932.  Other forms (ultramodern, ultra-blonde et al) are coined as required and survive or fall from use in the usual way English evolves.

The Ultras

The prefix ultra- occurred originally in loanwords from Latin, meaning essentially “on the far side of, beyond.”  In relation to the base to which it is prefixed, ultra- has the senses “located beyond, on the far side of” (eg ultraviolet), “carrying to the furthest degree possible, on the fringe of” (eg ultramodern) or “extremely” (eg ultralight); nouns to which it is added denote, in general, objects, properties, phenomena etc that surpass customary norms, or instruments designed to produce or deal with such things (eg ultrasound).  The more recent use as a noun (usually in the collective as “the ultras”) applied to members of an extreme faction dates from early nineteenth-century English parliamentary politics and is associated also with the most extreme supporters of certain Italian football (soccer) teams.

Although never formally a faction in the modern sense of the word, the ultra Tories (the ultras) operated from 1827 (some political scientists insists the aggregation coalesced only in 1828) as a formal as a loose and unstructured grouping of politicians, intellectuals, and journalists who constituted, in embryonic form, the “extreme right wing” of British and Irish politics.  Essentially reactionary conservatives unhappy with changes associated with the Enlightenment, the industrial revolution and urbanization, they regarded the 1689 protestant constitution as the unchangeable basis of British social, economic and political life and treated all their opponents with a rare obsessional hatred.  In another echo of recent right-wing politics, the ultras showed some scepticism of economic liberalism and supported measures designed to ameliorate the hardships suffered by the poor during the early industrial age.  Like a number of modern, nominally right-wing populist movements, the ultras were suspicious of “free trade” and the destructive consequences these policies had on industries vulnerable to competition from foreign producers.

Portrait of the Duke of Wellington (1769-1852) by Francisco Goya (1746–1828), circa 1812–14, oil on mahogany panel, National Gallery, London.

The previously inchoate ultras coalesced into a recognizable force in the period of instability which followed the death in 1827 of a long-serving prime-minister.  Their first flexing of political muscle, which proved unsuccessful, was an attempt to deny the premiership to a supporter of Catholic emancipation but the ultras emerged as a powerful influence in Tory politics although many claimed to belong to the Whig tradition.  Their annus mirabilis (a remarkable or auspicious year) came in 1830 when the ultras brought down the Duke of Wellington’s government (1828-1830) but the need for reform was unstoppable and while the label was for decades to be applied to the far-right of the Conservative Party, the latter iterations never matched the political ferocity of the early adherents.

Ultra Blonde product.

Although there are packaged products labeled as such and the phrase "ultra-blonde" is far from uncommon, there's no precise definition of such a thing and while some blondes are blonder than others, on the spectrum, there is a point at which going further means the color ceases to anymore to be blonde and becomes some shade which tends to grey, white or the dreaded yellow.  For that reason, some hairdressers prefer to describe platinum as a stand-alone color rather than the usual "platinum blonde", noting that the end result will anyway usually to some degree differ, depending on the shade and physiology of the hair to be treated.  They also caution the idea of ultra blonde isn't suitable for everyone and base their recommendations of whether a client's skin is warm or cool toned, the practical test being to assess the veins visible in the wrist; if they're mostly blue and purple (source of the word "blue-blooded" which was based on the notion of those with obviously blue veins being rich enough not to have to work in the fields), then the undertone is cool, if mostly green then it's warm and if a mix of both, the undertone is neutral.

Lindsay Lohan had an ultra-blonde phase but for her Playboy photo shoot in 2012, wore a blonde wig; many would call this "ultra blonde" but to a professional hairdresser it's a "pale".

The undertone interacts with skin tone, paler, pinky skin tones suit cool, delicate blondes like ash, beige or baby-blonde whereas darker or more golden-toned skins suit honey hues described often as butter, golden or caramel.  For perfectionists, there's also eye color to consider and here the trick is to achieve the desired degree of contrast; soft, multi-tonal shades better complementing lighter colours whereas deeper, richer blondes flatter the darker eye.  Those especially obsessive can use non-optically corrective contact lens, eye color often easier to change than hair.  So, while hairdressers think of ultra blonde as shifting concept rather than a specific color, most agree (whatever the sometimes extraordinary proliferation of imaginatively named products on manufacturers' color charts), there are essentially four stages of blondness and they’re usually described as something like: medium, light, pale & platinum.  In each of those categories, it's possible to be an "ultra" though hairdressers will readily admit their technical distinctions resonate little with customers whose expectation of "ultra" is simply the limit of what's physically possible.