Showing posts sorted by relevance for query Goodly. Sort by date Show all posts
Showing posts sorted by relevance for query Goodly. Sort by date Show all posts

Friday, March 26, 2021

Goodly

Goodly (pronounced good-lee)

(1) Of a substantial size or quantity.

(2) Of a good or fine appearance (rare).

(3) Of fine quality (obsolete).

(4) Highly virtuous (obsolete unless quantity is thought virtuous which does seem possible).

Pre 1000: From the Middle English, from the Old English gōdlīc, from the Proto-Germanic gōdalīkaz (equivalent to the Old English gōd (good)).  The construct was good + -ly.  Good was from the Middle English good & god, from Old English gōd (that which is good, a good thing; goodness; advantage, benefit; gift; virtue; property), from the Proto-West Germanic gōd, from the Proto-Germanic gōdaz, from the primitive Indo-European ghed (to unite, be associated, suit).  It was cognate with the Dutch goed, the German gut, the Old Norse gōthr, the Gothic goths & the Russian го́дный (gódnyj) (fit, well-suited, good for) & год (god) (year). The –ly prefix was from the Middle English -ly, -li, -lik & -lich, from the Old English -līċ, from the Proto-West Germanic -līk, from the Proto-Germanic -līkaz (having the body or form of), from līką (body) (from whence Modern German gained lich); in form, it was probably influenced by the Old Norse -ligr (-ly) and was cognate with the Dutch -lijk, the German -lich and the Swedish -lig.  It was used (1) to form adjectives from nouns, the adjectives having the sense of "behaving like, having a likeness or having a nature typical of what is denoted by the noun" and (2) to form adjectives from nouns specifying time intervals, the adjectives having the sense of "occurring at such intervals".  Goodly, goodlier & goodliest are adjectives and goodliness is a noun.

Lindsay Lohan with a largifical show of skin and a goodly sprinkle of freckles in Lynn Kiracofe tiara & Frye boots with Calvin Klein Original’s blue cotton jeans over white polyester and spandex XT Trunk briefs, W Magazine photo shoot, April 2005.

In English, goodly was long used to mean (1) someone or some act thought commendable or virtuous, (2) an item of high quality, (3) something or someone attractive and (4) ample or numerous in quality.  It was thus variously a synonym of many words including ample, big, biggish, burly, capacious, comprehensive, decent, extensive, good, great, gross, hefty, husky, jumbo, largish, major, massive, ponderous, respectable, sensible, beautiful & virtuous.  However, by the mid-twentieth century most senses of goodly had gone extinct and the word was only ever used of something quantitative.  Even then it was (and remains) rare but it exists in a niche populated by poets and literary novelists so its audience is thus limited.  As an example of the inconsistency in English’s evolution, the sense of virtue did survive in the noun goodliness.  An alternative to goodly when speaking of quantities was largifical and unlike goodly, it did not survive although large obviously has flourished.  The adjective largifical was from the Latin largificus, from largus (bountiful, liberal), the construct being an adaptation (via facere (in fact)) of larg(us) + faciō (do, make), from the Proto-Italic fakjō, from the primitive Indo-European dheh- (to put, place, set), the cognates of which included the Ancient Greek τίθημι (títhēmi), the Sanskrit दधाति (dádhāti), the Old English dōn (from which English ultimately gained “do”) and the Lithuanian dėti (to put).  So, beyond the confines of the literary novel, the preferred alternatives to goodly and largifical include sufficient, adequate, plenty, abundant, enough, satisfactory, plentiful, copious, profuse, rich, lavish, liberal, generous, bountiful, large, huge, great, bumper, flush, prolific, overflowing, generous & ample, the choice dictated by the nuances of need.

Thursday, November 20, 2025

Ultracrepidarian

Ultracrepidarian (pronounced uhl-truh-krep-i-dair-ee-uhn)

Of or pertaining to a person who criticizes, judges, or gives advice outside their area of expertise

1819: An English adaptation of the historic words sūtor, ne ultra crepidam, uttered by the Greek artist Apelles and reported by the Pliny the Elder.  Translating literally as “let the shoemaker venture no further” and sometimes cited as ne supra crepidam sūtor judicare, the translation something like “a cobbler should stick to shoes”.  From the Latin, ultra is beyond, sūtor is cobbler and crepidam is accusative singular of crepida (from the Ancient Greek κρηπίς (krēpís)) and means sandal or sole of a shoe.  Ultracrepidarian is a noun & verb and ultracrepidarianism is a noun; the noun plural is ultracrepidarians.  For humorous purposes, forms such as ultracrepidarist, ultracrepidarianish, ultracrepidarianize & ultracrepidarianesque have been coined; all are non-standard.

Ultracrepidarianism describes the tendency among some to offer opinions and advice on matters beyond their competence.  The word entered English in 1819 when used by English literary critic and self-described “good hater”, William Hazlitt (1778–1830), in an open letter to William Gifford (1756–1826), editor of the Quarterly Review, a letter described by one critic as “one of the finest works of invective in the language” although another suggested it was "one of his more moderate castigations" a hint that though now neglected, for students of especially waspish invective, he can be entertaining; the odd quote from him would certainly lend a varnish of erudition to trolling.  Ultracrepidarian comes from a classical allusion, Pliny the Elder (circa 24-79) recording the habit of the famous Greek painter Apelles (a fourth century BC contemporary of Alexander the Great (Alexander III of Macedon, 356-323 BC)), to display his work in public view, then conceal himself close by to listen to the comments of those passing.  One day, a cobbler paused and picked fault with Apelles’ rendering of sandals and the artist immediately took his brushes and pallet and touched-up the errant straps.  Encouraged, the amateur critic then let his eye wander above the ankle and suggested how the leg might be improved but this Apelles rejected, telling him to speak only of shoes and otherwise maintain a deferential silence.  Pliny hinted the artist's words of dismissal may not have been polite.

So critics should comment only on that about which they know.  The phrase in English is usually “cobbler, stick to your last” (a last a shoemaker’s pattern, ultimately from a Germanic root meaning “to follow a track'' hence footstep) and exists in many European languages: zapatero a tus zapatos is the Spanish, schoenmaker, blijf bij je leest the Dutch, skomager, bliv ved din læst the Danish and schuster, bleib bei deinen leisten, the German.  Pliny’s actual words were ne supra crepidam judicaret, (crepidam a sandal or the sole of a shoe), but the idea is conveyed is in several ways in Latin tags, such as Ne sutor ultra crepidam (sutor means “cobbler”, a word which survives in Scotland in the spelling souter).  The best-known version is the abbreviated tag ultra crepidam (beyond the sole), and it’s that which Hazlitt used to construct ultracrepidarian.  Crepidam is from the Ancient Greek κρηπίς (krēpísand has no link with words like decrepit or crepitation (which are from the Classical Latin crepare (to creak, rattle, or make a noise)) or crepuscular (from the Latin word for twilight); crepidarian is an adjective rare perhaps to the point of extinction meaning “pertaining to a shoemaker”.

The related terms are "Nobel disease" & "Nobel syndrome" which are used to describe some of the opinions offered by Nobel laureates on subjects beyond their specialization.  In some cases this is "demand" rather than "supply" driven because, once a prize winner is added to a media outlet's "list of those who comment on X", if they turn out to give answers which generate audience numbers, controversy or clicks, they become "talent" and may be asked questions about matters of which they know little.  This happens because some laureates in the three "hard" prizes (physics, chemistry, physiology or medicine) operate in esoteric corners of their discipline; asking a particle physicist something about plasma physics on the basis of their having won the physics prize may not elicit useful information.  Of course those who have won the economics gong or one of what are now the DEI (diversity, equity and inclusion) prizes (peace & literature) may be assumed to have helpful opinions on everything.

Jackson Pollock (1912-1956): Blue Poles

Number 11 (Blue poles, 1952), Oil, enamel and aluminum paint with glass on canvas.

In 1973, when a million dollars was a still lot of money, the NGA (National Gallery of Australia), a little controversially, paid Aus$1.3 million for Jackson Pollock’s (1912-1956) Number 11, 1952, popularly known as Blue Poles since it was first exhibited in 1954, the new name reputedly chosen by the artist.  It was some years ago said to be valued at up to US$100 million but, given the increase in the money supply (among the rich who trade this stuff) over the last two decades odd, that estimate may now be conservative although the suggestion in 2016 the value may have inflated to as much as US$350 million was though to be "on the high side".  Blue Poles emerged during Pollock’s "drip period" (1947-1950), a method which involved techniques such throwing paint at a canvas spread across the floor.  The art industry liked these (often preferring the more evocative term "action painting") and they remain his most popular works, although at this point, he abandoned the dripping and moved to his “black porings phase” a darker, simpler style which didn’t attract the same commercial interest.  He later returned to more colorful ways but his madness and alcoholism worsened; he died in a drink-driving accident.

Alchemy (1947), Oil, aluminum, alkyd enamel paint with sand, pebbles, fibres, and broken wooden sticks on canvas.

Although the general public remained uninterested (except in the price tags) or sceptical, there were critics, always drawn to a “troubled genius”, who praised Pollock’s work and the industry approves of any artist who (1) had the decency to die young and (2) produced lots of stuff which can sell for millions.  US historian of art, curator & author Helen A Harrison (b 1943; director (1990-2024) of the Pollock-Krasner House and Study Center, the former home and studio of the Abstract Expressionist artists Jackson Pollock and Lee Krasner in East Hampton, New York) is an admirer, noting the “pioneering drip technique…” which “…introduced the notion of action painting", where the canvas became the space with which the artist actively would engage”.  As a thumbnail sketch she offered:

Number 14: Gray (1948), Enamel over gesso on paper.

Reminiscent of the Surrealist notions of the subconscious and automatic painting, Pollock's abstract works cemented his reputation as the most critically championed proponent of Abstract Expressionism. His visceral engagement with emotions, thoughts and other intangibles gives his abstract imagery extraordinary immediacy, while his skillful use of fluid pigment, applied with dance-like movements and sweeping gestures that seldom actually touched the surface, broke decisively with tradition. At first sight, Pollock's vigorous method appears to create chaotic labyrinths, but upon close inspection his strong rhythmic structures become evident, revealing a fascinating complexity and deeper significance.  Far from being calculated to shock, Pollock's liquid medium was crucial to his pictorial aims.  It proved the ideal vehicle for the mercurial content that he sought to communicate 'energy and motion made visible - memories arrested in space'.”

Number 13A: Arabesque (1948), Oil and enamel on canvas.

Critics either less visionary or more fastidious seemed often as appalled by Pollock’s violence of technique as they were by the finished work (or “products” as some labelled the drip paintings), questioning whether any artistic skill or vision even existed, one finding them “…mere unorganized explosions of random energy, and therefore meaningless.”  The detractors used the language of academic criticism but meant the same thing as the frequent phrase of an unimpressed public: “That’s not art, anyone could do that.”

Number 1, 1949 (1949), Enamel and metallic paint on canvas. 

There have been famous responses to  “That’s not art, anyone could do that” but Ms Harrison's was practical, offering people the opportunity to try.  To the view that “…people thought it was arbitrary, that anyone can fling paint around”, Ms Harrison conceded it was true anybody could “fling paint around” but that was her point, anybody could, but having flung, they wouldn’t “…necessarily come up with anything” by which she meant the wouldn't necessarily come up with anything of which the critical establishment (a kind of freemasonry of the art business) would approve (ie could put a price tag on).

Helen A Harrison, The Jackson Pollock Box (Cider Mill Press, 96pp, ISBN-10:1604331860, ISBN-13:978-1604331868).

In 2010, Ms Harrison released The Jackson Pollock Box, a kit which, in addition to an introductory text, included paint brushes, drip bottles and canvases so people could do their own flinging and compare the result against a Pollock.  After that, they may agree with collector Peggy Guggenheim (1898-1979) that Pollock was “...the greatest painter since Picasso” or remain unrepentant ultracrepidarians.  Of course, many who thought their own eye for art quite well-trained didn't agree with Ms Guggenheim.  In 1945, just after the war, Duff Cooper (1890–1954), then serving as Britain's ambassador to France, came across Pablo Picasso (1881–1973) leaving an exhibition of paintings by English children aged 5-10 and in his diary noted the great cubist saying he "had been much impressed".  "No wonder" added the ambassador, "the pictures are just as good as his".

Dresses & drips: Three photographs by Cecil Beaton (1904-1980), shot for a three-page feature in Vogue (March 1951) titled American Fashion: The New Soft Look which juxtaposed Pollock’s paintings hung in New York’s Betty Parsons Gallery with the season’s haute couture by Irene (1872-1951) & Henri Bendel (1868-1936).

Beaton choose the combinations of fashion and painting; pairing Lavender Mist (1950, left) with a short black ball gown of silk paper taffeta with large pink bow at one shoulder and an asymmetrical hooped skirt best illustrates the value of his trained eye.  Critics and social commentators have always liked these three pages, relishing the opportunity to comment on the interplay of so many of the clashing forces of modernity: the avant-garde and fashion, production and consumption, abstraction and representation, painting and photography, autonomy and decoration, masculinity and femininity, art and commerce.  Historians of art note it too because it was the abstract expressionism of the 1940s which was both uniquely an American movement and the one which in the post-war years saw the New York supplant Paris as the centre of Western art.  There have been interesting discussions about when last it could be said Western art had a "centre".

Blue Poles, upside down.

Although the suggestion might offend the trained and discerning eyes of art critics, it’s doubtful that for ultracrepidarians the experience of viewing Blue Poles would much be different were it to be hung upside down.  Fortunately, the world does have a goodly stock of art critics who can explain that while Pollock did more than once say his works should be interpreted “subjectively”, their intended orientation is a part of the whole and an inversion would change the visual dynamics and gravitational illusions upon which the abstraction effects depend would be changed.  It would still be a painting but, in a sense, not the one the artist painted.  Because the drip technique involved “flinging and poring paint” onto a canvas spread across a studio’s floor, there was not exactly a randomness in where the paint landed but physics did mean gravity exerted some pull (in flight and on the ground), lending layers and rivulets what must be a specific downward orientation.  Thus, were the work to be hung inverted, what was in the creative process a downward flow would be seen as “flowing uphill” as it were.  The compositional elements which lent the work its name were course the quasi-vertical “poles” placed at slight angles and its these which are the superstructure which “anchor” the rest of the drips and, being intrinsically “directional”, they too have a “right way up”.  There is in the assessment of art the “eye of the beholder” but although it may be something they leave unstated, most critics will be of the “some eyes are more equal than others” school.

Mondrian’s 1941 New York City 1 as it (presumably correctly) sat in the artist's studio in 1944 (left) and as it was since 1945 exhibited (upside-down) in New York and Düsseldorf (right).  Spot the difference.

So although ultracrepidarians may not “get it” (even after digesting the critics’ explanations) and wouldn’t be able to tell whether or not it was hung correctly, that’s because they’re philistines.  In the world of abstract art however, even the critics can be fooled: in 2022, it was revealed a work in Piet Mondrian’s (1872-1944) 1941 New York City 1 series had for 77 years been hanging upside down.  First in exhibited in 1945 in New York’s MOMA (Museum of Modern Art), the piece was created with multi-colored adhesive paper tape and, in an incorrect orientation, it has since 1980 hung in the Düsseldorf Museum as part of the Kunstsammlung Nordrhein-Westfalen’s collection.  The decades-long, trans-Atlantic mistake came to light during a press conference held to announce the Kunstsammlung’s new Mondrian exhibition and the conclusion was the error may have been caused by something as simple as the packing-crate being overturned or misleading instructions being given to the staff.  1941 New York City 1 will remain upside because of the condition of the adhesive strips.  The adhesive tapes are already extremely loose and hanging by a thread” a curator was quoted as saying, adding that if it were now to be turned-over, “…gravity would pull it into another direction.  And it’s now part of the work’s story.  Mondrian was one of the more significant theorists of abstract art and its withdrawal from nature and natural subjects.  Denaturalization” he proclaimed to be a milestone in human progress, adding: “The power of neo-plastic painting lies in having shown the necessity of this denaturalization in painterly terms... to denaturalize is to abstract... to abstract is to deepen.  Now even ultracrepidarians can understand.

Eye of the beholder: Portrait of Lindsay Lohan in the style of Claude Monet (1840–1926) at craiyon.com and available at US$26 on an organic cotton T-shirt made in a factory powered by renewable energy.

Whether the arguments about what deserves to be called “art” began among prehistoric “artists” and their critics in caves long ago isn’t known but it’s certainly a dispute with a long history.  In the sense it’s a subjective judgment the matter was doubtless often resolved by a potential buyer declining to purchase but during the twentieth century it became a contested topic and there were celebrated exhibits and squabbles which for decades played out before, in the post modern age, the final answer appeared to be something was art if variously (1) the creator said it was or (2) an art critic said it was or (3) it was in an art gallery or (4) the price tag was sufficiently impressive.

So what constitutes “art” is a construct of time, place & context which evolves, shaped by historical, cultural, social, economic, political & personal influences, factors which in recent years have had to be cognizant of the rise of cultural equivalency, the recognition that Western concepts such as the distinction between “high” (or “fine”) art and “folk” (or “popular”) art can’t be applied to work from other traditions where cultural objects are not classified by a graduated hierarchy.  In other words, everybody’s definition is equally valid.  That doesn’t mean there are no longer gatekeepers because the curators in institutions such as museums, galleries & academies all discriminate and thus play a significant role in deciding what gets exhibited, studied & promoted, even though few would now dare to suggest what is art and what is not: that would be cultural imperialism.

Eye of the prompt 1.0: An AI (artificial intelligence) generated portrait of Lindsay Lohan by ChatGPT imagined in "drip painting style", this one using an interpretation which overlaid "curated drips" over "flung paint".  This could be rendered using Ms Harrison's Jackson Pollock Box but would demand some talent.

In the twentieth century, it seemed to depend on artistic intent, something which transcended a traditional measure such as aesthetic value but as the graphic art in advertising and that with a political purpose such as agitprop became bigger, brighter and more intrusive, such forms also came to be regarded as art or at least worth of being studied or exhibited on the same basis, in the same spaces as oil on canvas portraits & landscapes.  Once though, an unfamiliar object in such places could shock as French painter & sculptor Marcel Duchamp (1887-1968) managed in 1917 when he submitted a porcelain urinal as his piece for an exhibition in New York, his rationale being “…everyday objects raised to the dignity of a work of art by the artist's act of choice.”  Even then it wasn’t a wholly original approach but the art establishment has never quite recovered and from that urinal to Dadaism, to soup cans to unmade beds, it became accepted that “anything goes” and people should be left to make of it what they will.  Probably the last remaining reliable guide to what really is "art" remains the price tag.

Eye of the prompt 1.1: An AI (artificial intelligence) generated portrait of Lindsay Lohan by ChatGPT imagined in "drip painting style", this one closer to Pollock’s “action painting” technique.

His drip period wholly non-representational, Pollock didn’t produce recognizable portraiture so applying the technique for this purpose demands guesswork.  As AI illustrates, it can be done but, in blending two incompatible modes, whether it looks much like what Pollock would have produced had he accepted a “paint Lindsay Lohan” commission, is wholly speculative.  What is more likely is that even if some sort of hybrid, a portrait by Pollock would have been an abstraction altogether more chaotic and owing little to the structure on which such works usually depend in that there probably would have been no central focal point, fewer hints of symmetry and a use of shading producing a face not lineal in its composition.  That’s what his sense of “continuous motion” dictated: no single form becoming privileged over the rest.  So, this too is not for the literalists schooled in the tradition of photo-realism but as a work it’s also an example of how most armed with Ms Harrison's Jackson Pollock Box could with "drip & fling" produce this but not necessarily would produce this, chaos on canvas needing talent too.

1948 Cisitalia 202 GT (left; 1947-1952) and 1962 Jaguar E-Type (1961-1974; right), Museum of Modern Art (MoMA), New York City.

Urinals tend not to be admired for their aesthetic qualities but there are those who find beauty in stuff as diverse as math equations and battleships.  Certain cars have long been objects which can exert an emotional pull on those with a feeling for such things and if the lines are sufficiently pleasing, many flaws in execution or engineering can be forgivgen.  New York’s MOMA in 1972 acknowledged such creations can be treated as works of art when they added a 1948 Cisitalia 202 GT finished in “Cisitalia Red” (MoMA object number 409.1972) to their collection, the press release noting it was “…the first time that an art museum in the U.S. put a car into its collection.”  Others appeared from time-to-time and while the 1953 Willys-Overland Jeep M-38A1 Utility Truck (MoMA object number 261.2002) perhaps is not conventionally beautiful, its brutish functionalism has a certain simplicity of form and in the exhibition notes MoMA clarified somewhat by describing it as a “rolling sculpture”, presumably in the spirit of a urinal being a “static sculpture”, both to be admired as pieces of design perfectly suited to their intended purpose, something of an art in itself.  Of the 1962 Jaguar E-Type (sometimes informally as XKE or XK-E in the US) open two seater (OTS, better known as a roadster and acquired as MoMA object number 113.996), there was no need to explain because it’s one of the most seductive shapes ever rendered in metal.  Enzo Ferrari (1898-1988) attended the 1961 Geneva International Motor Show (now defunct but, on much the same basis as manufacturers east of Suez buying brand-names such as MG, Jaguar and such, the name has been purchased for use by an event in staged in Qatar) when the E-Type made its stunning debut and part of folklore is he called it “the most beautiful car in the world”.  Whether those words ever passed his lips isn’t certain because the sources vary slightly in detail and il Commendatore apparently never confirmed or denied the sentiment but it’s easy to believe and to this day many agree just looking at the thing can be a visceral experience.  The MoMA car is finished in "Opalescent Dark Blue" with a grey interior and blue soft-top (there are those who would prefer it in BRG (British Racing Green) over tan leather) and although as a piece of design it's not flawless, anyone who can't see the beauty in a Series 1 E-Type OTS is truly an ultracrepidarian.   

Saturday, March 25, 2023

Esurient

Esurient (pronounced ih-soo-r-ee-uhnt)

(1) The state of being hungry; greedy; voracious.

(2) One who is hungry.

1665–1675: A borrowing from the Latin ēsurient & ēsurientem, stem of ēsuriēns (hungering), present participle of ēsurīre (to be hungry; to hunger for something), from edere (to eat), the construct being ēsur- (hunger) + -ens (the Latin adjectival suffix which appeared in English as –ent (and –ant, –aunt etc) and in Old French as –ent).  The form ēsuriō was a desiderative verb from edō (to eat), ultimately from the primitive Indo-European hédti (to eat and from the root ed-) + -turiō (the suffix indicating a desire for an action).  English offers a goodly grab of alternatives including rapacious, ravenous, gluttonous, hoggish, insatiable, unappeasable, ravening, avaricious, avid and covetous.  Esurient is a noun & adjective, esurience & esuriency are nouns and esuriently is an adverb; the noun plural is esurients.

A noted Instagram influencer assuaging her esurience.

For word-nerds to note, a long vowel in the Proto-Italic edō from the primitive Indo-European hédti is illustrative of the application of Lachmann's law (a long-disputed phonological sound rule for Latin named after German philologist and critic Karl Lachmann (1793–1851)).  According to Lachmann, vowels in Latin lengthen before primitive (and the later proto-) Indo-European voiced stops which are followed by another (unvoiced) stop.  Given the paucity of documentary evidence, much work in this field is essentially educated guesswork and Lachmann’s conclusions were derived from analogy and the selective application of theory.  Not all in this highly specialized area of structural linguistics agreed and arguments percolated until an incendiary paper in 1965 assaulted analogy as an explanatory tool in historical linguistics, triggering a decade-long squabble.  This polemical episode appeared to suggest Lachmann had constructed a framework onto which extreme positions could be mapped, one wishing to attribute almost everything to analogy, the other, nothing.  With that, debate seemed to end and Lachmann’s law seems now noted less for what it was than for what it was not.

In memory of Tenuate Dospan

A seemingly permanent condition of late modernity is weight gain; the companion permanent desire being weight loss.  The human propensity to store fat was a product of natural selection, those who possessed the genes which passed on the traits more likely to achieve sexual maturity and thus be able to procreate.  Storing fat meant that in times of plenty, weight was gained which could be used as a source of energy in times of scarcity and for thousands of generations this was how almost all humans lived.  However, in so much of the world people now live in a permanent state of plenty and one in which that plenty (fats, salt & sugars) doesn’t have to be hunted, gathered or harvested.  Now, with only a minimal expenditure of energy, we take what we want from the shelf or, barely having to move from our chair, it’s delivered to our door.  In our sedentary lives we thus expend much less energy but our brains remain hard-wired to seek out the fats, salt & sugars which best enable the body to accumulate fat for the lean times.  Some call this the "curse of plenty".

For all but a few genetically unlucky souls, the theory of weight loss is simple: reduce energy intake and increase the energy burn.  For many reasons however the practices required to execute the theory can be difficult although much evidence does suggest that once started, exercise does become easier because (1) the brain rewards the body for doing it with what’s effectively a true “recreational drug”, (2) it becomes literally easier because weight-loss in itself reduces the energy required and (3) the psychological encouragement of success (some dieticians actually recommend scales with a digital read-out so progress can be measured in 100 gram (3½ oz) increments).  Still, even starting is clearly an obstacle which is why the pharmaceutical industry saw such potential in finding the means to reduce supply (food intake) if increasing demand (exercise) was just too hard.

Lindsay Lohan about to assuage her esurience.

For centuries physicians and apothecaries had been aware of the appetite suppressing qualities of various herbs and other preparations but these were usually seen as something undesirable and were often a side effect of the early medicines, many of which were of dubious benefit, some little short of poison.  Although the noun anorectic (a back formation from the adjective anorectic (anorectous an archaic form) appeared in the medical literature in the early nineteenth century, it was used to describe a patient suffering a loss of appetite; only later would it come to be applied to drugs, firstly those which induced the condition as a side-effect and later, those designed for purpose.  The adjective anorectic (characterized by want of appetite) appeared first in 1832 and was a coining of medical Latin, from the Ancient Greek ἀνόρεκτος (anórektos) (without appetite), the construct being ἀν- (an-) (not, without) + ὀρέγω (orégō) (a verbal adjective of oregein (to long for, desire) which was later to influence the word anorexia)).  The noun was first used in 1913.

Tenuate Dospan.  As an industry leader in promoting DEI (diversity, equity and inclusion), Merrell was years ahead in the use of plus-size models.

In the twentieth century, as modern chemistry emerged, anorectic drugs became available by accident as medical amphetamines reached the black market as stimulants, the side effects quickly noted.  Those side effects however were of little interest to the various military authorities which during World War II (1939-1945) made them available to troops by the million, their stimulant properties and the ability to keep soldiers alert and awake for days at a time functioning as an extraordinary force-multiplier.  Not for years was fully it understood just how significant was the supply of the amphetamine Pervitin in the Wehrmacht’s (the German armed forces (1935-1945)) extraordinary military successes in 1939-1941.  In the post-war years, various types of amphetamine were made commercially available as appetite suppressants and while effective, the side effects were of concern although many products remained available in the West well into the twenty-first century.  Probably the best known class of these was amfepramone (or diethylpropion) marketed most famously as Tenuate Dospan which was popular with (1) those who wanted to be thin and (2) those who wanted to stay awake longer than is usually recommended.  Tenuate Dospan usually achieved both.

The regulatory authorities however moved to ensure the supply of Tenuate Dospan and related preparations was restricted, the concern said to be about the side effects although in these matters the true motivations can sometimes be obscure.  In their place, the industry responded with appetite suppressants which essentially didn’t work (compared with the efficient Tennuate Dospan) but sold for two or three times the price which must have pleased some.  The interest in restricting esurience however continued and one of the latest generation is Liraglutide (sold under various the brand names including Victoza & Saxenda) which started life as an anti-diabetic medication, the appetite suppressing properties noted during clinical trials, rather as the side-effects of Viagra (sildenafil) came as a pleasing surprise to the manufacturer.  Being a injection, Liraglutide is harder to use than Tenuate Dospan (which was a daily pill) and users report there are both similarities and differences between the two.

Liraglutide (Saxenda).  The dose increases month by month.

On Tenuate Dospan, one’s appetite diminished rapidly but food still tasted much the same, only the desire for it declined and being an amphetamine, energy levels were elevated and there were the usual difficulties (sleeping, dryness in the mouth, mood swings).  Dieticians recommended combining Tenuate Dospan with a high quality diet (the usual fruit, vegetables, clear fluids etc).  By contrast, although Liraglutide users reported much the same loss of interest in food, they noted also some distaste for the foods they had once so enjoyed and a distinct lack of energy.  It’s still early in the life of Liraglutide but it certainly seems to work as an appetite suppressant although in the trials, the persistent problem of all such drugs was noted: as soon as the treatment ceased, the food cravings returned.  Liraglutide does what the manufacturer’s explanatory notes suggest it does: it is a drug which can be used to treat chronic obesity by achieving weight-loss over several months, during which a patient should seek to achieve a permanent lifestyle change (diet and exercise).  It does not undo thousands of generations of evolution.  The early literature at least hinted Liraglutide was intended for obese adolescents for whom no other weight loss programmes had proved effective but anecdotal evidence suggests adults are numerous among the early adopters.

Sunday, October 6, 2024

Monolith

Monolith (pronounced mon-uh-lith)

(1) A column, large statue etc, formed originally from a single block of stone but latter day use applies the term to structures formed from any material and not of necessity a single piece (although technically such a thing should be described using the antonym: “polylith”.

(2) Used loosely, a synonym of “obelisk”.

(3) A single block or piece of stone, especially when used in architecture or sculpture and applied most frequently to large structures.

(4) Something (or an idea or concept) having a uniform, massive, redoubtable, or inflexible quality or character.

(5) In architecture, a large hollow foundation piece sunk as a caisson and having a number of compartments that are filled with concrete when it has reached its correct position

(6) An unincorporated community in Kern County, California, United States (initial capital).

(7) In chemistry, a substrate having many tiny channels that is cast as a single piece, which is used as a stationary phase for chromatography, as a catalytic surface etc.

(8) In arboreal use, a dead tree whose height and size have been reduced by breaking off or cutting its branches (use rare except in UK horticultural use).

1829: The construct was mono- + lith.  Mono was from the Ancient Greek, a combining form of μόνος (monos) (alone, only, sole, single), from the Proto-Hellenic mónwos, from the primitive Indo-European mey- (little; small).  It was related to the Armenian մանր (manr) (slender, small), the Ancient Greek μανός (manós) (sparse, rare), the Middle Low German mone & möne, the West Frisian meun, the Dutch meun, the Old High German muniwa, munuwa & munewa (from which German gained Münne (minnow).  As a prefix, mono- is often found in chemical names to indicate a substance containing just one of a specified atom or group (eg a monohydrate such as carbon monoxide; carbon attached to a single atom of oxygen). 

In English, the noun monolith was from the French monolithe (object made from a single block of stone), from Middle French monolythe (made from a single block of stone) and their etymon the Latin monolithus (made from a single block of stone), from the Ancient Greek μονόλιθος (monólithos) (made from a single block of stone), the construct being μονο- (mono-) (the prefix appended to convey the meaning “alone; single”), from μόνος (monos) + λίθος (líthos) (a stone; stone as a substance).  The English form was cognate with the German monolith (made from a single block of stone).  The verb was derived from the noun.  Monolith is a noun & verb, monolithism, monolithicness & monolithicity are nouns, monolithic is an adjective and monolithically is an adverb; the noun plural is monoliths.  The adjective monolithal is listed as "an archaic form of monolithic".

Monolith also begat two back-formations in the technical jargon of archaeology: A “microlith” is (1) a small stone tool (sometimes called a “microlite”) and (2) the microscopic acicular components of rocks.  A “megalith” is (1) a large stone slab making up a prehistoric monument, or part of such a monument, (2) A prehistoric monument made up of one or more large stones and (3) by, extension, a large stone or block of stone used in the construction of a modern structure.  The terms seem not to be in use outside of the technical literature of the profession.  The transferred and figurative use in reference to a thing or person noted for indivisible unity is from 1934 and is now widely used in IT, political science and opinion polling.  The adjective monolithic (formed of a single block of stone) was in use by the early nineteenth century and within decades was used to mean “of or pertaining to a monolith”, the figurative sense noted since the 1920s.  The adjective prevailed over monolithal which seems first to have appeared in a scientific paper in 1813.  The antonym in the context of structures rendered for a single substance is “polylith” but use is rare and multi-component constructions are often described as “monoliths”.  The antonym in the context of “anything massive, uniform, and unmovable, especially a towering and impersonal cultural, political, or social organization or structure” is listed by many sources as “chimera” but terms like “diverse”, “fragmented” etc are usually more illustrative for most purposes.  In general use, there certainly has been something of a meaning-shift.  While "monolith" began as meaning "made of a single substance", it's now probably most used to covey the idea of "something big & tall" regardless of the materials used.

One of the Monoliths as depicted in the film 2001: A Space Odyssey (1968). 

The mysterious black structures in Sir Arthur C Clarke's (1917–2008) Space Odyssey series (1968-1997) became well known after the release in 1968 of Stanley Kubrick's (1928–1999) film of the first novel in the series, 2001: A Space Odyssey.  Although sometimes described as “obelisk”, the author noted they were really “monoliths”.  In recent years, enthusiasts, mischief makers and click-bait hunters have been erecting similar monoliths in remote parts of planet Earth, leaving them to be discovered and publicized.  With typical alacrity, modern commerce noted the interest  and soon, replicas were being offered for sale, a gap in the market for Christmas gifts between US$10,000-45,000 apparently identified.

The terms “obelisk” and “monolith” are sometimes used interchangeably and while in the case of many large stone structures this can be appropriate, the two terms have distinct meanings.  Classically, an obelisk is a tall, four-sided, narrow pillar that tapers to a pyramid-like point at the top.  Obelisks often are carved from a single piece of stone (and are thus monolithic) but can also be constructed in sections and archaeologists have discovered some of the multi-part structures exists by virtue of necessity; intended originally to be a single piece of stone, the design was changed after cracks were detected.  A monolith is a large single block stone which can be naturally occurring (such as a large rock formation) or artificially shaped; monoliths take many forms, including obelisks, statues and even buildings.  Thus, while an obelisk can be a monolith, not all monoliths are obelisks.

Highly qualified German content provider Chloe Vevrier (b 1968) standing in front of the Luxor Obelisk, Paris 2010.

The Luxor Obelisk sits in the centre of the Place de la Concorde, one of the world’s most photographed public squares.  Of red granite, 22.5 metres (74 feet) in height and weighing an estimated 227 tonnes (250 short (US) tons), it is one of a pair, the other still standing front of the first pylon of the Luxor Temple on the east bank of the Nile River, Egypt.  The obelisk arrived in France in May 1833 and less than six month later was raised in the presence of Louis Philippe I (1773–1850; King of the French 1830-1848).  The square hadn’t always been a happy place for kings to stand; in 1789 (then known as the Place de Louis XV) it was one of the gathering points for the mobs staging what became the French Revolution and after the storming of the Bastille (of of history’s less dramatic events despite the legends), the square was renamed Place de la Revolution, living up to the name by being the place where Louis XVI (1754–1793; King of France 1774-1792), Marie Antoinette (1755–1793; Queen Consort of France 1774-1792) and a goodly number of others were guillotined.  Things were calmer by 1833 when the obelisk was erected.

The structure was a gift to France by Pasha Mehmet Ali (1769–1849, Ottoman Albanian viceroy and governor of Egypt 1805-1848) and in return Paris sent a large mechanical clock which to this day remains in place in the clock tower of the mosque at the summit of the Citadel of Cairo and of the 28 obelisks, six remain in Egypt with the rest in various displays around the world.  Some 3000 years old, in its original location the Obelisk contributed to scientific history wine in circa 250 BC Greek geographer & astronomer Eratosthenes of Cyrene (circa 276 BC–circa 195 BC) used the shadow it cast to calculate the circumference of the Earth.  By comparing the shadow at a certain time with one in Alexandria, he concluded that the difference in distance between Alexandria and Aswan was seven degrees and 14 minutes and from this he could work out the Earth’s circumference.

Monolithic drivers

In IT, the term “monolithic driver” was used to refer to a software driver designed to handle multiple hardware components or functionalities within a single, large, and cohesive codebase.  In this it differed from earlier (and later) approaches which were modular or layered, the functionality is split into separate, smaller drivers or modules, each of which handled specific tasks or addressed only certain hardware components.  Monolithic drivers became generally available in the late 1980s, a period when both computer architecture and operating systems were becoming more sophisticated in an attempt to overcome the structural limitations imposed by the earlier designs.  It was in the era many of the fundamental concepts which continue to underpin modern systems were conceived although the general adoption of some lay a decade or more away.

During the 1970s & 1980s, many systems were built with a tight integration between software and hardware and some operating systems (OS) were really little more than “file loaders” with a few “add-ons”, and the limitations imposed were “worked-around” by some programmers who more-or-less ignored the operating system an address the hardware directly using “assemblers” (a flavor of “machine-code”).  That approach made for fast software but at the cost of interoperability and compatibility, such creations hardware specific rather using an OS as what came to be known as the HAL (hardware abstraction layer) but at the time, few OSs were like UNIX with its monolithic kernel in which the core OS services (file system management, device drivers etc.) were all integrated into a single large codebase.  As the market expanded, it was obvious the multi-fork approach was commercially unattractive except for the odd niche.

After its release in 1981, use of the IBM personal computers (PC) proliferated and because of its open (licence-free) architecture, an ecosystem of third party suppliers arose, producing a remarkable array of devices which either “hung-off” or “plugged-in” a PC; the need for hardware drivers grew.  Most drivers at the time came from the hardware manufacturers themselves and typically were monolithic (though not yet usually described as such) and written usually for specific hardware and issues were rife, a change to an OS or even other (apparently unrelated) hardware or software sometimes inducing instability or worse.  As operating systems evolved to support more modularity, the term “monolithic driver” came into use to distinguish these large, single-block drivers from the more modular or layered approaches that were beginning to emerge.

It was the dominance of Novell’s Netware (1983-2009) on PC networks which compelled Microsoft to develop Windows NT (“New Technology”, 1993) and it featured a modular kernel architecture, something which made the distinction between monolithic and modular drivers better understood and as developers increasingly embraced the modular, layered approach which better handled maintainability and scalability.  Once neutral, the term “monolithic driver” became something of a slur in IT circles, notably among system administrators (“sysadmins” or “syscons”, the latter based on the “system console”, the terminal on a mainframe hard-wired to the central processor) who accepted ongoing failures of this and that as inherent to the business but wanted to avoid a SPoFs (Single Point of Failure).

In political science, the term “monolithic” is used to describe a system, organization, or entity perceived as being unified, indivisible, and operating with a high degree of internal uniformity, often with centralized control. When something is labeled as monolithic, it implies that it lacks diversity or internal differentiation and presents a singular, rigid structure or ideology.  Tellingly, the most common use of the term is probably when analyzing electoral behavior and demonstrating how groups, societies or sub-sets of either. Although often depicted in the media as “monolithic” in their views, voting patterns or political behavior are anything but and there’s usually some diversity.  In political science, such divergences within defined groups are known as “cross-cutting cleavages”.

It’s used also of political systems in which a regime is structured (or run) with power is highly concentrated, typically in a single dictator or ruling party.  In such systems, usually there is little effective opposition and dissent is suppressed (although some of the more subtle informally tolerate a number of “approved dissenters” who operated within understood limits of self-censorship.  The old Soviet Union (the USSR (Union of Soviet Socialist Republics) 1922-1991), the Islamic Republic of Iran (1979-), the Republic of China (run by the Chinese Communist Party (CCP) (1949-) and the DPRK (Democratic Republic of Korea (North Korea) 1948-) are classic examples of monolithic systems; while the differences between them were innumerable, structurally all were (or are) politically monolithic.  The word is used also as a critique in the social sciences, Time magazine in April 2014 writing of the treatment of “Africa” as a construct in Mean Girls (2004):  Like the original Ulysses, Cady is recently returned from her own series of adventures in Africa, where her parents worked as research zoologists. It is this prior “region of supernatural wonder” that offers the basis for the mythological reading of the film. While the notion of the African continent as a place of magic is a dated, rather offensive trope, the film firmly establishes this impression among the students at North Shore High School. To them, Africa is a monolithic place about which they know almost nothing. In their first encounter, Karen inquires of Cady: “So, if you’re from Africa, why are you white?” Shortly thereafter, Regina warns Aaron that Cady plans to “do some kind of African voodoo” on a used Kleenex of his to make him like her—in fact, the very boon that Cady will come to bestow under the monomyth mode.”  It remains a sensitive issue and one of the consequences of European colonial practices on the African continent (something which included what would now be regarded as "crimes against humanity) so the casual use of "Africa" as a monolithic construct is proscribed in a way a similar of "Europe" would not attract criticism.    

The limitations of the utility of the term mean it should be treated with caution and while there are “monolithic” aspects or features to constructs such as “the Third World”, “the West” or “the Global South”, the label does over-simplify the diversity of cultures, political systems, and ideologies within these broad categories.  Even something which is to some degree “structurally monolithic” like the United States (US) or the European Union (EU) can be highly diverse in terms of actual behavior.  In the West (and the modern-day US is the most discussed example), the recent trend towards polarization of views has become a popular topic of study and the coalesced factions are sometimes treated as “monolithic” despite in many cases being themselves intrinsically factionalized.