Showing posts sorted by date for query Sole. Sort by relevance Show all posts
Showing posts sorted by date for query Sole. Sort by relevance Show all posts

Friday, June 14, 2024

Lapidify

Lapidify (pronounced luh-pid-uh-fahy)

(1) To convert into stone or stony material; to petrify.

(2) To transform a material into something stony.

(3) Figuratively, to cause to become permanent; to solidify.

1620s: From the French lapidifier, from the Medieval Latin lapidificāre, the construct being the Latin lapis (stone) + -ify.  The origin of the Latin lapis is uncertain but there may be a link with the Ancient Greek λέπας (lépas) (bare rock, crag), which was from either the primitive Indo-European lep- (to peel) or a Mediterranean substrate language, most etymologists tending to favor the latter.  The -ify suffix was from the Middle English -ifien, from the Old French -ifier, from the Latin -ificare, from -ficus, from facio, (“make” or “do”).  It was used to produce verbs meaning “to make”; the alternative form was -fy.  The literal synonym in geology is petrify but also used (in various contexts) are set, harden, clarify, solidify, calcify, mineralize & fossilize.  Lapidify, lapidifies, lapidifying & lapidified are verbs, lapidification is a noun and lapidific & lapidifical are adjectives; the noun plural is lapidifications.

Medusa

In Greek mythology, Medusa (from the Ancient Greek Μέδουσα (Médousa), from μέδω (médō) (rule over)) was the youngest of the three Gorgon sisters and among them, the sole mortal.  In the popular imagination it seems to be believed than only the gaze of Medusa had the power to turn men to stone but her sisters Stheno & Euryale also possessed the gift.  The three were the daughters of Phorcys & Ceto who lived in the far west and the heads of the girls were entwined with writhing snakes and their necks protected with the scales of dragons while they had huge, boar-like tusks, hands of bronze and golden wings.  That alone would have made dating a challenge but anyone who had the misfortune to encounter them was turned instantly to stone.  Only Poseidon (god of the sea and one of the Olympians, the son of Cronus & Rhea) didn’t fear their glance because he had coupled with Medusa and fathered a child (in some tales the ghastly Cyclops Polyphemus which wasn’t encouraging but the other Cyclops were about as disagreeable.

Bust of Medusa in marble (1636) by Gianlorenzo Bernini (1598-1680), Museos Capitolinos. Palazzo dei Conservatori, Rome, Italy (left) and Lindsay Lohan in Medusa mode, Confessions of a Teenage Drama Queen (2004) (right).

Born in great secrecy, Perseus was the son of Zeus & Danae but one day, Danae’s father Acrisius heard the baby’s cry and, enraged that Zeus had seduced his daughter, had mother & child sealed in a wooden chest and cast into the sea; it washed up on the shores of the island of Seriphos, the pair rescued by the fisherman Dictys, brother of the ruling tyrant Polydectes.  When Perseus grew, he was one day one of those at one of Polydectes' banquets and when the guests were asked what gift they would offer their host, all except Perseus suggested horses.  He instead offered to bring to the table the severed head of Medusa.  It’s not clear if this was intended as a serious suggestion (wine may have been involved) but the tyrant insisted, saying that otherwise he would take Danae by force.  Embarking on this unpromising quest Perseus was helped by Hermes & Athena who took him to the Graeae; they showed him the way to the nymphs who lent him winged sandals, a kibisis (the backpack of the gods) and the helmet of Hades which rendered the wearer invisible.  Hermes armed him with the harpe, a sickle made of adamant.

Thus equipped, Perseus and Athena began the hunt for the Gorgons.  Of the three sisters, only Medusa was mortal so the project of decapitation had at least some theoretical prospect of success.  The far west was a bleak and uninviting place to which few travelled and they had little trouble in finding their lair, outside which they lay in wait until the family slept.  After midnight, when Medusa had fallen into a deep slumber, Perseus rose into the air on the nymphs’ winged sandals, and, while Athena held a shield of polished bronze over Medusa so it acted as a mirror, protecting them from her gaze, Perseus wielded his harpe, in one stroke striking head from shoulders.  Instantly, from the bloodied neck sprang Pegasus the winged horse and Chrysaor the giant.  Perseus stashed the severed head in the kibisis and quickly alit for home, pursued by a vengeful Stheno & Euryale but, concealed by the helmet’s cloak of invisibility, he evaded them.  Arriving in Seriphos, he became enraged after discovering Polydectes had attempted to rape Danae who had been compelled to seek refuge at the altars of the gods.  Perseus took Medusa’s head from the backpack and held the visage before Polydectes, lapidifying him in an instant, declaring his rescuer Dictys was now the island’s ruler.  The invaluable accessories he returned to the Nymphs while Athena set the head of Medusa in the middle of her shield, meaning she now possessed the power of lapidification.

Tuesday, June 11, 2024

Ultracrepidarian

Ultracrepidarian (pronounced uhl-truh-krep-i-dair-ee-uhn)

Of or pertaining to a person who criticizes, judges, or gives advice outside their area of expertise

1819: An English adaptation of the historic words sūtor, ne ultra crepidam, uttered by the Greek artist Apelles and reported by the Pliny the Elder.  Translating literally as “let the shoemaker venture no further” and sometimes cited as ne supra crepidam sūtor judicare, the translation something like “a cobbler should stick to shoes”.  From the Latin, ultra is beyond, sūtor is cobbler and crepidam is accusative singular of crepida (from the Ancient Greek κρηπίς (krēpís)) and means sandal or sole of a shoe.  Ultracrepidarian is a noun & verb and ultracrepidarianism is a noun; the noun plural is ultracrepidarians.  For humorous purposes, forms such as ultracrepidarist, ultracrepidarianish, ultracrepidarianize & ultracrepidarianesque have been coined; all are non-standard.

Ultracrepidarianism describes the tendency among some to offer opinions and advice on matters beyond their competence.  The word entered English in 1819 when used by English literary critic and self-described “good hater”, William Hazlitt (1778–1830), in an open letter to William Gifford (1756–1826), editor of the Quarterly Review, a letter described by one critic as “one of the finest works of invective in the language” although another suggested it was "one of his more moderate castigations" a hint that though now neglected, for students of especially waspish invective, he can be entertaining.  The odd quote from him would certainly lend a varnish of erudition to trolling.  Ultracrepidarian comes from a classical allusion, Pliny the Elder (circa 24-79) recording the habit of the famous Greek painter Apelles (a fourth century BC contemporary of Alexander the Great (Alexander III of Macedon, 356-323 BC)), to display his work in public view, then conceal himself close by to listen to the comments of those passing.  One day, a cobbler paused and picked fault with Apelles’ rendering of shoes and the artist immediately took his brushes and pallet and touched-up the sandal’s errant straps.  Encouraged, the amateur critic then let his eye wander above the ankle and suggested how the leg might be improved but this Apelles rejected, telling him to speak only of shoes and otherwise maintain a deferential silence.  Pliny hinted the artist's words of dismissal may not have been polite.

So critics should comment only on that about which they know.  The phrase in English is usually “cobbler, stick to your last” (a last a shoemaker’s pattern, ultimately from a Germanic root meaning “to follow a track'' hence footstep) and exists in many European languages: zapatero a tus zapatos is the Spanish, schoenmaker, blijf bij je leest the Dutch, skomager, bliv ved din læst the Danish and schuster, bleib bei deinen leisten, the German.  Pliny’s actual words were ne supra crepidam judicaret, (crepidam a sandal or the sole of a shoe), but the idea is conveyed is in several ways in Latin tags, such as Ne sutor ultra crepidam (sutor means “cobbler”, a word which survives in Scotland in the spelling souter).  The best-known version is the abbreviated tag ultra crepidam (beyond the sole), and it’s that which Hazlitt used to construct ultracrepidarian.  Crepidam is from the Ancient Greek κρηπίς (krēpísand has no link with words like decrepit or crepitation (which are from the Classical Latin crepare (to creak, rattle, or make a noise)) or crepuscular (from the Latin word for twilight); crepidarian is an adjective rare perhaps to the point of extinction meaning “pertaining to a shoemaker”.

The related terms are "Nobel disease" & "Nobel syndrome" which are used to describe some of the opinions offered by Nobel laureates on subjects beyond their specialization.  In some cases this is "demand" rather than "supply" driven because, once a prize winner is added to a media outlet's "list of those who comment on X", they are sometimes asked questions about matters of which they know little.  This happens because some laureates in the three "hard" prizes (physics, chemistry, physiology or medicine) operate in esoteric corners of their discipline; asking a particle physicist something about plasma physics on the basis of their having won the physics prize may not elicit useful information.  Of course those who have won the economics or one of what are now the "diversity" prizes (peace & literature) may be assumed to have helpful opinions on everything.

Jackson Pollock (1912-1956): Blue Poles

In 1973, when a million dollars was a still lot of money, the National Gallery of Australia, a little controversially, paid Aus$1.3 million for Jackson Pollock’s (1912-1956) Number 11, 1952, popularly known as Blue Poles since it was first exhibited in 1954, the new name reputedly chosen by the artist.  It was some years ago said to be valued at up to US$100 million but, given the increase in the money supply (among the rich who trade this stuff) over the last two decades odd, that estimate may now be conservative and some have suggested as much as US$400 million might be at least the ambit claim.

Number 11 (Blue poles, 1952), Oil, enamel and aluminum paint with glass on canvas.

Blue Poles emerged during Pollock’s "drip period" (1947-1950), a method which involved techniques such throwing paint at a canvas spread across the floor.  The art industry liked these (often preferring the more evocative term "action painting") and they remain his most popular works, although at this point, he abandoned the dripping and moved to his “black porings phase” a darker, simpler style which didn’t attract the same commercial interest.  He later returned to more colorful ways but his madness and alcoholism worsened; he died in a drink-driving accident.

Alchemy (1947), Oil, aluminum, alkyd enamel paint with sand, pebbles, fibers, and broken wooden sticks on canvas.

Although the general public remained uninterested (except by the price tags) or sceptical, there were critics, always drawn to a “troubled genius”, who praised Pollock’s work and the industry approves of any artist who (1) had the decency to die young and (2) produced stuff which can sell for millions.  US historian of art, curator & author Helen A Harrison (b 1943; director (1990-2024) of the Pollock-Krasner House and Study Center, the former home and studio of the Abstract Expressionist artists Jackson Pollock and Lee Krasner in East Hampton, New York) is an admirer, noting the “pioneering drip technique…” which “…introduced the notion of action painting", where the canvas became the space with which the artist actively would engage”.  As a thumbnail sketch she offered:

Number 14: Gray (1948), Enamel over gesso on paper.

Reminiscent of the Surrealist notions of the subconscious and automatic painting, Pollock's abstract works cemented his reputation as the most critically championed proponent of Abstract Expressionism. His visceral engagement with emotions, thoughts and other intangibles gives his abstract imagery extraordinary immediacy, while his skillful use of fluid pigment, applied with dance-like movements and sweeping gestures that seldom actually touched the surface, broke decisively with tradition. At first sight, Pollock's vigorous method appears to create chaotic labyrinths, but upon close inspection his strong rhythmic structures become evident, revealing a fascinating complexity and deeper significance.  Far from being calculated to shock, Pollock's liquid medium was crucial to his pictorial aims.  It proved the ideal vehicle for the mercurial content that he sought to communicate 'energy and motion made visible - memories arrested in space'.”

Number 13A: Arabesque (1948), Oil and enamel on canvas.

Critics either less visionary or more fastidious seemed often as appalled by Pollock’s violence of technique as they were by the finished work (or “products” as some labelled the drip paintings), questioning whether any artistic skill or vision even existed, one finding them “…mere unorganized explosions of random energy, and therefore meaningless.”  The detractors used the language of academic criticism but meant the same thing as the frequent phrase of an unimpressed public: “That’s not art, anyone could do that.”

Number 1, 1949 (1949), Enamel and metallic paint on canvas. 

There have been famous responses to that but Ms Harrison's was practical, offering people the opportunity to try.  To the view that “…people thought it was arbitrary, that anyone can fling paint around”, Ms Harrison conceded it was true anybody could “fling paint around” but that was her point, anybody could, but having flung, they wouldn’t “…necessarily come up with anything.”  In 2010, she released The Jackson Pollock Box, a kit which, in addition to an introductory text, included paint brushes, drip bottles and canvases so people could do their own flinging and compare the result against a Pollock.  After that, they may agree with collector Peggy Guggenheim (1898-1979) that Pollock was “...the greatest painter since Picasso” or remain unrepentant ultracrepidarians.

Helen A Harrison, The Jackson Pollock Box (Cider Mill Press, 96pp, ISBN-10:1604331860, ISBN-13:978-1604331868).

Dresses & drips: Three photographs by Cecil Beaton (1904-1980), shot for a three-page feature in Vogue (March 1951) titled American Fashion: The New Soft Look which juxtaposed Pollock’s paintings hung in New York’s Betty Parsons Gallery with the season’s haute couture by Irene (1872-1951) & Henri Bendel (1868-1936).

Beaton choose the combinations of fashion and painting and probably pairing Lavender Mist (1950, left) with a short black ball gown of silk paper taffeta with large pink bow at one shoulder and an asymmetrical hooped skirt by Bendel best illustrates the value of his trained eye.  Critics and social commentators have always liked these three pages, relishing the opportunity to comment on the interplay of so many of the clashing forces of modernity: the avant-garde and fashion, production and consumption, abstraction and representation, painting and photography, autonomy and decoration, masculinity and femininity, art and commerce.  Historians of art note it too because it was the abstract expressionism of the 1940s which was both uniquely an American movement and the one which in the post-war years saw the New York supplant Paris as the centre of Western art.  There have been interesting discussions about when last it could be said Western art had a "centre".

Eye of the beholder: Portrait of Lindsay Lohan in the style of Claude Monet at craiyon.com and available at US$26 on an organic cotton T-shirt made in a factory powered by renewable energy.

Whether the arguments about what deserves to be called “art” began among prehistoric “artists” and their critics in caves long ago isn’t known but it’s certainly a dispute with a long history.  In the sense it’s a subjective judgment the matter was doubtless often resolved by a potential buyer declining to purchase but during the twentieth century it became a contested topic and there were celebrated exhibits and squabbles which for decades played out before, in the post modern age, the final answer appeared to be something was art if variously (1) the creator said it was or (2) an art critic said it was or (3) it was in an art gallery or (4) the price tag was sufficiently impressive.

So what constitutes “art” is a construct of time, place & context which evolves, shaped by historical, cultural, social, economic, political & personal influences, factors which in recent years have had to be cognizant of the rise of cultural equivalency, the recognition that Western concepts such as the distinction between “high” (or “fine”) art and “folk” (or “popular”) art can’t be applied to work from other traditions where cultural objects are not classified by a graduated hierarchy.  In other words, everybody’s definition is equally valid.  That doesn’t mean there are no longer gatekeepers because the curators in institutions such as museums, galleries & academies all discriminate and thus play a significant role in deciding what gets exhibited, studied & promoted, even though few would now dare to suggest what is art and what is not: that would be cultural imperialism.

In the twentieth century it seemed to depend on artistic intent, something which transcended a traditional measure such as aesthetic value but as the graphic art in advertising and that with a political purpose such as agitprop became bigger, brighter and more intrusive, such forms also came to be regarded as art or at least worth of being studied or exhibited on the same basis, in the same spaces as oil on canvas portraits & landscapes.  Once though, an unfamiliar object in such places could shock as French painter & sculptor Marcel Duchamp (1887-1968) managed in 1917 when he submitted a porcelain urinal as his piece for an exhibition in New York, his rationale being “…everyday objects raised to the dignity of a work of art by the artist's act of choice.”  Even then it wasn’t a wholly original approach but the art establishment has never quite recovered and from that urinal to Dadaism, to soup cans to unmade beds, it became accepted that “anything goes” and people should be left to make of it what they will.  Probably the last remaining reliable guide to what really is "art" remains the price tag.

1948 Cisitalia 202 GT (left; 1947-1952) and 1962 Jaguar E-Type (1961-1974; right), Museum of Modern Art (MoMA), New York City.

Urinals tend not to be admired for their aesthetic qualities but there are those who find beauty in things as diverse as mathematical equations and battleships.  Certain cars have long been objects which can exert an emotional pull on those with a feeling for such things and if the lines are sufficiently pleasing, many flaws in engineering are often overlooked.  New York’s Museum of Modern Art (MoMA) acknowledged in 1972 that such creations can be treated as works of art when they added a 1948 Cisitalia 202 GT finished in “Cisitalia Red” (MoMA object number 409.1972) to their collection, the press release noting it was “…the first time that an art museum in the U.S. put a car into its collection.”  Others appeared from time-to-time and while the 1953 Willys-Overland Jeep M-38A1 Utility Truck (MoMA object number 261.2002) perhaps is not conventionally beautiful, its brutish functionalism has a certain simplicity of form and in the exhibition notes MoMA clarified somewhat by describing it as a “rolling sculpture”, presumably in the spirit of a urinal being a “static sculpture”, both to be admired as pieces of design perfectly suited to their intended purpose, something of an art in itself.  Of the 1962 Jaguar E-Type (XKE) open two seater (OTS, better known as a roadster and acquired as MoMA object number 113.996), there was no need to explain because it’s one of the most seductive shapes ever rendered in metal.  Enzo Ferrari (1898-1988) attended the 1961 Geneva Motor Show (now defunct) when the Jaguar staged its stunning debut and part of E-Type folklore is he called it “the most beautiful car in the world”.  Whether those words ever passed his lips isn’t certain because the sources vary slightly in detail and il Commendatore apparently never confirmed or denied the sentiment but it’s easy to believe and many to this day agree just looking at the thing can be a visceral experience.  The MoMA car is finished in "Opalescent Dark Blue" with a grey interior and blue soft-top; there are those who think the exhibit would be improved if it was in BRG (British Racing Green) over tan leather but anyone who finds a bad line on a Series 1 E-Type OTS is truly an ultracrepidarian.   

Saturday, April 27, 2024

Molyneux

Molyneux (pronounced mol-un-ewe)

(1) A habitational surname of Norman origin, almost certainly from the town of Moulineaux-sur-Seine, in Normandy.

(2) A variant of the Old French Molineaux (an occupational surname for a miller).

(3) An Anglicized form of the Irish Ó Maol an Mhuaidh (descendant of the follower of the noble).

(4) In law in the state of New York, as the “Molineux Rule”, an evidentiary rule which defines the extent to which a prosecutor may introduce evidence of a defendant’s prior bad acts or crimes, not to show criminal propensity, but to “establish motive, opportunity, intent, common scheme or plan, knowledge, identity or absence of mistake or accident.”

(5) In philosophy, as the “Molyneux Problem”, a thought experiment which asks:”If someone born blind, who has learned to distinguish between a sphere and a cube by touch alone, upon suddenly gaining the power of sight, would they be able to distinguish those objects by sight alone, based on memory of tactile experience?”

Pre 900: The French surname Molyneux was from the Old French and is thought to have been a variant of De Molines or De Moulins, both linked to "Mill" (Molineaux the occupational surname for a miller) although the name is believed to have been habitation and form an unidentified place in France although some genealogists have concluded the de Moulins came from Moulineaux-sur-Seine, near Rouen, Normandy.  Despite the continental origin, the name is also much associated with various branches of the family in England and Ireland, the earliest known references pre-dating the Norman Conquest (1066).  The alternative spelling is Molineux.

The "Molyneux Problem" is named after Irish scientist and politician William Molyneux (1656–1698) who in 1688 sent a letter to the English physician & philosopher John Locke (1632–1704), asking: Could someone who was born blind, and able to distinguish a globe and a cube by touch, be able to immediately distinguish and name these shapes by sight if given the ability to see?  Obviously difficult to test experimentally, the problem prompted one memorable dialogue between Locke and Bishop George Berkeley (1685–1753 (who lent his name, pronounced phonetically to the US university) but it has long intrigued those from many disciplines, notably neurology and psychology, because sight is such a special attribute, the eyes being an outgrowth of the brain; the experience of an adult brain suddenly being required to interpret visual input would be profound and certainly impossible to imagine.  Philosophers since Locke have also pondered the problem because it raises issues such as the relationship between vision and touch and the extent to which some of the most basic components of knowledge (such as shape) can exist at birth or need entirely to be learned or experienced.

The Molineux Rule in the the adversarial system 

The Molineux Rule comes from a decision handed down by the Court of Appeals of New York in the case of People v Molineux (168 NY 264 (1901)).  Molineux had at first instance been convicted of murder in a trial which included evidence relating to his past conduct.  On appeal. the verdict was overturned on the basis that as a general principle: “in both civil and criminal proceedings, that when evidence of other crimes, wrongs or acts committed by a person is offered for the purpose of raising an inference that the person is likely to have committed the crime charged or the act in issue, the evidence is inadmissible.”  The rationale for that is it creates a constitutional safeguard which acts to protect a defendant from members of a jury forming an assumption the accused had committed the offence with which they were charged because of past conduct which might have included being accused of similar crimes.  Modified sometimes by other precedent or statutes, similar rules of evidentiary exclusion operate in many common law jurisdictions.  It was the Molineux Rule lawyers for former film producer Harvey Weinstein (b 1952) used to have overturned his 2020 conviction for third degree rape.  In a 4:3 ruling, the court held the trial judge made fundamental errors in having “erroneously admitted testimony of uncharged, alleged prior sexual acts against persons other than the complainants of the underlying crimes because that testimony served no material non-propensity purpose.” and therefore the only ...remedy for these egregious errors is a new trial.

Harvey Weinstein and others.

Reaction to the decision of the appellate judges was of course swift and the opinion of the “black letter” lawyers was the court was correct because “…we don't want a court system convicting people based on testimony about allegations with which they’ve not been charged.”, added to which such evidence might induce a defendant not to submit to the cross-examination they’d have been prepared to undergo if only matters directly relevant to the charge(s) had been mentioned in court.  Although the Molineux Rule has been operative for well over a century, some did thing it surprising the trial judge was prepare to afford the prosecution such a generous latitude in its interpretation but it should be noted the Court of Appeal divided 4:3 so there was substantial support from the bench that what was admitted as evidence did fall within what are known as the “Molineux exceptions” which permit certain classes of testimony in what is known as “character evidence”.  That relies on the discretion of the judge who must weigh the value of the testimony versus the prejudicial effect it will have on the defendant.  In the majority judgment, the Court of Appeal made clear that in the common law system (so much of which is based on legal precedent), if the trial judge’s decision on admissibility was allowed to stand, there could (and likely would) be far-reaching consequences and their ruling was based on upholding the foundations of our criminal justice system in the opening paragraphs: "Under our system of justice, the accused has a right to be held to account only for the crime charged and, thus, allegations of prior bad acts may not be admitted against them for the sole purpose of establishing their propensity for criminality. It is our solemn duty to diligently guard these rights regardless of the crime charged, the reputation of the accused, or the pressure to convict."

The strict operation of the Molineux Rule (which this ruling will ensure is observed more carefully) does encapsulate much of the core objection to the way courts operate in common law jurisdictions.  The common law first evolved into something recognizable as such in England & Wales after the thirteenth century and it spread around the world as the British Empire grew and that included the American colonies which, after achieving independence in the late eighteenth century as the United States of America, retained the legal inheritance.  The common law courts operate on what is known as the “adversarial system” as opposed to the “inquisitorial system” of the civil system based on the Code Napoléon, introduced in 1804 by Napoleon Bonaparte (1769–1821; leader of the French Republic 1799-1804 & Emperor of the French from 1804-1814 & 1815) and widely used in Europe and the countries of the old French Empire.  The criticism of the adversarial system is that the rules are based on the same principle as many adversarial contests such as football matches where the point of the rules is to ensure the game is decided on the pitch and neither team has any advantage beyond their own skill and application.

That’s admirable in sport but many do criticize court cases being conducted thus, the result at least sometimes being decided by the skill of the advocate and their ability to persuade.  Unlike the inquisitorial system where the object is supposed to be the determination of the truth, in the adversarial system, the truth can be something of an abstraction, the point being to win the case.  In that vein, many find the Molineux Rule strange, based on experience in just about every other aspect of life.  Someone choosing a new car, a bar of chocolate or a box of laundry detergent is likely to base their decision from their knowledge of other products from the same manufacturer, either from personal experience or the result of their research.  Most consumer organizations strongly would advise doing exactly that yet when the same person is sitting on a jury and being asked to decide if an accused is guilty of murder, rape or some other heinous offence, the rules don’t allow them to be told the accused has a history of doing exactly that.  All the jury is allowed to hear is evidence relating only to the matter to be adjudicated.  Under the Molineux Rule there are exceptions which allow “evidence of character” to be introduced but as a general principle, the past is hidden and that does suit the legal industry which is about winning cases.  The legal theorists are of course correct that the restrictions do ensure an accused can’t unfairly be judged by past conduct but for many, rules which seem to put a premium on the contest rather than the truth must seem strange.

Sunday, April 14, 2024

Legside

Legside (pronounced leg-sahyd)

(1) In the terminology of cricket (also as onside), in conjunction with “offside”, the half of the cricket field behind the batter in their normal batting stance.

(2) In the terminology of horse racing, in conjunction with “offside”, the sides of the horse relative to the rider.

Pre 1800s: The construct was leg + side.  Leg was from the Middle English leg & legge, from the Old Norse leggr (leg, calf, bone of the arm or leg, hollow tube, stalk), from the Proto-Germanic lagjaz & lagwijaz (leg, thigh).  Although the source is uncertain, the Scandinavian forms may have come from a primitive Indo-European root used to mean “to bend” which would likely also have been linked with the Old High German Bein (bone, leg).  It was cognate with the Scots leg (leg), the Icelandic leggur (leg, limb), the Norwegian Bokmål legg (leg), the Norwegian Nynorsk legg (leg), the Swedish lägg (leg, shank, shaft), the Danish læg (leg), the Lombardic lagi (thigh, shank, leg), the Latin lacertus (limb, arm), and the Persian لنگ (leng).  After it entered the language, it mostly displaced the native Old English term sċanca (from which Modern English ultimately gained “shank”) which was probably from a root meaning “crooked” (in the literal sense of “bent” rather than the figurative used of crooked Hillary Clinton).  Side was from the Middle English side, from the Old English sīde (flanks of a person, the long part or aspect of anything), from the Proto-Germanic sīdǭ (side, flank, edge, shore), from the primitive Indo-European sēy- (to send, throw, drop, sow, deposit).  It was cognate with the Saterland Frisian Siede (side), the West Frisian side (side), the Dutch zijde & zij (side), the German Low German Sied (side), the German Seite (side), the Danish & Norwegian side (side) and the Swedish sida (side).  The Proto-Germanic sīdō was productive, being the source also of the Old Saxon sida, the Old Norse siða (flank; side of meat; coast), the Danish & Middle Dutch side, the Old High German sita and the German Seite.  Legside is an adjective.

A cricket field as described with a right-hander at the crease (batting); the batter will be standing with their bat held to the offside (there’s no confusion with the concept of “offside” used in football and the rugby codes because in cricket there’s no such rule).

In cricket, the term “legside” (used also as “leg side” or “on side”) is used to refer to the half of the field corresponding to a batter’s non-dominant hand (viewed from their perspective); the legside can thus be thought of as the half of the ground “behind” the while the “offside” is that in front.  This means that what is legside and what is offside is dynamic depending on whether the batter is left or right-handed and because in a match it’s not unusual for one of each to be batting during an over (the basic component of a match, each over now consisting of six deliveries of the ball directed sequentially at the batters), as they change ends, legside and offside can swap.  This has no practical significance except that because many of the fielding positions differ according to whether a left or right-hander is the striker.  That’s not the sole determinate of where a fielding captain will choose to set his field because what’s referred to as a “legside” or “offside” field will often be used in deference to the batter’s tendencies of play.  It is though the main structural component of field settings.  The only exception to this is when cricket is played in unusual conditions such as on the deck of an aircraft carrier (remarkably, it’s been done quite often) but there’s still a legside & offside, shifting as required between port & starboard just as left & right are swapped ashore.

The weird world of cricket's fielding positions.

Quite when legside & offside first came to be used in cricket isn't known but they’ve been part of the terminology of the sport since the rules of the game became formalized when the MCC (Marylebone Cricket Club) first codified the "Laws of Cricket" in what now seem a remarkably slim volume published in 1788, the year following the club’s founding.  There had earlier been rule books, the earliest known to have existed in the 1730s (although no copies appear to have survived) but whether the terms were then is use isn’t known.  What is suspected is legside and offside were borrowed from the turf where, in horse racing jargon, they describe the sides of the horse relative to the rider.  The use of the terms to split the field is reflected also in the names of some of the fielding positions, many of which are self-explanatory while some remain mysterious although presumably they must have seemed a good idea at the time.  One curious survivor of the culture wars which banished "batsman" & "fieldsman" to the shame of being microaggressions is "third man" which continues to be used in the men's game although in women's competition, all seem to have settled on "third", a similar clipping to that which saw "nightwatch" replace "nightwatchman"; third man surely can't last.  The ones which follow the dichotomous description of the field (although curiously “leg” is an element of some and “on” for others) including the pairings “silly mid on & silly mid off” and “long on & long off”, while in other cases the “leg” is a modifier, thus “slip & leg slip” and “gully & leg gully”.  Some positions use different terminology depending on which side of the field they’re positioned, “point” on the offside being “square leg” on the other while fractional variations in positioning means there is lexicon of terms such as “deep backward square leg” and “wide long off” (which experts will distinguish from a “wideish long off”).

Leg theory

Leg theory was a polite term for what came to be known as the infamous “bodyline” tactic.  In cricket, when bowling, the basic idea is to hit the stumps (the three upright timbers behind the batter), the object being to dislodge the bails (the pair of small wooden pieces which sit in grooves, atop the three).  That done, the batter is “dismissed” and the batting side has to send a replacement, this going on until ten batters have been dismissed, ending the innings.  In essence therefore, the core idea is to aim at the stumps but there are other ways to secure a dismissal such as a shot by the batter being caught on the full by a fielder, thus the attraction of bowling “wide of the off-stump” (the one of the three closest to the off side) to entice the batter to hit a ball in the air to be caught or have one come "off the edge" of the bat to be “caught behind”.  It was realized early on there was little to be gained by bowling down the legside except restricting the scoring because the batter safely could ignore the delivery, content they couldn’t be dismissed LBW (leg before wicket, where but for the intervention of the protective pads on the legs, the ball would have hit the wicket) because, under the rules, if the ball hits the pitch outside the line of the leg stump, the LBW rule can’t be invoked.

A batter can however be caught from a legside delivery and as early as the nineteenth century this was known as leg theory, practiced mostly the slow bowlers who relied on flight in the air and spin of the pitch to beguile the batter.  Many had some success with the approach, the batters unable to resist the temptation of playing a shot to the legside field where the fielders tended often to be fewer.  On the slower, damper pitches of places like England or New Zealand, the technique offered little prospect for the fast bowlers who were usually more effective the faster they bowled but on the generally fast, true decks in Australia, there was an opportunity because a fast, short-pitched (one which hits the pitch first in the bowlers half of the pitch before searing up towards the batter) delivery with a legside line would, disconcertingly, tend at upwards of 90 mph (145 km/h) towards the batter’s head.  The idea was that in attempting to avoid injury by fending off the ball with the bat, the batter would be dismissed, caught by one of the many fielders “packed” on the legside, the other component of leg theory.

Leg theory: Lindsay Lohan’s legs.

For this reason it came to be called “fast leg theory” and it was used off and on by many sides (in Australia and England) during the 1920s but it gained its infamy (and the more evocative “bodyline label) during the MCC’s (the designation touring England teams used until the 1970s) 1932-1933 Ashes tour of Australia.  Adopted as a tactic against the Australian batter Donald Bradman (1908–2001) against whom nothing else seemed effective (the English noting on the 1930 tour of England he’d once scored 300 runs in a day off his own bat at Leeds), bodyline became controversial after a number of batters were struck high on the body, one suffering a skull fracture (this an era in which helmets and other upper-body protection were unknown).  Such was the reaction the matter was a diplomatic incident, discussed by the respective cabinets in London and Canberra while acerbic cables were exchanged between the ACBC (Australian Cricket Board of Control) and the MCC.

Japanese leg theory: Zettai ryōiki (絶対領域) is a Japanese term which translates literally as “absolute territory” and is used variously in anime gaming and the surrounding cultural milieu.  In fashion, it refers to that area of visible bare skin above the socks (classically the above-the-knee variety) but below the hemline of a miniskirt, shorts or top.

Japanese schoolgirls, long the trend-setters of the nation's fashions, like to pair zettai ryouiki with solid fluffy (also called "plushies") leg warmers.  So influential are they that the roaming pack in this image, although they've picked up the aesthetic, are not actually real school girls.  So, beware of imitations: Tokyo, April 2024.

High-level interventions calmed thing sufficiently for the tour to continue which ended with the tourists winning the series (and thus the Ashes) 4-1.  The tour remains the high-water mark of fast leg theory because although it continued to be used when conditions were suitable, the effectiveness was stunted by batters adjusting their techniques and, later in the decade, the MCC updated their rule book explicitly to proscribe “direct attack” (ie deliveries designed to hit the batter rather than the stumps) bowling, leaving the judgment of what constituted that to the umpires.  Although unrelated and an attempt to counter the “negative” legside techniques which had evolved in the 1950s to limit scoring, further rule changes in 1957 banned the placement of more than two fielders behind square on the leg side, thus rendering impossible the setting of a leg theory field.  Despite all this, what came to be called “intimidatory short pitched bowling” continued, one of the reasons helmets began to appear in the 1970s and the rule which now applies is that only one such delivery is permitted per over.  It has never been a matter entirely about sportsmanship and within the past decade, the Australian test player Phillip Hughes (1988-2014) was killed when struck on the neck (while wearing a helmet) by a short-pitched delivery which severed an artery.

Monday, March 18, 2024

Impeach

Impeach (pronounced im-peech)

(1) To accuse (a public official) before an appropriate tribunal of misconduct in office.

(2) In law, as “to impeach a witness”; to demonstrate in court that a testimony under oath contradicts another testimony from the same person, usually one taken during deposition.

(3) To bring an accusation against; to call in question; cast an imputation upon:

(4) In British criminal law, to accuse of a crime, especially of treason or some other offence against the state

(5) In the US and some other jurisdictions, to charge (a public official) with an offence committed in office.

(6) To hinder, impede, or prevent (archaic).

(7) To call to account (now rare).

1350–1400: From the Middle English empechen & enpeshen, from the Anglo-French empecher (to hinder) from the Old French empeechier from the Late Latin impedicāre (to fetter, trap, entangle or catch), the construct being im- + pedic(a) (a fetter (derivative of pēs (foot))) + -ā- (a thematic vowel) + -re (the Latin infinitive suffix) and cognate with French empêcher (to prevent); The most usual Latin forms were impedicō & impedicāre.  Impeach is a verb, impeachment & impeachability &  are nouns, impeaching & impeached are verbs and impeachable & impeachmentworthy are adjectives (although not all authorities acknowledge the latter as a standard form); the noun plural is impeachments.

An English import the Americans made their own 

Although most associated with the US where the constitution permits the House of Representatives to impeach government officials (most notably the president) and send them for trial in the Senate, the concept of impeachment is a borrowing from the procedures of the UK Parliament.  Always a rare mechanism, impeachment was first used in England in 1376 with the last UK case in 1806 and while technically extant, is probably obsolete although it’s not unknown for relics of the UK’s long legal past occasionally to be resuscitated.  What is more likely is that matters once dealt with by impeachment would now be brought before a court although most historians and constitutional lawyers seem to believe it remains part of UK constitutional law and abolition would demand legislation.  That was exactly what select committees recommended in 1967 and again ten years later but nothing was done and despite the New Labour government (1997-2010) imposing some quite radical structural changes on the legal system, the mechanism of impeachment remained untouched.  In September 2019, it was reported that opposition politicians in the House of Commons were considering impeachment proceedings against Boris Johnson (b 1964; UK prime-minister 2019-2022) "on charges of gross misconduct in relation to the unlawful prorogation of parliament", as well as his threat to break the law by failing to comply with the European Union (Withdrawal) (No. 2) Act 2019 (which required the prime-minister in certain circumstances to seek an extension to the Brexit withdrawal date of 31 October 2019).  Mr Johnson survived that one though it proved a temporary reprieve for his premiership.

Although the Sturm und Drang of Donald Trump’s (b 1946; US president 2017-2021) unprecedented two impeachments was entertaining for political junkies, as a spectacle the two trials were muted affairs because the verdicts were both predictable.  Under the US Constitution, the House of Representatives has the “sole Power of Impeachment” (essentially a form of indictment in other proceedings) while the Senate is vested with “the sole Power to try all Impeachments”.  An act of impeachment requires only a majority vote on the floor of a House but conviction in the Senate demand “the concurrence of two thirds of the members present”.  Given the numbers and the state of partisan which these days characterizes the two-party system, nobody in Washington DC believed there was even a vague prospect of Mr Trump being convicted.  Still, the dreary, confected, set-piece speeches on both sides were like slabs of raw meat thrown to the attack dogs watching Fox News and NBC so in that sense it was a kind of substitute for what the Founding Fathers might have hoped would have been the standard of debate in the Congress, 250-odd years on.  In an ominous sign, the Republicans have since made attempts to stage a retaliatory impeachment trial of Joe Biden (b 1942; US president since 2021) despite knowing there is no prospect of a conviction.  Political scientists have expressed concern this may be a harbinger of something like the situation is some countries (such as Pakistan & Bangladesh (the old West & East Pakistan)) where it is almost a form of ritualized revenge to pursue one's predecessor through the courts, jailing them if possible.  The hope is that such a culture might be peculiar to the Trump era and something less confrontation might emerge when he leaves the stage although, what he has threatened in a second term does sound like he has vengeance on his mind.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011. 

The best impeachment in the US was the one which never was, the one Richard Nixon (1913-1994; US president 1969-1974) avoided by resigning the presidency on 9 August 1974.  That an impeachment became inevitable was Nixon’s own fault.  The evidence of those acts of Nixon which met the standard of “Treason, Bribery, or other high Crimes and Misdemeanors.” existed only on the tapes which came to the knowledge of those investigating the White House’s involvement in the Watergate affair only through a chance remark by an aide; prior to that the existence of the president’s recording mechanism had been restricted to a small circle around Nixon.  There was a wealth of other material which hinted or suggested there may have been unlawful acts by Nixon but what was lacking was what came to be called the “smoking gun”, the undeniable proof.  That proof was on the tapes and as soon as knowledge of them became public, Nixon should have destroyed them and the ways and means existed close to home.  Even in oppressively hot Washington summers, Nixon would have the air-conditioning turned high to provide a wintery ambiance and have a log fire burning in the fireplace, close to which he would sit while writing his noted on yellow legal pads; it was a lifelong habit.

Washington Post 7 August 1974.

The tapes should have been tossed into that fire and that would have solved the problem, a smoking tape no smoking gun.  It would of course have created other problems but they were political and could be handled in a way legal difficulties could not.  However, as soon as the tapes were subpoenaed they became evidence and their destruction would have been an obstruction of justice or worse.  Nixon had a narrow window of opportunity and didn’t take it, apparently convinced the doctrine of executive privilege would operate to ensure he wasn’t required to surrender the tapes to the investigators although in some of his subsequent writings he also maintained he genuinely believed they contained nothing which could cause him problems.  Given he genuinely would have had no knowledge of what exactly was on the tapes, that is at least plausible but all the material since published suggests his opinion of the protection executive privilege affords a president was the critical factor.  As it was the US Supreme Court (SCOTUS) limited the application of the doctrine and compelled Nixon to hand over the tapes.

New York Times, 9 August 1974.

With the release of the “smoking gun tape” which contained recordings proving Nixon was implicated in the cover-up of the involvement in the Watergate break-in by staff connected to the White House, his support in the Congress collapsed and those Republican representatives who previously had refused to vote for impeachment switched sides and the same day, after sounding out the numbers in the Senate, a delegation of senior Republican senators told the president he would be convicted and by a decisive margin.  What was revealed on the tapes was enough to seal his fate but the verdict of history might have been worse still because To this day, mystery surrounds one tape in particular, a recording of a discussion between Nixon and HR Haldeman (1926–1993; White House chief of staff 1969-1973) on 20 June 1972, three days after the Watergate break-in.  Of obviously great interest, when reviewed, there was found to be a gap of 18½ minutes, the explanations offered of how, why or by whom the erasure was effected ranging from the humorously accidental to the darkly conspiratorial but half a century on, it remains a mystery.  Taking advantage of new data-recovery technology, the US government did in subsequent decades make several attempts to “un-delete” the gap but without success and it may be, given the nature of magnetic tape, that there is literally nothing left to find.  However, the tape is stored in a secure, climate-controlled facility in case technical means emerge and while it’s unlikely the contents would reveal anything not already known or assumed, it would be of great interest to historians.  What would be even more interesting is the identity of who it was that erased the famous 18½ minutes but that will likely never be known; after fifty years, it’s thought that were there to be any death-bed confessions, they should by now have been heard.  Some have their lists of names of those who might have "pressed the erase button" and while mostly sub-sets of Watergate's "usual suspects", one who tends not to appear is Nixon himself, the usual consensus being he was technically too inept to operate a tape machine though it's not impossible he ordered someone to do the deed.  However it happened, the suspects most often mentioned as having had their "finger on the button" (which may have been a foot-pedal) are Nixon's secretary and his chief of staff. 

On 8 August 1974, Nixon resigned his office, effective the next day, saying in conclusion during his nationally televised speech:

To leave office before my term is completed is abhorrent to every instinct in my body. But as President, I must put the interest of America first. America needs a full-time President and a full-time Congress, particularly at this time with problems we face at home and abroad. To continue to fight through the months ahead for my personal vindication would almost totally absorb the time and attention of both the President and the Congress in a period when our entire focus should be on the great issues of peace abroad and prosperity without inflation at home. Therefore, I shall resign the Presidency effective at noon tomorrow. Vice President Ford will be sworn in as President at that hour in this office.

Herblock's (Herbert Block; 1909–2001) Watergate affair-era take on Richard Nixon's then novel position on the presidency and the US Constitution, Washington Post, 13 March 1974.  The cartoon has been noted by some in the light of Donald Trump's comments about the extent of presidential immunity.

Wednesday, March 13, 2024

Nerd

Nerd (pronounced nurd)

(1) A person obsessed with a hobby or pursuit or with a particular topic, most associated with IT related or non-fashionable matters.

(2) A person thought socially awkward, boring, unstylish etc (used in both an affectionate and derogatory sense and also as a self descriptor by nerds proud of their status (and debatably by those who aspire to be accepted as part of the nerdhood).

(3) To spend an inordinate amount of time or devote extraordinary attention, energy, enthusiasm etc on an activity or topic of special or obsessive interest to oneself; (sometimes used interchangeably with geek and often in conjunction with “nerd out” or “nerding”).

1951: An Americanism described best as an “obscurely derived expressive formation” (the etymology thus unknown) but it seems agreed it began as US student slang.  The rare spelling nurd was either a mistake (probably an imperfect echoic) or an attempt at nuance although the purpose remains obscure while the forms nerdic, nerdism, nerdling, nerdlet, nerdsome & nergasm are usually regarded as non-standard parts of IT slang; arachnerd & cybernerd are both generally recognized, probably because of the long history of use.  Nerd has been widely adopted in other languages, usually unaltered and apparently always in the sense of a “computer geek” while as an acronym, NERD is used for Non-Erosive Reflux Disease, Non-Ester Renewable Diesel, Network Event Recording Device, Nucleic Exchange Research & Development & Neuro-Evolutionary Rostral Developer.  In IT slang, the acronym can decode as Network Emergency Repair Dude & Network Emergency Repair Diva.  Nerd is a noun & verb, nerding & nerded are verbs, nerdy, nerdish, nerdlike & nurdish are adjectives (nerdesque is non-standard); the noun plural is nerds.

If I Ran the Zoo by Dr Seuss (1950)

The word (in capitalized form), appeared in 1950 in the children’s book If I Ran the Zoo by Dr Seuss (Theodor Seuss Geisel; 1904–1991) who used it as the name of one of his imagined animals:

And then, just to show them, I’ll sail to Katroo
And bring back an It-Kutch, a Preep and a Proo,
A Nerkle, a Nerd and a Seersucker too!

All the evidence suggests Dr Seuss choose “Nerd” because he liked the word and it suited his sentence structure but there has been speculation about the etymology.  One suggestion was the character of Mortimer Snerd, a ventriloquist's dummy created by Edgar Bergen (1903-1978), a ventriloquist who was versatile enough to also build a career in radio.  Snerd was the archetypical hillbilly (a “country bumpkin” to English audiences), a species derided as tiresome or dull, these qualities magnified by his sophisticated foil, the dummy Charlie McCarthy.  One can see the point but there’s nothing to support the connection.

A year after the publication of If I Ran the Zoo, Newsweek magazine ran a piece about the latest slang terms (the linguistic melting pot of the war years had seen both a proliferation and the geographical spread of the forms) and included was “nerd”, listed as having currency in the Detroit region and used in the same sense as “someone who once would be called a drip or a square” although they added that for the less severe cases, “scurve” seemed to suffice.  From Michigan it must have spread because by the 1960s use had migrated from lists of slang to more general use and, being the pre-internet era, it was transmitted often orally, thus the appearance of the spelling “nurd” although by the following decade when frequently it was seen in print, the current spelling was almost universal.  Etymologists date nerd as an established colloquial form from this decade, noting that despite the modern association, it initially had nothing to do with computers and the accepted connotation became “socially inept but brainy”, juxtaposed often in campus use with the “jock” (stereotypically there on a sports scholarship) who excelled in sports (and by implication the conquest of female students) but whose academic aptitudes were slight.

The Nerd as imagined by Dr Seuss (left), Bill Gates (b 1955), the defining nerd of the late twentieth century (centre) and John McAfee (1945–2021), the nerd’s anti-nerd (right).

The link between the nerd imagined by Dr Seuss and the notion of squareness has attracted interest but the character in the book looks more bad tempered than socially inept although one can perhaps see some resemblance to John McAfee (1945–2021), Bill Gates (b 1955) et al; that though is very much something retrospective and there’s nothing to support any degree of connection between “nerd” and computing until the 1980s when PCs entered the consumer electronics market.  There has been speculation Dr Seuss mentioned the “Seersucker” in the same sentence as the one introducing the Nerd as an attempt to harden to link with “squareness”, (seersucker in the view of the young a most uncool fabric) but that seems too clever by half and few have any doubt the author invented or choose the words to suit the rhythm of the text.

Inside Lindsay Lohan there's a nerd trying to escape: In nerd glasses, LAX, February 2012.

Another theory is that nerd was a piece of wordplay, an alteration of nerts, a slang form from the early twentieth century applied to things thought extraordinary (as in “that movie was the nerts”) or used as an interjection like “nuts!”.  An alternative idea was it was a re-bracketing of "inert" in which “they’re inert” became “they’re a nerd”, the same process which early meme-makers used to take “be alert” and render it as “be a lert; the world needs more lerts”.  In the case of “inert” begetting “nerd”, again, there’s no supporting evidence.  The ultimate folk etymology tale was probably that nerd developed from the campus slang knurd (“drunk” written backwards), the implication being that while a drunken student is obviously cool, the sober knurd would sooner study than party, the distinction explored by Boris Johnson (b 1964; UK prime-minister 2019-2022) who labelled David Cameron (b 1966; UK prime-minister 2010-2016) a “girly swot” to rationalize why Cameron got a First at Oxford and Johnson a Second.  It’s an attractive theory but without any evidence.  Nor is there any support for the notion of a link between nerd and “turd” (shit) or merde (French a vulgar word for “shit”).  There is however no doubt the 1980s slang “nerd pack” referred to the combination of a pocket protector (so the pens wouldn’t leak ink onto a nerd’s polyester shirt) and big lens spectacles with conspicuously unattractive frames although, that showed a fundamental misunderstanding of nerd culture: nerds know pencils are much better than pens.

Lindsay Lohan nerding up on rest.

In idiomatic use, to “nerd out” is enthusiastically to immerse one’s self in their interest or even an extended conversation (which may often be a monologue) on the topic.  The best nerd outs can last a day or more; the past tense is “nerded out”, modified when emphasis is demanded as “nerded out hardcore”, “totally nerded out” or “nerded out big time”.  To “nerd up” can mean variously (1) to augment one’s surroundings with the imagery or objects associated with one’s interest, (2) to cram study of some topic for some purpose (a exam, an upcoming date etc) and (3) to describe a discussion which evolved unexpectedly into something highly specific (usually as “nerded up”).

Richard Nixon (left) with Henry Kissinger (b 1923; US national security advisor 1969-1975 & secretary of state 1937-1977, right), the White House, October 1973.  Dr Kissinger was a policy wonk who became one of history's more improbable sex symbols.

There are a number of words which are used to convey something similar to nerd including geek, wonk & dork.  A word like anorak (mostly UK) is similar but has a different emphasis.  Historically a nerd is someone with an inclination to study, often subjects with technical focus or something truly arcane.  The modern association is with science, mathematics, computers and such but there are poetry nerds and those who nerd-out on the strains of Karlheinz Stockhausen (1928–2007), Charles Ives (1874–1954) and Philip Glass (b 1937).  The association with social ineptitude seems no less prevalent.  Geeks are like nerds in that they are obsessive about their specific interests but these niches may be far removed from computer code or respectable academic pursuits and may include comic books, the film franchise Star Wars, baseball statistics or video games.  Often geeks are highly social but many would prefer they were not because their interests are their sole topic of conversation; they’re best left alone with each other.  Wonks are different again and the term has evolved to be used usually as “policy wonk”, describing a particular political creature who is genuinely interested in and has expertise related to specialized fields such as trade, agriculture and other important if dismal matters.  The political operatives admire the wonks and value them for doing the hard work which involves reading long documents of mind-numbing complexity.  Policy wonks think such papers are great.  Finally, there are dorks.  Dorks may or may not be nerds, geeks or wonks and are defined wholly by their social awkwardness and clumsy manners although in the early 1950s, in US slang a dork was “an effeminate male”.  Other slurs, more offensive still took its place and in less than a decade, dork seem exclusively to have assumed the idea of “social ineptitude and poor taste in clothing”.  Interestingly, although the reign of the policy wonks in government can be said to have begun during the administrations of John Kennedy (JFK, 1917–1963; US president 1961-1963), Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) & Richard Nixon (1913-1994; US president 1969-1974), the term entered mainstream pop-culture under Bill Clinton (b 1946; US president 1993-2001).