Showing posts sorted by relevance for query Halloween. Sort by date Show all posts
Showing posts sorted by relevance for query Halloween. Sort by date Show all posts

Monday, October 31, 2022

Halloween

Halloween (pronounced hal-uh-ween or hal-oh-een)

The evening of 31 October, historically was celebrated mostly in the UK, Canada, the US and Ireland but it spread to Scandinavia and Australia and can now be found in many countries, some participants presumably unaware of its history.

Circa 1745: From the festivals All Hallows Even (also as Hallow-e'en & Hallow e'en), celebrated as a popular holiday on the last night of October (the eve of All Saints Day).  All Hallows’ Eve was the evening before All Saints’ Day, from the Old English ealra halgena mæssedæg (All Hallows' Mass-day) and the literal meaning is "hallowed evening" or "holy evening", derived from the Scottish term Allhallowe'en although throughout the British Isles it had long been noted in the calendar as "the evening before All-Hallows".  In Scots, the word eve is even, and this became contracted to e'en or een, eventually to become Hallowe'en.  Hallow was from the otherwise-obsolete Middle English noun halwe (holy person, saint), from the Old English halga, which is from the source of the verb hallow.

A traditional Jack O'Lantern, hung throughout Scotland and Ireland to ward off evil spirits.  Pumpkins came later which were bigger and easier to carve but aesthetically, a turnip makes sense because the shape tends to more closely resemble that of a human skull.

The idea of "All Hallows'" existed in Old English but "All Hallows' Eve" didn’t appear until 1556.  All-Hallows is from the Middle English al-halwe, from the late Old English ealra halgan (all saints, the saints in heaven collectively) and this was both the name of the feast day and of individual churches.  In the regions of the British Isles the fests were celebrated on various days (influenced as in pagan times by the rhythm of the seasons and the demands placed on the allocation and location of labor) but in the Church records the date 31 October was being described as alle halwe eue by the early twelfth century.  The term “Hallow-day” for "All-Saints Day" is from 1590s, replacing the late thirteenth century halwemesse day.  The consequential Hallowtide (the first week of November) emerged in the mid-fifteenth century.

In pagan times it was the last night of the year in the old Celtic calendar, where it was Old Year's Night (a night for witches) and Halloween is thus another of the pagan festivals essentially taken over and re-branded by Christianity.  Because of the association with witches the day was always associated with magic and sorcery and it was this tradition which inspired Robert Burns’ (1759-1796) poem Halloween, penned in 1785 and first published in 1786 in the Kilmarnock Volume (1786).  Of twenty-eight stanzas (epic length by Burns’ standards) and written in a mix of Scots and English, it shows the clear influence of the twelve stanza on Hallow-E'en (1780) by John Mayne (1759–1836) and the spirit of the evening is captured in Burns’ words which suggest Halloween is "thought to be a night when witches, devils, and other mischief-making beings are all abroad on their baneful midnight errands".

Off to the party.  Lindsay Lohan entering the Cuckoo Club Halloween Party, 31 October 2018.

Although most associated with children going door-to-door in costume demanding candy with the (usually implied) menace of some minor prank if denied (hence trick-or-treat), this aspect is of US origin and dates only from the 1930s.  In these modern, litigious times, children are encouraged to be pragmatic, cut their losses and seek more treats from the more generous rather than visit tricks upon the parsimonious.

Like a number of the festivals in the Christian calendar, it’s a borrowing from pagan rituals, this one the last night of the year in the old Celtic calendar, where it was Old Year's Night, treated as a night for witches, hence the tradition of the costumes in this theme with pumpkins carved in demonic form (although the original Jack O'Lanterns in Scotland were turnips rather than pumpkins).  The Christian feast of 31 October begins the three-day observance of Allhallowtide which, in the western liturgical calendar, is dedicated to the remembrance of the dead, including saints (hallows), martyrs, and all the departed faithful.  The view that Halloween is a lineal descendant of old pagan festivals, especially the Gaelic Samhain, is generally accepted as being one of many Christianized by the early Church which found it more profitable to accommodate rather than suppress popular, unthreatening traditions.  However, there’s always been a purist sect within the Church which has denied the pagan link and insists Halloween’s origins are wholly Christian.  Modern capitalism is neutral on this, the day just another secular event during which much stuff can be sold and one unusual in that in United States, it’s the only event on the calendar free from some sort of moral or spiritual baggage.  Many abstained from meat on All Hallows' Eve, a tradition which endures in the vegetarian dishes of this vigil day such as potato pancakes, toffee-apples and soul cakes.

Pumpkin carving can reflect many influences including pumpkin ∏ (pi) (left), Leggo (centre) and Kim Kardashian (right).

Upon that night, when fairies light
On Cassilis Downans dance,
Or owre the lays, in splendid blaze,
On sprightly coursers prance;
Or for Colean the route is ta'en,
Beneath the moon's pale beams;
There, up the cove, to stray and rove,

Among the rocks and streams
To sport that night.
Among the bonny winding banks,
Where Doon rins, wimplin' clear,
Where Bruce ance ruled the martial ranks,
And shook his Carrick spear,
Some merry, friendly, country-folks,
Together did convene,
To burn their nits, and pou their stocks,
And haud their Halloween
Fu' blithe that night.

Opening stanzas of Halloween by Robert Burns.

Samhainophobia trigger: posters for the 1978 movie Halloween.

One general principle (certainly in the West) which may be gleaned from the work of phenomenologists is that where a cultural practice exists, there may be an associated phobia.  The morbid fear of Halloween is known as samhainophobia, the construct being the Celtic samhuin (the construct being sam (summer) + fuin (end)) + phobia.  The suffix -phobia (fear of a specific thing; hate, dislike, or repression of a specific thing) was from the New Latin, from the Classical Latin, from the Ancient Greek -φοβία (-phobía) and was used to form nouns meaning fear of a specific thing (the idea of a hatred came later).  The name of the festival Samhuin was from the earlier Samfuin, from the Old Irish.  Samhainophobia can be triggered by many things including the general fear of ghosts, witches, skeletons, spiders, black cats, bats, vampires and any of the other spooky stuff associated with Halloween; the representations in popular culture (axe murderers and such) presumably reinforce these fears.  Although the research seems sparse, it seems likely the symptoms of the condition would be not dissimilar to those suffered by patients afflicted by victims of related phobias including phasmophobia (fear of ghosts), wiccaphobia (fear of witches and witchcraft), sanguivoriphobia (fear of vampires), chiroptophobia (fear of bats), nyctophobia (fear of darkness), arachnophobia (fear of spiders), skelephobia (fear of skeletons), placophobia (fear of tombstones), and michaelmyersphobia (fear of Michael Myers).

Wednesday, December 13, 2023

Autophagia

Autophagia (pronounced aw-tuh-fey-juh or aw-tuh-fey-jee-uh)

(1) In cytology, the process of self-digestion by a cell through the action of enzymes originating within the same cell (the controlled digestion of damaged organelles within a cell which is often a defensive and/or self-preservation measure and associated with the maintenance of bodily nutrition by the metabolic breakdown of some bodily tissues).

(2) In cytology, a type of programmed cell death accomplished through self-digestion (known also as apoptosis and associated with the maintenance of bodily nutrition by the metabolic breakdown of some bodily tissues).

(3) In psychiatry, self-consumption; the act of eating oneself.

The construct was auto- + -phagia.  The auto-prefix was a learned borrowing from Ancient Greek ατο- (auto-) (self-) (reflexive, regarding or to oneself (and most familiar in forms like autobiography)), from ατός (autós) (himself/herself/oneself), from either a construct of (1) the primitive Indo-European hew (again) + to- (that) or (2) the Ancient Greek reflexes of those words, α () (back, again, other) + τόν (tón) (the) and related to Phrygian αυτος (autos), the existence of alternatives suggesting there may have been a common innovation.  Phagia was from the Ancient Greek -φαγία (-phagía) (and related to -φαγος (-phagos) (eater)), the suffix corresponding to φαγεν (phageîn) (to eat), the infinitive of φαγον (éphagon) (I eat), which serves as aorist (A verb paradigm found in certain languages, usually an unmarked form or one that expresses the perfective or aorist aspect) for the defective verb σθίω (esthíō) (I eat).  The alternative spelling is autophagal and the synonyms (sometimes used in non-specialist contexts) are self-consumption & auto-cannibalism.  Autophagia, autophagophore, autophagosome & autophagy are nouns, autophagically is an adverb, autophagocytotic is an adjective and autophagic is an adjective (and a non-standard noun); the noun plural is autophagies.

In cytology (in biology, the study of cells), autophagy is one aspect of evolutionary development, a self-preservation and life-extending mechanism in which damaged or dysfunctional parts of a cell are removed and used for cellular repair.  Internally, it’s thus beneficial, the removal or recycling of debris both efficient and (by this stage of evolutionary development) essential, most obviously because it removes toxins and “effectively “creates” younger cells from the old; it can thus be thought an anti-aging mechanism.  It something which has also interested cancer researchers because all cancers (as the word and the parameters of the disease(s) are defined) start from some sort of cell-defect and the speculation is it might be possible to in some way adapt the autophagic process, re-purposing it to identify and remove suspect cells.

In psychiatry, autophagia refers to the act of eating oneself which is sometimes described as self-consumption or the even more evocative auto-cannibalism.  Perhaps surprisingly, the behavior is not explicitly mentioned in the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) which of course means there are no published diagnostic criteria nor recommendations for treatments.  The DSM’s editors note there are a number of reasons why a specific behavior may not be included in the manual notably (1) the lack of substantial empirical evidence or research, (2) the rarity of cases and (3) the material to hand being unsuitable (in terms of volume or quality) for the development of practical tools for clinicians to diagnose and treat a disorders.

It would be flippant to suggest autophagia might have been included when the revisions in the fifth edition of the DSM (DSM-5 (2013)) included a more systematic approach taken to eating disorders and as well as variable definitional criteria being defined for the range of behaviours within that general rubric, just about every other form of “unusual” consumption was listed including sharp objects (acuphagia), purified starch (amylophagia), burnt matches (cautopyreiophagia), dust (coniophagia), feces (coprophagia), sick (emetophagia), raw potatoes (geomelophagia), soil, clay or chalk (geophagia), glass (hyalophagia), stones (lithophagia), metal (metallophagia), musus (mucophagia), ice (pagophagia), lead (plumbophagia), hair, wool, and other fibres (trichophagia), urine (urophagia), blood (hematophagia (sometimes called vampirism)) and wood or derivates such as paper & cardboard (xylophagia).  The DSM-5 also codified the criteria for behaviour to be classified pica (a disorder characterized by craving and appetite for non-edible substances, such as ice, clay, chalk, dirt, or sand and named for the jay or magpie (pīca in Latin), based on the idea the birds will eat almost anything): they must (1) last beyond one (1) month beyond an age in infancy when eating such objects is not unusual, (2) not be culturally sanctioned practice and (3), in quantity or consequence, be of sufficient severity to demand clinical intervention.  However, pica encompassed only “non-nutritive substances” which of course one’s own body parts are not.

Finger food: Severed fingers are a popular menu item for Halloween parties; kids think they're great.  For those who like detail, those emulating nail polish seem to be following Dior shades 742 (top right) and 999 (bottom right). 

In the profession, autophagia seems to be regarded not as a progression from those who eat their fingernails or hair but something with more in common with the cutters.  Cutters are the best known example of self-harmers, the diagnosis of which is described in DSM as non-suicidal self-injury (NSSI).  NSSI is defined as the deliberate, self-inflicted destruction of body tissue without suicidal intent and for purposes not socially sanctioned; it includes behaviors such as cutting, burning, biting and scratching skin.  Behaviorally, it’s highly clustered with instances especially prevalent during adolescence and the majority of cases being female although there is some evidence the instances among males may be under-reported.  It’s a behavior which has long interested and perplexed the profession because as something which involves deliberate and intentional injury to body tissue in the absence of suicidal intent (1) it runs counter to the fundamental human instinct to avoid injury and (2) as defined the injuries are never sufficiently serious to risk death, a well-understood reason for self-harm.  Historically, such behaviors tended to be viewed as self-mutilation and were thought a form of attenuated suicide but in recent decades more attention has been devoted to the syndrome, beginning in the 1980s at a time when self-harm was regarded as a symptom of borderline personality disorder (BPD) (personality disorders first entered DSM when DSM-III was published in 1980), distinguished by suicidal behavior, gestures, threats or acts of self-mutilation.  Clinicians however advanced the argument the condition should be thought a separate syndrome (deliberate self-harm syndrome (DSHS)), based on case studies which identified (1) a patient’s inability to resist the impulse to injure themselves, (2) a raised sense of tension prior to the act and (3) an experience of release or at least partial relief after the act.  That a small number of patients were noted as repeatedly self-harming was noted and it was suggested that a diagnosis called repetitive self-mutilation syndrome (RSMS) should be added to the DSM.  Important points associated with RSMS were (1) an absence of conscious suicidal intent, (2) the patient’s perpetually negative affective/cognitive which was (temporarily) relieved only after an act of self-harm and (3) a preoccupation with and repetitiveness of the behavior.  Accordingly, NSSI Disorder was added to the DSM-5 (2013) and noted as a condition in need of further study.

However, although there would seem some relationship to cutting, it’s obviously a different behavior to eat one’s body parts and the feeling seems to be that autophagia involves a quest for pain and that suggests some overlap with other conditions and it certainly belongs in the sub category of self-injurious behavior (SIB).  The literature is said to be sparse and the analysis seems not to have been extensive but the behavior has been noted in those diagnosed with a variety of conditions including personality disorders, anxiety disorders, obsessive compulsive disorder, schizophrenia and bipolar disorder.  The last two have been of particular interest because the act of biting off and eating some body part (most typically fingers) has been associated with the experience of hallucinations and patients have been recorded as saying the pain of the injury “makes the voices stop”.  Importantly, autophagia has a threshold and while in some senses can be thought a spectrum condition (in terms of frequency & severity), behaviors such as biting (and even consuming) the loose skin on the lips (morsicatio buccarum) or the ragged edges of skin which manifest after nail biting (onychophagia) are common and few proceed to autophagia and clinicians note neurological reasons may also be involved.    

Lindsay Lohan with bread on the syndicated Rachael Ray Show, April 2019.

Autophagia and related words should not be confused with the adjective artophagous (bread-eating).  The construct was the Artos + -phagous.  Artos was from the Ancient Greek ρτος (ártos) (bread), of pre-Greek origin.  Phagous was from the Latin -phagus, from the Ancient Greek -φάγος (-phágos) (eating) from φαγεν (phageîn) (to eat).  Apparently, in the writings of the more self-consciously erudite, the word artophagous, which enjoyed some currency in the nineteenth century, was still in occasional use as late as the 1920s but most lexicographers now either ignore it or list it as archaic or obsolete.  It’s an example of a word which has effectively been driven extinct even though the practice it describes (the eating of bread) remains as widespread and popular as ever.  Linguistically, this is not uncommon in English and is analogous with the famous remark by Sheikh Ahmed Zaki Yamani (1930–2021; Saudi Arabian Minister of Petroleum and Mineral Resources 1962-1986): “The Stone Age came to an end not for a lack of stones, and the Oil Age will end, but not for a lack of oil” (the first part of that paraphrased usually as the punchier “the Stone Age did not end because the world ran out of rocks”).

Friday, February 17, 2023

Acephalous

Acephalous (pronounced ey-sef-uh-luhs)

(1) In zoology, a creature without a head or lacking a distinct head (applied to bivalve mollusks).

(2) In the social sciences, political science & sociology, a system of organisation in a society with no centralized authority (without a leader or ruler), where power is in some way distributed among all or some of the members of the community.

(3) In medicine, as (1) acephalia, a birth defect in which the head is wholly or substantially missing & (2), the congenital lack of a head (especially in a parasitic twin).

(4) In engineering, an internal combustion piston engine without a cylinder head.

(5) In botany, a plant having the style spring from the base, instead of from the apex (as is the case in certain ovaries).

(6) In information & communications technology (ICT), a class of hardware and software (variously headless browser, headless computer, headless server etc) assembled lacking some feature or object analogous with a “head” or “high-level” component.

(7) In prosody, deficient in the beginning, as a line of poetry that is missing its expected opening syllable.

(8) In literature, a manuscript lacking the first portion of the text.

1725-1735: From French acéphale (the construct being acéphal(e) + -ous), from the Medieval Latin acephalous, from the Ancient Greek κέφαλος (aképhalos) (headless), the construct being - (a-) (not) + κεφαλή (kephal) (head), thus synchronically: a- + -cephalous.  The translingual prefix a- was from the Ancient Greek ἀ- (a-) (not, without) and in English was used to form taxonomic names indicating a lack of some feature that might be expected.  The a- prefix (with differing etymologies) was also used to form words imparting various senses.  Acephalous & acephalic are adjectives, acephalousness, acephalia & acephaly are nouns and acephalously is an adverb; the noun plural is acephali.

In biology (although often literally synonymous with “headless”), it was also used to refer to organisms where the head(s) existed only partially, thus the special use of the comparative "more acephalous" and the superlative "most acephalous", the latter also potentially misleading because it referred to extreme malformation rather than absence (which would be something wholly acephalous).  In biology, the companion terms are anencephalous (without a brain), bicephalous (having two heads), monocephalous (used in botany to describe single-headed, un-branched composite plants) & polycephalous (many-headed).

Acephalous: Lindsay Lohan “headless woman” Halloween costume.

The word’s origins were in botany and zoology, the use in political discussion in the sense of “without a leader” dating from 1751.  The Acephali (plural of acephalus) were a people, said to live in Africa, which were the product of the imagination of the writers of Antiquity, said by both the Greek historian Herodotus (circa 487-circa 425 BC) and Romano-Jewish historian Flavius Josephus (circa 37–circa 100) to have no heads (sometimes removable heads) and Medieval historians picked up the notion in ecclesiastical histories, describing thus (1) the Eutychians (a Christian sect in the year 482 without a leader), (2) those bishops certain clergymen not under regular diocesan control and later a class of levelers in the time of Henry I (circa 1068–1135; King 1100-1135).  The word still sometimes appears when discussing religious orders, denominations (or even entire churches) which reject the notion of a separate priesthood or a hierarchical order including such as bishops, the ultimate evolution of which is popery.

Acephalousness in its age of mass production: Marie Antoinette (1755–1793; Queen Consort of France 1774-1792) kneeling next to her confessor, contemplates the guillotine on the day of her execution, 16 October 1793.  Colorized version of a line engraving with etching, 1815.

In political science, acephalous refers to societies without a leader or ruler in the Western sense of the word but it does not of necessity imply an absence of leadership or structure, just that the arrangements don’t revolve around the one ruler.  Among the best documented examples were the desert-dwelling tribes of West Africa (notably those inhabiting the Northern Territories of the Gold Coast (now Ghana)), the arrangements of which required the British colonial administrators (accustomed to the ways of India under the Raj with its Maharajas and institutionalized caste system) to adjust their methods somewhat to deal with notions such as distributed authority and collective decision making.  That said, acephalous has sometimes been used too freely.  It is inevitably misapplied when speaking of anarchist societies (except in idealized theoretical models) and often misleading if used of some notionally collectivist models which are often conventional leadership models in disguise or variations of the “dictatorship of the secretariat” typified by the early structure of Stalinism.

The Acephalous Commer TS3

A curious cul-de-sac in engineering, Commer’s acephalous TS3 Diesel engine (1954-1972) was a six-cylinder, two-stroke system, the three cylinders in a horizontal layout, each with two pistons with their crowns facing each other, the layout obviating any need for a cylinder head.  The base of each piston was attached to a connecting rod and a series of rockers which then attached to another connecting rod, joined to the single, centrally located crankshaft at the bottom of the block, a departure from other “opposed piston” designs, almost all of which used twin crankshafts.  The TS3 was compact, powerful and light, the power-to-weight ratio exceptional because without components such as a cylinder heads, camshafts or valve gear, internal friction was low and thermal efficiency commendably high, the low fuel consumption especially notable.  In other companies, engineers were attracted to the design but accountants were sceptical and there were doubts reliability could be maintained were capacity significantly increased (the TS3 was 3.3 litres (200 cubic inch) and when Chrysler purchased Commer in 1967, development ceased although an eight-piston prototype had performed faultlessly in extensive testing.  Production thus cease in 1972 but although used mostly in trucks, there was also a marine version, many examples of which are still running, the operators maintaining them in service because of the reliability, power and economy (although the exhaust emissions are at the shockingly toxic levels common in the 1960s).

Acephalous information & communications technology (ICT)

A headless computer (often a headless server) is a device designed to function without the usual “head” components (monitor, mouse, keyboard) being attached.  Headless systems are usually administered remotely, typically over a network connection although some still use serial links, especially those emulating legacy systems.  Deployed to save both space and money, numerous headless computers and servers still exist although the availability of KVM (and related) hardware which can permit even dozens of machines to be hard-wired to the one keyboard/mouse/monitor/ combination has curbed their proliferation.

A headless browser is a web browser without a graphical user interface (GUI) and can thus be controlled only be from a command-line interface or with a (usually) automated script, often deployed in a network environment.  Obviously not ideal for consumer use, they’re ideal for use in distributed test environments or automating tasks which rely on interaction between web pages.  Until methods of detection improved, headless browsers were a popular way of executing ploys such as credential stuffing, page-view building or automated clicking but there now little to suggest they’re now anymore frequently used as a vector for nefarious activity than conventional browsers with a GUI attached.

Browsing for nerds: Google’s acephalous Headless Chrome.

Headless software is analogous with but goes beyond the concept of a headless computer in that it’s designed specifically to function without not just a GUI or monitor but even the hardware necessary to support the things (notably the video card or port).  Whereas some software will fail to load if no video support is detected, headless software proceeds regardless, either because it’s written without such parameter checking or it includes responses which pass “false positives”, emulating the existence of absent software.  Headless software operated in a specialized (horizontal in terms of industries supplied but vertical in that the stuff exists usually in roles such as back-to-front-end comms on distributed servers) niche, the advantage being the two end can remain static (as some can be for years) while bridge between the two remains the more maintenance intensive application programming interface (API), the architecture affording great flexibility in the software stack.

Thursday, June 29, 2023

Phlebotomy

Phlebotomy (pronounced fluh-bot-uh-mee)

(1) The act or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; the letting of blood and known in historic medicine as "a bleeding".

(2) Any surgical incision into a vein (also known as venipuncture & (less commonly) venesection).  It shouldn’t be confused with a phlebectomy (the surgical removal of a vein).

1350–1400: From the earlier flebotomye & phlebothomy, from the Middle French flebotomie, from the thirteenth century Old French flebothomie, (phlébotomie the Modern French) from the Late & Medieval Latin phlebotomia, from the Ancient Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein), the construct being φλέψ (phléps) (genitive phlebos) (vein), of uncertain origin + tomē (a cutting), from the primitive Indo-European root tem- (to cut).  The form replaced the Middle English fleobotomie.  The noun phlebotomist (one who practices phlebotomy, a blood-letter) is documented only as late as the 1650s but may have been in use earlier and operated in conjunction with the verb phlebotomize.  The earlier noun and verb in English (in use by the early fifteenth century) were fleobotomier & fleobotomien.  The Latin noun phlebotomus (genitive phlebotomī) (a lancet or fleam (the instruments used for blood-letting)) was from the Ancient Greek φλεβότομος (phlebótomos) (opening veins), the construct being φλέψ (phléps) (blood vessel) + τέμνω (témnō) (to cut) + -ος (-os) (the adjectival suffix).  The alternative spelling was flebotomusThe noun fleam (sharp instrument for opening veins in bloodletting (and this in the pre-anesthetic age)) was from the late Old English, from Old French flieme (flamme in Modern French), from the Medieval Latin fletoma, from the Late Latin flebotomus, from Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein).  The doublet was phlebotome and in barracks slang, a fleam was a sword or dagger.  Phlebotomy & Phlebotomist are nouns, phlebotomize is a verb and phlebotomic & phlebotomical are adjectives; the noun plural is phlebotomies.

Phlebotomy describes the process of making a puncture in a vein cannula for the purpose of drawing blood.  In modern medicine the preferred term is venipuncture (used also for therapy) although the title phlebotomist continues to be used for those who specialize in the task.  One of the most frequently performed procedures in clinical practice, it’s commonly undertaken also by doctors, nurses and other medical staff.  Although the origins of phlebotomy lie in the ancient tradition of blood letting, it’s now most associated with (1) the taking of blood samples for testing by pathologists and (2) those carried out as “therapeutic phlebotomies” as part of the treatment regimen for certain disorders of the blood.  The inner elbow is the most often used site but in therapeutic medicine or in cases where the veins in the arms are not suitable, other locations can be used.

Bleeding the foot (circa 1840), oil on canvas following Honoré Daumier (1808-1879).

It’s an urban myth the Hippocratic Oath includes the clause: “First, do no harm” but by any reading that is a theme of the document and while the Greek physician Hippocrates of Kos (circa 460-circa 375 BC) wouldn’t have been the first in his field to regard illness as something to be treated as a natural phenomenon rather than something supernatural, he’s remembered because of his document.  His doctrine was one which took a long time to prevail (indeed there are pockets where still it does not), holding that treatment of ailments needed to be based on science (“evidence-based” the current phrase) rather than devotion or appeals to the gods.  His influence thus endures but one of his most famous theories which persisted for decades resulted in much lost blood for no known benefit and an unknown number of deaths.  Drawing from the notion of earlier philosophers that the basis of the universe was air, earth, water & fire, the theory was that there were four “humors” which had to be maintained in perfect balance to ensure health in body & mind, the four being flegmat (phlegm), sanguin (blood), coleric (yellow bile) & melanc (black bile) which were the source of the four personality types, the phlegmatic, the sanguine, the choleric & the melancholic.  Had Hippocrates and his successors left the humors in the realm of the speculative, it would now be thought some amusing fragment from Antiquity but unfortunately surgical intervention was designed to ensure balance was maintained and the mechanism of choice was bloodletting to “remove excess liquids”.

George Washington in his last illness, attended by Doctors Craik and Brown (circa 1800) engraving by unknown artist, Collection of The New-York Historical Society.

Apparently, bloodletting was practiced by the ancient Egyptians some 3000 years ago and it’s not impossible it was among the medical (or even religious) practices of older cultures and From there it’s known to have spread to the Middle East, Rome, Greece and West & South Asia, physicians and others spilling blood in the quest to heal and the evidence suggests it was advocated for just about any symptom.  The very idea probably sounds medieval but in the West that really was the nature of so much medicine until the nineteenth century and even well into the twentieth, there were still some reasonably orthodox physicians advocating its efficacy.  Still, in fairness to Hippocrates, he was a pioneer in what would now be called “holistic health management” which involved taking exercise, eating a balanced diet and involving the mind in art & literature.  He was an influencer in his time.  All the humors were of course good but only in balance so there could be too much of a good thing.  When there was too much, what was in excess had to go and apart from bloodletting, there was purging, catharsis & diuresis, none of which sound like fun.  Bloodletting however was the one which really caught on and was for centuries a fixture in the surgeon’s bag.

Blood self-letting: Lindsay Lohan as Carrie from the eponymous film, Halloween party, Foxwoods Resort & Casino, Connecticut, October 2013.

Actually, as the profession evolved, the surgeons emerged from the barber shops where they would pull teeth too.  The formal discipline of the physician did evolve but they restricted themselves to providing the diagnosis and writing scripts from which the apothecary would mix his potions and pills, some of which proved more lethal than bloodletting.  The bloodletting technique involved draining blood from a large vein or artery (the most productive soon found to be the median cubital at the elbow) but if a certain part of the body was identified as being out-of-balance, there would be the cut.  The mechanisms to induce blood loss included cupping, leeching & scarification and with the leeches, they were actually onto something, the thirsty creatures still used today in aspects of wound repair and infection control, able often to achieve better results more quickly than any other method.  Leeches have demonstrated extraordinary success in handing the restoration of blood flow after microsurgery and reimplantation and works because the little parasites generate substances like fibrinase, vasodilators, anticoagulants & hyaluronidase, releasing them into the would area where they assist the healing process by providing an unrestricted blood flow.  Of course the leeches don't always effect a cure.   When in 1953 doctors were summoned to examine a barely conscious comrade Stalin (1878-1953; Soviet leader 1924-1953), after their tests they diagnosed a haemorrhagic stroke involving the left middle cerebral artery.  In an attempt to lower his blood pressure, two separate applications of eight leeches each were applied over 48 hours but it was to no avail.  Had he lived he might have had the leeches shot but they probably lived to be of further service.

A Surgeon Letting Blood from a Woman's Arm, and a Physician Examining a Urine-flask (in some descriptions named Barber-Surgeon Bleeding a Patient), eighteenth century oil on canvas, attributed to school of Jan Josef Horemans (Flemish; 1682-1752); Previously attributed to Richard Brakenburg (Dutch; 1650-1702); Previously attributed to the Flemish School,

Scarification was a scraping of the skin and if the circumstances demanded more, leeches could be added.  Cupping used dome-shaped cups placed on the skin to create blisters through suction and once in place, suction was achieved through the application of heat.  However it was done it could be a messy, bloody business and in the twelfth century the Church banned the practice, calling it “abhorrent” and that had the effect of depriving priests and monks of a nice, regular source of income which wasn’t popular.  However, especially in remote villages far from the bishop’s gaze, the friars continued to wield their blades and harvest their leeches, the business of bloodletting now underground.  In the big towns and cities though the barbers added bloodletting to their business model and it’s tempting to wonder whether package deals were offered, bundling a blooding with a tooth pulling or a haircut & shave.  From here it was a short step to getting into the amputations, a not uncommon feature of life before there were antibiotics and to advertise their services, the barber-surgeons would hang out white rags smeared in places with blood, the origin of the red and white striped poles some barbers still display.  To this day the distinctions between surgeons and physicians remains and in England the Royal College of Physicians (the RCP, a kind of trade union) was founded by royal charter in 1518.  By the fourteenth century there were already demarcation disputes between the barber surgeons and the increasingly gentrified surgeons and a number of competing guilds and colleges were created, sometimes merging, sometimes breaking into factions until 1800 when the Royal College of Surgeons (RCS) was brought into existence.  It's said there was a time when fellows of the RCP & RCS, when speaking of each-other, would only ever make reference to "the other college", the name of the institution never passing their lips. 

Bloodletting tools: Late eighteenth century brass and iron “5-fingered” fleam.

Unfortunately, while doubtlessly lobbying to ensure the fees of their members remained high, the colleges did little to advance science and the byword among the population remained: “One thing's for sure: if illness didn't kill you, doctors would”.  It was the researchers of the nineteenth century, who first suggested and then proved germ theory, who sounded the death knell for most bloodletting, what was visible through their microscopes rendering the paradigm of the four humors obsolete.  By the twentieth century it was but a superstition.

Sunday, July 21, 2024

Harlot

Harlot (pronounced hahr-luht)

(1) A prostitute or promiscuous woman; one given to the wanton; lewd; low; base.

(2) By extension, in political discourse, an unprincipled person (now rare).

(3) A person given to low conduct; a rogue; a villain; a cheat; a rascal (obsolete).

(4) To play the harlot; to practice lewdness.

Circa 1200: From the Middle English harlot (young idler, rogue), from the Old French harlot, herlot & arlot (rascal; vagabond; tramp”), of obscure origin but thought probably of Germanic origin, either a derivation of harjaz (“army; camp; warrior; military leader”) or a diminutive of karilaz (man; fellow); most speculate the first element is from hari (army).  It was cognates with the Old Provençal arlot, the Old Spanish arlote and the Italian arlotto.  The long obsolete Middle English carlot (a churl; a common man; a person (male or female) of low birth; a boor; a rural dweller, peasant or countryman) is thought probably related.  Harlot was a noun and (less often) a verb, harlotry a noun and harlotize a verb; the present participle was harloting (or harlotting), the simple past and past participle harloted (or harlotted) and there’s no evidence exotic forms like harlotistic or harlotic ever existed, however useful they might have been.  Harlot is a noun & verb, harlotry is a noun, harlotish is an adjective, harlotize and harloted & harloting are verbs; the noun plural is harlots.  The adjective harlotesque is non-standard.

Harlot as a surname dates from at least the mid-late 1100s but by circa 1200 was being used to describe a “vagabond, someone of no fixed occupation, an idle rogue" and was applied almost exclusively to men in the Middle English and Old French.  Geoffrey Chaucer (circa 1345-1400) used harlot in a positive as well as pejorative sense and in medieval English texts it was applied to jesters, buffoons, jugglers and later to actors.  What is the now prevalent meaning (prostitute, unchaste woman) was originally the secondary sense but it had probably developed as early as the late fourteenth century, being well-documented by the early fifteenth.  Doubtless, it was the appearance in sixteenth century English translations of the Bible (as a euphemism for "strumpet, whore") which cemented the association.

In harlotesque mode: Lindsay Lohan in fancy dress as Suicide Squad's (2016) Harley Quinn, Halloween party, London, November 2016.  It may be a cliché but for purposes of fancy dress, fishnet stockings (or tights) are the motif of choice for those wanting the "harlot look". 

The biblical imprimatur didn’t so much extend the meaning as make it gender-specific.  The noun harlotry (loose, crude, or obscene behavior; sexual immorality; ribald talk or jesting) had been in use since the late fourteenth century and the choice of harlot in biblical translation is thought an example of linguistic delicacy, a word like “strumpet” though too vulgar for a holy text and “jezebel” too historically specific.  In this, harlot is part of a long though hardly noble tradition of crafting or adapting words as derogatory terms to be applied to women.  It has to be admitted there are nuances between many but one is impressed there was thought to be such a need to be offensive to women that English contains so many: promiscuous, skeezer, slut, whore, concubine, courtesan, floozy, hooker, hussy, nymphomaniac, streetwalker, tom, strumpet, tramp, call girl, lady of the evening, painted woman et al.  So the bible is influential although there’s a perhaps surprising difference in the translations of that prescriptive duo, Leviticus & Ezekiel: In the King James Version (KJV 1611), harlot appears in thirty-eight versus, but once in Leviticus, nine times in Ezekiel, some of the memorable being:.

Genesis 38:24: And it came to pass about three months after, that it was told Judah, saying, Tamar thy daughter in law hath played the harlot; and also, behold, she [is] with child by whoredom. And Judah said, Bring her forth, and let her be burnt.

Leviticus 21:14: A widow, or a divorced woman, or profane, [or] an harlot, these shall he not take: but he shall take a virgin of his own people to wife.

Joshua 6:25: And Joshua saved Rahab the harlot alive, and her father's household, and all that she had; and she dwelleth in Israel [even] unto this day; because she hid the messengers, which Joshua sent to spy out Jericho.

Isaiah 1:21: How is the faithful city become an harlot! it was full of judgment; righteousness lodged in it; but now murderers.

Ezekiel 16:15: But thou didst trust in thine own beauty, and playedst the harlot because of thy renown, and pouredst out thy fornications on every one that passed by; his it was.

Ezekiel 16:41: And they shall burn thine houses with fire, and execute judgments upon thee in the sight of many women: and I will cause thee to cease from playing the harlot, and thou also shalt give no hire any more.

Ezekiel 23:19: Yet she multiplied her whoredoms, in calling to remembrance the days of her youth, wherein she had played the harlot in the land of Egypt.

Ezekiel 23:44: Yet they went in unto her, as they go in unto a woman that playeth the harlot: so went they in unto Aholah and unto Aholibah, the lewd women.

Amos 7:17: Therefore thus saith the LORD; Thy wife shall be an harlot in the city, and thy sons and thy daughters shall fall by the sword, and thy land shall be divided by line; and thou shalt die in a polluted land: and Israel shall surely go into captivity forth of his land.

Nahum 3:4: Because of the multitude of the whoredoms of the wellfavoured harlot, the mistress of witchcrafts, that selleth nations through her whoredoms, and families through her witchcrafts.

Stanley Baldwin election campaign poster, 1929.

Phrases like “shameless harlot” and “political prostitution” used to be part of the lively language of politics but social change and an increasing intolerance of gendered terms of derision have rendered them almost extinct (the language of metaphorical violence is next for the chopping-block: guillotined, knifed, axed etc all on death row).  Harlot’s most notable political excursion came in 1931 when Stanley Baldwin (1867–1947; thrice UK prime-minister 1923-1937) was facing an orchestrated campaign against his leadership by the newspaper proprietors, Lords Rothermere (1868–1940) & Beaverbrook (1879-1964), the "press barons" then a potent force (Beaverbrook called them collectively the "press gang").  Before commercial television & radio, let alone the internet and social media, most information was disseminated in newspapers and their influence was considerable.  The press barons though, whatever their desires, couldn't be dictatorial, as Beaverbrook found when his long campaign for empire free-trade achieved little but they sometimes behaved as if they could at a whim move public opinion and often politicians were inclined to believe them.  Within the UK at the time, Rothermere & Beaverbrook weren’t exactly “by Murdoch out of Zuckerberg” but it’s hard to think of a better way of putting it.

Baldwin in 1931 found a good way of putting it.  His leadership of the Tory party challenged because he refused to support them in what was even then the chimera of empire free trade, he responded with a strident speech which appealed to the public’s mistrust of the press barons, using a phrase from his cousin Rudyard Kipling (1865-1936), ironically a friend of Beaverbrook.  Rothermere & Beaverbrook he denounced as wanting power without responsibility, “…the prerogative of the harlot throughout the ages.”  It was the most effective political speech in the UK until 1940, Baldwin flourishing and empire free trade doomed, although Beaverbrook would keep flogging the corpse for the rest of the 1930s.  Often underestimated, David Lloyd George (1863–1945; UK prime-minister 1916-1922) and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) would later acknowledge Baldwin as the most formidable political operator of the era.

The oratory of Lloyd-George and Churchill may be more regarded by history but Baldwin did have a way with words and less remembered lines from another of his famous speeches may have influenced climate change activist Greta Thunberg (b 2003).  Delivered in the House of Commons on 10 November 1932 in a debate on disarmament, he argued for an international agreement to restrict the development of the aircraft as a military weapon:

I think it is well also for the man in the street to realize that there is no power on earth that can protect him from being bombed, whatever people may tell him.  The bomber will always get through…”.  “The only defense is in offence, which means that you have got to kill more women and children more quickly than the enemy if you want to save yourselves. I mention that so that people may realize what is waiting for them when the next war comes.”

Prescient about the way the unrestricted bombing of civilians would be the Second World War’s novel theatre, the phrase "the bomber will always get through" reverberated around the world, chancelleries and military high commands taking from it not the need for restrictions but the imperative to build bomber fleets, Baldwin not planting the seed of the idea but certainly reinforcing the prejudices and worst instincts of many.  That was the power of the phrase; it subsumed the purpose of the speech, the rest of which was essentially forgotten including the concluding sentences:

"I do not know how the youth of the world may feel, but it is no cheerful thought to the older men that having got that mastery of the air we are going to defile the earth from the air as we have defiled the soil for nearly all the years that mankind has been on it."

This is a question for young men far more than it is for us…”  “Few of my colleagues around me here will see another great war…”  “At any rate, if it does come we shall be too old to be of use to anyone.  But what about the younger men, they who will have to fight out this bloody issue of warfare; it is really for them to decide. They are the majority on the earth. It touches them more closely. The instrument is in their hands.”

If the conscience of the young men will ever come to feel that in regard to this one instrument the thing will be done.”  “As I say, the future is in their hands, but when the next war comes and European civilization is wiped out, as it will be and by no force more than by that force, then do not let them lay the blame on the old men, but let them remember that they principally and they alone are responsible for the terrors that have fallen on the earth.

Hansard recorded Baldwin’s speech being greeted with “loud and prolonged cheers”, his enthusiasm for disarmament making him as popular as Neville Chamberlain (1869–1940; UK prime-minister 1937-1940) would briefly be in 1938 when he returned from Germany with a piece of paper bearing Hitler’s signature an a guarantee of “peace in our time”.  Soon, the views on both men would shift but historians today treat them more sympathetically.

The old and the young.

Greta Thunberg (b 2003) and Donald Trump (b 1946; US president 2017-2021), United Nations, New York, September 2019.  Ms Thunberg was attending a UN climate summit Mr Trump snubbed, going instead to a meeting on religious freedom.  Proving that God moves in mysterious ways, Mr Trump took a whole new interest in evangelical Christianity when he entered the contest for the 2016 presidential election.  Ms Thunberg seems to have noted the final paragraphs of Baldwin's speech and while convinced it’s quite right to “lay the blame on the old men” and their blah, blah, blah, which she thinks insufficient to lower carbon emissions, seems confident youth will prove more receptive to doing something about us defiling the earth.

Greta Thunberg, How Dare You? (Acid house mix).