Showing posts with label DSM. Show all posts
Showing posts with label DSM. Show all posts

Monday, April 15, 2024

MADD

MADD, Madd MaDD (pronounced mad)

(1) The acronym (as MADD) for Mothers Against Drunk Driving, a non-profit education and lobbying operation founded in California in 1982 with a remit to campaign against driving while drink or drug-affected.

(2) The acronym (as MADD) for Myoadenylate deaminase deficiency or Adenosine monophosphate deaminase.

(3) The acronym (as MADD) for multiple acyl-CoA dehydrogenase deficiency (known also as the genetic disorder Glutaric acidemia type 2).

(4) In computing (as MADD), the acronym for Multiple-Antenna Differential Decoding (a technique used in wireless comms using multiple antennas for both transmit & receive which improves performance by exploiting spatial diversity & multipath propagation of the wireless channel).

(5) As the gene MADD (or MAP kinase), an activating death domain protein.

(6) As Madd, the fruit of Saba senegalensis (a fruit-producing plant of the Apocynaceae family, native to the Sahel region of sub-Saharan Africa).

(7) As madd, a clipping of maddah (from the From Arabic مَدَّة (madda)), the English form of the Arabic diacritic (a distinguishing mark applied to a letter or character) used in both the Arabic & Persian.

(8) The acronym (as MaDD), Maladaptive Daydreaming Disorder.

(9) The acronym (as MADD), for mutually assured digital destruction: a theory of cyber-warfare whereby each participant demonstrates to the other their capacity to inflict equal or more severe damage in retaliation, thereby deterring a cyber-attack (based on the earlier MAD (mutually assured destruction), a description of nuclear warfare deterrence).

From AD to MAD, 1962-1965

The period between the addition of nuclear weapons to the US arsenal in 1945 and 1949 when the USSR detonated their first atomic bomb was unique, a brief anomaly in the history of great-power conflict.  It's possible to find periods in history when one power has possessed an overwhelming preponderance of military strength that would have enabled them easily to defeat any enemy or possible coalition but never was the imbalance of force so asymmetric as it was between 1945-1949.   Once both the US and USSR possessed strategic nuclear arsenals, the underlying metric of Cold War became the two sides sitting in their bunkers counting warheads and the centrality of that lasted as long as the bombs were gravity devices delivered by aircraft which needed to get to a point above the target.  At this point, the military’s view was that nuclear war was possible and the only deterrent was to maintain a creditable threat of retaliation and, still in the age of the “bomber will always get through” doctrine, both sides literally kept squadrons of nuclear-armed bombers in the air 24/7.  Once ground-based intercontinental ballistic missiles (ICBMs) and (especially) submarine-launched ballistic missile (SLBMs) were deployed, the calculation of nuclear war changed from damage assessment to an acknowledgement that, in the worse case scenarios made possible by the preservation of large-scale second-strike retaliatory capacity, although the "total mutual annihilation" of the popular imagination was never likely, the damage inflicted would have been many times worse and more extensive than in any previous conflict and, although the climatarian implications weren't at the time well-understood, the consequences would have been global and lasted to one degree or another for centuries.

It was thus politically and technologically deterministic that the idea of mutually assured destruction (MAD) would evolve and it was a modification of a deterrence doctrine known as AD (assured destruction) which appeared in Pentagon documents as early as 1962.  AD was intended as a way to deter the USSR from staging a first-strike against the US, the notion being that the engineering and geographical deployment of the US's retaliatory capacity was such that whatever was achieved by a Soviet attack, their territory would suffer something much worse.  To the Pentagon planners in their bunker, the internal logic of AD was compelling and was coined as a description of the prevailing situation rather than a theoretical doctrine.  To the general population, it obviously meant MAD (mutually assured destruction) and while as a doctrine of deterrence, the metrics remained the same, after 1966 when the term gained currency, it began to be used as an argument against the mere possession of nuclear arsenals, the paradox being the same acronym was also used to underpin the standard explanation of the structural reason nuclear warfare was avoided.  Just as paradoxically, while serving to prevent their use, MAD also fueled the arms race because the stalemate created its own inertia and it would be almost a decade before the cost and absurdity of maintaining the huge number of useless warheads was addressed.  MAD probably also contributed to both sides indulging in conflict by proxy, supporting wars and political movements which served as surrogate battles made too dangerous by the implications of MAD to be contested between the two big protagonists.

Maladaptive Daydreaming Disorder

There are those who criticize the existence of MADD (Maladaptive Daydreaming Disorder) as an example of the trend to “medicalize” aspects of human behaviour which have for millennia been regarded as “normal”, the implication being the sudden creation of a cohort of customers for psychiatrists and the pharmaceutical industry, the suspicion being MADD is of such interest to the medical-industrial complex because the catchment is of the “worried well”, those with sufficient disposable income to make the condition worthwhile, the poor too busy working to ensure food and shelter for their families for there to be much time to daydream.

Still, the consequences of MADD are known to be real and while daydreaming is a common and untroubling experience for many, in cases where it’s intrusive and frequent, it can cause real problems with everyday activities such as study or employment as well as being genuinely dangerous if associated with tasks such as driving or the use of heavy machinery.  The condition was first defined by Professor Eli Somer (b 1951; a former President of both the International Society for the Study of Trauma and Dissociation (ISSTD) and the European Society for Trauma and Dissociation (ESTD)) who described one manifestation as possibly an “escape or coping mechanism from trauma or abuse”, noting it may “involve long periods of structured fantasy”.  Specific research into MADD has been limited but small-scale studies have found some similarities to behavioral addictions, the commonality being a compulsion to engage in activities despite negative impacts on a person’s mental or physical health or ability to function various aspects of life. 

Despite the suggestion of similarities to diagnosable conditions, latest edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-5-TR, 2022) did not add an entry for MADD and the debate among those in the profession interested in the matter is between those arguing it represents an unidentified clinical syndrome which demands a specific diagnosis and those who think either it fits within the rubric of obsessive compulsive disorder (OCD) or is a dissociative condition.  Accordingly, in the absence of formal recognition of MADD, while a psychiatrist may decline to acknowledge the condition as a specific syndrome, some may assess the described symptoms and choose to prescribe the drugs used to treat anxiety or OCD or refer the patient to sessions of cognitive behavior therapy (CBT) or the mysterious mindfulness meditation.

Mutually Assured Digital Destruction

Authors in 2021 suggested MADD (mutually assured digital destruction) as the term to describe the strategic stalemate achieved by the major powers infecting each other’s critical (civilian & military) digital infrastructure with crippleware, logic-bombs and other latent tools of control or destruction.  The core the idea was based on old notion of “the bomber always gets through”, a recognition it’s neither possible to protect these systems from infiltration nor clean up what’s likely there and still undiscovered.  So, rather than being entirely covert, MADD instead makes clear to the other side its systems are also infected and there will be retaliation in kind to any cyber attack with consequences perhaps even worse than any suffered in the first strike.  Like the nuclear submarines with their multiple SLBMs silently which cruise the world's oceans, the strategic charm of the latent penetration of digital environments is that detection of all such devices is currently impossible; one knows they (and their SLMBs) are somewhere in firing range but not exactly where.  Oceans are big places but so is analogously is the digital environment and a threat may be in the hardware, software or the mysterious middleware and sometimes a treat can actually be observed yet not understood as such.

For individuals, groups and corporations, there's also the lure of unilateral destruction, something quite common in the social media age.  For a variety of reasons, an individual may choose to "delete" their history of postings and while it's true this means what once was viewable no longer is, it does not mean one's thoughts and images are "forever gone" in the sense one can use the phrase as one watches one's diary burn.  That was possible (with the right techniques or a power drill) when a PC sat on one's desk and was connected to nothing beyond but as soon as a connection with a network (most obviously the internet) is made and data is transferred, whatever is sent is in some sense "in the wild".  That was always true but in the modern age it's now effectively impossible to know where one's data may exist, such are the number of "pass-through" devices which may exist between sender and receiver.  On the internet, even if the path of the data packets can be traced and each device identified, there is no way to know where things have been copied (backup tapes, replica servers et al) and that's even before one wonders what copies one's followers have taken.  There may often be good reasons to curate one's social media presence to the point of deletion but that shouldn't be thought of as destruction.

Thursday, March 21, 2024

Hypnopompic

Hypnopompic (pronounced hip-nuh-pom-pik)

Of or relating to the state of consciousness between sleep and becoming fully awake.

1897: The construct was hypno-, from the Ancient Greek ὕπνος (húpnos) (sleep) + the Ancient Greek πομπή (pomp(ḗ)), (a sending away) + -ic.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); a doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (HSO) has more oxygen atoms per molecule than sulphurous acid (HSO).  The word was coined in the sense of “pertaining to the state of consciousness when awaking from sleep” by Frederic WH Myers (1843-1901), the construct being from hypno- (sleep) + the second element from the Greek pompe (sending away) from pempein (to send).  The word was introduced in Glossary of Terms used in Psychical Research, Proceedings of the Society for Psychical Research, vol. xii (1896-1897 supplement), an organization founded by Myers.  Hypnopompic & hypnopompia were thought to be necessary as companion (in the sense of “bookend”) terms to hypnagogic & hypnagogia (Illusions hypnagogiques) which are the “vivid illusions of sight or sound (sometimes referred to as “faces in the dark”) which sometimes accompany the prelude to the onset of sleep.  Hypnopompic is an adjective and hypnopompia is a noun; the noun plural is hypnopompias.

Frederic Myers was a philologist with a great interest in psychical matters, both the orthodox science and aspects like the work of mediums who would “contact the spirits of the dead”, the latter, while not enjoying much support in the scientific establishment, was both taken seriously and practiced by a remarkable vista of “respectable society”.  Mediums enjoyed a burst in popularity in the years immediately after World War I (1914-1918) when there was much desire by grieving wives & mothers to contact dead husbands and sons and some surprising figures clung to beliefs in such things well into the twentieth century.  In the early 1960s, a reunion of surviving pilots from the Battle of Britain (1940) was startled when their wartime leader and former head of Royal Air Force (RAF) Fighter Command, Hugh Dowding (1882–1970), told them: “regularly he communicated with the spirits of their fallen comrades”.  Myers also had what might now be called a “varied” love life although it’s said in his later life his interest was restricted to women, including a number of mediums, all reputed to be “most fetching”.

In the profession, while acknowledging the potential usefulness in things like note-taking in a clinical environment, few psychologists & psychiatrists appear to regard hypnopompia & hypnagogia as separate phenomena, both understood as the imagery, sounds and strange bodily feelings sometimes felt when in that state between sleep and being fully awake.  In recent years, as the very definition of “sleep” has increasingly been segmented, the state in some literature has also been referred to both as “stage 1 sleep” & “quiet wakefulness” although the former would seem to be most applicable to falling asleep (hypnagogia) rather than waking up (hypnopompia).  Still, the distinction between what’s usually a late night versus an early morning thing does seem of some significance, especially that most in the discipline of the science of sleep (now quite an industry) seem to concede wake-sleep & sleep-wake transitions are not fully understood; nor are the associated visual experiences and debate continues about the extent to which they should (or can) be differentiated from other dream-states associated with deeper sleep.

Waking in a hypnopompic state: Lindsay Lohan in Falling for Christmas (Netflix, 2022).

One striking finding is that so few remember hypnopompic & hypnagogic imagery and that applies even among those who otherwise have some ability to recall their dreams.  What’s often reported by subjects or patients is the memory is fleeting and difficult to estimate in duration and that while the memory is often sustained for a short period after “waking”, quickly it vanishes.  An inability to recall one’s dreams in not unusual but this behavior is noted also for those with a sound recollection of the dreams enjoyed during deeper sleep states.  What seems to endure is a conceptual sense of what has been “seen”: faces known & unknown, fragmentary snatches of light and multi-dimensional geometric shapes.  While subjects report they “know” they have “seen” (and also “heard”) more fully-developed scenes, their form, nature or even the predominate colors prove usually elusive.  Despite all this, it’s not uncommon for people to remark the hypnopompic experience is “pleasant”, especially the frequently cited instances of floating, flying or even a separation from the physical body, something which seems more often called “trippy” than “scary”.

For some however, the hypnopompic & hypnagogic experience can be recalled, haunting the memory and the speculation is that if “nightmarish” rather than “dream-like”, recollection is more likely, especially if associated with “paralyzed hypnogogia or hypnopompia” in which a subject perceives themselves “frozen”, unable to move or speak while the experience persists (for centuries a reported theme in “nightmares).  Observational studies are difficult to perform to determine the length of these events but some work in neurological monitoring seems to suggest what a patient perceives as lasting some minutes may be active for only seconds, the implication being a long “real-time” experience can be manufactured in the brain in a much shorter time and the distress can clinically be significant.  For this reason, the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) regards hypnagogia & hypnopompia as something similar to synaesthesia (where a particular sensory stimulus triggers a second kind of sensation; things like letters being associated with colors) or certain sexual fetishes (which were once classified as mental disorders) in that they’re something which requires a diagnosis and treatment only if the condition is troubling for the patient.  In the fifth edition of the DSM (DSM-5 (2013)), hypnagogia anxiety was characterized by intense anxiety symptoms during this state, disturbing sleep and causing distress; it’s categorized with sleep-related anxiety disorders.

The Nightmare (1781), oil on canvas by the Swiss-English painter John Henry Fuseli (1741-1825), Detroit Institute of Arts.  It's a popular image to use to illustrate something "nightmare related".

When the political activist Max Eastman (1883–1969) visited Sigmund Freud (1856-1939)in Vienna in 1926, he observed a print of Fuseli's The Nightmare, hung next to Rembrandt's  (Rembrandt Harmenszoon van Rijn; 1606-1669) The Anatomy Lesson.  Although well known for his work on dream analysis (although it’s the self-help industry more than the neo-Freudians who have filled the book-shelves), Freud never mentions Fuseli's famous painting in his writings but it has been used by others in books and papers on the subject.  The speculation is Freud liked the work (clearly, sometimes, a painting is just a painting) but nightmares weren’t part of the intellectual framework he developed for psychoanalysis which suggested dreams (apparently of all types) were expressions of wish fulfilments while nightmares represented the superego’s desire to be punished; later he would refine this with the theory a traumatic nightmare was a manifestation of “repetition compulsion”.

Thursday, January 25, 2024

Alexithymia

Alexithymia (pronounced ey-lek-suh-thahy-mee-uh)

In psychiatry, a range of behaviors associated with certain conditions which manifests as a difficulty in experiencing, processing, expressing and describing emotional responses.

1973: The construct was the Ancient Greek a- (not) + λέξις (léxis) (speaking) + θυμός (thumós) (heart (in the sense of “soul”)) which deconstructs as a- + lexi + -thymia (in a medical context a suffix meaning “one’s state of mind”), alexithymia thus understood as “without words for emotions”.  Alexithymia is a noun and alexithymic & alexithymiac are nouns & adjectives; the noun plural of alexithymia is also alexithymia but alexithymics, the plural of alexithymic is the more common form.

The word alexithymia was in 1973 coined by US based psychiatrists John Nemiah (1918–2009) and Peter Sifneos (1920-2008) to describe a psychological state as well known to the general population as the profession, the former preferring terms “emotionless”, “taciturn”, “feelingless” or “impassive” although alexithymia has meanings which are more specific.  Translated literally as “no words for emotions”, in practice it’s a spectrum condition which references individual degrees of difficulty in recognizing, processing or expressing emotional states or experiences.  Although it appears in both the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) and the World Health Organization’s (WHO) International Classification of Diseases (ICD), neither class it as either a diagnosable mental disorder or a symptom.  Instead, it should be regarded as a dimensional construct and one distributed normally in the general population.  In other words it’s a personality trait and like all spectrum conditions, it varies in frequency and intensity between individuals.

Alexithymia was first described as a psychological construct characterized by difficulties in identifying, describing, and interpreting one's emotions but it was soon realized individuals less able to recognize and express their own feelings would often have a diminished ability to understand the emotional experiences of others.  Clinically, alexithymia is classified in two sub-groups: (1) Primary (or Trait) Alexithymia is considered more stable and enduring and the evidence suggests there is often a genetic or developmental basis, those with primary alexithymia displaying indications from an early age.  (2) Secondary (or State) Alexithymia is something usually temporary and often associated with specific psychological or medical conditions, noted especially in patients suffering post-traumatic stress disorder (PTSD) and depressive illnesses.

Available for both Android and iOS, there are Alexithymia apps and it's possible there are those who wish to increase the extent of at least some aspects of the condition in their own lives, the apps presumably a helpful tool in monitoring progress in either direction.  There must be emos who would like to be more alexithymic. 

The characteristics common to alexithymia include (1) a limited imaginative capacity and “fantasy life”, (2) a difficulty in identifying and describing emotions, (3) thought processes which focus predominately on external events rather than internal emotional experience, (3) a difficulty in distinguishing between emotions and bodily sensations and (4) challenges in understanding (or even recognizing) the emotions of others.  As a spectrum condition, alexithymia greatly can vary in severity, and not all with alexithymia will experience the same symptoms with there being a high instance reported among patients with psychiatric and psychosomatic disorders.  Additionally, it does seem a common feature of neurological disease with most evidence available for patients with traumatic brain injury, stroke, and epilepsy although the numbers may be slanted because of the greater volume of study of those affected and it remains unclear how independent it is from affective disorders such as depression and anxiety, both common in neurological conditions.

A sample from the validation study of the Toronto Alexithymia Scale (TAS-26) (in the Croatian population).

Clinicians have available a number of questionnaires which can be use to assess a patient’s state of alexithymia and these can do more than provide a metric; the limitation of drawing a conclusion from observation alone is that with such an approach it can genuinely be impossible to distinguish between the truly alexithymic and those who have no difficulties in experiencing, processing, expressing and describing emotional responses but for some reason choose not to.  Such behavior can of course induce problems in inter-personal relationships but it’s something distinct from alexithymia and importantly too, it is clinically distinct from psychiatric personality disorders, such as antisocial personality disorder.  However, as a structural view of the DSM over the seventy-odd years would indicate, within psychiatry, mission creep has been a growing phenomenon and the definitional nets tend to be cast wide and wider and it’s not impossible that alexithymia may in some future edition be re-classified as a diagnostic criterion or at least recognized formally as a symptom.  It has for some time been acknowledged the DSM has over those decades documented the reassessment of some aspects of the human condition as mental disorders but what is less discussed is the relationship between cause and effect and there will be examples of both: it would be interesting to try to work out if there’s a pattern in the nature of (1) the changes the DSM has driven compare with (2) those which retrospectively have been codified.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011,

There may be movement because alexithymia has many of the qualities and attributes which appeal to both academia and the pharmaceutical industry.  The orthodoxy is that it occurs in some 10% of the general population but is disproportionately seen in patients suffering certain mental conditions, notably neuro-developmental disorders; the prevalence among those with autism spectrum disorder (ASD) estimated at anything up to 90%.  What will probably surprise few is that within any sub-group, it is males who are by far the most represented population and there is even the condition normative male alexithymia (NMA) although that describes the behavior and not the catchment, NMA identified also in females.

Monday, January 8, 2024

Solemncholy

Solemncholy (pronounced sol-uhm-kol-ee)

(1) Solemn; serious.

(2) Solemn and melancholic.

1772: The construct was solemn +‎ (melan)choly.  The element –choly was never a standard suffix and was a Middle English variant of –colie used in French.  The Middle English adjective solemn dated from the late thirteenth century and was from solemne & solempne, from either Old French or directly from the Late Latin sōlennis & sōlempnis or the Classical Latin sōlemnis, a variant of sollemnis (consecrated, holy; performed or celebrated according to correct religious forms) which has always been of obscure origin although Roman scholars thought it could have come only from sollus (whole; complete), the derivative adjective formed by appending the noun annus (year), thus the idea of sollemnis meaning “taking place every year”.  Not all modern etymologists are convinced by that but acknowledge “some assimilation via folk-etymology is possible”.  In English, the extension of meaning from “annual events; sacred rites, ceremonies, holy days” to “a grave and serious demeanor; mirthless” was associative describing the behaviour expected of individuals attending such events.  Over time, the later sense became dissociated from the actual events and the original meaning became obsolete, surviving only in a handful of formal ecclesiastical calendars.  The word, without any reference to religious ceremonies meaning “marked by seriousness or earnestness” was common by the late fourteenth century, the sense of “fitted to inspire devout reflection” noted within decades.    Solemncholy is an adjective and no sources list the noun solemncholic or the adverb solemncholically as standard forms although, by implication, the need would seem to exist.  Emos presumably apply the adjectival comparative (more solemncholy) & superlative (most solemncholy) and perhaps too (during emo get-togethers) the plural forms solemncholics & solemncholies.

Melancholy was from the Middle English melancolie & malencolie (mental disorder characterized by sullenness, gloom, irritability, and propensity to causeless and violent anger), from the thirteenth century Old French melancolie (black bile; ill disposition, anger, annoyance), from the Late Latin melancholia, from the Ancient Greek μελαγχολία (melancholia) (atrabiliousness; sadness, (literally “excess of black bile”)), the construct being μέλας (mélas) or μελαν- (melan-) (black, dark, murky) + χολή (khol) (bile).  It appeared in Latin as ātra bīlis (black bile) and was for centuries part of orthodox medical diagnosis and the adjectival use was a genuine invention of Middle English although whether the used of the –ly as a component of the suffix was an influence or a product isn’t known.  Pre-modern medicine attributed what would now be called “depression” to excess “black bile”, a secretion of the spleen and one of the body's four “humors” which needed to be “in balance” to ensure physical & mental well-being.  The adjectival use in Middle English to describe “sorrow, gloom” was most associated by unrequited love or doomed affairs but this is likely more the influence of poets than doctors.  As the medical profession’s belief in the four humors declined during the eighteenth century as understanding of human physiology improved, the word was in the mid-1800s picked up by the newly (almost) respectable branch of psychiatry where it remained a defined “condition” until well into the twentieth century.

The physicians from Antiquity attributed mental depression to unnatural or excess "black bile," a secretion of the spleen and one of the body's four "humors," which help form and nourish the body unless altered or present in excessive amounts. The word also was used in Middle English to mean "sorrow, gloom" (brought on by unrequited love, disappointment etc).  In antiquity it was a concept rather than something with a standardized systemization and there existed competing models with more or fewer components but it’s because the description with four was that endorsed by the Greek physician Hippocrates (circa 460–circa 370 BC) that it became famous in the West and absorbed into medical practice.  The four humors of Hippocratic medicine were (1) black bile (μέλαινα χολή (melaina chole)), (2) yellow bile (ξανθη χολή (xanthe chole)), (3) phlegm (φλέγμα (phlegma)) & (4) blood (αἷμα (haima)), each corresponding with the four temperaments of man and linked also to the four seasons: yellow Bile=summer, black bile=autumn, phlegm=winter & blood=spring.  Since antiquity, doctors and scholars wrote both theoretical and clinical works, the words melancholia and melancholy used interchangeably until the nineteenth century when the former came to refer to a pathological condition, the latter to a temperament.  Depression was derived from the Latin verb deprimere (to press down) and from the fourteenth century, "to depress" meant to subjugate or to bring down in spirits and by 1665 was applied to someone having "a great depression of spirit", Dr Johnson (Samuel Johnson, 1709-1784) using the word in a similar sense in 1753.  Later, the term came into use in physiology and economics.

What was for over two-thousand years known as melancholia came gradually to be called depression, a reclassification formalized in the mid-twentieth century when mental illness was subject to codification.  The first edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM (1952)) included depressive reaction and the DSM-II (1968) added depressive neurosis, defined as an excessive reaction to internal conflict or an identifiable event, and also included a depressive type of manic-depressive psychosis within the category of Major Affective Disorders.  The term Major Depressive Disorder was introduced by a group of US clinicians in the mid-1970s and was incorporated into the DSM-III (1980).  Interestingly, the ancient idea of melancholia survives in modern medical literature in the notion of the melancholic subtype but, from the 1950s, the newly codified definitions of depression were widely accepted (although not without some dissent) and the nomenclature, with enhancements, continued in the DSM-IV (1994) and DSM-5 (2013)

According to the Oxford English Dictionary (OED), the earliest known instance of solemncholy in text dates from 1772 in the writings of Philip Vickers Fithian (1747–1776), peripatetic tutor, missionary & lay-preacher of the Presbyterian denomination of Christianity, now best remembered for his extensive diaries and letters which continue to provide historians with source material relating to the pre-revolutionary north-eastern colonies which would later form the United States of America.  His observations on slavery and the appalling treatment of those of African origin working the plantations in Virginia remain a revealing counterpoint to the rationalizations and justifications (not infrequently on a theological or scriptural basis) offered by many other contemporary Christians.  Those dictionaries which include an entry for solemncholy often note it as one of the humorous constructions in English, based usually on words from other languages or an adaptation of a standard English form.  That’s certainly how it has come to be used but Fithian was a Presbyterian who aspired to the ministry, not a breed noted for jocularity and in his journal entries its clear he intended to word to mean only that he was pursuing serious matters, in 1773 writing: “Being very solemncholy and somewhat tired, I concluded to stay there all night.

So it was an imaginative rather than a fanciful coining.  In contemporary culture, with mental health conditions increasingly fashionable, solemncholy (although still sometimes, if rarely, used in its original sense) found a new niche among those who wished to intellectualize their troubled state of mind and distinguish their affliction from mere depression which had become a bit common.  In a roundabout way, this meant it found a role too in humor, a joke about someone’s solemncholy still acceptable whereas to poke fun at their depression would be at least a micro-aggression:

Q: Victoria says she suffers from solemncholy.  Do you think that's a real condition?

A: Victoria is an emo; for her solemncholy is a calling.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The companion term to solemncholy is the sometimes acronym leucocholy (a state of feeling that accompanies preoccupation with trivial and insipid diversions).  The construct of leucocholy was leuco- + (melan)choly.  The leuco- prefix (which had appeared also as leuko-, leuc- & leuk-) was from the Proto-Hellenic λευκός (leukós) (white; colourless; leucocyte), from the primitive Indo-European lewk- (white; light; bright), the cognates including the Latin lūx, the Sanskrit रोचते (rocate), the Old Armenian լոյս (loys) and the Old English lēoht (light, noun) from which English gained “light”.  In the Ancient Greek, the word evolved to enjoy a range or meanings, just as in would happen English including (1) bright, shining, gleaming, (2) light in color; white, (3) pale-skinned, weakly, cowardly & (4) fair, happy, joyful.  Leucocholy is said to have been coined by the English poet and classical scholar Thomas Gray (1716–1771) whose oeuvre was highly regarded despite being wholly compiled into one slim volume and he’s remembered also for declining appointment as England’s Poet Laureate, thereby forgoing the both the tick of approval from the establishment and the annual cask of “strong wine” which came with the job.  What he meant by a “white melancholy” seems to have been a state of existence in which there may not be joy or enchantment but is pleasant: unfulfilling yet undemanding.  In such a state of mind, as he put it:  ca ne laisse que de s’amuser (which translates most elegantly as something like “all that is left for us is to have some fun”).

Wednesday, December 13, 2023

Autophagia

Autophagia (pronounced aw-tuh-fey-juh or aw-tuh-fey-jee-uh)

(1) In cytology, the process of self-digestion by a cell through the action of enzymes originating within the same cell (the controlled digestion of damaged organelles within a cell which is often a defensive and/or self-preservation measure and associated with the maintenance of bodily nutrition by the metabolic breakdown of some bodily tissues).

(2) In cytology, a type of programmed cell death accomplished through self-digestion (known also as apoptosis and associated with the maintenance of bodily nutrition by the metabolic breakdown of some bodily tissues).

(3) In psychiatry, self-consumption; the act of eating oneself.

The construct was auto- + -phagia.  The auto-prefix was a learned borrowing from Ancient Greek ατο- (auto-) (self-) (reflexive, regarding or to oneself (and most familiar in forms like autobiography)), from ατός (autós) (himself/herself/oneself), from either a construct of (1) the primitive Indo-European hew (again) + to- (that) or (2) the Ancient Greek reflexes of those words, α () (back, again, other) + τόν (tón) (the) and related to Phrygian αυτος (autos), the existence of alternatives suggesting there may have been a common innovation.  Phagia was from the Ancient Greek -φαγία (-phagía) (and related to -φαγος (-phagos) (eater)), the suffix corresponding to φαγεν (phageîn) (to eat), the infinitive of φαγον (éphagon) (I eat), which serves as aorist (A verb paradigm found in certain languages, usually an unmarked form or one that expresses the perfective or aorist aspect) for the defective verb σθίω (esthíō) (I eat).  The alternative spelling is autophagal and the synonyms (sometimes used in non-specialist contexts) are self-consumption & auto-cannibalism.  Autophagia, autophagophore, autophagosome & autophagy are nouns, autophagically is an adverb, autophagocytotic is an adjective and autophagic is an adjective (and a non-standard noun); the noun plural is autophagies.

In cytology (in biology, the study of cells), autophagy is one aspect of evolutionary development, a self-preservation and life-extending mechanism in which damaged or dysfunctional parts of a cell are removed and used for cellular repair.  Internally, it’s thus beneficial, the removal or recycling of debris both efficient and (by this stage of evolutionary development) essential, most obviously because it removes toxins and “effectively “creates” younger cells from the old; it can thus be thought an anti-aging mechanism.  It something which has also interested cancer researchers because all cancers (as the word and the parameters of the disease(s) are defined) start from some sort of cell-defect and the speculation is it might be possible to in some way adapt the autophagic process, re-purposing it to identify and remove suspect cells.

In psychiatry, autophagia refers to the act of eating oneself which is sometimes described as self-consumption or the even more evocative auto-cannibalism.  Perhaps surprisingly, the behavior is not explicitly mentioned in the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) which of course means there are no published diagnostic criteria nor recommendations for treatments.  The DSM’s editors note there are a number of reasons why a specific behavior may not be included in the manual notably (1) the lack of substantial empirical evidence or research, (2) the rarity of cases and (3) the material to hand being unsuitable (in terms of volume or quality) for the development of practical tools for clinicians to diagnose and treat a disorders.

It would be flippant to suggest autophagia might have been included when the revisions in the fifth edition of the DSM (DSM-5 (2013)) included a more systematic approach taken to eating disorders and as well as variable definitional criteria being defined for the range of behaviours within that general rubric, just about every other form of “unusual” consumption was listed including sharp objects (acuphagia), purified starch (amylophagia), burnt matches (cautopyreiophagia), dust (coniophagia), feces (coprophagia), sick (emetophagia), raw potatoes (geomelophagia), soil, clay or chalk (geophagia), glass (hyalophagia), stones (lithophagia), metal (metallophagia), musus (mucophagia), ice (pagophagia), lead (plumbophagia), hair, wool, and other fibres (trichophagia), urine (urophagia), blood (hematophagia (sometimes called vampirism)) and wood or derivates such as paper & cardboard (xylophagia).  The DSM-5 also codified the criteria for behaviour to be classified pica (a disorder characterized by craving and appetite for non-edible substances, such as ice, clay, chalk, dirt, or sand and named for the jay or magpie (pīca in Latin), based on the idea the birds will eat almost anything): they must (1) last beyond one (1) month beyond an age in infancy when eating such objects is not unusual, (2) not be culturally sanctioned practice and (3), in quantity or consequence, be of sufficient severity to demand clinical intervention.  However, pica encompassed only “non-nutritive substances” which of course one’s own body parts are not.

Finger food: Severed fingers are a popular menu item for Halloween parties; kids think they're great.  For those who like detail, those emulating nail polish seem to be following Dior shades 742 (top right) and 999 (bottom right). 

In the profession, autophagia seems to be regarded not as a progression from those who eat their fingernails or hair but something with more in common with the cutters.  Cutters are the best known example of self-harmers, the diagnosis of which is described in DSM as non-suicidal self-injury (NSSI).  NSSI is defined as the deliberate, self-inflicted destruction of body tissue without suicidal intent and for purposes not socially sanctioned; it includes behaviors such as cutting, burning, biting and scratching skin.  Behaviorally, it’s highly clustered with instances especially prevalent during adolescence and the majority of cases being female although there is some evidence the instances among males may be under-reported.  It’s a behavior which has long interested and perplexed the profession because as something which involves deliberate and intentional injury to body tissue in the absence of suicidal intent (1) it runs counter to the fundamental human instinct to avoid injury and (2) as defined the injuries are never sufficiently serious to risk death, a well-understood reason for self-harm.  Historically, such behaviors tended to be viewed as self-mutilation and were thought a form of attenuated suicide but in recent decades more attention has been devoted to the syndrome, beginning in the 1980s at a time when self-harm was regarded as a symptom of borderline personality disorder (BPD) (personality disorders first entered DSM when DSM-III was published in 1980), distinguished by suicidal behavior, gestures, threats or acts of self-mutilation.  Clinicians however advanced the argument the condition should be thought a separate syndrome (deliberate self-harm syndrome (DSHS)), based on case studies which identified (1) a patient’s inability to resist the impulse to injure themselves, (2) a raised sense of tension prior to the act and (3) an experience of release or at least partial relief after the act.  That a small number of patients were noted as repeatedly self-harming was noted and it was suggested that a diagnosis called repetitive self-mutilation syndrome (RSMS) should be added to the DSM.  Important points associated with RSMS were (1) an absence of conscious suicidal intent, (2) the patient’s perpetually negative affective/cognitive which was (temporarily) relieved only after an act of self-harm and (3) a preoccupation with and repetitiveness of the behavior.  Accordingly, NSSI Disorder was added to the DSM-5 (2013) and noted as a condition in need of further study.

However, although there would seem some relationship to cutting, it’s obviously a different behavior to eat one’s body parts and the feeling seems to be that autophagia involves a quest for pain and that suggests some overlap with other conditions and it certainly belongs in the sub category of self-injurious behavior (SIB).  The literature is said to be sparse and the analysis seems not to have been extensive but the behavior has been noted in those diagnosed with a variety of conditions including personality disorders, anxiety disorders, obsessive compulsive disorder, schizophrenia and bipolar disorder.  The last two have been of particular interest because the act of biting off and eating some body part (most typically fingers) has been associated with the experience of hallucinations and patients have been recorded as saying the pain of the injury “makes the voices stop”.  Importantly, autophagia has a threshold and while in some senses can be thought a spectrum condition (in terms of frequency & severity), behaviors such as biting (and even consuming) the loose skin on the lips (morsicatio buccarum) or the ragged edges of skin which manifest after nail biting (onychophagia) are common and few proceed to autophagia and clinicians note neurological reasons may also be involved.    

Lindsay Lohan with bread on the syndicated Rachael Ray Show, April 2019.

Autophagia and related words should not be confused with the adjective artophagous (bread-eating).  The construct was the Artos + -phagous.  Artos was from the Ancient Greek ρτος (ártos) (bread), of pre-Greek origin.  Phagous was from the Latin -phagus, from the Ancient Greek -φάγος (-phágos) (eating) from φαγεν (phageîn) (to eat).  Apparently, in the writings of the more self-consciously erudite, the word artophagous, which enjoyed some currency in the nineteenth century, was still in occasional use as late as the 1920s but most lexicographers now either ignore it or list it as archaic or obsolete.  It’s an example of a word which has effectively been driven extinct even though the practice it describes (the eating of bread) remains as widespread and popular as ever.  Linguistically, this is not uncommon in English and is analogous with the famous remark by Sheikh Ahmed Zaki Yamani (1930–2021; Saudi Arabian Minister of Petroleum and Mineral Resources 1962-1986): “The Stone Age came to an end not for a lack of stones, and the Oil Age will end, but not for a lack of oil” (the first part of that paraphrased usually as the punchier “the Stone Age did not end because the world ran out of rocks”).

Monday, November 13, 2023

Somnambulism

Somnambulism (pronounced som-nam-byuh-liz-uhm or suhm- nam-byuh-liz-uhm)

Sleepwalking; a condition characterized by walking while asleep or in a hypnotic trance

1786: A Modern English borrowing, via the French somnambulisme from the New Latin somnambulismus (sleepwalker), the construct of the original being somn(us) (sleep) + ambul(āre) (to walk) + -ismus (equivalent to the English –ism).  In English, the construct became somnus + ambulo + -ism.  Somnus came from the Proto-Italic swepnos, from the primitive Indo-European swépnos, from the root swep- (to sleep); the form spread east too, including the Lithuanian sãpnas.  Ambulo is from ambi- + alō (to wander”), from the primitive Indo-European hzel- (to wander) and was cognate with the Ancient Greek λη (álē) (wandering) & λύω (alúō) (to wander in mind, to roam).  The suffix –ism is ultimately from either the Ancient Greek -ισμός (-ismós), a suffix that forms abstract nouns of action, state, condition, doctrine; from stem of verbs in -ίζειν (-ízein) (from which whence English gained -ize), or from the related suffix from Ancient Greek -ισμα (-isma), which more specifically expressed a finished act or thing done.  Somnambulist is a noun, somnambulation a verb and somnambulistic an adjective; in the technical jargon of clinicians, there’s the mysterious semisomnambulistic, the implication presumably that somnambulism (at least when not raised in court as a defense) may be a spectrum condition.  Somnambulism, somnambulator, somnambulation, somnambulance & somnambulist are nouns, somnambulate & somnambulating are verb, somnambular, somnambulic & somnambulistic are adjectives and somnambulistically is an adverb; the most common noun plural is somnambulists.

Sleepwalking scene, Lady Macbeth (1829), by Johann Heinrich Ramberg (1763–1840).

As it was in science, philosophy and art, the Enlightenment proved productive in words, creations needed to describe newly discovered things and novel ideas.  The noun somnambulism came into use originally during the excitement over "animal magnetism"; it won out over noctambulation which endured not long.  A flurry of linguistic action ensued in the early nineteenth century including somnambule (1837), somnambulator (1803), somnambulary (1827) & somnambular (1820).  When the theory of animal magnetism (the doctrine that one person can exercise influence over the will and nervous system of another and produce certain phenomena by virtue of a supposed emanation called animal magnetism) was published in 1778, it created great interest.  Called mesmerism (from the French mesmérisme and named for Franz Anton Mesmer (1734-1815), an Austrian physician who developed a theory of animal magnetism and a mysterious body fluid which allows one person to hypnotize another), the still used word is synonymous with hypnotism or artificial somnambulism.  Another similar word for the same effect was braidism, named after English physician James Braid (1795-1860) and an ancient term for "hypnotic suggestion" was "mesmeric promise".

Somnambulism is classified among the parasomnias, sleep-wake disorders characterized by undesirable motor, verbal, or experiential phenomena occurring in association with sleep, specific stages of sleep, or sleep-awake transition phases.  In the fifth edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-5 (2013)), somnambulism is noted as a condition rather than a mental illness with most attention given to the protocols to be followed when awakening sleepwalkers.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The other profession to take interest in somnambulism is the criminal bar.  At common law, sleepwalking can in some circumstances be a complete defense to any charge including murder.  That’s because the law (generally) will convict in criminal matters only if intent can be proved and that requires a “guilty mind”.  The legal Latin is mēns rea (literally “guilty mind”), from the English common law precept actus non facit reum nisi mens rea sit (the act does not make a person guilty unless the mind is also guilty).  It’s rarely successful but if it can be proved a defendant was, at the time of the act, “a sane automaton” (ie in effect sleepwalking and thus unaware of their actions), it’s an absolute defense.  Lawyers like it because sane automatism is a defense even against crimes of strict liability like dangerous driving, where no intent is necessary.  If the defense succeeds, the defendant walks free, unlike a finding of insanity (ie the notion of the insane automaton) where even if not found guilty, they're anyway locked-up.

Tuesday, October 24, 2023

Anorexia

Anorexia (pronounced an-uh-rek-see-uh)

(1) In clinical medicine, loss of appetite and inability to eat.

(2) In psychiatry, as anorexia nervosa, a defined eating disorder characterized by fear of becoming fat and refusal of food, leading to debility and even death.

(3) A widely-used (though clinically incorrect) short name for anorexia nervosa.

1590–1600: From the New Latin, from the Ancient Greek νορεξία (anorexía), the construct being ν (an) (without) + ρεξις (órexis) (appetite; desire).  In both the Greek and Latin, it translated literally as "a nervous loss of appetite".  Órexis (appetite, desire) is from oregein (to desire, stretch out) and was cognate with the Latin regere (to keep straight, guide, rule).  Although adopted as a metaphorical device to describe even inanimate objects, anorexia is most often (wrongly) used as verbal shorthand for the clinical condition anorexia nervosa.  The former is the relatively rare condition in which appetite is lost for no apparent reason; the latter the more common eating disorder related to most cases to body image.  Interestingly, within the English-speaking world, there are no variant pronunciations.

Anorexia Nervosa and the DSM

The pro-ana community has created its own sub-set of standard photographic angles, rather as used car sites typically feature certain images such as the interior, the odometer, the engine etc.  Among the most popular images posted on "thinspiration" pages are those which show bone definition through skin and, reflecting the superior contrast possible, there's a tendency use grayscale, usually converted from color originals.  The favored body parts include the spine, hip bones, clavicles (collar bones) and the shoulder blades.     

Although documented since antiquity, the condition in its modern form wasn't noted in western medical literature until an 1873 paper presented to the Royal College of Physicians (RCP) called “Anorexia Hysterica”, a description of a loss of appetite without an apparent gastric cause.  That same year, a similar condition was mentioned in a French publication, also called “l’anorexie hystérique”, and described food refusal combined with hyperactivity.  Although the author of the earlier work had within a year changed the descriptor to “Anorexia Nervosa”, the implication in all these papers was of an affliction exclusively female, something very much implied in l’anorexie hystérique”, hysteria then a mainstream diagnosis and one thought inherently "a condition of women".

A slight Lindsay Lohan demonstrates "an anorexic look" which is something distinct from the clinically defined condition "anorexia nervosa" although there's obviously some overlap.

After its acceptance as a psychogenic disorder in the late nineteenth century, anorexia nervosa (AN) was the first eating disorder placed in the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM).  In the first edition (DSM-I (1952)), it was considered a psycho-physiological reaction (a neurotic illness).  In the DSM-II (1968), it was listed with special symptoms & feeding disturbances, which also included pica and rumination.  In DSM-III (1980), eating disorders were classified under disorders of childhood or adolescence, perhaps, at least in part, contributing to the under-diagnosis of later-onset cases.  At that time, the American Psychiatric Association (APA) created two specific categories that formally recognized the diagnosis of eating disorders: AN and binge eating (called bulimia in DSM-III and bulimia nervosa (BN; the obsessive regurgitation of food) in both the revised DSM-III (1987) and DSM-IV (1994).  In the DSM-IV, all other clinically significant eating disorder symptoms were absorbed by the residual categories of eating disorder not otherwise specified (EDNOS) and binge-eating disorder (BED), noting the disorders were the subjects for further research.  Subsequently, When the DSM-IV was revised (2000), eating disorders moved to an independent section.  The DSM-5 (2013) chapter for eating disorders added to the alphabet soup.  In addition to pica, AN, BN and BED, DSM-5 added  avoidant/restrictive food intake disorder (ARFID) and other specified feeding or eating disorder (OSFED), the latter including some other peculiar pathological eating patterns, like atypical AN (where all other criteria for AN are met, but weight is in the normal range).