Showing posts with label Medicine. Show all posts
Showing posts with label Medicine. Show all posts

Saturday, January 10, 2026

Aphantasia

Aphantasia (pronounced ay-fan-tay-zhuh)

The inability voluntarily to recall or form mental images.

2015: The word (not the diagnosis) was coined by UK neurologist Dr Adam Zeman (b 1957), neuropsychologist Dr Michaela Dewar (b 1976) and Italian neurologist Sergio Della Sala (b 1955), first appearing the paper Lives without imagery.  The construct was a- (from the Ancient Greek ἀ- (a-), used as a privative prefix meaning “not”, “without” or “lacking” + phantasía (from the Greek φαντασία (“appearance”, “imagination”, “mental image” or “power of imagination”, from φαίνω (phaínō) ( “to show”, “to make visible” or “to bring to light”).  Literally, aphantasia can be analysed as meaning “an absence of imagination” or “an absence of mental imagery” and in modern medicine it’s defined as “the inability voluntarily to recall or form mental images”.  Even in Antiquity, there was some meaning shift in phantasía, Plato (circa 427-348 BC) using the word to refer generally to representations and appearances whereas Aristotle (384-322 BC) added a technical layer, his sense being faculty mediating between perception (aisthēsis) and thought (noēsis).  It’s the Aristotelian adaptation (the mind’s capacity to form internal representations) which flavoured the use in modern neurology.  Aphantasia is a noun and aphantasic is a noun & adjective; the noun plural is aphantasics.

Scuola di Atene (The School of Athens, circa 1511), fresco by Raphael (Raffaello Sanzio da Urbino, 1483–1520), Apostolic Palace, Vatican Museums, Vatican City, Rome.  Plato and Aristotle are the figures featured in the centre.

In popular use, the word “aphantasia” can be misunderstood because of the paths taken in English by “phantasy”, “fantasy” and “phantasm”, all derived from the Ancient Greek φαντασία (phantasía) meaning “appearance, mental image, imagination”.  In English, this root was picked up via Latin and French but the multiple forms each evolved in distinct semantic trajectories.  The fourteenth century phantasm came to mean “apparition, ghost, illusion” so was used of “something deceptive or unreal”, the connotation being “the supernatural; spectral”.  This appears to be the origin of the association of “phantas-” with unreality or hallucination rather than normal cognition.  In the fifteenth & sixteenth centuries, the spellings phantasy & fantasy were for a time interchangeable although divergence came with phantasy used in its technical senses of “mental imagery”; “faculty of imagination”; “internal representation”, this a nod to Aristotle’s phantasía.  Fantasy is the familiar modern form, used to suggest “a fictional invention; daydream; escapism; wish-fulfilment, the connotation being “imaginative constructions (in fiction); imaginative excess (in the sense of “unreality” or the “dissociative”); indulgence (as in “speculative or wishful thoughts”)”.

While the word “aphantasia” didn’t exist until 2015, in the editions of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) published between 1952 (DSM-I) and 2013 (DSM-5), there was not even any any discussion (or even mention) of a condition anything like “an inability voluntarily to recall or form mental images”.  That’s because despite being “a mental condition” induced by something happening (or not happening) in the brain, the phenomenon has never been classified as “a mental disorder”.  Instead it’s a cognitive trait or variation in the human condition and technically is a spectrum condition, the “pure” aphantasic being one end of the spectrum, the hyperaphantasic (highly vivid, lifelike mental imagery, sometimes called a “photorealistic mind's eye”) the other.  That would of course imply the comparative adjective would be “more aphantasic” and the superlative “most aphantasic” but neither are standard forms.

If That rationale for the “omission” was the DSM’s inclusion criteria including the requirement of some evidence of clinically significant distress or functional impairment attributable to a condition.  Aphantasia, in isolation, does not reliably meet this threshold in that many individuals have for decades functioned entirely “normally” without being aware they’re aphantasic  while others presumably had died of old age in similar ignorance.  That does of course raise the intriguing prospect the mental health of some patients may have been adversely affected by the syndrome only by a clinician informing them of their status, thus making them realize what they were missing.  This, the latest edition of the DSM (DSM-5-TR (2022)) does not discuss.  The DSM does discuss imagery and perceptual phenomena in the context of other conditions (PTSD (post-traumatic stress disoder), psychotic disorders, dissociative disorders etc), but these references are to abnormal experiences, not the lifelong absence of imagery.  To the DSM’s editors, aphantasis remains a recognized phenomenon, not a diagnosis.

Given that aphantasia concerns aspects of (1) cognition, (2) inner experience and (3) mental representation, it wouldn’t seem unreasonable to expect the condition now described as aphantasia would have appeared in the DSM, even if only in passing or in a footnote.  However, in the seventy years between 1952-2022, over nine editions, there is no mention, even in DSM-5-TR (2022), the first volume released since the word was in 2015 coined.  That apparently curious omission is explained by the DSM never having been a general taxonomy of mental phenomena.  Instead, it’s (an ever-shifting) codification of the classification of mental disorders, defined by (1) clinically significant distress and/or (2) functional impairment and/or (3) a predictable course, prognosis and treatment relevance.  As a general principle the mere existence of an aphantasic state meets none of these criteria.

Crooked Hillary Clinton in Orange Nina McLemore pantsuit, 2010.  If in response to the prompt "Imagine a truthful person" one sees an image of crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), that obviously wrong but is not an instance of aphantasia because the image imagined need not be correct, it needs just to exist.

The early editions (DSM-I (1952) & DSM-II (1968)) heavily were slanted to the psychoanalytic, focusing on psychoses, neuroses and personality disorders with no mention of any systematic treatment of cognition as a modular function; the matter of mental imagery (even as abstract though separated from an imagined image), let alone its absence, wholly is ignored.  Intriguingly, given what was to come in the field, there was no discussion of the cognitive phenomenology beyond gross disturbances (ie delusions & hallucinations).  Even with the publication of the DSM-III (1980) & DSM-III-R (1987), advances in scanning and surgical techniques, cognitive psychology and neuroscience seem to have made little contribution to what the DSM’s editorial board decided to include and although DSM-III introduced operationalized diagnostic criteria (as a part of a more “medicalised” and descriptive psychiatry), the entries still were dominated by a focus on dysfunctions impairing performance, the argument presumably that it was possible (indeed, probably typical) for those with the condition to lead, full, happy lives; the absence of imagery ability thus not considered a diagnostically relevant variable.  Even in sections on (1) amnestic disorders (a class of memory loss in which patients have difficulty forming new memories (anterograde) or recalling past ones (retrograde), not caused by dementia or delirium but of the a consequence of brain injury, stroke, substance abuse, infections or trauma), with treatment focusing on the underlying cause and rehabilitation, (2) organic mental syndromes or (3) neuro-cognitive disturbance, there was no reference to voluntary imagery loss as a phenomenon in its own right.

Although substantial advances in cognitive neuroscience meant by the 1990s neuropsychological deficits were better recognised, both the DSM-IV (1994) and DSM-IV-TR (2000) continued to be restricted to syndromes with behavioural or functional consequences.  In a way that was understandable because the DSM still was seen by the editors as a manual for working clinicians who were most concerned with helping those afflicted by conditions with clinical salience; the DSM has never wandered far into subjects which might be matters of interesting academic research and mental imagery continued to be mentioned only indirectly, hallucinations (percepts without stimuli) and memory deficits (encoding and retrieval) both discussed only in the consequence of their affect on a patient, not as phenomenon.  The first edition for the new century was DSM-5 (2013) and what was discernible was that discussions of major and mild neuro-cognitive disorders were included, reflecting the publication’s enhanced alignment with neurology but even then, imagery ability is not assessed or scaled: not possessing the power of imagery was not listed as a symptom, specifier, or associated feature.  So there has never in the DSM been a category for benign cognitive variation and that is a product of a deliberate editorial stance rather than an omission, many known phenomenon not psychiatrised unless in some way “troublesome”.

The term “aphantasia” was coined to describe individuals who lack voluntary visual mental imagery, often discovered incidentally and not necessarily associated with brain injury or psychological distress.  In 2015 the word was novel but the condition had been documented for more than a century, Sir Francis Galton (1822–1911) in a paper published in 1880 describing what would come to be called aphantasia.  That work was a statistical study on mental imagery which doubtless was academically solid but Sir Francis’s reputation later suffered because he was one of the leading lights in what was in Victorian times (1837-1901) the respectable discipline of eugenics.  Eugenics rightly became discredited so Sir Francis was to some extent retrospectively “cancelled” (something like the Stalinist concept of “un-personing”) and these days his seminal contribution to the study of behavioural genetics is acknowledged only grudgingly.

Galton in 1880 noted a wide variation in “visual imagination” (ie it was understood as a “spectrum condition”) and in the same era, in psychology publications the preferred term seems to have been “imageless thought”.  In neurology (and trauma medicine generally) there were many reports of patients losing the power of imagery after a brain injury but no agreed name was ever applied because the interest was more in the injury.  The unawareness that some people simply lacked the facility presumably must have been held among the general population because as Galton wrote: “To my astonishment, I found that the great majority of the men of science to whom I first applied, protested that mental imagery was unknown to them, and they looked on me as fanciful and fantastic in supposing that the words “mental imagery” really expressed what I believed everybody supposed them to mean. They had no more notion of its true nature than a colour-blind man who has not discerned his defect has of the nature of colour.

His paper must have stimulated interest because one psychologist reported some subjects possessing what he called a “typographic visual type” imagination in which ideas (which most would visualize as an image of some sort) would manifest as “printed text” which was intriguing because in the same way a computer in some aspects doesn’t distinguish between an image file (jpeg, TIFF, webp, avif etc) which is a picture of (1) someone and (2) their name in printed form, that would seem to imply at least some who are somewhere on the aphantasia spectrum retain the ability to visualize printed text, just not the object referenced.  Professor Zeman says he first became aware of the condition in 2005 when a patient reported having lost the ability to visualize following minor surgery and after the case was in 2010 documented in the medical literature in the usual way, it provoked a number of responses in which multiple people informed Zeman they had never in their lifetime been able to visualize objects.  This was the origin of Zeman and his collaborators coining “congenital aphantasia”, describing individuals who never enjoyed the ability to generate voluntary mental images.  Because it was something which came to general attention in the age of social media, great interest was triggered in the phenomenon and a number of “on-line tests” were posted, the best-known of which was the request for readers to “imagine a red apple” and rate their “mind's eye” depiction of it on a scale from 1 (photorealistic visualisation) through to 5 (no visualisation at all).  For many, this was variously (1) one’s first realization they were aphantasic or (2) an appreciation one’s own ability or inability to visualise objects was not universal.

How visualization can manifest: Lindsay Lohan and her lawyer in court, Los Angeles, December. 2011.  If an aphantasic person doesn't know about aphantasia and doesn't know other people can imagine images, their lives are probably little different from them; it's just their minds have adapted to handle concepts in another way.

Top right: What’s thought “normal” visualization (thought to be possessed by most of the population) refers to the ability to imagine something like a photograph of what’s being imagined.  This too is a spectrum condition in that some will be able to imagine an accurate “picture”, something like a HD (high definition photograph” while others will “see” something less detailed, sketchy or even wholly inaccurate.  However, even if when asked to visualize “an apple” one instead “sees a banana”, that is not an indication of aphantasia, a condition which describes only an absence of an image.  Getting it that wrong is an indication of something amiss but it’s not aphantasia.

Bottom left: “Seeing” text in response to being prompted to visualize something was the result Galton in 1880 reported as such a surprise.  It means the brain understands the concept of what is being described; it just can’t be imagined as an image.  This is one manifestation of aphantasia but it’s not related to the “everything is text” school of post-modernism.  Jacques Derrida’s (1930-2004) fragment “Il n'y a pas de hors-texte” (literally “there is no outside-text”) is one of the frequently misunderstood phrases from the murky field of deconstruction bit it has nothing to do with aphantasia (although dedicated post-modernists probably could prove a relationship).

Bottom right: The absence of any image (understood as a “blankness” which does not necessarily imply “whiteness” or “blackness” although this is the simple way to illustrate the concept), whether text or to some degree photorealistic is classic aphantasia.  The absence does not mean the subject doesn’t understand the relevant object of concept; it means only that their mental processing does not involve imagery and for as long as humans have existed, many must have functioned in this way, their brains adapted to the imaginative range available to them.  What this must have meant was many became aware of what they were missing only when the publicity about the condition appeared on the internet, am interesting example of “diagnostic determinism”.

WebMD's classic Aphantasia test.

The eyes are an out-growth of the brain and WebMD explains aphantasia is caused by the brain’s visual cortex (the part of the brain that processes visual information from the eyes) “working differently than expected”, noting the often quoted estimate of it affecting 2-4% of the population may be understated because many may be unaware they are “afflicted”.  It’s a condition worthy of more study because aphantasics handle the characteristic by processing information differently from those who rely on visual images.  There may be a genetic element in aphantasia and there’s interest too among those researching “Long Covid” because the symptom of “brain fog” can manifest much as does aphantasia.

Aphantasia may have something to do with consciousness because aphantasics can have dreams (including nightmares) which can to varying degrees be visually rich.  There’s no obvious explanation for this but while aphantasia is the inability voluntarily to generate visual mental imagery while awake, dreaming is an involuntary perceptual experience generated during sleep; while both are mediated by neural mechanisms, these clearly are not identical but presumably must overlap.  The conclusions from research at this stage remains tentative the current neuro-cognitive interpretation seems to suggest voluntary (conscious) imagery relies on top-down activation of the visual association cortex while dream (unconscious) dream imagery relies more on bottom-up and internally driven activation during REM (rapid eye movement) sleep.  What that would seem to imply is that in aphantasia, the former pathway is impaired (or at least inaccessible), while the latter may remain intact (or accessible).

The University of Queensland’s illustration of the phantasia spectrum.

The opposite syndrome is hyperphantasia (having extremely vivid, detailed, and lifelike mental imagery) which can be a wonderful asset but can also be a curse, rather as hyperthymesia (known also as HSAM (Highly Superior Autobiographical Memory) and colloquially as “total recall”) can be disturbing.  Although it seems not to exist in the sense of “remembering everything, second-by-second”, there are certainly those who have an extraordinary recall of “events” in their life and this can have adverse consequences for mental health because one of the mind’s “defensive mechanisms” is forgetting or at least suppressing memories which are unwanted.  Like aphantasia & hyperphantasia, hyperthymesia is not listed by the DSM as a mental disorder; it is considered a rare cognitive trait or neurological phenomenon although like the imaging conditions it can have adverse consequences and these include disturbing “flashbacks”, increased rumination and increased rates of anxiety or obsessive tendencies.

Sunday, December 28, 2025

Osphresiolagnia

Osphresiolagnia (pronounced aus-free-see-a-lan-gee-ah)

A paraphilia characterized by recurrent sexually arousing fantasies, sexual urges, or behaviour involving smells.

Early-mid twentieth century: A coining in clinical psychiatry the construct being osphres(is) + lagina.  Osphresis was from the Ancient Greek ὀσφρῆσις (osphrēsis) (sense of smell; olfaction).  Lagina was from the Ancient Greek λαγνεία (lagina) (lust; sexual desire), from λᾰγνός (lagnos) (lustful; sexually aroused).  Osphresiolagnia thus translated literally as “lust or sexual arousal related to or induced by one’s sense of smell”. Osphresiolagnia & Osphresiolagnism are nouns and osphresiolagnic is a noun & adjective; the noun plural is Osphresiolagnias.

The synonym is olfactophilia (sexual arousal caused by smells or odors, especially from the human body) and in modern clinical use, that’s seems now the accepted form.  Although now rare, in clinical use a renifleur was paraphiliac who derived sexual pleasure from certain smells.  Renifleur was from the French noun renifleur (the feminine renifleuse, the plural renifleurs), the construct being renifler +‎ -eur.  The construct of the verb renifler was re- (used in the sense of “to do; to perform the function”) + nifler (to irritate, to annoy); it was from the same Germanic root as the Italian niffo & niffa (snout) and related to the Low German Niff (nose, mouth, bill), the Dutch neb (nose, beak) and the English neb (nose, beak, face).  The French suffix -eur was from the Middle French, from the Old French -eor or -or, from the Latin -ātōrem & -tor and a doublet of -ateur.  It was used to form masculine agent nouns from verbs (some of which were used also as adjectives).

Pioneering Austrian psychoanalyst Sigmund Freud (1856-1939) never developed his hypothesis of osphresiolagnia into a fully-developed theory and in his papers it’s mentioned only as an aspect of the psychoanalytic exploration of human sexuality, specifically focusing on the role of olfactory stimuli (sense of smell) in sexual arousal.  It was part of a body of work in which he explored his concept of fetishism and infantile sexuality.  In psychoanalysis, osphresiolagnia described the condition (“the state” might now be thought a better way of putting it) where certain smells become associated with sexual pleasure or arousal and to Freud these naturally were those related to bodily functions, such as sweat, skin, or other natural odors because he believed different sensory experiences, including smell, could become a focus of sexual fixation, particularly if something in early psychosexual development caused this association.  The tie-in with fetishism was that an obsessive focus on the sense of can form as a way of displacing or substituting more normative sexual interests.  Freud spoke also of the significance of the senses (including smell) in early childhood development and linked them to psychosexual stages, where early experiences with stimuli can influence later adult sexuality and while he didn’t use the word, he believed a smell associated with some significant childhood experience, could, even decades later, act as a “trigger”.  Although it’s been in the literature for more than a century, osmophresiolagnia (also now sometimes called “olfactory stimulation”) seems to have aroused more clinical and academic interest in the last fifteen years and while the psychological and physiological responses to certain smells have been well-documented, it was usually in the context of revulsion and the way this response could influence the decision-making processes.  However, positive responses can also be influential, thus the renewed interest.

In medicine and the study of human and animal sexuality, the significance of “olfactory attraction” has been researched and appears to be well understood.  At its most, the idea of olfactory attraction is that animals (including humans) can be attracted to someone based on scent; in the patients seen by psychiatrists, they can also be attracted to objects based on their smell, either because of their inherent quality or by their association with someone (either someone specific or “anyone”.  The best known aspect of the science is the study of pheromones (in biology A chemical secreted by an animal which acts to affects the development or behavior of other members of the same species, functioning often as a means of attracting a member of the opposite sex).  Human pheromones have been synthesised and are available commercially in convenient spray-packs for those who wish to enhance their desirability with a chemical additive.  More generally, there is also the notion of “fragrance attraction” which describes the allure another’s smell (either natural or the scent they wear) exerts and this can manifest in “objective transference” (keeping close during periods of absence a lover’s article of clothing or inhaling from the bottle of perfume they wear.

The opposite of being attracted to a smell is finding one repellent.  What is known in the profession technically as ORS (olfactory reference syndrome) has never been classified as a separate disorder in either the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) or the World Health Organization’s (WHO) International Classification of Diseases (ICD).  The DSM-III-R (1987) did mention ORS in the context of “aversion”, noting “convictions that the person emits a foul odor…are one of the most common types of delusional disorder, somatic type”, the idea extended in DSM-IV (1994) which referred to the concept as a type of delusional disorder, somatic type, although the term “olfactory reference syndrome” was not mentioned.

In October 2024, it was reported by Greek news services that a court in Thessaloniki (the capital of the Macedonia region and Greece's second city) in the north of the country had imposed a suspended one-month prison sentence on a man convicted of “…disturbing his neighbors by repeatedly sneaking into their properties to smell their shoes.  According to the AP (Associated Press), the 28-year-old man was unable to explain his behaviour although he did tell the court he was “embarrassed by it”, adding that he had “…no intention of breaking the law or harming anybody…” and his neighbours did testify he never displayed any signs of aggression during his nocturnal visits to the shoes, left outside the door to air.  The offences were committed in the village of Sindos, some 15 kilometres (9 miles) west of Thessaloniki and the police were called only after the man had ignored requests sent to his family that his conduct stop.  According to the neighbours, there had in the last six months been at least three prior instances of shoe sniffing.  In addition to the suspended sentence, the defendant was ordered to attend therapy sessions.

The postman always sniffs twice, Balnagask Circle, Torry, Aberdeen, Scotland, August 2024.  Helpfully, the video clip was posted by the Daily Mail and from his grave of a hundred-odd years, old Lord Northcliffe (Alfred Harmsworth, 1865–1922) would be delighted.

Osphresiolagnia is however not culturally specific and in August 2024, a postman delivering mail to an address on Balnagask Circle in the Torry area of South Aberdeen, Scotland was captured on a doorbell camera, pausing to “to sniff a girl's shoes.  All appeared normal until the osphresiolagnic servant of the Royal Mail had put the letters in the slot but then he turned and, after a brief glance at the shoe rack, bent down and picked up a white trainer which he sniffed before leaving to resume his round (and possibly his sniffing).  The mother of the girl whose shoes fell victim to the postman posted the video on social media, tagging the entry: “I would just like to let everyone know just to watch out for this postman; he sniffed my daughter's shoes; what an absolute creep.  The clip came to the attention of the Scottish police which issued a statement: “We received a report of a man acting suspiciously in the Balnagask Circle area of Aberdeen.  Enquiries were carried out and no criminality was established. Suitable advice was given.  It wasn’t made clear what that advice was or to whom it was delivered but presumably the constabulary’s attitude was: no shoe being harmed during this sniffing, all’s well that ends well.

Shoe-sniffing should not be confused with Podophilia (a paraphilia describing the sexualized objectification of feet (and sometimes footwear), commonly called foot fetishism although the correct clinical description is now “foot partialism”).  The construct was podo- +‎ -philia.  Podo- (pertaining to a foot or a foot-like part) was from the Ancient Greek πούς (poús), from the primitive Indo-European pds.  It was cognate with the Mycenaean Greek po, the Latin pēs, the Sanskrit पद् (pad), the Old Armenian ոտն (otn) & հետ (het), the Gothic fōtus and the Old English fōt (from which Modern English gained “foot”).  It was Sigmund Freud who admitted that, lawfulness aside, as animals, the only truly aberrant sexual behavior in humans could be said to be its absence (something which the modern asexual movement re-defines rather than disproves).  It seemed to be in that spirit the DSM-5 (2013) was revised to treat podophila and many other “harmless” behaviors as “normal” and thus within the purview of the manual only to the extent of being described, clinical intervention no longer required.  Whether all clinicians agree with the new permissiveness isn’t known but there's nothing in the DSM-5-TR (2022) to suggest podophiles will soon again be labeled deviants.

Point of vulnerability to osphresiolagnism: Lindsay Lohan taking off her shoes and putting them on the shoe rack.  The photo shoot featured Ms Lohan as a nueva embajadora de Allbirds (new Allbirds ambassador), in a promotion for Allbirds (Comfortable, Sustainable Shoes & Apparel) and the shoes are the Tree Flyer in Lux Pink which include “no plastics” in their construction.  The photo session may have been shot on a Wednesday.

Shoe sniffing is different and clinicians define it as an instance of “intimacy by proxy” in a similar class to those who steal women’s underwear from their clothes lines; an attempt to in some way be associated with the wearer (or just "women" generally).  This differs from those with an interest in the shoes or garments as objects because, conveniently & lawfully they can fulfil their desires by buying what they want from a shop.  How prevalent are such proclivities isn’t known because, the fetish being pursued in a lawful (and in most cases presumably secret) manner, unless self-reported, clinicians would never become aware of the activity.

Monday, December 22, 2025

Psychosis

Psychosis (pronounced sahy-koh-sis)

In psychiatry, a severe mental disorder (sometimes with physical damage to the brain), more serious than neurosis, characterized by disorganized thought processes, disorientation in time and space, hallucinations, delusions and a disconnection from reality.  Paranoia, manic depression, megalomania, and schizophrenia are all psychoses.

1847: From the New Latin & Late Greek psȳ́chōsis, the construct being psycho- + -osis, the source being the Ancient Greek ψύχωσις (psúkhōsis) (animation, principle of life), psych from the Ancient Greek ψυχή (psukh or psykhē) (mind, life, soul).  The suffix –osis is from the Ancient Greek -ωσις (-ōsis) (state, abnormal condition or action), from -όω (-óō) (stem verbs) + -σις (-sis); -oses was the plural form and corresponding adjectives are formed using –otic, thus respectively producing psychoses and psychotic.  The Ancient Greek psykhosis meant "a giving of life; animation; principle of life".  In English, the original 1847 construction meant "mental affection or derangement" while the adjective psychotic (of or pertaining to psychosis) dates from 1889, coined from psychosis, on the model of neurotic/neurosis and ultimately from the Ancient Greek psykhē (understanding, the mind (as the seat of thought), faculty of reason).

In clinical use there are many derived forms (with meanings more precise than is often the case when such words migrate to general use) including antipsychotic, micropsychotic, neuropsychotic, nonpsychotic, postpsychotic, prepsychotic, propsychotic, protopsychotic, quasipsychotic, semipsychotic & unpsychotic.  The useful portmanteau word sarchotic (the construct a blend of sarcastic + psychotic) is used to describe a statement so distrubingly sarcastic it can't be certain if the remark is intended to be humerous or the person making it genuinely is psychotic and even then there are graduations for which the adverb is used, the comparative being "more psychotically" and the superlative "most psychotically".  Psychosis & psychoticism are nouns, psychotic is a noun & adjective and psychotically is an adverb; the noun plural is psychoses.

Psychosis and the DSM

The word psychosis was a mid-nineteenth century creation necessitated by early psychiatry’s separation of psychiatric conditions from neurological disorders.  Originally a generalized concept to refer to psychiatric disorders, gradually it became one of the major classes of mental illness, assumed to be the result of a disease process, and, more recently, to a symptom present in many psychiatric disorders.  During this evolution, the diagnostic criteria shifted from the severity of the clinical manifestations and the degree of impairment in social functioning to the presence of one or more symptoms in a set of psychopathological symptoms.  By the early twentieth century, the concept of neurosis (which once embraced both the psychiatric and the neurological disorders), became restricted to one major class of psychiatric disease whereas psychosis (which once embraced all psychiatric disorders) became restricted to the other.

The first consensus-based classification with a description of diagnostic terms was in the first edition (DSM-I (1952)) of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) in which mental disorders were divided into two classes of illness: (1) organic disorders, caused by or associated with impairment of brain tissue function; and (2) disorders of psychogenic origin without clearly defined physical cause or structural changes in the brain.  When DSM-II (1968) was released, the classifications were revised with mental disorders now classed as (1) psychoses and (2) neuroses, personality disorders, and other non-psychotic mental disorders.  Psychosis was defined as a mental disorder in which mental functioning is impaired to the degree that it interferes with the patient's ability to meet the ordinary demands of life and recognize reality.

Advances in both neurology and psychiatry led to an extensive revision in DSM-III (1980).  Radically, all traditional dichotomies (organic versus functional, psychotic versus neurotic etc) were discarded with psychiatric syndromes assigned to one of fifteen categories of disease.  At the labelling level, the term psychotic was used to describe a patient at a given time, or a mental disorder in which at some time during its course, all patients evaluate incorrectly the accuracy of their perceptions and thoughts but the editors emphasized it should not be applied to patients suffering only minor distortions of reality, regardless of how exactly they might fulfil the clinical criteria.  The revisions in DSM-III-R (1987) extended only to slight changes in terminology.

Mirroring the changes in diagnostic criteria published by the WHO, DSM-IV (1994) noted the diagnosis of psychosis should no longer be based on the severity of the functional impairment but rather on the presence of certain symptoms which included delusions, hallucinations, disorganized speech and grossly disorganized or catatonic behavior.  This emphasis on psychoses being spectrum conditions was continued in DSM-5 (2013) with schizoid (personality) disorder and schizophrenia defining its mild and severe ends.  Additionally, a more precise diagnostic framework was defined in which patients were assessed in terms both of symptoms and duration of suffering.

Two examples of "schizophrenia art".

My Eyes in the Time of Apparition (1913) by August Natterer (1868-1938).

The life of German artist August Natterer began innocuously enough, studying engineering, travelling extensively, marrying and building a successful career as an electrician.  However, in his thirties, he began to experience anxiety attacks and delusions and in 1907 suffered a hallucination in which thousands of images flashed before his eyes in little more than thirty minutes.  So affected by the experience that he attempted suicide, he was admitted to an asylum and would spend the remaining quarter-century of his life in and out of institutes for the insane.  In the literature, Natterer is referred to as Neter, a pseudonym used by his psychiatrist to protect patient and family from the social stigma then associated with mental illness.  He described the 1907 hallucination as a vision of the Last Judgment which he described as:

"...10,000 images flashed by in half an hour.  I saw a white spot in the clouds absolutely close – all the clouds paused – then the white spot departed and stood all the time like a board in the sky. On the same board or the screen or stage now images as quick as a flash followed each other, about 10,000 in half an hour… God himself occurred, the witch, who created the world – in between worldly visions: images of war, continents, memorials, castles, beautiful castles, just the glory of the world – but all of this to see in supernal images. They were at least twenty meters big, clear to observe, almost without color like photographs… The images were epiphanies of the Last Judgment. Christ couldn't fulfil the salvation because he was crucified early... God revealed them to me to accomplish the salvation."

After his suicide attempt and committal to the first of what would be several mental asylums, Natterer thereafter maintained that he was the illegitimate child of Emperor Napoleon I (Napoleon Bonaparte (1769–1821; leader of the French Republic 1799-1804 & Emperor of the French from 1804-1814 & 1815)) and "Redeemer of the World".  The vision inspired Natterer to a prolific production of drawings, all documenting images and ideas seen in the vision, one especially interesting to those studying psychosis and schizophrenia being My Eyes in the Time of Apparition (1913), two eyes bloodshot and wide-open eyes staring from the page.  The irises of the eyes do not match.

The Scream (1893), oil, tempera & paste on cardboard, by Edvard Munch (1863-1944), National Gallery of Norway.

Norwegian Edvard Munch was one of a number of artists modern psychiatrists have written of as having both genetic and environmental predispositions to mental illness, schizophrenia in particular; one of Munch’s sisters had schizophrenia, his father suffered from depression, his mother and another sister dying from tuberculosis when he was young.  Munch though was a realist, once telling an interviewer, “I cannot get rid of my illnesses, for there is a lot in my art that exists only because of them.”  The idea of affliction as a source or artistic inspiration appears often in the literature of art, music and such and in that it's something of a parallel with those who produce their finest work while living under political oppression; unpleasant as that can be, reform can see careers suffer, famous dissidents abruptly left as "rebels without a cause" after the fall of the Soviet Union (1922-1991) and a generation of the UK's activists found grist for their mills less prolific after the Tory Party had Margaret Thatcher (1925–2013; UK prime-minister 1979-1990) walk the political plank.  Where one door closes however, another sometimes opens and in John Major (b 1943; UK prime-minister 1990-1997) the comedians found a rich vein of material.    

His was a troubled life and in 1908, following a psychotic break exacerbated by alcoholism, Munch was admitted to a mental health clinic, later diagnosed with neurasthenia, a clinical condition now known to be closely associated with hypochondria and hysteria.  Adding to his problems, the Nazis labelled Munch’s style “degenerate art” and in 1937 confiscated many of his works but their disapprobation had less of an influence on his painting than his schizophrenia, his output continuing to feature figures obviously tortured by anguish and despair.  The apparently frantic strokes of the brush and his seemingly chaotic pallet of colors have long intrigued both critics and clinicians seeing insight into his state of mind, the idea being his paintings provide something of a visual representation of how schizophrenia might lead individuals to see the world.

Lindsay Lohan, following Edvard Munch, rendered by Vovsoft in comicbook style.

Endlessly reproduced, the subject of numerous memes and the inspiration for many re-interpretations, The Scream is Munch’s most famous work and the most emblematic of what now casually is called “schizophrenic art” (unfortunately often conflated with “art by schizophrenics”).  For decades it has been the chosen artistic representation for the angst-ridden modern human condition, the artist in 1890 noting in his diary a still vivid memory: “I was walking along the road with two friends—the sun went down—I felt a gust of melancholy—suddenly the sky turned a bloody red... I felt this big, infinite scream through nature.  That entry was written some years after the sight and before painting The Scream in 1893 but the moment stayed with him because his vision of the sky caused him to “tremble with pain and angst” and he felt he heard his “…scream passing endlessly through the world.  For historians those fragment of memory proved of interest and in his book Krakatoa:The Day the World Exploded (2003), detailing the 1883 eruption of the Indonesian volcano Krakatoa, Simon Winchester (b 1944) connected the “blood red” Norwegian sky with the fiery sunsets created by the ash from the explosion circulating the planet, high in the atmosphere.

Krakatoa: The Day the world exploded.

The idea of a link between the catastrophic geological event and the painting had long intrigued art historians who understood such a sight would have appeared “surreal”, decades before the surrealism movement became established and that it was a natural phenomenon is well-supported by theoretical modelling.  Between 20 May-21 October 1883, Krakatau, a volcanic island in the Sunda Strait, erupted, the “main event” happening on 27 August, during which over two-thirds of the island and its surrounding archipelago was destroyed, the remnants subsequently collapsing into a caldera (in volcanology, a large crater formed by collapse of the cone or edifice of a volcano).  The event created a large tsunami which, much diminished, reached the Atlantic and it’s believed that day’s third explosion was history’s loudest known sound.  What Edvard Munch is thought to have seen is the evening light of the sun being colored by the millions of tons of sulfur dioxide and volcanic dust blasted high into the atmosphere, circulating there for years including over Oslo when the artist was taking his walk.  Nor was he wholly wrong in suggesting “a scream passing” because such was energy generated by the explosion, the acoustic pressure wave circled the globe at least three times.

Wednesday, December 17, 2025

Inkhorn

Inkhorn (pronounced ingk-hawrn)

A small container of horn or other material (the early version would literally have been hollowed-out horns from animals), formerly used to hold writing ink.

1350-1400: From the Middle English ynkhorn & inkehorn (small portable vessel, originally made of horn, used to hold ink), the construct being ink +‎ horn.  It displaced the Old English blæchorn, which had the same literal meaning but used the native term for “ink”.  It was used attributively from the 1540s as an adjective for things (especially vocabulary) supposed to be beloved by scribblers, pedants, bookworms and the “excessively educated”).  Inkhorn, inkhornery & inkhornism are nouns, inkhornish & inkhornesque are adjectives and inkhornize is a verb; the noun plural is inkhorns.

Ink was from the Middle English ynke, from the Old French enque, from the Latin encaustum (purple ink used by Roman emperors to sign documents), from the Ancient Greek ἔγκαυστον (énkauston) (burned-in”), the construct being ἐν (en) (in) + καίω (kaíō) (burn). In this sense, the word displaced the native Old English blæc (ink (literally “black” because while not all inks were black, most tended to be).  Ink came ultimately from a Greek form meaning “branding iron”, one of the devices which should make us grateful for modern medicine.  Because, in addition to using the kauterion to cauterize (seal wounds with heat), essentially the same process was used to seal fast the colors used in paintings.  Then, the standard method was to use wax colors fixed with heat (encauston (burned in)) and in Latin this became encaustum which came to be used to describe the purple ink with which Roman emperors would sign official documents.  In the Old French, encaustum became enque which English picked up as enke & inke which via ynk & ynke, became the modern “ink”.  Horn was from the Middle English horn & horne, from the Old English horn, from the Proto-West Germanic horn, from the Proto-Germanic hurną; it was related to the West Frisian hoarn, the Dutch hoorn, the Low German Hoorn, horn, the German, Danish & Swedish horn and the Gothic haurn.  It was ultimately from the primitive Indo-European r̥h-nó-m, from erh- (head, horn) and should be compared with the Breton kern (horn), the Latin cornū, the Ancient Greek κέρας (kéras), the Proto-Slavic sьrna, the Old Church Slavonic сьрна (sĭrna) (roedeer), the Hittite surna (horn), the Persian سر (sar) and the Sanskrit शृङ्ग (śṛṅga) (horn

Inkhorn terms & inkhorn words

The phrase “inkhorn term” days from the 1530s and was used to criticize the use of language in an obscure or way difficult for most to understand, usually by an affected or ostentatiously erudite borrowing from another language, especially Latin or Greek.  The companion term “inkhorn word” was used of such individual words and in modern linguistics the whole field is covered by such phrases as “lexiphanic term”, “pedantic term” & “scholarly term”, all presumably necessary now inkhorns are rarely seen.  Etymologists are divided on the original idea behind the meaning of “inkhorn term” & “inkhorn word”.  One faction holds that because the offending words tended to be long or at least multi-syllabic, a scribe would need more than once to dip their nib into the horn in order completely write things down while the alternative view is that because the inkhorn users were, by definition, literate, they were viewed sometimes with scepticism, one suspicion they used obscure or foreign words to confuse or deceive the less educated.  The derived forms are among the more delightful in English and include inkhornism, inkhornish, inkhornery inkhornesque & inkhornize.  The companion word is sesquipedalianism (a marginal propensity to use humongous words).

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

Inkhorn words were in the fourteenth & fifteenth centuries known also as “gallipot words”, derived from the use of such words on apothecaries' jars, the construct being galli(s) + pot.  Gallis was from the Latin gallus (rooster or cock (male chicken)), from the Proto-Italic galsos, an enlargement of gl̥s-o-, zero-grade of the primitive Indo-European gols-o-, from gelh- (to call); it can be compared with the Proto-Balto-Slavic galsas (voice), the Proto-Germanic kalzōną (to call), the Albanian gjuhë (tongue; language), and (although this is contested) the Welsh galw (call).  Appearing usually in the plural a gallipot word was something long, hard to pronounce, obscure or otherwise mysterious, the implication being it was being deployed gratuitously to convey the impression of being learned.  The companion insult was “you talk like an apothecary” and “apothecary's Latin” was a version of the tongue spoken badly or brutishly (synonymous with “bog Latin” or “dog Latin” but different from “schoolboy Latin” & “barracks Latin”, the latter two being humorous constructions, the creators proud of their deliberate errors).  The curious route which led to “gallipot” referencing big words was via the rooster being the symbol used by apothecaries in medieval and Renaissance Europe, appearing on their shop signs, jars & pots.  That was adopted by the profession because the rooster symbolized vigilance, crowing (hopefully) at dawn, signaling the beginning of the day and thus the need for attentiveness and care.  Apothecaries, responsible for preparing and dispensing medicinal remedies, were expected to be vigilant and attentive to detail in their work to ensure the health and well-being of their patients who relied on their skill to provided them the potions to “get them up every morning” in sound health.  Not all historians are impressing by the tale and say a more convincing link is that in Greek mythology, the rooster was sacred to Asclepius (Aesdulapius in the Latin), the god of medicine, and was often depicted in association with him.  In some tales, Asclepius had what was, even by the standards of the myths of Antiquity, a difficult birth and troubled childhood.

The quest for the use of “plain English” is not new.  The English diplomat and judge Thomas Wilson (1524–1581) wrote The Arte of Rhetorique (1553), remembered as the “the first complete works on logic and rhetoric in English” and in it he observed the first lesson to be learned was never to affect “any straunge ynkhorne termes, but to speak as is commonly received.  Wring a decade earlier, the English bishop John Bale (1495–1563) had already lent an ecclesiastical imprimatur to the task, condemning one needlessly elaborate text with: “Soche are your Ynkehorne termes” and that may be the first appearance of the term in writing.  A religious reformer of some note, he was nicknamed “bilious Bale”, a moniker which politicians must since have been tempted to apply to many reverend & right-reverend gentlemen.  A half millennium on, the goal of persuading all to use “plain English” is not yet achieved and a fine practitioner of the art was Dr Kevin Rudd (b 1957; Australian prime-minister 2007-2010 & 2013): from no one else would one be likely to hear the phrase detailed programmatic specificity” and to really impress he made sure he spoke it to an audience largely of those for whom English was not a first language.

An inkhorn attributed to Qes Felege, a scribe and craftsman.

Animal horns were for millennia re-purposed for all sorts of uses including as drinking vessels, gunpowder stores & loaders, musical instruments and military decoration and in that last role they’ve evolved into a political fashion statement, Jacob Chansley (b 1988; the “QAnon Shaman”) remembered for the horned headdress worn during the attack on the United States Capitol building in Washington DC on 6 January 2021.  Inkhorns tended variously to be made from the horns of sheep or oxen, storing the ink when not as use and ideal as a receptacle into which the nib of a quill or pen could be dipped.  Given the impurities likely then to exist a small stick or nail was left in the horn to stir away any surface film which might disrupts a nib’s ability to take in free-flowing ink, most of which were not pre-packaged products by mixed by the user from a small solid “cake” of the base substance in the desired color, put into the horn with a measure starchy water and left overnight to dissolve.  The sharp point of a horn allowed it to be driven into the ground because the many scribes were not desk-bound and actually travelled from place to place to do their writing, quill and inkhorn their tools of trade.

A mid-Victorian (1837-1901) silver plated three-vat inkwell by George Richards Elkington (1801–1865) of Birmingham, England.  The cast frame is of a rounded rectangular form with outset corners, leaf and cabuchons, leaf scroll handle and conforming pen rest.  The dealer offering this piece described the vats as being of "Vaseline" glass with fruit cast lids and in the Elkington factory archives, this is registered: "8 Victoria Chap 17. No. 899, 1 November 1841".

“Vaseline glass” is a term describing certain glasses in a transparent yellow to yellow-green color attained by virtue of a uranium content.  It's an often used descriptor in the antique business because some find the word “uranium” off-putting although inherently the substance is safe, the only danger coming from being scratched by a broken shard.  Also, some of the most vivid shades of green are achieved by the addition of a colorant (usually iron) and these the cognoscenti insist should be styled “Depression Glass” a term which has little appeal to antique dealers.  The term “Vaseline glass” wasn’t used prior to the 1950s (after the detonation of the first A-bombs in 1945, there emerged an aversion to being close to uranium) and what's used in this inkwell may actually be custard glass or Burmese glass which is opaque whereas Vaseline glass is transparent.  Canary glass was first used in the 1840s as the trade name for Vaseline glass, a term which would have been unknown to George Richards Elkington.

English silver plate horn and dolphin inkwell (circa 1909) with bell, double inkwell on wood base with plaque dated 1909.  This is an inkwell made using horns; it is not an inkhorn.

So inkhorns were for those on the move while those which sat on desks were called “ink wells” or “ink pots” and these could range from simple “pots” to elaborate constructions in silver or gold.  There are many ink wells which use horns as part of their construction but they are not inkhorns, the dead animal parts there just as decorative forms of structure.

Dr Rudolf Steiner’s biodynamic cow horn fertilizer.

Horns are also a part of the “biodynamic” approach to agriculture founded by the Austrian occultist & mystic Rudolf Steiner (1861-1925), an interesting figure regarded variously as a “visionary”, a “nutcase” and much between.  The technique involves filling cow horns with cow manure which are buried during the six coldest months so the mixture will ferment; upon being dug up, it will be a sort of humus which has lost the foul smell of the manure and taken on a scent of undergrowth.  It may then be used to increase the yield generated from the soil.  It’s used by being diluted with water and sprayed over the ground.  Dr Steiner believed the forces penetrating the digestive organ of cows through the horn influence the composition of their manure and when returned to the environment, it is enriched with spiritual forces that make the soil more fertile and positively affect it.  As he explained: “The cow has horns to send within itself the etheric-astral productive forces, which, by pressing inward, have the purpose of penetrating directly into the digestive organ. It is precisely through the radiation from horns and hooves that a lot of work develops within the digestive organ itself.  So in the horns, we have something well-adapted, by its nature, to radiate the vital and astral properties in the inner life.”  Now we know.

Saturday, December 6, 2025

Otrovert

Otrovert (pronounced ott-roh-vert)

A person unable to feel a connection to social groups or collectives; despite being welcomed and included in social settings, they feel like outsiders.

2025: A coining by US psychiatrist Dr Rami Kaminski (b 1954), who first used the word in his book his book The Gift of Not Belonging (2025), the construct being the Spanish otro (other; another) + -vert.  Otro was from the Latin alter, altera & alterum (the other), ultimately from the primitive Indo-European hélteros (the other of two); it may be compared with the Portuguese outro (from the Old Galician-Portuguese outro, from the Latin alterum (the other)) and the French autre (from Old French autre (another), from the Latin alterum).  The –vert suffix was from the Latin vertere (to turn) and was used to refer to a person with a particular personality which manifests when in the presence of others.

Otrovert is a noun; the noun plural is otroverts.  Because otrovert is a “hot word” (newly coined or an adaptation of an existing word and one which has in a short time become popular), most lexicographers are tagging it as “provisional”, the majority of “hot words and phrases” (think “six-seven”) fading from use and never gaining critical mass.  Even the idea of “popular: had (in this context) shifted because whereas once it could take months or years for a word or phrase to spread into general use, on the various platform on the internet, proliferation can be close to instant.  However, the tools used to assess “use” are rather brute-force and often are counting appearances in “lists” rather than “general use”.  For those reasons, in the technical sense, derived forms really don’t (yet) exist but if constructed the list (based on the model of other “-verts”) might include the nouns otrovertist, otroverting & otrovertness, the verb & adjective otroverted, adjectives otrovertish & otrovertesque & otrovertive and the adverbs otrovertedly & otrovertly.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

An ambivert is a person neither clearly extroverted nor introverted, but has characteristics of each, the construct being ambi- +‎ -vert.  Ambi- was from the Latin ambo (both) and was a doublet of the New Latin amphi-, from the Ancient Greek ἀμφί (amphí) (on both sides).  The dexter element in the Medieval Latin meant “right” and ambidexter thus was understood as “both hands being like a right hand”.  In English, the ambi- prefix is most familiar in “ambidextrous” (possessing an equal or functionally comparable ability to handle objects with both hands (in writing, music, sport etc) although it has from time to time been used figuratively (not taking sides in conflicts or being equally adept in more than one medium, genre, style etc) and even as a humorous synonym for “bisexual”.  When used in psychology, historically, ambiversion described someone with characteristics of both extroversion and introversion and thus suggested a “balanced personality”, the subject choosing to manifest the different characteristics according to what the circumstances seemed to demand.  Ambivert thus does not imply some sort of split personality or the existence of a condition like bi-polar disorder (the old manic depression) but simply reflects an individual able to undertake their social interactions in an appropriate manner.

Because the “vert words” are not really part of academic or clinical physiology, the definitions can be “elastic” and while centovert (being in the middle between introvert and extrovert) may be a synonym of ambivert, it may also be nuanced in that it suggests someone unable (or at least unwilling) to engage in introverted or extroverted behaviour, regardless of the circumstances.  A variant of the ambivert is the omnivert (someone fits into both extremes of the extroversion-introversion personality spectrum), the construct being omni + -vert.  Omni- ultimately was from the Latin omnis (all).  Again, because the “verts” are pop-psychology words there’s little to be gained from attempting to “parse the overlaps” (ie where one ends and another begins) and seems likely omniversion is simply an “enabling pre-condition” for one to possess if one is to attain the desirable “balanced state” of ambiversion.  Nobody seems yet to have coined ultravert, hypervert or ubervert but one need not spend long on social media to see the why such labels might be handy.

A self-described introvert: Lindsay Lohan explains she's an introvert; 2019 interview by broadcaster Howard Stern (b 1954).

Like other “-verts” of this ilk, otrovert was built on the model of the familiar introvert & extravert, the construct being intro + -vert.  An introvert (pronounced in-truh-vurt) is an individual who prefers (sometimes actively seeks) tranquil environments, limits social engagement and tends to a greater than average preference for solitude.  In anatomy & zoology there’s a technical meaning “a part (typically a hollow, cylindrical structure) that is or can be introverted, or turned in on itself (ie invaginate)) but the most commonly used is the psychological sense: a person characterized by concern primarily with their own thoughts and feelings.  Introverts are noted often for having a disposition that finds social engagement at least tiresome (and sometimes threatening), thus the preference for quiet solitude.  Introvert seem first to have appeared in print in the 1660s and was from the New Latin intrōvertere, the construct being intrō (within) + vertere (to turn).  The prefix intro- was from the Latin intro- (inwards) & intrā (within) + -ō (used as a verbalizer).  Although it’s not infrequent for introvert to be used as a synonym for “shy” (and in terms of observed behaviour the two phenomena can appear indistinguishable), they are definitionallly distinct.  While shyness is associated with timidity and social anxiety, introverts have a lack of interest in interpersonal engagement and a limited endurance for social contact; what that means is while the behaviours can often be the same, the underlying motivations differ.

Slaughterhouse-Five (1969) by Kurt Vonnegut.

Introvert & extrovert are popular terms of self-description but they can also be aspirational and while the classic stereotype is of the introvert who “wishes they were more outgoing” there are other types.  The US pediatrician Dr Mark Vonnegut (b 1947) wrote short stories and in one he described his father’s (the author Kurt Vonnegut (1922-2007)) desire to be a cynical, grumpy old man who despaired of humanity but could never quite manage it because of his “inherent optimism”.  As Dr Vonnegut put it, he was “…like an extrovert who wanted to be an introvert, a very social guy who wanted to be a loner, a lucky person who would have preferred to be unlucky. An optimist posing as a pessimist, hoping people will take heed.  Explaining the difference, he added: “Introverts almost never cause me trouble and are usually much better at what they do than extroverts.  Extroverts are too busy slapping one another on the back, team building, and making fun of introverts to get much done.  Extroverts are amazed and baffled by how much some introverts get done and assume that they, the extroverts, are somehow responsible.  On the basis of his clinical experience, he observed: “I understand perfectly why some of my autistic patients scream and flap their arms--it's to frighten off extroverts.

An extrovert (pronounced ek-struh-vurt) is described typically as an outgoing, gregarious person who thrives in dynamic environments and seeks to maximize social engagement; in the jargon of psychology, it refers to someone characterized by extroversion; a person concerned primarily with the physical and social environment, thus the usual presentation as a person with a disposition energized through social engagement who tends to languish or chafe in solitude.  The word extrovert (the alternative spelling extravert (an example of the influence of German on psychology) is now rare) also emerged in the 1660s, the construct being extro- + vert.  In this case, extro- was a pseudo-Latinism prefix based upon the Latin extra- (outside, beyond), under the influence of the distinction between the Latin intro- (inwards) & intra- (inside; within).  In English, formations using the prefix tend to be restricted to words formed as antonyms of terms formed with intro-.

Introvert & extrovert (in their literal senses) were since the late seventeenth century used in science and medicine but both in the twentieth century entered general use when certain works by the Swiss psychiatrist, Carl Jung (1875–1961) were translated from German into English.  What seems to have given the words their greatest impetus was the appearance of commentaries on Jung written for a general audience and for these purposes binary concepts like “introvert” and “extrovert” were useful devices to encapsulate layers of meaning although the trigger may have been the 1918 paper Psycho-Analytic Study of August Comte [1798-1857; a seminal figure in sociology] by psychologist Dr Phyllis Blanchard (1895-1986).  Being a woman, Dr Blanchard has been neglected by history but, like the Austrian psychiatrist Sigmund Freud (1856-1939), Jung became what would now be called a “celebrity” psychoanalyst and that happened because advances in their field (and neurology) had made the public fascinated with the human mind and its processes (especially dreams).  Reflecting what may possibly be a professional distaste at their jargon ending up in pop-psychology texts, technical papers often use the spelling “extravert”, following Jung and his contemporaries.

The Gift of Not Belonging (2025) by Dr Rami Kaminski (b 1954).  Psychiatrist Dr Kaminski is the founder and director of the Institute for Integrative Psychiatry in New York City.

Dr Kaminski describes The Gift of Not Belonging as “…the first book to explore the distinct personality style of the otrovert - someone who lacks the communal impulse and does not fit in with any social group, regardless of its members - and to reveal all the advantages of being an otrovert and how otroverts contribute to the world.”  He explained that while otroverts enjoy deep and fulfilling one-on-one relationships, within groups they feel alienated, uncomfortable, and alone.  Unlike introverts, who crave solitude and are easily drained by social interactions, otroverts can be quite gregarious and rarely tire from one-on-one socialising; unlike loners, or people who have been marginalised based on their identity, otroverts are socially embraced and often popular - yet are unable to conform with what the group collectively thinks or cares about.  Dr Kaminski positions all this as “the great gifts of being an otrovert” by which he means someone with no affinity for a particular group is not constrained by their sense of self-worth being conditioned on the group's approval.  A champion of the otrovert, Dr Kaminski suggests they “must not be harassed to take part, but allowed to revel in their glorious difference.

Despite vying with “psychopath” for the title of “most popular” words from psychology, neither introvert and extrovert have ever been used as diagnostic terms in the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM); that doesn’t mean they’re not used by clinicians, just that they’re not part of the formal jargon.  That might seem curious given their not infrequent appearances in the published history of personality psychology including Jung’s original typology (codified in their most refined form in the 1920s), the ubiquitous MBTI (Myers–Briggs Type Indicator) and the Big Five model, where Extraversion is one of the five major personality traits.  These frameworks are however psychological, not psychiatric.  The DSM does of course have an extensive section on personality disorders and many of the traits related to introversion & extraversion appear including in (1) Avoidant Personality Disorder (social inhibition—links superficially to introversion but is not the same thing) and Histrionic or Narcissistic Personality Disorders (social boldness—superficially “extraverted” traits).

However, what the DSM’s editors have in recent decades done is to avoid the use of potentially ambiguous labels and focus instead on behavioural criteria that may indicate impairment or pathology.  Especially since the 1970s, the DSM has acknowledged (even championed) the idea that many “things” once classified as deviant are really part of the “normal” human condition; reflecting that paradigm, introversion & extraversion came to be understood as “normal-range” personality traits, not indicators of disorder.  As a general principle, the DSM appears to restrict the use of terms to instances where they relate to clinically significant impairment (the emphasis on the effect on the patient rather than the mechanics of process).  This approach was institutionalized with the release of DSM-5 (2013) in which the model clearly had become one of trait-based personality assessment.

To make the point, there exists in DSM-5 & DSM-5-TR (2022) the “Alternative DSM-5 Model for Personality Disorders, Section III” which describes personality traits that (more or less) correspond to what popular culture calls extraversion and introversion.  The editors however avoid the two popular words and instead breaks personality into trait domains with pathological versions of ordinary traits.  What general readers think of a “introversion” now appears in the DSM as “Detachment” although this is not pathologized unless it manifests in maladaptive extremes (chronic or persistent withdrawal; avoidance of social interaction; intimacy avoidance; a reluctance to form close relationships; anhedonia (inability to experience pleasure); mistrust of others; restricted affectivity (limited emotional expression)).  So, introverts can to some degree be “happy” with their state and just prefer frequent solitude and what the DSM calls “detachment” is invoked only when the trait is causing significant impairment or distress.

In the popular imagination, “extraversion” is associated with sociability, talkativeness, outgoing behaviour, enthusiasm (ie someone who is the “life of the party”).  That’s also obviously a “spectrum condition” and the DSM has never listed a single domain which could be classed as “high extraversion” which is good because high sociability isn’t intrinsically pathological.  Rather, should extraversion becomes maladaptive or extreme, the DSM classifies it across several domains:

(1) Attention-seeking (a facet of Antagonism) which manifests especially in Histrionic Personality Disorder.  Symptoms include an excessive need for approval, dramatic or provocative behaviour and an Intense desire to be the centre of attention.

(2) Grandiosity (a facet of Antagonism) which is characteristic of Narcissistic Personality Disorder, the symptoms including social boldness (masking fragile self-esteem) and entitlement and arrogance (which, in many cases, doesn’t manifest)

(3) Impulsivity & Risk Taking (a facet of disinhibition).  This is outgoing, sensation-seeking behaviour in its pathological form and is associated with thrill-seeking, poor impulse control and a tendency to act without considering the consequences

(4) Low Detachment: This is acknowledged as the “adaptive end of Detachment” but the editors seem to list it only to “close the circle”; it’s there because logically it has to be but is certainly not treated as a disorder.

So the DSM intentionally avoids the introvert/extrovert dichotomy which is how starkly it’s understood in popular use.  This “either-or” approach obviously doesn’t map onto the way the DSM treats personality traits as spectrums with only the margins (ie the dysfunctional extremes) described.  What that does is acknowledge there is introversion & extraversion which part of the “normal” human condition and not pathological.  Additionally it’s acknowledged the behavior which in one subject may indicate “significant impairment or distress” might in another not be of concern.