Showing posts sorted by date for query Apostolic. Sort by relevance Show all posts
Showing posts sorted by date for query Apostolic. Sort by relevance Show all posts

Saturday, January 10, 2026

Aphantasia

Aphantasia (pronounced ay-fan-tay-zhuh)

The inability voluntarily to recall or form mental images.

2015: The word (not the diagnosis) was coined by UK neurologist Dr Adam Zeman (b 1957), neuropsychologist Dr Michaela Dewar (b 1976) and Italian neurologist Sergio Della Sala (b 1955), first appearing the paper Lives without imagery.  The construct was a- (from the Ancient Greek ἀ- (a-), used as a privative prefix meaning “not”, “without” or “lacking” + phantasía (from the Greek φαντασία (“appearance”, “imagination”, “mental image” or “power of imagination”, from φαίνω (phaínō) ( “to show”, “to make visible” or “to bring to light”).  Literally, aphantasia can be analysed as meaning “an absence of imagination” or “an absence of mental imagery” and in modern medicine it’s defined as “the inability voluntarily to recall or form mental images”.  Even in Antiquity, there was some meaning shift in phantasía, Plato (circa 427-348 BC) using the word to refer generally to representations and appearances whereas Aristotle (384-322 BC) added a technical layer, his sense being faculty mediating between perception (aisthēsis) and thought (noēsis).  It’s the Aristotelian adaptation (the mind’s capacity to form internal representations) which flavoured the use in modern neurology.  Aphantasia is a noun and aphantasic is a noun & adjective; the noun plural is aphantasics.

Scuola di Atene (The School of Athens, circa 1511), fresco by Raphael (Raffaello Sanzio da Urbino, 1483–1520), Apostolic Palace, Vatican Museums, Vatican City, Rome.  Plato and Aristotle are the figures featured in the centre.

In popular use, the word “aphantasia” can be misunderstood because of the paths taken in English by “phantasy”, “fantasy” and “phantasm”, all derived from the Ancient Greek φαντασία (phantasía) meaning “appearance, mental image, imagination”.  In English, this root was picked up via Latin and French but the multiple forms each evolved in distinct semantic trajectories.  The fourteenth century phantasm came to mean “apparition, ghost, illusion” so was used of “something deceptive or unreal”, the connotation being “the supernatural; spectral”.  This appears to be the origin of the association of “phantas-” with unreality or hallucination rather than normal cognition.  In the fifteenth & sixteenth centuries, the spellings phantasy & fantasy were for a time interchangeable although divergence came with phantasy used in its technical senses of “mental imagery”; “faculty of imagination”; “internal representation”, this a nod to Aristotle’s phantasía.  Fantasy is the familiar modern form, used to suggest “a fictional invention; daydream; escapism; wish-fulfilment, the connotation being “imaginative constructions (in fiction); imaginative excess (in the sense of “unreality” or the “dissociative”); indulgence (as in “speculative or wishful thoughts”)”.

While the word “aphantasia” didn’t exist until 2015, in the editions of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) published between 1952 (DSM-I) and 2013 (DSM-5), there was not even any any discussion (or even mention) of a condition anything like “an inability voluntarily to recall or form mental images”.  That’s because despite being “a mental condition” induced by something happening (or not happening) in the brain, the phenomenon has never been classified as “a mental disorder”.  Instead it’s a cognitive trait or variation in the human condition and technically is a spectrum condition, the “pure” aphantasic being one end of the spectrum, the hyperaphantasic (highly vivid, lifelike mental imagery, sometimes called a “photorealistic mind's eye”) the other.  That would of course imply the comparative adjective would be “more aphantasic” and the superlative “most aphantasic” but neither are standard forms.

If That rationale for the “omission” was the DSM’s inclusion criteria including the requirement of some evidence of clinically significant distress or functional impairment attributable to a condition.  Aphantasia, in isolation, does not reliably meet this threshold in that many individuals have for decades functioned entirely “normally” without being aware they’re aphantasic  while others presumably had died of old age in similar ignorance.  That does of course raise the intriguing prospect the mental health of some patients may have been adversely affected by the syndrome only by a clinician informing them of their status, thus making them realize what they were missing.  This, the latest edition of the DSM (DSM-5-TR (2022)) does not discuss.  The DSM does discuss imagery and perceptual phenomena in the context of other conditions (PTSD (post-traumatic stress disoder), psychotic disorders, dissociative disorders etc), but these references are to abnormal experiences, not the lifelong absence of imagery.  To the DSM’s editors, aphantasis remains a recognized phenomenon, not a diagnosis.

Given that aphantasia concerns aspects of (1) cognition, (2) inner experience and (3) mental representation, it wouldn’t seem unreasonable to expect the condition now described as aphantasia would have appeared in the DSM, even if only in passing or in a footnote.  However, in the seventy years between 1952-2022, over nine editions, there is no mention, even in DSM-5-TR (2022), the first volume released since the word was in 2015 coined.  That apparently curious omission is explained by the DSM never having been a general taxonomy of mental phenomena.  Instead, it’s (an ever-shifting) codification of the classification of mental disorders, defined by (1) clinically significant distress and/or (2) functional impairment and/or (3) a predictable course, prognosis and treatment relevance.  As a general principle the mere existence of an aphantasic state meets none of these criteria.

Crooked Hillary Clinton in Orange Nina McLemore pantsuit, 2010.  If in response to the prompt "Imagine a truthful person" one sees an image of crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), that obviously wrong but is not an instance of aphantasia because the image imagined need not be correct, it needs just to exist.

The early editions (DSM-I (1952) & DSM-II (1968)) heavily were slanted to the psychoanalytic, focusing on psychoses, neuroses and personality disorders with no mention of any systematic treatment of cognition as a modular function; the matter of mental imagery (even as abstract though separated from an imagined image), let alone its absence, wholly is ignored.  Intriguingly, given what was to come in the field, there was no discussion of the cognitive phenomenology beyond gross disturbances (ie delusions & hallucinations).  Even with the publication of the DSM-III (1980) & DSM-III-R (1987), advances in scanning and surgical techniques, cognitive psychology and neuroscience seem to have made little contribution to what the DSM’s editorial board decided to include and although DSM-III introduced operationalized diagnostic criteria (as a part of a more “medicalised” and descriptive psychiatry), the entries still were dominated by a focus on dysfunctions impairing performance, the argument presumably that it was possible (indeed, probably typical) for those with the condition to lead, full, happy lives; the absence of imagery ability thus not considered a diagnostically relevant variable.  Even in sections on (1) amnestic disorders (a class of memory loss in which patients have difficulty forming new memories (anterograde) or recalling past ones (retrograde), not caused by dementia or delirium but of the a consequence of brain injury, stroke, substance abuse, infections or trauma), with treatment focusing on the underlying cause and rehabilitation, (2) organic mental syndromes or (3) neuro-cognitive disturbance, there was no reference to voluntary imagery loss as a phenomenon in its own right.

Although substantial advances in cognitive neuroscience meant by the 1990s neuropsychological deficits were better recognised, both the DSM-IV (1994) and DSM-IV-TR (2000) continued to be restricted to syndromes with behavioural or functional consequences.  In a way that was understandable because the DSM still was seen by the editors as a manual for working clinicians who were most concerned with helping those afflicted by conditions with clinical salience; the DSM has never wandered far into subjects which might be matters of interesting academic research and mental imagery continued to be mentioned only indirectly, hallucinations (percepts without stimuli) and memory deficits (encoding and retrieval) both discussed only in the consequence of their affect on a patient, not as phenomenon.  The first edition for the new century was DSM-5 (2013) and what was discernible was that discussions of major and mild neuro-cognitive disorders were included, reflecting the publication’s enhanced alignment with neurology but even then, imagery ability is not assessed or scaled: not possessing the power of imagery was not listed as a symptom, specifier, or associated feature.  So there has never in the DSM been a category for benign cognitive variation and that is a product of a deliberate editorial stance rather than an omission, many known phenomenon not psychiatrised unless in some way “troublesome”.

The term “aphantasia” was coined to describe individuals who lack voluntary visual mental imagery, often discovered incidentally and not necessarily associated with brain injury or psychological distress.  In 2015 the word was novel but the condition had been documented for more than a century, Sir Francis Galton (1822–1911) in a paper published in 1880 describing what would come to be called aphantasia.  That work was a statistical study on mental imagery which doubtless was academically solid but Sir Francis’s reputation later suffered because he was one of the leading lights in what was in Victorian times (1837-1901) the respectable discipline of eugenics.  Eugenics rightly became discredited so Sir Francis was to some extent retrospectively “cancelled” (something like the Stalinist concept of “un-personing”) and these days his seminal contribution to the study of behavioural genetics is acknowledged only grudgingly.

Galton in 1880 noted a wide variation in “visual imagination” (ie it was understood as a “spectrum condition”) and in the same era, in psychology publications the preferred term seems to have been “imageless thought”.  In neurology (and trauma medicine generally) there were many reports of patients losing the power of imagery after a brain injury but no agreed name was ever applied because the interest was more in the injury.  The unawareness that some people simply lacked the facility presumably must have been held among the general population because as Galton wrote: “To my astonishment, I found that the great majority of the men of science to whom I first applied, protested that mental imagery was unknown to them, and they looked on me as fanciful and fantastic in supposing that the words “mental imagery” really expressed what I believed everybody supposed them to mean. They had no more notion of its true nature than a colour-blind man who has not discerned his defect has of the nature of colour.

His paper must have stimulated interest because one psychologist reported some subjects possessing what he called a “typographic visual type” imagination in which ideas (which most would visualize as an image of some sort) would manifest as “printed text” which was intriguing because in the same way a computer in some aspects doesn’t distinguish between an image file (jpeg, TIFF, webp, avif etc) which is a picture of (1) someone and (2) their name in printed form, that would seem to imply at least some who are somewhere on the aphantasia spectrum retain the ability to visualize printed text, just not the object referenced.  Professor Zeman says he first became aware of the condition in 2005 when a patient reported having lost the ability to visualize following minor surgery and after the case was in 2010 documented in the medical literature in the usual way, it provoked a number of responses in which multiple people informed Zeman they had never in their lifetime been able to visualize objects.  This was the origin of Zeman and his collaborators coining “congenital aphantasia”, describing individuals who never enjoyed the ability to generate voluntary mental images.  Because it was something which came to general attention in the age of social media, great interest was triggered in the phenomenon and a number of “on-line tests” were posted, the best-known of which was the request for readers to “imagine a red apple” and rate their “mind's eye” depiction of it on a scale from 1 (photorealistic visualisation) through to 5 (no visualisation at all).  For many, this was variously (1) one’s first realization they were aphantasic or (2) an appreciation one’s own ability or inability to visualise objects was not universal.

How visualization can manifest: Lindsay Lohan and her lawyer in court, Los Angeles, December. 2011.  If an aphantasic person doesn't know about aphantasia and doesn't know other people can imagine images, their lives are probably little different from them; it's just their minds have adapted to handle concepts in another way.

Top right: What’s thought “normal” visualization (thought to be possessed by most of the population) refers to the ability to imagine something like a photograph of what’s being imagined.  This too is a spectrum condition in that some will be able to imagine an accurate “picture”, something like a HD (high definition photograph” while others will “see” something less detailed, sketchy or even wholly inaccurate.  However, even if when asked to visualize “an apple” one instead “sees a banana”, that is not an indication of aphantasia, a condition which describes only an absence of an image.  Getting it that wrong is an indication of something amiss but it’s not aphantasia.

Bottom left: “Seeing” text in response to being prompted to visualize something was the result Galton in 1880 reported as such a surprise.  It means the brain understands the concept of what is being described; it just can’t be imagined as an image.  This is one manifestation of aphantasia but it’s not related to the “everything is text” school of post-modernism.  Jacques Derrida’s (1930-2004) fragment “Il n'y a pas de hors-texte” (literally “there is no outside-text”) is one of the frequently misunderstood phrases from the murky field of deconstruction bit it has nothing to do with aphantasia (although dedicated post-modernists probably could prove a relationship).

Bottom right: The absence of any image (understood as a “blankness” which does not necessarily imply “whiteness” or “blackness” although this is the simple way to illustrate the concept), whether text or to some degree photorealistic is classic aphantasia.  The absence does not mean the subject doesn’t understand the relevant object of concept; it means only that their mental processing does not involve imagery and for as long as humans have existed, many must have functioned in this way, their brains adapted to the imaginative range available to them.  What this must have meant was many became aware of what they were missing only when the publicity about the condition appeared on the internet, am interesting example of “diagnostic determinism”.

WebMD's classic Aphantasia test.

The eyes are an out-growth of the brain and WebMD explains aphantasia is caused by the brain’s visual cortex (the part of the brain that processes visual information from the eyes) “working differently than expected”, noting the often quoted estimate of it affecting 2-4% of the population may be understated because many may be unaware they are “afflicted”.  It’s a condition worthy of more study because aphantasics handle the characteristic by processing information differently from those who rely on visual images.  There may be a genetic element in aphantasia and there’s interest too among those researching “Long Covid” because the symptom of “brain fog” can manifest much as does aphantasia.

Aphantasia may have something to do with consciousness because aphantasics can have dreams (including nightmares) which can to varying degrees be visually rich.  There’s no obvious explanation for this but while aphantasia is the inability voluntarily to generate visual mental imagery while awake, dreaming is an involuntary perceptual experience generated during sleep; while both are mediated by neural mechanisms, these clearly are not identical but presumably must overlap.  The conclusions from research at this stage remains tentative the current neuro-cognitive interpretation seems to suggest voluntary (conscious) imagery relies on top-down activation of the visual association cortex while dream (unconscious) dream imagery relies more on bottom-up and internally driven activation during REM (rapid eye movement) sleep.  What that would seem to imply is that in aphantasia, the former pathway is impaired (or at least inaccessible), while the latter may remain intact (or accessible).

The University of Queensland’s illustration of the phantasia spectrum.

The opposite syndrome is hyperphantasia (having extremely vivid, detailed, and lifelike mental imagery) which can be a wonderful asset but can also be a curse, rather as hyperthymesia (known also as HSAM (Highly Superior Autobiographical Memory) and colloquially as “total recall”) can be disturbing.  Although it seems not to exist in the sense of “remembering everything, second-by-second”, there are certainly those who have an extraordinary recall of “events” in their life and this can have adverse consequences for mental health because one of the mind’s “defensive mechanisms” is forgetting or at least suppressing memories which are unwanted.  Like aphantasia & hyperphantasia, hyperthymesia is not listed by the DSM as a mental disorder; it is considered a rare cognitive trait or neurological phenomenon although like the imaging conditions it can have adverse consequences and these include disturbing “flashbacks”, increased rumination and increased rates of anxiety or obsessive tendencies.

Friday, August 8, 2025

Carnival

Carnival (pronounced kahr-nuh-vuhl)

(1) A traveling amusement show, having sideshows, rides etc.

(2) Any merrymaking, revelry, or festival, as a program of sports or entertainment.

(3) In the Christian ecclesiastical calendar, the season immediately preceding Lent, often observed with merrymaking; Shrovetide.

(4) A festive occasion or period marked by merrymaking, processions etc and historically much associated with Roman Catholic countries in the period just before Lent.

(5) A sports meeting.

(6) In literary theory (as the noun carnivalization & verb carnivalize), to subvert (orthodox assumptions or literary styles) through humour and chaos.

(7) In sociology, a context in which transgression or inversion of the social order is given temporary license (an extension of the use in literary theory).

(8) Figuratively, a gaudily chaotic situation.

(9) As a modifier (often as “carnival atmosphere?”) a festive atmosphere.

1540–1550: From the Middle French carnaval, from the Italian carnevale, from the Old Italian carnelevare (taking meat away), from older Italian forms such as the Milanese carnelevale or Old Pisan carnelevare (to remove meat (literally “raising flesh”)) the construct built from the Latin caro (flesh (originally “a piece of flesh”)) from the primitive Indo-European root sker- (to cut) + levare (lighten, raise, remove), from the primitive Indo-European root legwh- (not heavy, having little weight).  Etymologists are divided on the original source of the term used by the Church, the alternatives being (1) carnem levare (to put away flesh), (2) carnem levāmen (meat dismissal), (3) carnuālia (meat-based country feast) and (4) carrus nāvālis (boat wagon; float).  What all agree upon is the ecclesiastical use would have come from one of the forms related to “meat” and the folk etymology favors the Medieval Latin carne vale (flesh, farewell!).  Spreading from the use in Christian feast days, by at least the 1590s it was used in the sense of “feasting or revelry in general” while the meaning “a circus or amusement fair” appears to be a 1920s adoption in US English.  The synonyms can include festival, celebration, festivity, fiesta, jubilee, gala, fete, fête, fest, fair, funfair, exhibit, exhibition, revelry, merriment, rejoicing, jamboree, merrymaking, mardi gras, jollity, revel, jollification, exposition and show.  Which is chosen will be dependent on region, context, history etc and (other than in ecclesiastical use) rules mostly don’t exist but there seem to be a convention that a “sporting carnival” is a less formal event (ie non-championship or lower level competitions).  The alternative spelling carnaval is obsolete.  Carnival & carnivalization are nouns, carnivalize, carnivalizing & carnivalized are verbs, and carnivalic, carnivalistic, carnivalesque, carnivallike, precarnival & noncarnival are adjectives; the noun plural is carnivals.

Not just meat: Francis (1936-2025; pope 2013-2025) on fasting for Lent.

Originally, a carnival was a feast observed by Christians before the Lenten fast began and wasn’t a prelude to a sort of proto-veganism.  It was a part of one of religion’s many dietary rules, one which required Christians to abstain from meat during Lent (particularly on Fridays and during certain fast days), carnival the last occasion on which meat was permissible before Easter.  The Christian practice of abstaining from meat evolved as part of a broader theology of penance, self-denial, and imitation of Christ’s suffering, the rationale combining biblical precedent, symbolic associations and early ascetic traditions, the core of the concept Christ’s 40 days of fasting in the wilderness (Matthew 4:1–11, Luke 4:1–13).  Theologically, the argument was that for one’s eternal soul to enter the Kingdom of Heaven, a price to be paid was Imitatio Christi (earthly participation in Christ’s suffering).  Much the early church valued suffering (for the congregants if not the clergy and nobility) and the notion remains an essential theme in some Christian traditions which can be summed up in the helpful advice: “For everything you do, there’s a price to be paid.

Donald Trump (b 1946; US president 2017-2021 and since 2025) in 2016 on his private jet, fasting for Lent.

By voluntarily abstaining from certain foods, Christians imitated Christ’s self-denial and prepared spiritually for Easter: sharing in His suffering to grow in holiness.  Meat was seen a symbol of feasting and indulgence, an inheritance from Antiquity when “flesh of the beasts of the field” was associated with celebration rather than everyday subsistence, the latter something sustained typically by seafood, fruits and grains so voluntarily (albeit at the behest of the Church) choosing temporarily to renounce meat symbolized forgoing luxury and bodily pleasure, cultivating humility and penitence.  As well as the theological, there was also a quasi-medical aspect to what Tertullian (Quintus Septimius Florens Tertullianus, circa 155–circa 220) commended as “forsaking worldly indulgence” in that fasting took one’s thoughts away from earthly delights, allowing a focus on “prayer and spiritual discipline”, strengthening the soul against “sinful temptations”.  Another layer was added by the Patristics (from the Latin pater (father)), a school of thought which explored the writings and teachings of the early Church Fathers.  Although it was never a universal view in Patrology, there were those who saw in the eating of meat a connection to animal sacrifice and blood, forbidden in the Old Testament’s dietary laws and later spiritualized in Christianity, thus the idea of abstinence as a distancing from violence and sensuality.  Finally, there was the special significance of Fridays, which, as "Good Friday" reflected the remembrance of the crucifixion of Christ and his death at Calvary (Golgotha); the early Christians treated every Friday as a mini-fast and later this would be institutionalized as Lent.

Lindsay Lohan arriving at the Electric Daisy Carnival (left) and detail of the accessory worn on her right thigh (right), Memorial Coliseum, Los Angeles, June 2010.  The knee-high boots were not only stylish but also served to conceal the court-mandated SCRAM (Secure Continuous Remote Alcohol Monitor) bracelet.

The allowance of fish during Lent had both pragmatic and theological origins, its place in the Christian diet a brew of symbolism, biblical precedent and cultural context.  As a legal and linguistic point, in the Greco-Roman scheme of things fish was not thought “flesh meat” which was understood as coming from warm-blooded land animals and birds.  Fish, cold-blooded and aquatic, obviously were different and belonged to a separate category, one which Christianity inherited and an implication of the distinction was seafood being viewed as “everyday food” rather than an indulgent luxury.  This was a thing also of economics (and thus social class), the eating of fish much associated with the poorer coastal dwellers whereas meat was more often seen on urban tables.  Notably, there was also in this a technological imperative: in the pre-refrigeration age, in hot climates, often it wasn’t possible safely to transport seafood inland.  The Biblical symbolism included Christ feeding the multitudes with a few “loaves and fishes” (Matthew 14:13–21), several of the apostles were fishermen who Christ called upon to be “fishers of men” (Mark 1:16–18) and the ichthys (fish symbol) was adopted as early Christian emblem for Christ Himself.  Collectively, this made fish an acceptably modest food for a penitential season.  All that might have been thought justification enough but, typically, Medieval scholars couldn’t resist a bit of gloss and the Italian Dominican friar, philosopher & theologian Saint Thomas Aquinas (1225–1274) decided abstinence aimed to “curb the concupiscence of the flesh” and, because meat generated more “bodily heat” and pleasure than fish, it was forbidden while fish was not.  That wasn’t wholly speculative and reflected the humoral theory from Antiquity, still an orthodoxy during the Middle Ages: fish seen as lighter, cooler, and less sensual.

Notting Hill Carnival, London.

Traditionally, there was also a Lenten prohibition of dairy products and eggs, each proscription with its own historical and symbolic logic and the basis of Shrove Tuesday (Pancake Day) and Easter eggs (though not the definitely un-Christian Easter bunny).  The strictness derived partly from Jewish precedents notably the vegetarian edict in Daniel 10:2–3 and the idea of a “return to Edenic simplicity” where man would eat only plants (Genesis 1:29) but also an aversion to links with sexuality and fertility, eggs obviously connected with sexual reproduction and dairy with lactation.  What this meant was early Christian asceticism sought to curb bodily impulses and anything connected with fleshly generation and (even if indirectly), thoughts of sex.

Historically, a time of absolution when confessions were made in preparation for Lent, Shrovetide described the three days immediately preceding Lent (Shrove Sunday, Shrove Monday & Shrove Tuesday, preceding Ash Wednesday).  The construct being shrove +‎ -tide, the word was from the late Middle English shroftyde.  Shrove was the simple past of shrive, from the Middle English shryven, shriven & schrifen, from the Old English sċrīfan (to decree, pass judgement, prescribe; (of a priest) to prescribe penance or absolution), from the Proto-West Germanic skrīban, from the late Proto-Germanic skrībaną, a borrowing from the Latin scrībō (write).  The word may be compared with the West Frisian skriuwe (to write), the Low German schrieven (to write), the Dutch schrijven (to write), the German schreiben (to write), the Danish skrive (to write), the Swedish skriva (to write) and the Icelandic skrifa (to write).  The –tide suffix was from the Middle English –tide & -tyde, from the Old English -tīd (in compounds), from tīd (point or portion of time, due time, period, season; feast-day, canonical hour).  Before refrigeration, eggs and dairy naturally accumulated during springtime as hens resumed laying and animals produced more milk.  Being banned during Lent, stocks thus had to be consumed lest they be wasted so a pragmatic way to ensure economy of use was the pancake (made with butter, milk & eggs), served on the feast of Shrove Tuesday (Pancake Day).  Following Easter, when eggs returned to the acceptable list, “Easter eggs” were a natural festive marker of the fast’s end.

Carnival Adventure and Carnival Encounter off Australia’s eastern Queensland coast.

Although dubbed “floating Petri dishes” because of the high number of food poisoning & norovirus cases, cruise ships remain popular, largely because, on the basis of cost-breakdown, they offer value-for-money packages few land-based operators can match.  The infections are so numerous because (1) there are thousands of passengers & crew in a closed, crowded environment, (2) an extensive use of buffets and high-volume food service, (3) a frequent turnover of crew & passengers, (4) port visits to places with inconsistent sanitation, health & food safety standards and (5) sometimes delayed reporting and patient isolation.

However, although the popular conception of Medieval Western Christendom is of a dictatorial, priest-ridden culture, the Church was a political structure and it needed to be cognizant of practicalities and public opinion.  Even dictatorships can maintain their authority only with public consent (or at least acquiescence) and in many places the Church recognized burdensome rules could be counter-productive, onerous dietary restrictions resented especially by the majority engaged for their living in hard, manual labor.  Dispensations (formal exceptions) became common with bishops routinely relaxing the rules for the ill, those pregnant or nursing or workers performing physically demanding tasks.  As is a common pattern when rules selectively are eased, a more permissive environment was by the late Middle Ages fairly generalized (other than for those who chose to live by to monastic standards).

Carnival goers enjoying the Sydney Gay & Lesbian Mardi Gras: This is not what Medieval bishops would have associated with the word “carnival” but few events better capture the spirit of the phrase “carnival atmosphere”.

The growth of dispensations (especially in the form of “indulgences” which were a trigger for the Protestant Reformation) was such it occurred to the bishops they’d created a commodity and commodities can be sold.  This happened throughout Europe but, in France and Germany, the “system” became institutionalized, the faithful even able to pay “butter money” for the privilege of eating the stuff over Lent (a kind of inverted “fat tax”!) with the proceeds devoted to that favourite capital works programme of bishops & cardinals: big buildings.  The sixteenth century tower on Normandy’s Rouen Cathedral was nicknamed “Butter Tower” although the funds collected from the “tax” covered only part of the cost; apparently even the French didn’t eat enough butter.  As things turned out, rising prosperity and the population drifts towards towns and cities meant consumption of meat and other animal products increased, making restrictions harder to enforce and the Protestant reformers anyway rejected mandatory fasting rules, damning them as man-made (“Popery!” the most offensive way they could think to express that idea) rather than divine law.  Seeing the writing nailed to the door, one of the results of the Council of Trent (1545–1563) was that while the Church reaffirmed fasting, eggs and dairy mostly were allowed and the ban on meat was restricted to Fridays and certain fast days in the ecclesiastical calendar.

Archbishop Daniel Mannix in his library at Raheen, the Roman Catholic's Church's Episcopal Palace in Melbourne, 1917-1981.

By the twentieth century, it was clear the Holy See was fighting a losing battle and in February 1966, Paul VI (1897-1978; pope 1963-1978) promulgated Apostolic Constitution Paenitemini (best translated as “to be penitent”) making abstinence from meat on Fridays optional outside Lent and retained only Ash Wednesday and Good Friday as obligatory fast days for Catholics.  It was a retreat very much in the corrosive spirit of the Second Vatican Council (Vatican II, 1962-1965) and an indication the Church was descending to a kind of “mix & match” operation, people able to choose the bits they liked, discarding or ignoring anything tiresome or too onerous.  In truth, plenty of priests had been known on Fridays to sprinkle a few drops of holy water on their steak and declare “In the name of our Lord, you are now fish”.  That was fine for priests but for the faithful, dispensation was often the “luck of clerical draw”.  At a time in the late 1940s when there was a shortage of good quality fish in south-east Australia, Sir Norman Gilroy (1896–1977; Roman Catholic Archbishop of Sydney 1940-1971, appointed cardinal 1946) granted dispensation but the stern Dr Daniel Mannix (1864–1963; Roman Catholic Archbishop of Melbourne 1917-1963) refused so when two politicians from New South Wales (Ben Chifley (1885–1951; prime minister of Australia 1945-1949) and Fred Daly (1912–1995)) arrived in the parliamentary dining room for dinner, Chifley’s order was: “steaks for me and Daly, fish for the Mannix men.

In the broad, a carnival was an occasion, event or season of revels, merrymaking, feasting and entertainments (the Spanish fiestas a classic example) although they could assume a political dimension, some carnivals staged to be symbolic of the disruption and subversion of authority.  The idea was a “turning upside down of the established hierarchical order” and names used included “the Feast of Fools”, “the Abbot of Misrule” and “the Boy Bishop”.  With a nod to this tradition, in literary theory, the concept of “carnivalization” was introduced by the Russian philosopher & literary critic Mikhail Bakhtin (1895–1975), the word appearing first in the chapter From the Prehistory of Novelistic Discourse (written in 1940) which appeared in his book The Dialogic Imagination: chronotope and heteroglossia (1975).  What carnivalization described was the penetration or incorporation of carnival into everyday life and its “shaping” effect on language and literature.

The Socratic dialogues (most associated with the writing of the Greek philosophers Xenophon (circa 430–355 BC) and Plato (circa 427-348 BC)) are regarded as early examples of a kind of carnivalization in that what appeared to be orthodox “logic” was “stood on its head” and shown to be illogical although Menippean satire (named after the third-century-BC Greek Cynic Menippus) is in the extent of its irreverence closer to the modern understanding which finds expression in personal satire, burlesque and parody.  Bakhtin’s theory suggested the element of carnival in literature is subversive in that it seeks to disrupts authority and introduce alternatives: a deliberate affront to the canonical thoughts of Renaissance culture.  In modern literary use the usual term is “carnivalesque”, referring to that which seeks to subvert (“liberate” sometimes the preferred word) assumptions or orthodoxies by the use of humor or some chaotic element.  This can be on a grand scale (ie an entire cultural movement) or as localized some malcontent disrupting their book club (usually polite affairs where novels are read and ladies sit around talking about their feelings).

Portrait of Leo Tolstoy (1887), oil on canvas by Ilya Repin (1844-1930), Tretyakov Gallery, Moscow, Russia.

He expanded on the theme in his book Problems of Dostoevsky's Poetics (1929) by contrasting the novels of Leo Tolstoy (1828-1910) and Fyodor Dostoevsky (1821–1881).  Tolstoy’s fiction he classified as a type of “monologic” in which all is subject to the author's controlling purpose and hand, whereas for Dostoevsky the text is “dialogic” or “polyphonic” with an array of different characters expressing a variety of independent views (not “controlled” the author) in order to represent the author's viewpoint.  Thus deconstructed, Bakhtin defined these views as “not only objects of the author's word, but subjects of their own directly significant word as well” and thus vested with their own dynamic, being a liberating influence which, as it were, “conceptualizes” reality, lending freedom to the individual character and subverting the type of “monologic” discourse characteristic of many nineteenth century authors (typified by Tolstoy).

Portrait of Fedor Dostoyevsky (1872), oil on canvas by Vasily Perov (1834-1882), Tretyakov Gallery, Moscow, Russia.

Dostoevsky’s story Bobok (1873) is cited as an exemplar of carnival.  It has characters with unusual freedom to speak because, being dead, they’re wholly disencumbered of natural laws, able to say what they wish and speak truth for fun.  However, Bakhtin did acknowledge this still is literature and didn’t claim a text could be an abstraction uncontrolled by the author (although such things certainly could be emulated): Dostoevsky (his hero) remained in control of his material because the author is the directing agent.  So, given subversion, literary and otherwise, clearly has a history dating back doubtlessly as many millennia as required to find an orthodoxy to subvert, why was the concept of carnivalization deemed a necessary addition to literary theory?  It went to the form of things, carnivalization able especially to subvert because it tended to be presented in ways less obviously threatening than might be typical of polemics or actual violence.

Thursday, December 12, 2024

Bulla

Bulla (pronounced bool-uh or buhl-uh)

(1) A seal attached to an official document; in the Holy See, a leaden seal affixed to certain edicts issued by the papal chancellery (a papal bull), having a representation of the saints Peter and Paul on one side and the name of the reigning pope on the other.

(2) In archaeology, a clay envelope or hollow ball, typically with seal impressions or writing on its outside indicating its contents.

(3) In Ancient Rome, type of ornament worn, especially an amulet worn around the neck (as a pendant (or boss), usually by children of “the better classes” (mostly boys) as a protective charm).

(4) In medicine, a large vesicle; alternative name for blister.

(5) In pathology, the tympanic part of a temporal bone (having a bubble-like appearance); any of several hollow structures as features of bones.

(6) In zoology, a blister-like or bubble-like prominence of a bone, as that of the tympanic bone in the skull of certain mammals.

(7) In archaeology, a clay envelope or hollow ball, typically with seal impressions or writing on its outside indicating its contents.

(8) In archaeology and linguistics, a clay envelope, hollow ball or token used in ancient Mesopotamian record-keeping; the link being the rounded, bubble-like form of the objects.

(9) A rich Jamaican cake made with molasses and spiced with ginger and nutmeg.

(10) In surgical use, as bullectomy (a procedure in which small portions of the lung (known as bulla, large areas (>10 mm diameter) in the lung filled with oxygen-depleted air) and bullostomy (the making of a hole through a bulla).

Circa 1845: From the Latin bulla (round swelling, stud, boss, knob (literally “bubble”)), either from the Latin Latin bullire (to boil), or from the Gaulish, from the primitive Indo-European bew- or beu- (a swelling) or bhel- (to blow, inflate, swell) which may have formed a large group of words meaning “much, great, many” (and also words associated with swelling, bumps, blisters and such and the source also of the Lithuanian bulė "buttocks and the Middle Dutch puyl (bag); etymologists remain divided over any link with the Latin bucca (cheek).  In medieval times, it referred to the seal (or stamp) attached to official documents because of its rounded, blister-like shape, familiar from many uses.  The speculative link with the Latin bullire (to boil) was an allusion to the need for heat to be applied to melt or partially melt the material (gold, lead, wax etc) used in the making of seals (once thus softened, the impression was applied).  Historically, while wax seals wear the most common, official imperial seals were gold and papal seals of lead (although some were gold).  The use to describe certain documents issued by the papal chancellery is an adoption of Medieval Latin.  Although it was never an absolute rule (the seal with a representation of the saints Peter and Paul on one side and the name of the reigning pope on the other has appeared variously), its existence usually indicates a papal document is a bulla, a specific type of papal document distinguished by its formality, purpose, and its authentication.  Bulla is a noun; the noun plural is bullas (the Latin bullae used of the papal documents).

Seal of the appropriation of Ospringe Hospital (Headcorn Kent) by the Archbishop of Canterbury, Boniface of Savoy, in accord with a papal bull of 31 March 1267, to, Headcorn Kent. 1267.

Bulls begin with the phrase Episcopus Servus Servorum Dei (The Bishop, Servant of the Servants of God) and are written in a formal style.  The significance of a document being a bull is that technically it is a decree with enduring legal & doctrinal authority including ex cathedra pronouncements or administrative acts (which can be as procedural as creating religious orders or dioceses).  In this they differ from (1) encyclicals which are letters intended for broader purposes, addressed to bishops, clergy, and the faithful, often dealing with theological or social issues, (2) Apostolic Constitutions which usually deal with issues of governance, the promulgation of liturgical texts or matters pursuant to earlier bullae and (3) Motu Proprio (literally “on his own initiative”) which are edicts issued personally by the pope and these can be used for just about any purpose although they’re most associated with rulings which provide an “instant solution” to a troublesome or controversial matter on which it’s not been possible to find consensus; the Moto Proprio may thus be compared to a "royal decree".  Papal bulls were more common in the medieval and early modern periods when formal seals were the primary means of authentication but today they are rare, most communication from the Vatican in the form of apostolic letters or exhortations, not all with origins in the papal chancellery.

The last papal resignation but one

Red Bull Chuck Wagon Restaurant (No Bum Bull Served Here), Winnemucca, Nevada, USA, circa 1967.

Even when absolute monarchies were more common, kings usually took care to placate at least elite opinion and today, although the constitutional arrangements in Saudi Arabia, Brunei, Oman and Eswatini (the old Swaziland) remain, on paper, absolute monarchies, even there things are not done quite as once they were.  The Holy See remains an absolute monarchy and is now the only theocracy so structured although doubtlessly many popes have lamented their authority seems to exist more in the minds of canon lawyers than among the curia or his flock, something exacerbated now malcontents can no longer be burned at the stake (as far as is known) and Francis (b 1936; pope since 2013) may recall the words of a world-weary Benedict XIV (1675–1758; pope 1740-1758): “The pope commands, his cardinals do not obey, and the people do what they wish.”

Papal Bull issued by Urban VIII (1568–1644; pope 1623-1644).  By the mid-fifteenth century, papal bulls had ceased to be used for general public communications and were restricted to the more formal or solemn matters.  The papal lead seals (the spellings bulla & bolla both used) were attached to the vellum document by cords made of hemp or silk, looped through slits.

But popes still have great powers not subject to checks & balances or constitutional review, the best known of which is “papal infallibility”.  The Roman Catholic Church’s dogma of papal infallibility holds that a pope’s rulings on matters of faith and doctrine are infallibility correct and cannot be questioned and when making such statements, a pope is said to be speaking ex cathedra (literally “from the chair” (of the Apostle St Peter, the first pope)).  Although ex cathedra pronouncements had been issued since medieval times, as a point of canon law, the doctrine was codified first at the First Ecumenical Council of the Vatican (Vatican I; 1869–1870) in the document Pastor aeternus (shepherd forever).  Since Vatican I, the only ex cathedra decree has been Munificentissimus Deus (The most bountiful God), issued by Pius XII (1876–1958; pope 1939-1958) in 1950, in which was declared the dogma of the Assumption; that the Virgin Mary "having completed the course of her earthly life, was assumed body and soul into heavenly glory".  Pius XII never made explicit whether the assumption preceded or followed earthly death, a point no pope has since discussed although it would seem of some theological significance.  Prior to the solemn definition of 1870, there had been decrees issued ex cathedra.  In Ineffabilis Deus (Ineffable God (1854)), Pius IX (1792–1878; pope 1846-1878) defined the dogma of the Immaculate Conception of the Blessed Virgin Mary, an important point because of the theological necessity of Christ being born free of sin, a notion built upon by later theologians as the perpetual virginity of Mary.  It asserts that Mary "always a virgin, before, during and after the birth of Jesus Christ", explaining the biblical references to brothers of Jesus either as children of Joseph from a previous marriage, cousins of Jesus, or just folk closely associated with the Holy Family.

Lindsay Lohan, posing with a can of Red Bull, photographed by Brian Adams (b 1959) for Harper’s Bazaar magazine, 2007.

Technically, papal infallibility may have been invoked only the once since codification but since the early post-war years, pontiffs have found ways to achieve the same effect, John Paul II (1920–2005; pope 1978-2005) & Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022) both adept at using what was in effect a personal decree a power available to one who sits at the apex of what is in constitutional terms an absolute theocracy.  Critics have called this phenomenon "creeping infallibility" and its intellectual underpinnings own much to the tireless efforts of Benedict XVI while he was head of the Inquisition (by then called the Congregation for the Doctrine of the Faith (CDF) and now renamed the Dicastery for the Doctrine of the Faith (DDF)) during the late twentieth century (the Holy See probably doesn't care but DDF is also the acronym, inter alia, for “drug & disease free” and (in gaming) “Doom definition file” and there's also the DDF Network which is an aggregator of pornography content).  So while not since 1950 formally invoked, popes have not been reluctant to “play the de facto infallibility card”, possibly thinking of the (probably apocryphal) remark attributed to John XXIII (1881-1963; pope 1958-1963): “When one is infallible, one has to be careful what one says.

Bulla issued 17 July 1492 by Innocent VIII (1432–1492; pope 1484-1492) granting St Duthac’s Church (Tain) official permission to become a Collegiate Church.

But for a pope’s own purposes, a bulla can prove invaluable.  Pietro Angellerio (1215-1296) was for five months between July and December 1294 installed as Pope Celestine V.  Prior to his elevation, Celestine had for decades been a monk and hermit, living a anchorite existence in remote caves and subsisting on little more that wild vegetables, fruits, honey and the occasional locust, his unworldly background meaning he emerged as the ultimate compromise candidate, declared pope after a two-year deadlock in the church’s last non-conclave papal election.  The cardinals had been squabbling for all those two years which so upset the hermit in his cave that he wrote them a letter warning divine retribution would be visited upon them if they didn't soon elect a pope.  Realizing he was entirely un-political, without enemies and likely pliable, the cardinals promptly elected him by acclamation.

Lindsay Lohan mixing a Red Bull & mandarin juice while attending an event with former special friend Samantha Ronson (b 1977), Mandarin Oriental Hotel, London, February 2012.

Shocked, the hermit declined the appointment, only to have his own arguments turned on him, the cardinals insisting if he refused the office he would be defying God himself; trapped, he was crowned at Santa Maria di Collemaggio in Aquila, taking the name Celestine V.  The anchorite, lost in the world of power politics and low skulduggery was utterly unsuited to the role and within weeks expressed the wish to abdicate and return to his solitary cave in the Abruzzi Mountains.  The cardinals told him it wasn’t possible and only God could release him from the office (will all that implies) but they couldn’t stop him consulting the lawyers who drafted for him two bulls, the first codifying the regulations concerning a pope’s abdication and the second a sort of “enabling act”.  The second bull (Quia in futurum (for in the future)) restored the constitution (Ubi periculum (Where there lies danger)), and re-established the papal conclave (the constitution had been suspended by Adrian V (circa 1216-1276; pope 1276)).  The bulls having put in place the required mechanisms, while at Naples, Celestine V abdicated.

Brutum Fulmen issued by Pius V (1504–1572; pope 1566-1572), concerning the Damnation, Excommunication and Deposition of Elizabeth I (1533–1603; Queen of England & Ireland 1558-1603) by Thomas Barlow (circa 1608- 1691; Lord Bishop of Lincoln (1675–1691).

That done, he resigned, intending to return to his cave but his successor, Boniface VIII (circa 1231-1303; pope 1294-1303) had no wish to have such a puritanical loose cannon at large (he feared some dissidents might proclaim him antipope) and imprisoned him (in an agreeable circumstances) in the castle where ultimately he would die.   His resignation from the office was the last until Benedict XVI who in 2013 did rather better, retiring to a sort of papal granny flat in the Vatican where he lived (uniquely) as pope emeritus.  Celestine was canonized on 5 May 1313 by Clement V (circa 1265-1314; pope 1305-1314) and no subsequent pontiff has taken the name Celestine. 

1966 Lamborghini Miura P400 re-painted in what most would probably call hot pink but professionals list as fashion fuchsia (Hex: #F400A1; RGB: 244, 0, 161; CMYK: 0, 100, 34, 4).  The Miura (1966-1973) was named after a breed of fighting bull and was the first Lamborghini to borrow an identity from bullfighting and the first to wear the corporate logo featuring a bull.  In the film The Italian Job (1969), a Miura is shown being crushed by a bulldozer but that was filmic trickery using a second, pre-wrecked car and the orange Miura seen driven through the Alps still exists.  In many places the film's car is referred to as "tangerine" or "orange", both of which well describe the vivid hue which the factory listed as Arancio Miura (meaning "Miura Orange" although the use in Italian of arancio to mean the color "orange" is untypical, the word used usually in the sense of "orange tree").  In a technological quirk, some film sites call the car "red" but that's a function of some prints or photographs not correctly maintaining the integrity or the original.  The term “hot pink” (Hex: #FF69B4; RGB: 255, 105, 180; CMYK: 0, 59, 29, 0) is used very loosely and has become the general term for bright shades while “baby pink” is used casually of most pastels.  In theory, the color palette is infinitely variable but the practical limitation is the range able to be perceived by the human eye, illustrated by the announcement in April, 2025 of the “discovery” of a “new color” (“new” in the sense no human had ever seen it although it may well be common around the universe).

To this day Lamborghini still uses terms from the tradition of bullfighting for some models which perhaps is surprising given bullfighting is now not as socially respectable as it was during the 1960s but disapprobation of the “sport” is not new and Pius V (1504–1572; pope 1566-1572) as early as 1567 called the practice: “alien from Christian piety and charity”, “better suited to demons rather than men” and “public slaughter and butchery” fit for paganism but not Christendom.  Word nerds will be delighted to note Pius V’s ban on bullfighting was technically a “papal bull”.  De Salute Gregis Dominici (On the Salvation of the Lord’s Flock) was issued on 1 November 1, 1567 as a formal proclamation with the papal lead bulla attached and, as an official decree, it was binding upon Church and Christian princes.  Appalled by the cruelty, Pius called bullfighting “a sin” and condemned the events as “spectacles of the devil”, prohibiting Christians from attending or participating under pain of excommunication.  However, like many papal thought bubbles down the ages which never quite make it to the status of doctrine, his ban was soon ignored and after his death the edict quietly was allowed to lapse.  Predictably, in Spain and Portugal, where bullfighting had deep cultural & political roots, the bulla was either ignored or resisted and Philip II (1527–1598; King of Spain 1556-1598), while as devout a Catholic as any man, was known as Felipe el Prudente (Philip the Prudent) for a reason and quietly he turned the royal blind eye, allowing bullfighting to continue.