Showing posts sorted by date for query Diet. Sort by relevance Show all posts
Showing posts sorted by date for query Diet. Sort by relevance Show all posts

Friday, August 8, 2025

Carnival

Carnival (pronounced kahr-nuh-vuhl)

(1) A traveling amusement show, having sideshows, rides etc.

(2) Any merrymaking, revelry, or festival, as a program of sports or entertainment.

(3) In the Christian ecclesiastical calendar, the season immediately preceding Lent, often observed with merrymaking; Shrovetide.

(4) A festive occasion or period marked by merrymaking, processions etc and historically much associated with Roman Catholic countries in the period just before Lent.

(5) A sports meeting.

(6) In literary theory (as the noun carnivalization & verb carnivalize), to subvert (orthodox assumptions or literary styles) through humour and chaos.

(7) In sociology, a context in which transgression or inversion of the social order is given temporary license (an extension of the use in literary theory).

(8) Figuratively, a gaudily chaotic situation.

(9) As a modifier (often as “carnival atmosphere?”) a festive atmosphere.

1540–1550: From the Middle French carnaval, from the Italian carnevale, from the Old Italian carnelevare (taking meat away), from older Italian forms such as the Milanese carnelevale or Old Pisan carnelevare (to remove meat (literally “raising flesh”)) the construct built from the Latin caro (flesh (originally “a piece of flesh”)) from the primitive Indo-European root sker- (to cut) + levare (lighten, raise, remove), from the primitive Indo-European root legwh- (not heavy, having little weight).  Etymologists are divided on the original source of the term used by the Church, the alternatives being (1) carnem levare (to put away flesh), (2) carnem levāmen (meat dismissal), (3) carnuālia (meat-based country feast) and (4) carrus nāvālis (boat wagon; float).  What all agree upon is the ecclesiastical use would have come from one of the forms related to “meat” and the folk etymology favors the Medieval Latin carne vale (flesh, farewell!).  Spreading from the use in Christian feast days, by at least the 1590s it was used in the sense of “feasting or revelry in general” while the meaning “a circus or amusement fair” appears to be a 1920s adoption in US English.  The synonyms can include festival, celebration, festivity, fiesta, jubilee, gala, fete, fête, fest, fair, funfair, exhibit, exhibition, revelry, merriment, rejoicing, jamboree, merrymaking, mardi gras, jollity, revel, jollification, exposition and show.  Which is chosen will be dependent on region, context, history etc and (other than in ecclesiastical use) rules mostly don’t exist but there seem to be a convention that a “sporting carnival” is a less formal event (ie non-championship or lower level competitions).  The alternative spelling carnaval is obsolete.  Carnival & carnivalization are nouns, carnivalize, carnivalizing & carnivalized are verbs, and carnivalic, carnivalistic, carnivalesque, carnivallike, precarnival & noncarnival are adjectives; the noun plural is carnivals.

Not just meat: Francis (1936-2025; pope 2013-2025) on fasting for Lent.

Originally, a carnival was a feast observed by Christians before the Lenten fast began and wasn’t a prelude to a sort of proto-veganism.  It was a part of one of religion’s many dietary rules, one which required Christians to abstain from meat during Lent (particularly on Fridays and during certain fast days), carnival the last occasion on which meat was permissible before Easter.  The Christian practice of abstaining from meat evolved as part of a broader theology of penance, self-denial, and imitation of Christ’s suffering, the rationale combining biblical precedent, symbolic associations and early ascetic traditions, the core of the concept Christ’s 40 days of fasting in the wilderness (Matthew 4:1–11, Luke 4:1–13).  Theologically, the argument was that for one’s eternal soul to enter the Kingdom of Heaven, a price to be paid was Imitatio Christi (earthly participation in Christ’s suffering).  Much the early church valued suffering (for the congregants if not the clergy and nobility) and the notion remains an essential theme in some Christian traditions which can be summed up in the helpful advice: “For everything you do, there’s a price to be paid.

Donald Trump (b 1946; US president 2017-2021 and since 2025) in 2016 on his private jet, fasting for Lent.

By voluntarily abstaining from certain foods, Christians imitated Christ’s self-denial and prepared spiritually for Easter: sharing in His suffering to grow in holiness.  Meat was seen a symbol of feasting and indulgence, an inheritance from Antiquity when “flesh of the beasts of the field” was associated with celebration rather than everyday subsistence, the latter something sustained typically by seafood, fruits and grains so voluntarily (albeit at the behest of the Church) choosing temporarily to renounce meat symbolized forgoing luxury and bodily pleasure, cultivating humility and penitence.  As well as the theological, there was also a quasi-medical aspect to what Tertullian (Quintus Septimius Florens Tertullianus, circa 155–circa 220) commended as “forsaking worldly indulgence” in that fasting took one’s thoughts away from earthly delights, allowing a focus on “prayer and spiritual discipline”, strengthening the soul against “sinful temptations”.  Another layer was added by the Patristics (from the Latin pater (father)), a school of thought which explored the writings and teachings of the early Church Fathers.  Although it was never a universal view in Patrology, there were those who saw in the eating of meat a connection to animal sacrifice and blood, forbidden in the Old Testament’s dietary laws and later spiritualized in Christianity, thus the idea of abstinence as a distancing from violence and sensuality.  Finally, there was the special significance of Fridays, which, as "Good Friday" reflected the remembrance of the crucifixion of Christ and his death at Calvary (Golgotha); the early Christians treated every Friday as a mini-fast and later this would be institutionalized as Lent.

Lindsay Lohan arriving at the Electric Daisy Carnival (left) and detail of the accessory worn on her right thigh (right), Memorial Coliseum, Los Angeles, June 2010.  The knee-high boots were not only stylish but also served to conceal the court-mandated SCRAM (Secure Continuous Remote Alcohol Monitor) bracelet.

The allowance of fish during Lent had both pragmatic and theological origins, its place in the Christian diet a brew of symbolism, biblical precedent and cultural context.  As a legal and linguistic point, in the Greco-Roman scheme of things fish was not thought “flesh meat” which was understood as coming from warm-blooded land animals and birds.  Fish, cold-blooded and aquatic, obviously were different and belonged to a separate category, one which Christianity inherited and an implication of the distinction was seafood being viewed as “everyday food” rather than an indulgent luxury.  This was a thing also of economics (and thus social class), the eating of fish much associated with the poorer coastal dwellers whereas meat was more often seen on urban tables.  Notably, there was also in this a technological imperative: in the pre-refrigeration age, in hot climates, often it wasn’t possible safely to transport seafood inland.  The Biblical symbolism included Christ feeding the multitudes with a few “loaves and fishes” (Matthew 14:13–21), several of the apostles were fishermen who Christ called upon to be “fishers of men” (Mark 1:16–18) and the ichthys (fish symbol) was adopted as early Christian emblem for Christ Himself.  Collectively, this made fish an acceptably modest food for a penitential season.  All that might have been thought justification enough but, typically, Medieval scholars couldn’t resist a bit of gloss and the Italian Dominican friar, philosopher & theologian Saint Thomas Aquinas (1225–1274) decided abstinence aimed to “curb the concupiscence of the flesh” and, because meat generated more “bodily heat” and pleasure than fish, it was forbidden while fish was not.  That wasn’t wholly speculative and reflected the humoral theory from Antiquity, still an orthodoxy during the Middle Ages: fish seen as lighter, cooler, and less sensual.

Notting Hill Carnival, London.

Traditionally, there was also a Lenten prohibition of dairy products and eggs, each proscription with its own historical and symbolic logic and the basis of Shrove Tuesday (Pancake Day) and Easter eggs (though not the definitely un-Christian Easter bunny).  The strictness derived partly from Jewish precedents notably the vegetarian edict in Daniel 10:2–3 and the idea of a “return to Edenic simplicity” where man would eat only plants (Genesis 1:29) but also an aversion to links with sexuality and fertility, eggs obviously connected with sexual reproduction and dairy with lactation.  What this meant was early Christian asceticism sought to curb bodily impulses and anything connected with fleshly generation and (even if indirectly), thoughts of sex.

Historically, a time of absolution when confessions were made in preparation for Lent, Shrovetide described the three days immediately preceding Lent (Shrove Sunday, Shrove Monday & Shrove Tuesday, preceding Ash Wednesday).  The construct being shrove +‎ -tide, the word was from the late Middle English shroftyde.  Shrove was the simple past of shrive, from the Middle English shryven, shriven & schrifen, from the Old English sċrīfan (to decree, pass judgement, prescribe; (of a priest) to prescribe penance or absolution), from the Proto-West Germanic skrīban, from the late Proto-Germanic skrībaną, a borrowing from the Latin scrībō (write).  The word may be compared with the West Frisian skriuwe (to write), the Low German schrieven (to write), the Dutch schrijven (to write), the German schreiben (to write), the Danish skrive (to write), the Swedish skriva (to write) and the Icelandic skrifa (to write).  The –tide suffix was from the Middle English –tide & -tyde, from the Old English -tīd (in compounds), from tīd (point or portion of time, due time, period, season; feast-day, canonical hour).  Before refrigeration, eggs and dairy naturally accumulated during springtime as hens resumed laying and animals produced more milk.  Being banned during Lent, stocks thus had to be consumed lest they be wasted so a pragmatic way to ensure economy of use was the pancake (made with butter, milk & eggs), served on the feast of Shrove Tuesday (Pancake Day).  Following Easter, when eggs returned to the acceptable list, “Easter eggs” were a natural festive marker of the fast’s end.

Carnival Adventure and Carnival Encounter off Australia’s eastern Queensland coast.

Although dubbed “floating Petri dishes” because of the high number of food poisoning & norovirus cases, cruise ships remain popular, largely because, on the basis of cost-breakdown, they offer value-for-money packages few land-based operators can match.  The infections are so numerous because (1) there are thousands of passengers & crew in a closed, crowded environment, (2) an extensive use of buffets and high-volume food service, (3) a frequent turnover of crew & passengers, (4) port visits to places with inconsistent sanitation, health & food safety standards and (5) sometimes delayed reporting and patient isolation.

However, although the popular conception of Medieval Western Christendom is of a dictatorial, priest-ridden culture, the Church was a political structure and it needed to be cognizant of practicalities and public opinion.  Even dictatorships can maintain their authority only with public consent (or at least acquiescence) and in many places the Church recognized burdensome rules could be counter-productive, onerous dietary restrictions resented especially by the majority engaged for their living in hard, manual labor.  Dispensations (formal exceptions) became common with bishops routinely relaxing the rules for the ill, those pregnant or nursing or workers performing physically demanding tasks.  As is a common pattern when rules selectively are eased, a more permissive environment was by the late Middle Ages fairly generalized (other than for those who chose to live by to monastic standards).

Carnival goers enjoying the Sydney Gay & Lesbian Mardi Gras: This is not what Medieval bishops would have associated with the word “carnival” but few events better capture the spirit of the phrase “carnival atmosphere”.

The growth of dispensations (especially in the form of “indulgences” which were a trigger for the Protestant Reformation) was such it occurred to the bishops they’d created a commodity and commodities can be sold.  This happened throughout Europe but, in France and Germany, the “system” became institutionalized, the faithful even able to pay “butter money” for the privilege of eating the stuff over Lent (a kind of inverted “fat tax”!) with the proceeds devoted to that favourite capital works programme of bishops & cardinals: big buildings.  The sixteenth century tower on Normandy’s Rouen Cathedral was nicknamed “Butter Tower” although the funds collected from the “tax” covered only part of the cost; apparently even the French didn’t eat enough butter.  As things turned out, rising prosperity and the population drifts towards towns and cities meant consumption of meat and other animal products increased, making restrictions harder to enforce and the Protestant reformers anyway rejected mandatory fasting rules, damning them as man-made (“Popery!” the most offensive way they could think to express that idea) rather than divine law.  Seeing the writing nailed to the door, one of the results of the Council of Trent (1545–1563) was that while the Church reaffirmed fasting, eggs and dairy mostly were allowed and the ban on meat was restricted to Fridays and certain fast days in the ecclesiastical calendar.

Archbishop Daniel Mannix in his library at Raheen, the Roman Catholic's Church's Episcopal Palace in Melbourne, 1917-1981.

By the twentieth century, it was clear the Holy See was fighting a losing battle and in February 1966, Paul VI (1897-1978; pope 1963-1978) promulgated Apostolic Constitution Paenitemini (best translated as “to be penitent”) making abstinence from meat on Fridays optional outside Lent and retained only Ash Wednesday and Good Friday as obligatory fast days for Catholics.  It was a retreat very much in the corrosive spirit of the Second Vatican Council (Vatican II, 1962-1965) and an indication the Church was descending to a kind of “mix & match” operation, people able to choose the bits they liked, discarding or ignoring anything tiresome or too onerous.  In truth, plenty of priests had been known on Fridays to sprinkle a few drops of holy water on their steak and declare “In the name of our Lord, you are now fish”.  That was fine for priests but for the faithful, dispensation was often the “luck of clerical draw”.  At a time in the late 1940s when there was a shortage of good quality fish in south-east Australia, Sir Norman Gilroy (1896–1977; Roman Catholic Archbishop of Sydney 1940-1971, appointed cardinal 1946) granted dispensation but the stern Dr Daniel Mannix (1864–1963; Roman Catholic Archbishop of Melbourne 1917-1963) refused so when two politicians from New South Wales (Ben Chifley (1885–1951; prime minister of Australia 1945-1949) and Fred Daly (1912–1995)) arrived in the parliamentary dining room for dinner, Chifley’s order was: “steaks for me and Daly, fish for the Mannix men.

In the broad, a carnival was an occasion, event or season of revels, merrymaking, feasting and entertainments (the Spanish fiestas a classic example) although they could assume a political dimension, some carnivals staged to be symbolic of the disruption and subversion of authority.  The idea was a “turning upside down of the established hierarchical order” and names used included “the Feast of Fools”, “the Abbot of Misrule” and “the Boy Bishop”.  With a nod to this tradition, in literary theory, the concept of “carnivalization” was introduced by the Russian philosopher & literary critic Mikhail Bakhtin (1895–1975), the word appearing first in the chapter From the Prehistory of Novelistic Discourse (written in 1940) which appeared in his book The Dialogic Imagination: chronotope and heteroglossia (1975).  What carnivalization described was the penetration or incorporation of carnival into everyday life and its “shaping” effect on language and literature.

The Socratic dialogues (most associated with the writing of the Greek philosophers Xenophon (circa 430–355 BC) and Plato (circa 427-348 BC)) are regarded as early examples of a kind of carnivalization in that what appeared to be orthodox “logic” was “stood on its head” and shown to be illogical although Menippean satire (named after the third-century-BC Greek Cynic Menippus) is in the extent of its irreverence closer to the modern understanding which finds expression in personal satire, burlesque and parody.  Bakhtin’s theory suggested the element of carnival in literature is subversive in that it seeks to disrupts authority and introduce alternatives: a deliberate affront to the canonical thoughts of Renaissance culture.  In modern literary use the usual term is “carnivalesque”, referring to that which seeks to subvert (“liberate” sometimes the preferred word) assumptions or orthodoxies by the use of humor or some chaotic element.  This can be on a grand scale (ie an entire cultural movement) or as localized some malcontent disrupting their book club (usually polite affairs where novels are read and ladies sit around talking about their feelings).

Portrait of Leo Tolstoy (1887), oil on canvas by Ilya Repin (1844-1930), Tretyakov Gallery, Moscow, Russia.

He expanded on the theme in his book Problems of Dostoevsky's Poetics (1929) by contrasting the novels of Leo Tolstoy (1828-1910) and Fyodor Dostoevsky (1821–1881).  Tolstoy’s fiction he classified as a type of “monologic” in which all is subject to the author's controlling purpose and hand, whereas for Dostoevsky the text is “dialogic” or “polyphonic” with an array of different characters expressing a variety of independent views (not “controlled” the author) in order to represent the author's viewpoint.  Thus deconstructed, Bakhtin defined these views as “not only objects of the author's word, but subjects of their own directly significant word as well” and thus vested with their own dynamic, being a liberating influence which, as it were, “conceptualizes” reality, lending freedom to the individual character and subverting the type of “monologic” discourse characteristic of many nineteenth century authors (typified by Tolstoy).

Portrait of Fedor Dostoyevsky (1872), oil on canvas by Vasily Perov (1834-1882), Tretyakov Gallery, Moscow, Russia.

Dostoevsky’s story Bobok (1873) is cited as an exemplar of carnival.  It has characters with unusual freedom to speak because, being dead, they’re wholly disencumbered of natural laws, able to say what they wish and speak truth for fun.  However, Bakhtin did acknowledge this still is literature and didn’t claim a text could be an abstraction uncontrolled by the author (although such things certainly could be emulated): Dostoevsky (his hero) remained in control of his material because the author is the directing agent.  So, given subversion, literary and otherwise, clearly has a history dating back doubtlessly as many millennia as required to find an orthodoxy to subvert, why was the concept of carnivalization deemed a necessary addition to literary theory?  It went to the form of things, carnivalization able especially to subvert because it tended to be presented in ways less obviously threatening than might be typical of polemics or actual violence.

Monday, August 4, 2025

Exposome

Exposome (pronounced eks-poh-sohm)

(1) A concept describing (1) the environmental exposures an individual encounters throughout life and (2) how these factors impact an individual's biology and health.

(2) The collection of environmental factors (stress, diet, climate, health-care etc) to which an individual is exposed and which can have an effect on health outcomes.

2005: The construct was expos(e) +‎ -ome, the word coined by cancer epidemiologist Dr Christopher Wild, then director of the International Agency for Research on Cancer (IARC).  Expose (in the sense of “to lay open to danger, attack, harm etc”; “to lay open to something specified”) dates from the mid-fifteenth century and was from the late Middle English exposen, from the Middle French exposer (to lay open, set forth), from the Latin expōnō (set forth), with contamination from poser (to lay, place). The –ome suffix was an alteration of -oma, from the Ancient Greek -ωμα (-ōma).  It was only partially cognate to -some (body), from σῶμα (soma) (body), in that both share the case ending -μα (-ma), but the ω was unrelated.  The sense was of “a mass of something” and use is familiar in forms such as genome (in genetics the complete genetic information (DNA (deoxyribonucleic acid) or RNA (ribonucleic acid)) and phenome (the whole set of phenotypic entities in a cell, tissue, organ, organisms, and species). Exposome is a noun and exposomic is an adjective; the noun plural is exposomes.

The study and assessment of external and internal factors (chemical, physical, biological, social, climatic etc) factors that may influence human health is not new and evidence of interest in the topic(s) exist in the literature of physicians and philosophers (there was sometimes overlap) from the ancient civilizations of Greece, Rome, China, Persia and India.  One of the paradoxes of modernity in medicine was that simultaneously there developed an interest in (1) interdisciplinary and holistic approaches while (2) specialization become increasingly entrenched, the latter leading sometimes to a “siloing” in research and data accumulation.  What makes exposome a useful tool is it is a way of expressing the interplay between genetics and environmental factors in the development of diseases with a particular focus on chronic conditions and widely the concept has been applied in many fields of medicine beyond public health.  What it does is calculate the cumulative effect of multiple exposures, allowing researchers to “scope-down” to specific or general gene-environment interactions, producing data to permit a more accurate assessment of disease risk and thus the identification of useful modes of intervention.

Dr Wild’s coining of exposome came about because some word or phrase was needed to describe his innovation which was the application of a systematic approach to measuring environmental exposures to what was coming to be known about the human genome; in a sense it was an exercise in cause and effect, the three components being (1) the external exposome, (2) the internal exposome and (3) the biological response.  The external exposome included factors such as air pollution, diet and socioeconomic factors as well as specific external factors like chemicals and radiation.  The internal exposome included endogenous factors, such as hormones, inflammation, oxidative stress, and gut microbiota.  The biological response described the complex interactions between the external and internal exposome factors and their influence on an individual's physiology and health.

At its most comprehensive (and complex), the exposome is a cumulative measure of all environmental exposures to which an individual has been subject throughout their entire life.  While that’s something that can be modelled for an “imagined person”, in a real-world instance it will probably always be only partially complete, not least because in some cases critical environmental exposures may not be known for long after their effect has been exerted; indeed, some may be revealed only by an autopsy (post mortem).  Conceptually however, the process can be illustrated by example and one illustrative of the approach is to contrast the factors affecting the same individual living in three different places.  What that approach does is emphasize certain obvious differences between places but variations in an exposome don’t depend on the sample being taken in locations thousands of miles apart.  For a variety of reasons, the same individual might record a radically different outcome if (in theory) living their entire life in one suburb compared with one adjacent or even in one room in one dwelling compared with another perhaps only a few feet away.  Conditions can be similar across a wide geographical spread or different despite close proximity (even between people sitting within speaking distance), the phenomenon of “micro-climates” in open-plan offices well documented.  The number of variables which can be used usefully to calculate (estimate might be a better word) an individual’s (or a group’s) exposome is probably at least in the dozens but could easily be expanded well into three figures were one to itemize influences (such as chemicals or specifics types of pollutant matter) and such is the complexity of the process that the mere existence of some factors might be detrimental to some individuals yet neutral or even beneficial to others.  At this stage, although the implications of applying AI (artificial intelligence) to the interaction of large data sets with a individual’s genetic mix have intrigued some, the exposome remains an indicative conceptual model rather than a defined process.

As an example, consider the same individual living variously in New York City, Dubai or Los Angeles.  In each of those places, some factors will be universal within the locality while others will vary according to which part of place one inhabits and even at what elevation at the same address; the physical environment in a building’s ground floor greatly can vary from that which prevails on the 44th floor:

Lindsay Lohan in New York City in pastel yellow & black bouclé tweed mini-dress.  Maintaining an ideal BMI (body mass index) is a positive factor in ones exposome. 

(1) Air Quality and Pollution: Moderate to high levels of air pollution, especially from traffic (NO₂, PM2.5). Seasonal heating (oil and gas) contributes in winter.  Subway air has unique particulate matter exposure.

(2) Climate and UV Radiation: Humid continental climate—cold winters and hot summers. Seasonal variability affects respiratory and cardiovascular stressors.

(3) Diet and Food Environment: Diverse food options—high availability of ultra-processed foods but also global cuisines. Food deserts in poorer boroughs can reduce fresh produce access.

(4) Built Environment and Urban Design: Dense, walkable, vertical urban environment. High reliance on public transport; more noise pollution and crowding stress.  Lower car ownership can reduce personal emissions exposure.

(5) Cultural and Psychosocial Stressors: High-paced lifestyle, long working hours. High density increases social stress, noise, and mental health challenges.  Diversity can be enriching or alienating, depending on context.

(6) Economic and Occupational Exposures: Highly competitive job market. Occupational exposures vary widely—white-collar vs service industries. Union protections exist in some sectors.

(7) Healthcare Access and Public Policy: Robust healthcare infrastructure, but disparities remain by borough and income. Medicaid and public hospitals provide some safety net.

Lindsay Lohan in Dubai in J.Lo flamingo pink velour tracksuit.  A healthy diet and regular exercise are factors in one's exposome. 

(1) Air Quality and Pollution: Frequently exposed to dust storms (fine desert dust), high PM10 levels, and air conditioning pollutants. Limited greenery means less natural air filtration.  Desalination plants and industrial expansion add further exposure.

(2) Climate and UV Radiation: Extreme desert heat (45°C+), intense UV exposure, little rain. Heat stress and dehydration risks are chronic, especially for outdoor workers.

(3) Diet and Food Environment: High import dependency. Abundant processed and fast foods, especially in malls. Dietary pattern skewed toward high sugar and fat content.  Cultural fasting (eg Ramadan) introduces cyclical dietary stressors.

(4) Built Environment and Urban Design: Car-centric city. Pedestrian-unfriendly in many areas due to heat and design. Heavy air conditioning use is a major indoor exposure pathway.

(5) Cultural and Psychosocial Stressors: Strict social codes and legal restrictions influence behavioral exposures. Expat life often means social disconnection and job insecurity for migrant workers.

(6) Economic and Occupational Exposures: Large migrant workforce faces occupational health risks, including long hours in extreme heat. Labor protections are inconsistent.

(7) Healthcare Access and Public Policy: Healthcare access stratified—good for citizens and wealthy expats, less so for low-wage migrants. Private sector dominates.

Lindsay Lohan in Los Angeles in 2005 Mercedes-Benz SL65 AMG (2005-2011) Roadster (R230, 2002-2011).  Smoking is a factor in one's exposome.

(1) Air Quality and Pollution: Known for smog due to vehicle emissions and topography (valley trap). Ozone levels high, especially in summer. Wildfire smoke increasingly common.

(2) Climate and UV Radiation: Mediterranean climate with mild, dry summers. High UV exposure, though moderated by coastal influence. Drought conditions affect water quality and stress.

(3) Diet and Food Environment: Strong health-food culture, organic and plant-based diets more common. Yet fast food and food deserts remain in less affluent areas.  Hispanic and Asian dietary influences prominent.

(4) Built Environment and Urban Design: Sprawling, suburban in many parts. High car dependence means more exposure to vehicle exhaust.  Outdoor activities more common in certain demographics (eg, beach culture).

(5) Cultural and Psychosocial Stressors: Cultural emphasis on appearance, wealth, and entertainment may increase psychosocial pressure.  Homelessness crisis also creates variable community stress exposures.

(6) Economic and Occupational Exposures: Gig economy widespread, leading to precarious employment. Hollywood and tech industries also introduce unique workplace stress patterns.

(7) Healthcare Access and Public Policy: California’s public health programs are progressive, but uninsured rates still high. Proximity to cutting-edge research centers can boost care quality for some.

So one's exposome is a product of what one wants or gets from life, mapped onto a risk analysis table.  In New York City, one copes with urban pollution and persistent subway dust in an increasingly variable climate marked by periods of high humidity, a dietary range determined by one's wealth, the advantage of a good (if not always pleasant) mass transit system and the possibility of a “walking distance” lifestyle, albeit it in usually crowded, fast-paced surroundings.  Employment conditions are mixed and access to quality health care is a product of one's insurance status or wealth.

In Dubai, one lives with frequent dust storms, months of intense heat and UV exposure, a dependence on food imports, the constant temptation of fast food (FSS; fat, salt, sugar).  The car-centric lifestyle has created a built environment described as “pedestrian-hostile” and there are sometimes severe legal limits on the personal freedom especially for migrant workers who are subject to heat exposure and limited labor rights (even those which exist often not enforced).  The health system distinctly is tiered (based on wealth) and almost exclusively privatized.

The air quality in Los Angeles greatly has improved since the 1970s but climate change has resulted in the more frequent intrusion of smoke from wildfires and the prevailing UV exposure tends to be high; the climate is not as “mild” as once it was rated.  While there are pockets in which walkability is good, Los Angeles mostly is a car-dependent culture and the coverage and frequency of mass-transit has in recent decades declined.  Although this is not unique to the city, there's heightened awareness of a sensitivity to specific cultural pressures based on appearances and perceptions of lifestyle while housing stress is increasing.  Economic pressures are being exacerbated by the growth of the gig economy and traditionally secure forms of employment are being displaced by AI (bots, robots and hybrids).  Although California's healthcare system is sometimes described as "progressive", on the ground, outcomes are patchy.

So each location shapes the exposome in distinctive ways and the potential exists for the process better to be modelled so public health interventions and policies can be adjusted.  Of course, some risks are global: anywhere on the planet there’s always the chance one might be murdered by the Freemasons but some things which might seem unlikely to be affected by location turn out also to be an exposome variable. Because planet Earth is (1) roughly spherical, (2) and travels through space (where concepts like up & down don’t apply) and (3) constantly is exposed to meteoroids (every day Earth receives tons of “space dust”), it would be reasonable to assume one is equally likely to be struck by a meteoroid wherever one may be.  However, according to NASA (the US National Aeronautics and Space Administration), strikes are not equally likely everywhere, some latitudes (and regions) being more prone, due to several factors:

(1) Because Earth’s rotation and orbital motion create a bias, meteoroids tend more often to approach from the direction of Earth’s orbital motion (the “apex direction”), meaning the leading hemisphere (the side facing Earth's motion, near the dawn terminator) sees more meteoroid entries than the trailing hemisphere.  On a global scale, the effect is small but is measurable with the risk increasing as one approaches the equatorial regions where rotational velocity is greatest.

(2) Because most meteoroids approach from near the plane of the Solar System (the ecliptic plane), there’s what NASA calls a “latitude distribution bias”: Earth’s equator being tilted only some 23.5° from the ecliptic, meteoroids are more likely to intersect Earth’s atmosphere near lower latitudes (the tropical & sub-tropical zones) than near the poles.  So, those wishing to lower their risk should try to live in the Arctic or Antarctic although those suffering chronic kosmikophobia (fear of cosmic phenomena) are likely already residents.

(3) Some 70% of the Earth’s surface area being the seas and oceans, statistically, most meteoroids land in the water rather than in land so the lesson is clear: avoid living at sea.  The calculated probability is of course just math; because sparsely populated deserts accumulate meteorites better because erosion is low, a large number have been found in places like the Sahara and outback Australia but those numbers reflect a preservation bias and don’t necessarily confirm a higher strike rate.  The lesson from the statisticians is: Don’t dismiss the notion of living in a desert because of a fear of being struck by a meteoroid.

(4) Gravitational focusing, although it does increase Earth’s meteoroid capture rates (disproportionately so for objects travelling more slowly), is a global effect so there is no known locational bias.  While there is at least one documented case of a person being struck by a meteoroid, the evidence does suggest the risk is too low to be statistically significant and should thus not be factored into the calculation of one’s exposome because one is anywhere at greater risk of being murdered by the Freemasons.

Ms Ann Hodges with bruise, Alabama, September. 1952.  Painful though it would have been, she did get  her 15 minutes of fame and eventually sold the fragment for US$25 so there was that.

In the narrow technical sense, many people have been struck by objects from space (as estimated 40+ tons of the stuff arrives every day) but most fragments are dust particles, too small to be noticed.  The only scientifically verified injury a person has suffered was an impressively large bruise a meteorite (the part of a meteoroid that survives its fiery passage through the atmosphere to land on Earth’s surface) on 10 September 1954 inflicted on Ms Ann Hodges (1920-1972) of Sylacauga, Alabama in the US.  Weighing 7.9 lb (3.6 kg), the intruder crashed through the roof of her house and bounced off a radio, striking her while enjoying a nap on the sofa.  The meteoroid was called Sylacauga and, just as appropriately, the offending meteorite was named the Hodges Fragment.  Anatomically modern humans (AMH) have been walking the planet for perhaps 300,000 years and we’ve been (more or less) behaviorally modern (BMH) for maybe a quarter of that so it’s possible many more of us have been struck,  In the absence of records, while it’s impossible to be definitive, it’s likely more have been murdered by the Freemasons that have ever been killed by stuff falling from space although, as the history of species extinction illustrates, a direct hit on someone is not a prerequisite for dire consequences.

Dashcam footage of meteorite fragment in the sky over Lexington, South Carolina.

The cosmic intruder crashed through the roof of a house on 26 June, 2025 and although there were no injuries, Fox News reported the fragment left a hole in the floor “about the size of a large cherry tomato”.  Analysis determined the rock was from the asteroid belt between Mars and Jupiter and as well as the dramatic fireball many captured on their dashcams, it would briefly have broken the sound barrier as it entered Earth’s atmosphere.  It was also very old, dating from slightly before the formation of the Solar System’s rocky inner planets (one of which is Earth) some 4.56 billion years ago and such fragments are of interest to many branches of science because they represent a small part of the “basic building blocks” of those planets and can thus assist in understanding the processes active during the Solar System’s earliest days.  Curiously (to those not trained in such things), the cosmologists explained “such a small fragment didn’t present a threat to anyone” which seems strange given its impact left a small crater in a floor, one implication being one wouldn’t wish for such a thing to hit one’s skull.  That the impact happened in Georgia, a state adjacent to Alabama where a half-century earlier the unfortunate Ms Hodges was struck, may make some add meteorite fragments” to their list of exposome factors south of the Mason-Dixon Line” but the sample size is too small for conclusions to be drawn and the events are mere geographic coincidences.

Thursday, July 24, 2025

Kamikaze

Kamikaze (pronounced kah-mi-kah-zee or kah-muh-kah-zee)

(1) A member of a World War II era special corps in the Japanese air force charged with the suicidal mission of crashing an aircraft laden with explosives into an enemy target, especially Allied Naval vessels.

(2) In later use, one of the (adapted or specifically built) airplanes used for this purpose.

(3) By extension, a person or thing that behaves in a wildly reckless or destructive manner; as a modifier, something extremely foolhardy and possibly self-defeating.

(4) Of, pertaining to, undertaken by, or characteristic of a kamikaze; a kamikaze pilot; a kamikaze attack.

(5) A cocktail made with equal parts vodka, triple sec and lime juice.

(6) In slang, disastrously to fail.

(7) In surfing, a deliberate wipeout.

1945: From the Japanese 神風 (かみかぜ) (kamikaze) (suicide flyer), the construct being kami(y) (god (the earlier form was kamui)) + kaze (wind (the earlier form was kanzai)), usually translated as “divine wind” (“spirit wind” appearing in some early translations), a reference to the winds which, according to Japanese folklore, destroying Kublai Khan's Mongol invasionfleet in 1281.  In Japanase military parlance, the official designation was 神風特別攻撃隊 (Shinpū Tokubetsu Kōgekitai (Divine Wind Special Attack Unit)).  Kamikaze is a noun, verb & adjective and kamikazeing & kamikazed are verbs; the noun plural is kamikazes.  When used in the original sense, an initial capital is used. 

HESA Shahed 136 UAV.

The use of kamikaze to describe the Iranian delta-winged UAV (unmanned aerial vehicle, popularly known as “drones”) being used by Russia against Ukraine reflects the use of the word which developed almost as soon as the existence of Japan’s wartime suicide bomber programme became known.  Kamikaze was the name of the aviators and their units but it was soon also applied to the aircraft used, some re-purposed from existing stocks and some rocket powered units designed for the purpose.  In 1944-1945 they were too little, too late but they proved the effectiveness of precision targeting although not all military cultures would accept the loss-rate the Kamikaze sustained.  In the war in Ukraine, the Iranian HESA Shahed 136 (شاهد ۱۳۶ (literally "Witness-136" and designated Geran-2 (Герань-2 (literally "Geranium-2") by the Russians) the kamikaze drone have proved extraordinarily effective being cheap enough to deploy en masse and capable of precision targeting.  They’re thus a realization of the century-old dream of the strategic bombing theorists to hit “panacea targets” at low cost while sustaining no casualties.  Early in World War II, the notion of panacea targets had been dismissed, not because as a strategy it was wrong but because the means of finding and bombing such targets didn’t exist, thus “carpet bombing” (bombing for several square miles around any target) was adopted because it was at the time the best option.  Later in the war, as techniques improved and air superiority was gained, panacea targets returned to the mission lists but the method was merely to reduce the size of the carpet.  The kamikaze drones however can be pre-programmed or remotely directed to hit a target within the tight parameters of a GPS signal.  The Russians know what to target because so many blueprints of Ukrainian infrastructure sit in Moscow’s archives and the success rate is high because, deployed in swarms because they’re so cheap, the old phrase from the 1930s can be updated for the UAV age: “The drone will always get through”.

Imperial Japan’s Kamikazes

By 1944, it was understood by the Japanese high command that the strategic gamble simultaneously to attack the US Pacific Fleet at anchor in Pearl Harbor and the Asian territories of the European powers.  Such was the wealth and industrial might of the US that within three years of the Pearl Harbor raid, the preponderance of Allied warships and military aircraft in the Pacific was overwhelming and Japan’s defeat was a matter only of time.  That couldn’t be avoided but within the high command it was thought that if the Americans understood how high would be the causality rate if they attempted an invasion of the Japanese home islands, that and the specter of occupation might be avoided and some sort of "negotiated settlement" might be possible, the notion of the demanded "unconditional surrender" unthinkable.

HMS Sussex hit by Kamikaze (Mitsubishi Ki-51 (Sonia)), 26 July 1945 (left) and USS New Mexico (BB-40) hit by Kamikaze off Okinawa, 12 May 1945 (right).

Although on paper, late in the war, Japan had over 15,000 aircraft available for service, a lack of development meant most were at least obsolescent and shortages of fuel increasingly limited the extent to which they could be used in conventional operations.  From this analysis came the estimate that if used as “piloted bombs” on suicide missions, it might be possible to sink as many as 900 enemy warships and inflict perhaps 22,000 causalities and in the event of an invasion, when used at shorter range against landing craft or beachheads, it was thought an invading force would sustain over 50,000 casualties by suicide attacks alone.  Although the Kamikaze attacks didn't achieve their strategic objective, they managed to sink dozens of ships and kill some 5000 allied personnel.  All the ships lost were smaller vessels (the largest an escort carrier) but significant damage was done to fleet carriers and cruisers and, like the (also often dismissed as strategically insignificant) German V1 & V2 attacks in Europe, resources had to be diverted from the battle plan to be re-tasked to strike the Kamikaze air-fields.  Most importantly however, so vast by 1944 was the US military machine that it was able easily to repair or replace as required.  Brought up in a different tradition, US Navy personnel the target of the Kamikaze dubbed the attacking pilots Baka (Japanese for “Idiot”).

A captured Japanese Yokosuka MXY-7 Ohka (Model 11), Yontan Airfield, April 1945.

Although it’s uncertain, the first Kamikaze mission may have been an attack on the carrier USS Frankin by Rear Admiral Arima (1895-1944) flying a Yokosuka D4Y Suisei (Allied codename Judy) and the early flights were undertaken using whatever airframes were available and regarded, like the pilots, as expendable.  Best remembered however, although only 850-odd were built, were the rockets designed for the purpose.  The Yokosuka MXY-7 Ohka (櫻花, (Ōka), (cherry blossom)) was a purpose-built, rocket-powered attack aircraft which was essentially a powered bomb with wings, conceptually similar to a modern “smart bomb” except that instead of the guidance being provided by on board computers and associated electronics which were sacrificed in the attack, there was a similarly expendable human pilot.  Shockingly single-purpose in its design parameters, the version most produced could attain 406 mph (648 km/h) in level flight at relatively low altitude and 526 mph (927 km/h) while in an attack dive but the greatest operational limitation was the range was limited to 23 miles (37 km), forcing the Japanese military to use lumbering Mitsubishi G4N (Betty) bombers as “carriers” (the Ohka the so-called "parasite aircraft") with the rockets released from under-slung assemblies when within range.  As the Ohka was originally conceived (with a range of 80 miles (130 km)), as a delivery system that may have worked but such was the demand on the designers to provide the highest explosive payload, the fuel load was reduced, restricting the maximum speed to 276 mph (445 km/h), making the barely maneuverable little rockets easy prey for fighters and even surface fire.

Yokosuka MXY-7 Ohka.

During the war, Japan produced more Mitsubishi G4Ms than any other bomber and its then remarkable range (3130 miles (5037 km)) made it a highly effective weapon early in the conflict but as the US carriers and fighters were deployed in large numbers, its vulnerabilities were exposed: the performance was no match for fighters and it was completely un-armored without even self-sealing fuel tanks, hence the nick-name “flying lighter” gained from flight crews.  However, by 1945 Japan had no more suitable aircraft available for the purpose so the G4M was used as a carrier and the losses were considerable, an inevitable consequence of having to come within twenty-odd miles of the US battle-fleets protected by swarms of fighters.  It had been planned to develop a variant of the much more capable Yokosuka P1Y (Ginga) (as the P1Y3) to perform the carrier role but late in the war, Japan’s industrial and technical resources were stretched and P1Y development was switched to night-fighter production, desperately needed to repel the US bombers attacking the home islands.  Thus the G4M (specifically the G4M2e-24J) continued to be used.

Watched by Luftwaffe chief Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945), Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) presents test pilot Hanna Reitsch (1912-1979) with the Iron Cross (2nd class), Berlin, March, 1941 (left); she was later (uniquely for a woman), awarded the 1st-class distinction.  Conceptual sketch of the modified V1 flying bomb (single cockpit version, right).

The idea of suicide missions also appealed to some Nazis, predictably most popular among those never likely to find themselves at the controls, non-combatants often among the most blood-thirsty of politicians.  The idea had been discussed earlier as a means of destroying the electricity power-plants clustered around Moscow but early in 1944, the intrepid test pilot Hanna Reitsch suggested to Adolf Hitler (1889-1945; German head of government 1933-1945 & of state 1934-1945) a suicide programme as the most likely means of hitting strategic targets.  Ultimately, she settled on using a V1 flying bomb (the Fieseler Fi 103R, an early cruise missile) to which a cockpit had been added, test-flying it herself and even mastering the landing, a reasonable feat given the high landing speed.  As a weapon, assuming a sufficient supply of barely-trained pilots, it would probably have been effective but Hitler declined to proceed, feeling things were not yet sufficiently desperate.  The historic moment passed although in the skies above Germany, in 1945 there were dozens of what appeared to be "suicide attacks" by fighter pilots ramming their aircraft into US bombers.  The Luftwaffe was by this time so short of fuel that training had been cut to the point new recruits were being sent into combat with only a few hours of solo flying experience so it's believed some incidents may have been "work accidents" but the ad-hoc Kamikaze phenomenon was real.

According to statics compiled by the WHO (World Health Organization) in 2021, globally, there were an estimated 727,000 suicides and within the total: (1) among 15–29-year-olds, suicide was the third leading cause of death (2) for 15–19-year-olds, it was the fourth leading and (3) for girls aged 15–19, suicide ranked the third leading.  What was striking was that in middle & high income nations, suicide is the leading cause of death in the young (typically defined as those aged 15-29 or 15-34.  Because such nations are less affected by infectious disease, armed conflicts and accident mortality that in lower income countries, it appeared there was a “mental health crisis”, one manifestation of which was the clustering of self-harm and attempted suicides, a significant number of the latter successful.  As a result of the interplay of the economic and social factors reducing mortality from other causes, intentional self-harm stands out statistically, even though suicide rates usually are not, in absolute terms, “extremely” high.  Examples quoted by the WHO included:

Republic of Korea (ROK; South Korea): Among people aged 10–39, suicide is consistently the leading cause of death and that’s one of the highest youth suicide rates in the OECD (Organization of Economic Cooperation & Development, sometimes called the “rich countries club” although changes in patterns of development have compressed relativities and that tag is not as appropriate as once it was.

Japan (no longer styled the “Empire of Japan although the head of state remain an emperor): Suicide is the leading cause of death among those aged 15-39 and while there was a marked decline in the total numbers after the government in the mid 1990s initiated a public health campaign the numbers did increase in the post-COVID pandemic period.  Japan is an interesting example to study because its history has meant cultural attitudes to suicide differ from those in the West.

New Zealand (Aotearoa): New Zealand has one of the highest youth suicide rates in the developed world, especially among Māori youth and although the numbers can bounce around, for those aged 15–24, suicide is often the leading or second leading cause of death.

Finland:  For those aged 15-24, suicide is always among leading causes of mortality and in some reporting periods the leading one.  Because in Finland there are there are extended times when the hours of darkness are long and the temperatures low, there have been theories these conditions may contribute to the high suicide rate (building on research into rates of depression) but the studies have been inconclusive.

Australia: Suicide is the leading cause of death for those in the cohorts 15–24 and 25–44 and a particular concern is the disproportionately high rate among indigenous youth, the incidents sometimes happening while they’re in custody.  In recent years, suicide has road accidents and cancer as the leading cause in these age groups.

Norway & Sweden: In these countries, suicide is often one of the top three causes of death among young adults and in years when mortality from disease and injury are especially low it typically will rise to the top.

Kamikaze Energy Cans in all six flavors (left) and potential Kakikaze Energy Can customer Lindsay Lohan (right).

Ms Lohan was pictured here with a broken wrist (fractured in two places in an unfortunate fall at Milk Studios during New York Fashion Week) and 355 ml (12 fluid oz) can of Rehab energy drink, Los Angeles, September 2006.  Some recovering from injuries find energy drinks a helpful addition to the diet.  The car is a 2005 Mercedes-Benz SL 65 (R230; 2004-2011) which earlier had featured in the tabloids after a low-speed crash.  The R230 range (2001-2011) was unusual because of the quirk of the SL 550 (2006-2011), a designation used exclusively in the North American market, the RoW (rest of the world) cars retaining the SL 500 badge even though both used the 5.5 litre (333 cubic inch) V8 (M273).

Given the concerns about suicide among the young, attention has in the West been devoted the way the topic is handled on social media and the rise in the use of novel applications for AI (artificial intelligence) has flagged new problems, one of the “AI companions” now wildly popular among youth (the group most prone to attempting suicide) recently in recommending their creator take his own life.  That would have been an unintended consequence of (1) the instructions given to the bot and (2) the bot’s own “learning process”, the latter something which the software developers would have neither anticipated nor expected.  Given the sensitivities to the way suicide is handled in the media, on the internet or in popular culture, it’s perhaps surprising there’s an “energy drink” called “Kamikaze”.  Like AI companions, the prime target for the energy drink suppliers is males aged 15-39 which happens to be the group most at risk of suicide thoughts and most likely to attempt suicide.  Despite that, the product’s name seems not to have attracted much criticism and the manufacturer promises: “With your Kamikaze Energy Can, you'll enjoy a two-hour energy surge with no crash.  Presumably the word “crash” was chosen with some care although, given the decline in the teaching of history at school & university level, it may be a sizeable number of youth have no idea about the origin of “Kamikaze”.  Anyway, containing “200mg L-Citrulline, 160mg Caffeine Energy, 1000mg Beta Alanine, vitamin B3, B6 & B12, zero carbohydrates and zero sugar, the cans are available in six flavours: Apple Fizz, Blue Raspberry, Creamy Soda, Hawaiian Splice, Mango Slushy & Rainbow Gummy.