Showing posts sorted by date for query Acid. Sort by relevance Show all posts
Showing posts sorted by date for query Acid. Sort by relevance Show all posts

Monday, August 4, 2025

Exposome

Exposome (pronounced eks-poh-sohm)

(1) A concept describing (1) the environmental exposures an individual encounters throughout life and (2) how these factors impact an individual's biology and health.

(2) The collection of environmental factors (stress, diet, climate, health-care etc) to which an individual is exposed and which can have an effect on health outcomes.

2005: The construct was expos(e) +‎ -ome, the word coined by cancer epidemiologist Dr Christopher Wild, then director of the International Agency for Research on Cancer (IARC).  Expose (in the sense of “to lay open to danger, attack, harm etc”; “to lay open to something specified”) dates from the mid-fifteenth century and was from the late Middle English exposen, from the Middle French exposer (to lay open, set forth), from the Latin expōnō (set forth), with contamination from poser (to lay, place). The –ome suffix was an alteration of -oma, from the Ancient Greek -ωμα (-ōma).  It was only partially cognate to -some (body), from σῶμα (soma) (body), in that both share the case ending -μα (-ma), but the ω was unrelated.  The sense was of “a mass of something” and use is familiar in forms such as genome (in genetics the complete genetic information (DNA (deoxyribonucleic acid) or RNA (ribonucleic acid)) and phenome (the whole set of phenotypic entities in a cell, tissue, organ, organisms, and species). Exposome is a noun and exposomic is an adjective; the noun plural is exposomes.

The study and assessment of external and internal factors (chemical, physical, biological, social, climatic etc) factors that may influence human health is not new and evidence of interest in the topic(s) exist in the literature of physicians and philosophers (there was sometimes overlap) from the ancient civilizations of Greece, Rome, China, Persia and India.  One of the paradoxes of modernity in medicine was that simultaneously there developed an interest in (1) interdisciplinary and holistic approaches while (2) specialization become increasingly entrenched, the latter leading sometimes to a “siloing” in research and data accumulation.  What makes exposome a useful tool is it is a way of expressing the interplay between genetics and environmental factors in the development of diseases with a particular focus on chronic conditions and widely the concept has been applied in many fields of medicine beyond public health.  What it does is calculate the cumulative effect of multiple exposures, allowing researchers to “scope-down” to specific or general gene-environment interactions, producing data to permit a more accurate assessment of disease risk and thus the identification of useful modes of intervention.

Dr Wild’s coining of exposome came about because some word or phrase was needed to describe his innovation which was the application of a systematic approach to measuring environmental exposures to what was coming to be known about the human genome; in a sense it was an exercise in cause and effect, the three components being (1) the external exposome, (2) the internal exposome and (3) the biological response.  The external exposome included factors such as air pollution, diet and socioeconomic factors as well as specific external factors like chemicals and radiation.  The internal exposome included endogenous factors, such as hormones, inflammation, oxidative stress, and gut microbiota.  The biological response described the complex interactions between the external and internal exposome factors and their influence on an individual's physiology and health.

At its most comprehensive (and complex), the exposome is a cumulative measure of all environmental exposures to which an individual has been subject throughout their entire life.  While that’s something that can be modelled for an “imagined person”, in a real-world instance it will probably always be only partially complete, not least because in some cases critical environmental exposures may not be known for long after their effect has been exerted; indeed, some may be revealed only by an autopsy (post mortem).  Conceptually however, the process can be illustrated by example and one illustrative of the approach is to contrast the factors affecting the same individual living in three different places.  What that approach does is emphasize certain obvious differences between places but variations in an exposome don’t depend on the sample being taken in locations thousands of miles apart.  For a variety of reasons, the same individual might record a radically different outcome if (in theory) living their entire life in one suburb compared with one adjacent or even in one room in one dwelling compared with another perhaps only a few feet away.  Conditions can be similar across a wide geographical spread or different despite close proximity (even between people sitting within speaking distance), the phenomenon of “micro-climates” in open-plan offices well documented.  The number of variables which can be used usefully to calculate (estimate might be a better word) an individual’s (or a group’s) exposome is probably at least in the dozens but could easily be expanded well into three figures were one to itemize influences (such as chemicals or specifics types of pollutant matter) and such is the complexity of the process that the mere existence of some factors might be detrimental to some individuals yet neutral or even beneficial to others.  At this stage, although the implications of applying AI (artificial intelligence) to the interaction of large data sets with a individual’s genetic mix have intrigued some, the exposome remains an indicative conceptual model rather than a defined process.

As an example, consider the same individual living variously in New York City, Dubai or Los Angeles.  In each of those places, some factors will be universal within the locality while others will vary according to which part of place one inhabits and even at what elevation at the same address; the physical environment in a building’s ground floor greatly can vary from that which prevails on the 44th floor:

Lindsay Lohan in New York City in pastel yellow & black bouclé tweed mini-dress.  Maintaining an ideal BMI (body mass index) is a positive factor in ones exposome. 

(1) Air Quality and Pollution: Moderate to high levels of air pollution, especially from traffic (NO₂, PM2.5). Seasonal heating (oil and gas) contributes in winter.  Subway air has unique particulate matter exposure.

(2) Climate and UV Radiation: Humid continental climate—cold winters and hot summers. Seasonal variability affects respiratory and cardiovascular stressors.

(3) Diet and Food Environment: Diverse food options—high availability of ultra-processed foods but also global cuisines. Food deserts in poorer boroughs can reduce fresh produce access.

(4) Built Environment and Urban Design: Dense, walkable, vertical urban environment. High reliance on public transport; more noise pollution and crowding stress.  Lower car ownership can reduce personal emissions exposure.

(5) Cultural and Psychosocial Stressors: High-paced lifestyle, long working hours. High density increases social stress, noise, and mental health challenges.  Diversity can be enriching or alienating, depending on context.

(6) Economic and Occupational Exposures: Highly competitive job market. Occupational exposures vary widely—white-collar vs service industries. Union protections exist in some sectors.

(7) Healthcare Access and Public Policy: Robust healthcare infrastructure, but disparities remain by borough and income. Medicaid and public hospitals provide some safety net.

Lindsay Lohan in Dubai in J.Lo flamingo pink velour tracksuit.  A healthy diet and regular exercise are factors in one's exposome. 

(1) Air Quality and Pollution: Frequently exposed to dust storms (fine desert dust), high PM10 levels, and air conditioning pollutants. Limited greenery means less natural air filtration.  Desalination plants and industrial expansion add further exposure.

(2) Climate and UV Radiation: Extreme desert heat (45°C+), intense UV exposure, little rain. Heat stress and dehydration risks are chronic, especially for outdoor workers.

(3) Diet and Food Environment: High import dependency. Abundant processed and fast foods, especially in malls. Dietary pattern skewed toward high sugar and fat content.  Cultural fasting (eg Ramadan) introduces cyclical dietary stressors.

(4) Built Environment and Urban Design: Car-centric city. Pedestrian-unfriendly in many areas due to heat and design. Heavy air conditioning use is a major indoor exposure pathway.

(5) Cultural and Psychosocial Stressors: Strict social codes and legal restrictions influence behavioral exposures. Expat life often means social disconnection and job insecurity for migrant workers.

(6) Economic and Occupational Exposures: Large migrant workforce faces occupational health risks, including long hours in extreme heat. Labor protections are inconsistent.

(7) Healthcare Access and Public Policy: Healthcare access stratified—good for citizens and wealthy expats, less so for low-wage migrants. Private sector dominates.

Lindsay Lohan in Los Angeles in 2005 Mercedes-Benz SL65 AMG (2005-2011) Roadster (R230, 2002-2011).  Smoking is a factor in one's exposome.

(1) Air Quality and Pollution: Known for smog due to vehicle emissions and topography (valley trap). Ozone levels high, especially in summer. Wildfire smoke increasingly common.

(2) Climate and UV Radiation: Mediterranean climate with mild, dry summers. High UV exposure, though moderated by coastal influence. Drought conditions affect water quality and stress.

(3) Diet and Food Environment: Strong health-food culture, organic and plant-based diets more common. Yet fast food and food deserts remain in less affluent areas.  Hispanic and Asian dietary influences prominent.

(4) Built Environment and Urban Design: Sprawling, suburban in many parts. High car dependence means more exposure to vehicle exhaust.  Outdoor activities more common in certain demographics (eg, beach culture).

(5) Cultural and Psychosocial Stressors: Cultural emphasis on appearance, wealth, and entertainment may increase psychosocial pressure.  Homelessness crisis also creates variable community stress exposures.

(6) Economic and Occupational Exposures: Gig economy widespread, leading to precarious employment. Hollywood and tech industries also introduce unique workplace stress patterns.

(7) Healthcare Access and Public Policy: California’s public health programs are progressive, but uninsured rates still high. Proximity to cutting-edge research centers can boost care quality for some.

So one's exposome is a product of what one wants or gets from life, mapped onto a risk analysis table.  In New York City, one copes with urban pollution and persistent subway dust in an increasingly variable climate marked by periods of high humidity, a dietary range determined by one's wealth, the advantage of a good (if not always pleasant) mass transit system and the possibility of a “walking distance” lifestyle, albeit it in usually crowded, fast-paced surroundings.  Employment conditions are mixed and access to quality health care is a product of one's insurance status or wealth.

In Dubai, one lives with frequent dust storms, months of intense heat and UV exposure, a dependence on food imports, the constant temptation of fast food (FSS; fat, salt, sugar).  The car-centric lifestyle has created a built environment described as “pedestrian-hostile” and there are sometimes severe legal limits on the personal freedom especially for migrant workers who are subject to heat exposure and limited labor rights (even those which exist often not enforced).  The health system distinctly is tiered (based on wealth) and almost exclusively privatized.

The air quality in Los Angeles greatly has improved since the 1970s but climate change has resulted in the more frequent intrusion of smoke from wildfires and the prevailing UV exposure tends to be high; the climate is not as “mild” as once it was rated.  While there are pockets in which walkability is good, Los Angeles mostly is a car-dependent culture and the coverage and frequency of mass-transit has in recent decades declined.  Although this is not unique to the city, there's heightened awareness of a sensitivity to specific cultural pressures based on appearances and perceptions of lifestyle while housing stress is increasing.  Economic pressures are being exacerbated by the growth of the gig economy and traditionally secure forms of employment are being displaced by AI (bots, robots and hybrids).  Although California's healthcare system is sometimes described as "progressive", on the ground, outcomes are patchy.

So each location shapes the exposome in distinctive ways and the potential exists for the process better to be modelled so public health interventions and policies can be adjusted.  Of course, some risks are global: anywhere on the planet there’s always the chance one might be murdered by the Freemasons but some things which might seem unlikely to be affected by location turn out also to be an exposome variable. Because planet Earth is (1) roughly spherical, (2) and travels through space (where concepts like up & down don’t apply) and (3) constantly is exposed to meteoroids (every day Earth receives tons of “space dust”), it would be reasonable to assume one is equally likely to be struck by a meteoroid wherever one may be.  However, according to NASA (the US National Aeronautics and Space Administration), strikes are not equally likely everywhere, some latitudes (and regions) being more prone, due to several factors:

(1) Because Earth’s rotation and orbital motion create a bias, meteoroids tend more often to approach from the direction of Earth’s orbital motion (the “apex direction”), meaning the leading hemisphere (the side facing Earth's motion, near the dawn terminator) sees more meteoroid entries than the trailing hemisphere.  On a global scale, the effect is small but is measurable with the risk increasing as one approaches the equatorial regions where rotational velocity is greatest.

(2) Because most meteoroids approach from near the plane of the Solar System (the ecliptic plane), there’s what NASA calls a “latitude distribution bias”: Earth’s equator being tilted only some 23.5° from the ecliptic, meteoroids are more likely to intersect Earth’s atmosphere near lower latitudes (the tropical & sub-tropical zones) than near the poles.  So, those wishing to lower their risk should try to live in the Arctic or Antarctic although those suffering chronic kosmikophobia (fear of cosmic phenomena) are likely already residents.

(3) Some 70% of the Earth’s surface area being the seas and oceans, statistically, most meteoroids land in the water rather than in land so the lesson is clear: avoid living at sea.  The calculated probability is of course just math; because sparsely populated deserts accumulate meteorites better because erosion is low, a large number have been found in places like the Sahara and outback Australia but those numbers reflect a preservation bias and don’t necessarily confirm a higher strike rate.  The lesson from the statisticians is: Don’t dismiss the notion of living in a desert because of a fear of being struck by a meteoroid.

(4) Gravitational focusing, although it does increase Earth’s meteoroid capture rates (disproportionately so for objects travelling more slowly), is a global effect so there is no known locational bias.  While there is at least one documented case of a person being struck by a meteoroid, the evidence does suggest the risk is too low to be statistically significant and should thus not be factored into the calculation of one’s exposome because one is anywhere at greater risk of being murdered by the Freemasons.

Ms Ann Hodges with bruise, Alabama, September. 1952.  Painful though it would have been, she did get  her 15 minutes of fame and eventually sold the fragment for US$25 so there was that.

In the narrow technical sense, many people have been struck by objects from space (as estimated 40+ tons of the stuff arrives every day) but most fragments are dust particles, too small to be noticed.  The only scientifically verified injury a person has suffered was an impressively large bruise a meteorite (the part of a meteoroid that survives its fiery passage through the atmosphere to land on Earth’s surface) on 10 September 1954 inflicted on Ms Ann Hodges (1920-1972) of Sylacauga, Alabama in the US.  Weighing 7.9 lb (3.6 kg), the intruder crashed through the roof of her house and bounced off a radio, striking her while enjoying a nap on the sofa.  The meteoroid was called Sylacauga and, just as appropriately, the offending meteorite was named the Hodges Fragment.  Anatomically modern humans (AMH) have been walking the planet for perhaps 300,000 years and we’ve been (more or less) behaviorally modern (BMH) for maybe a quarter of that so it’s possible many more of us have been struck,  In the absence of records, while it’s impossible to be definitive, it’s likely more have been murdered by the Freemasons that have ever been killed by stuff falling from space although, as the history of species extinction illustrates, a direct hit on someone is not a prerequisite for dire consequences.

Dashcam footage of meteorite fragment in the sky over Lexington, South Carolina.

The cosmic intruder crashed through the roof of a house on 26 June, 2025 and although there were no injuries, Fox News reported the fragment left a hole in the floor “about the size of a large cherry tomato”.  Analysis determined the rock was from the asteroid belt between Mars and Jupiter and as well as the dramatic fireball many captured on their dashcams, it would briefly have broken the sound barrier as it entered Earth’s atmosphere.  It was also very old, dating from slightly before the formation of the Solar System’s rocky inner planets (one of which is Earth) some 4.56 billion years ago and such fragments are of interest to many branches of science because they represent a small part of the “basic building blocks” of those planets and can thus assist in understanding the processes active during the Solar System’s earliest days.  Curiously (to those not trained in such things), the cosmologists explained “such a small fragment didn’t present a threat to anyone” which seems strange given its impact left a small crater in a floor, one implication being one wouldn’t wish for such a thing to hit one’s skull.  That the impact happened in Georgia, a state adjacent to Alabama where a half-century earlier the unfortunate Ms Hodges was struck, may make some add meteorite fragments” to their list of exposome factors south of the Mason-Dixon Line” but the sample size is too small for conclusions to be drawn and the events are mere geographic coincidences.

Wednesday, July 9, 2025

Lollipop

Lollipop (pronounced lol-ee-pop)

(1) A (usually spherical or disc-shaped) piece of hard candy attached to the end of a small stick, held in the hand while the candy is sucked or licked (It was essentially a toffee-apple without the apple; a stick dipped in toffee and the older spelling used in the UK was lollypop (which exists still in modern commerce)).

(2) Something in a shape resembling the candy on a stick.

(3) In the UK, Ireland and the Commonwealth, as lollipop lady (and lollipop man), a school crossing attendant (based on the shape of the "stop/go" signs traditionally used and in the slang of children they're also "lollipoppers".

(4) In computer networking, a routing protocol using sequence numbering starting at a negative value, increasing until zero, at which point it switches indefinitely to cycle through a finite set of positive numbers.

(5) In the labeling of the Android operating system, v5.0 to 5.1.1.

(6) In motorsport, a circular sign on a long stick, used by a pit crew to covey messages to drivers (system still used despite advances in radio communication because (1) it's retained as a backup in case of system failure and (2) the messages can't electronically be monitored and done, with care, can be secret.

(7) In the slang of fashion and related photography, a term for very thin models whose heads thus appear disproportionately large.

(8) Figuratively, something sweet but unsubstantial (originally of literature).

(9) In the slang of musical criticism, a short, entertaining, but undemanding piece of classical music.

1784: A creation of Modern English of uncertain origin but the construct may be the obvious lolly + pop. Lolly was from the Northern English dialect loll (dangle the tongue) and pop was an alternative name for “slap”.  The alternative theory is it was borrowed from the Angloromani (literally "English Romani" and the language combining aspects of English and Romani), which was spoken by the Romani (gypsy, traveller, Roma etc) people in England, Ireland & Wales.  It was in the twentieth century displaced by English but traces remain in the variant English used by modern Roma.  The suggestion is of links with the Angloromani loli phabai (or lollipobbul (red or candy apple)), which was a blend from the Middle Indic lohita (from Sanskrit) and loha (red), drawn from reudh which had Indo-European roots. Among etymologists, the Angloromani connection has most support.  Originally, lollipop seems to have referred just to the boiled sweet (ie "stickless) with the meaning "hard candy on a stick" not noted until the 1920s while the figurative sense (something sweet but unsubstantial) was in use by at least 1849.  Used in the slang of catwalk photographers, the verb lollipopping (a stick-thin model walking down catwalk) and adjective lollipopish (a model close to thin enough to be a true "lollipop") are both non-standard.  Among the pill-poppers, there seems to be a consensus that post-rave, the best lollipops are lemon-flavored.  In commerce, the spelling varies including lollipop, lollypop, loli-pop, lollypopp and lolly-pop.  Lollipop & lollipopper are nouns and lollipoplike is an adjective; the noun plural is lollipops.

Lindsay Lohan (b 1986) enjoying a giant lollipop.

In classical music criticism, the term “lollipop” refers to short, appealing and often melodically charming pieces which were nevertheless judged as “lightweight in musical substance”.  Deployed often as “palate-cleansers” or encores, despite the opinions of many critics, composers, conductors and musicians, the bulk of the audience tended to enjoy them because in character they were often jaunty and playful, not something which endeared them to the earnest types who decided what deserved to be the canon of the “serious” repertoire in which complexity was valued above accessibility.  A well-known exponent the genre was Johann Strauss II (1825–1899) and his An der schönen blauen Donau, Op. 314 (On the Beautiful Blue Danube (better known in English as The Blue Danube (1866)) and Tritsch-Tratsch-Polka, Op. 214 (Chit-chat (1858)) are exemplars of his technique.  The reason the lollipops were and remain popular with general audiences (typically not trained in any aspect of music) is that they paid their money to be entertained by listening to something they could enjoy, not always the experience delivered by the composers who preferred “the experimental”, valuing originality over beauty; these were the “formalists” (as comrade Stalin (1878-1953; Soviet leader 1924-1953) once labeled comrade Dmitri Shostakovich (1906-1975) and they may be compared with the modern generation of architects churning out ugly buildings because prizes in their profession are awarded on the basis of work being “new” rather than “attractive”.  Neither art deco nor mid-century modern buildings are in any way “lollipops” but the committees which award prizes in architecture probably think of them that way.

Britney Spears (b 1981) with lollipop, emerging from a session in a West Hollywood tanning salon, Los Angeles, October, 2022.

Many composers at least dabbled in lollipop production and some were memorable, French composer Claude Debussy’s (1862–1918) Clair de Lune (1890) hauntingly beautiful and demanding nothing more from a listener than to sit and let it wash over them; even comrade Stalin (who liked tunes he could hum) would have enjoyed it despite Debussy being French.  Others were specialists in the genre including: (1) the Austrian-American Friedrich "Fritz" Kreisler (1875–1962) who published a few of his compositions under wholly fictitious “old” names to lend them some “classic” respectability, (2) the English conductor Sir Thomas Beecham (1879–1961) who had a reputation among his peers for treating his music with about the same seriousness as he handled his many relationships with women and it was his encores and brief “concert fillers” which more than anything popularized use of “lollipop” in this context; he was also a practical impresario who noted what pleased the crowd and sometime constructed entire concerts with them, (3) Leopold Stokowski (1882–1977), a British conductor of Polish extraction noted for his arrangements of the works of Johann Sebastian Bach (1685–1750), pieces for which the appellation “lush” would have had to been coined had it not existed and (4) the Australian Percy Grainger (1882–1961) a man of not always conventional tastes & predilections who enjoyed and unusually close relationship with his mother although whether any of that in any way influenced his folk-inspired miniatures (quintessential lollipops) is a matter for debate.  What can’t be denied is that for the untrained, a hour or two of lollipop music will probably be enjoyed more than listening to the strains of stuff by Béla Bartók (1881-1945) or Arnold Schoenberg (1874–1951), the composers the critics think would be good for us.

A pandemic-era Paris Hilton (b 1981) in face mask with Whirly Pop lollipop.  Always remove facemask before attempting to lick or suck lollipop.

How to make lemon lollipops

Among the pill-poppers (and there are a lot of them about), there seems, at least impressionistically, to be a consensus that post-rave, the best lollipops are lemon-flavored.  It’s thought lemon lollipops work best in this niche because the acidic content interacts with taste receptors enjoying a heightened sensitivity induced by the pills’ chemistry.  Ideally, pill-poppers should pre-purchase lemon lollypops and at all times carry a few (on the basis of the (Boy) Scout motto: “Be prepared”) but that’s not always possible because, there being so many pill-poppers, shops often are out of stock of the lemon flavor.

Lemon Lollipops.

This recipe is therefore provided as a courtesy to pill poppers and, having shelf-life of weeks, lollipops can be prepared in advance; except for those popping at a heroic level, a batch should last a week so users should add the task to their routine, scheduling it perhaps after church every Sunday.  Lollipop sticks and one or more (depending on production target) lollypop molds will be required and the volume of ingredients quoted here should yield 24 small or 10-12 large lollipops.  Sticks and molds are available at supermarkets and speciality stores as are the small cellophane bags (needed only if some or all are being stored).  The taste can be varied by (slightly) adjusting the volumes of sugar, citric acid & lemon oil and preferences will vary between pill-poppers who are encouraged to experiment.

Ingredients (lollipops)

1 cup (200 g) sugar
½ cup (120 ml) water
¼ cup (60 ml) light corn syrup
1¼ teaspoons citric acid
¾ teaspoon lemon oil
2-4 (according to preference) drops liquid yellow food coloring

Ingredients (sour powder)

½ cup (50 g) confectioners' sugar
2 teaspoons citric acid

Directions (lollipops)

(1) Coat lollipop molds with non-stick cooking spray.

(2) Place lollipop sticks in the molds.

(3) Combine the sugar, water, and corn syrup in a large, heavy saucepan and then bring mix to a boil over medium-high heat.

(4) Continue cooking until mixture reaches 300°F (150°C) which is the “hard-crack” stage.  Immediately remove saucepan from the heat.  The timing is critical so watch pot during cooking.

(5) Add citric acid, lemon oil and food coloring and stir to combine.  (Because of the acidic nature of the mix, don’t allow face to come too close to pot because fumes can irritate the eyes).

(6) Pour the mixture into a heatproof measuring container with spout (or a candy funnel (which every pill-popper should own)).

(7) Divide the mixture among prepared molds and leave lollipops to cool and harden.  After about 15 minutes, they should be ready to remove from mold (may take longer if temperature or humidity are high).

Directions (sour powder)

(1) Mix confectioners’ sugar and citric acid in bowl.

(2) Holding by stick, dip lollipops in mixture, coating entire surface.

(3) Lollipops may immediately be consumed but if being stored, wrap in cellophane bags and twist-tie.  Store lollipops in cool, dark, dry place (they'll remain in a “best by” state for about a month). 

Wednesday, June 11, 2025

Hardwired

Hardwired (pronounced hahrd-whyid)

(1) In electronics, built into the hardware.

(2) In mainframe computing, a terminal connected to the CPU(s) by a direct cable rather than through a switching network.

(3) In the behavioral sciences, a cluster of theories pertaining to or describing intrinsic and relatively un-modifiable patterns of behavior by both humans and animals.  Published work describes genetically determined, instinctive behavior, as opposed to learned behavior.

(4) In computer programming, a kludge temporarily or quickly to fix a problem, done historically by bypassing the operating system and directly addressing the hardware (assembly language).

(5) Casual term for anything designed to perform a specific task.

1969:  A compound word: hard + wired.  Hard was from the Middle English hard from the Old English heard, from the Proto-Germanic harduz, derived ultimately from the primitive Indo-European kort-ús from kret (strong, powerful).  Cognate with the German hart, the Swedish hård, the Ancient Greek κρατύς (kratús), the Sanskrit क्रतु (krátu) and the Avestan xratu.  Wire was from the Middle English wir & wyr from the Old English wīr (wire, metal thread, wire-ornament) from the Proto-Germanic wīraz (wire) from the primitive Indo-European wehiros (a twist, thread, cord, wire) from wehy (to turn, twist, weave, plait).  The suffix ed was used to form past tenses of (regular) verbs and in linguistics is used for the base form of any past form.  It was from the Middle English ede & eden, from the Old English ode & odon (a weak past ending) from the Proto-Germanic ōd & ōdēdun. Cognate with the Saterland Frisian ede (first person singular past indicative ending), the Swedish ade and the Icelandic aði.  The earliest known citation is from 1969 although there are suggestions the word or its variants had been used earlier, both in electronics and forms of mechanical production, the word migrating to zoology, genetics and human behavioral studies in 1971. The spellings hardwired, hard wired and hard-wired are used interchangeably and no rules or conventions of use have ever emerged.

Lindsay Lohan in leather, hardwired to impressively chunky headphones, visiting New York’s Meatpacking District for a photo-shoot, Soho, November 2013.

The coming of the wireless hardware devices really pleased many women who, for whatever reason, often showed an aversion to the sight of cables, whether lying across the floor or cluttering up their desks, noting their curious way of attracting dust and, adding insult to injury, an apparently insoluble tendency to tangle.  There are though still genuine advantages to using a cabled connection and although wireless headphones have long been the preferred choice of most, there remains a niche in which the old ways still are the best.  The advantages include (1) typically superior sound quality (which obviously can be subjective but there are metrics confirming the higher fidelity), (2) no batteries required, (3) inherently lower latency (thus ideal for gaming, and audio or video editing because of the precision in synchronization, (4) simplified internal construction which should mean lower weight for equivalent dimensions mass and improved reliability and (5) close to universal compatibility with any device with headphone jack or adapter.  The drawbacks include (1) one’s physical movement can be limited by the tethering (thus not ideal for workouts), (2) cables can be prone to damage, (3) cables can be prone to snags & tangles, (4) compatibility emerging as an issue on mobile devices with an increasing number lacking headphone jacks or demanding adaptors.  Of course for some the existence of Bluetooth pairing will be a compelling reason to go wireless and it has to be admitted the modern devices are now of such quality that even lower cost units are now good enough to please even demand audiophiles.

SysCon

IBM explains by example.

In the pre-modern world of the mainframes, there might be a dozen or thousands of terminals (a monitor & keyboard) attached to a system but there was always one special terminal, SysCon (system console), hardwired to the central processor (something not wholly synonymous with the now familar CPU (central processing unit) in PCs.  Unlike other terminals which connected, sometimes over long distances, through repeaters and telephone lines, SysCon, often used by system administrators (who sometimes dubbed themselves "SysCon" the really nerdy ones not using capitals), plugged directly into the core CPU.  When Novell released Netware in 1983, they reprised SysCon as the name of the software layer which was the core administration tool.

Google ngram: The pre-twentieth century use of "hardwired" would have been unrelated to the modern senses.  Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

In recent decades, the word “hardwired” has become a popular form, used figuratively to describe traits, behaviors, or tendencies believed to be innate, automatic, or deeply ingrained, the idea being things “permanently programmed into a human or animal”, on the model of the fixed circuitry in an electronic device.  Although probably over-used and sometimes with less than admirable precision, the term has come to be well-understood as referring to things (1) biologically pre-determined (instincts, reflexes), (2) psychologically ingrained (personality traits, cognitive biases) or (3) culturally conditioned but so deeply entrenched they appear intrinsic.  Even in professions such as medicine, psychiatry & psychology, all noted for their lexicons of technical terms with meanings often (in context) understood only by those with the training, in colloquial use it has become a popular metaphor.  It seems also to be an established element in academic writing because it’s such convenient verbal shorthand to convey meaning.  In that sense, it’s an acceptable metaphor in a way the phrase “it’s in the DNA” is not because that can be literal in a way “it's hardwired” cannot because living organisms have no wires.  DNA (deoxyribonucleic acid) is the famous double helix of polymers which constitute the so-called “building blocks” of life and sometimes the expression “it’s in the DNA” simply is incorrect because what’s being discussed is not connected with the double helix and it would be better to say “it’s hardwired” because the latter is vague enough to convey the idea without be so specific as to mislead.  The best use of the metaphoric “hardwired” is probably in neuroscience because the brain’s neural circuits may directly be compared with electronic circuitry.  The difficulty with using “hardwired” in the behavioural sciences is that very vagueness: it’s not helpful in suggesting where the lines exists between what’s determined by evolution and what are an individual’s temperamental traits.  That said, it remains a useful word but, used carelessly, it can overstate biological determinism.

Friday, May 30, 2025

Tatterdemalion

Tatterdemalion (pronounced tat-er-di-meyl-yuhn or tat-er-di-mal-yuhn)

(1) A person in tattered clothing; a shabby person.

(2) Ragged; unkempt or dilapidated.

(3) In fashion, (typically as “a tatterdemalion dress” etc), garments styled deliberately frayed or with constructed tears etc (also described as “distressed” or “destroyed”).

(4) A beggar (archaic).

1600–1610: The original spelling was tatter-de-mallian (the “demalion” rhymed with “Italian” in English pronunciation), the construct thus tatter + -demalion, of uncertain origin although the nineteenth century English lexicographer Ebenezer Cobham Brewer (1810-1897) (remembered still for his marvelous Dictionary of Phrase and Fable (1894) suggested it might be from de maillot (shirt) which does seem compelling.  Rather than the source, tatter is thought to have been a back-formation from tattered, from the Middle English tatered & tatird, from the Old Norse tǫturr.  Originally, it was derived from the noun, but it was later re-analysed as a past participle (the construct being tatter + -ed) and from this came the verb.  As a noun a tatter was "a shred of torn cloth or an individual item of torn and ragged clothing" while the verb implied both (as a transitive) "to destroy an article of clothing by shredding" & (as an intransitive) "to fall into tatters".  Tatterdemalion is a noun & adjective and tatterdemalionism is a noun; the noun plural is tatterdemalions.

In parallel, there was also the parallel "tat", borrowed under the Raj from the Hindi टाट (ā) (thick canvas) and in English it assumed a variety of meanings including as a clipping of tattoo, as an onomatopoeia referencing the sound made by dice when rolled on a table (and came to be used especially of a loaded die) and as an expression of disapprobation meaning “cheap and vulgar”, either in the context of low-quality goods or sleazy conduct.  The link with "tatty" in the sense of “shabby or ragged clothing” however apparently comes from tat as a clipping of the tatty, a woven mat or screen of gunny cloth made from the fibre of the Corchorus olitorius (jute plant) and noted for it loose, scruffy-looking weave.  Tatterdemalion is a noun & adjective; the noun plural is tatterdemalions.  The historic synonyms were shoddy, battered, broken, dilapidated, frayed, frazzled, moth-eaten, ragged, raggedy, ripped, ramshackle, rugged, scraggy, seedy, shabby, shaggy, threadbare, torn & unkempt and in the context of the modern fashion industry, distressed & destroyed.  An individual could also be described as a tramp, a ragamuffin, a vagabond, a vagrant, a gypsy or even a slum, some of those term reflecting class and ethnic prejudice or stereotypes.  Historically, tatterdemalion was also a name for a beggar.

A similar word in Yiddish was שמאַטע‎ (shmate or shmatte and spelled variously as schmatte, schmata, schmatta, schmate, schmutter & shmatta), from the Polish szmata, of uncertain origin but possibly from szmat (a fair amount).  In the Yiddish (and as adopted in Yinglish) it meant (1) a rag, (2) a piece of old clothing & (3) in the slang of the clothing trade, any item of clothing.  That was much more specific than the Polish szmata which meant literally "rag or old, ripped piece of cloth" but was used also figuratively to mean "publication of low journalistic standard" (ie analogous the English slang use of "rag") and in slang to refer to a woman of loose virtue (used as skank, slut etc might be used in English), a sense which transferred to colloquial use in sport to mean "simple shot", "easy goal" etc.

Designer distress: Lindsay Lohan illustrates the look.

Tatterdemalion is certainly a spectrum condition (the comparative “more tatterdemalion”; the superlative “most tatterdemalion”) and this is well illustrated by the adoption of the concept by fashionistas, modern capitalism soon there to supply demand.  In the fashion business, tatterdemalion needs to walk a fine line because tattiness was historically associated with poverty while designers need to provide garments which convey a message wealth.  The general terms for such garments is “distressed” although “destroyed” is (rather misleadingly) also used.

Highly qualified content provider Busty Buffy (b 1996) in “cut-off” denim shorts with leather braces while beltless.

The ancestor of designer tatterdemalion was a pair of “cut off” denim shorts, improvised not as a fashion statement but as a form of economy, gaining a little more life from a pair of jeans which had deteriorated beyond the point where mending was viable.  Until the counter-culture movements of the 1960s (which really began the previous decade but didn’t until the 1960s assume an expression in mass-market fashion trends), wearing cut-off jeans or clothing obviously patched and repaired generally was a marker of poverty although common in rural areas and among the industrial working class where it was just part of life.  It was only in the 1960s when an anti-consumerist, anti materialist vibe attracted the large cohort of youth created by the post-war “baby boom” that obviously frayed or torn clothing came to be an expression of disregard or even disdain for the prevailing standards of neatness (although paradoxically they were the richest “young generation” ever).  It was the punk movement in the 1970s which took this to whatever extremes seemed possible, the distinctive look of garments with rips and tears secured with safety pins so emblematic of (often confected) rebellion that in certain circles it remains to this day part of the “uniform”.  The fashion industry of course noted the trend and what would later be called “distressed” denim appeared in the lines of many mainstream manufacturers as early as the 1980s, often paired with the acid-washing and stone-washing which previously had been used to make a pair of jeans appear “older”, sometimes a desired look.

Dolce & Gabbana Distressed Jeans (part number FTCGGDG8ET8S9001), US$1150.

That it started with denim makes sense because it's the ultimate "classless" fabric in that it's worn by both rich and poor and while that has advantages for manufacturers, it does mean some are compelled to find ways to ensure buyers are able (blatantly or with some subtlety) to advertise what they are wearing is expensive; while no fashion house seems yet to have put the RRP (recommended retail price) on a leather patch, it may be only a matter of time.  The marketing of jeans which even when new gave the appearance of having been “broken in” by the wearer was by the 1970s a define niche, the quasi-vintage look of “fade & age” achieved with processes such as stone washing, enzyme washing, acid washing, sandblasting, emerizing and micro-sanding but this was just to create an effect, the fabrics not ripped or torn.  Distressed jeans represented the next step in the normal process of wear, fraying hems and seams, irregular fading and rips & tears now part of the aesthetic.  As an industrial process that’s not difficult to do but if done in the wrong way it won’t resemble exactly a pair of jeans subject to gradual degradation because different legs would have worn the denim at different places.  In the 2010s, the look spread to T-shirts and (predictably) hoodies, some manufacturers going beyond mere verisimilitude to (sort of) genuine authenticity, achieving the desired decorative by shooting shirts with bullets, managing a look which presumably the usual tricks of “nibbling & slashing” couldn’t quite emulate.  Warming to the idea, the Japanese label Zoo released jeans made from material torn by lions and tigers, the company anxious to mention the big cats in Tokyo Zoo seemed to "enjoy the fun" and to anyone who has seen a kitten with a skein of wool, that will sound plausible.  Others emulated the working-class look, the “caked-on muddy coating and “oil and grease smears” another variant although one apparently short-lived; appearing dirty apparently never a fashionable choice.  All these looks had of course been seen for centuries, worn mostly by the poor with little choice but to eke a little more wear from their shabby clothes but in the late twentieth century, as wealth overtook Western society, the look was adopted by many with disposable income; firstly the bohemians, hippies and other anti-materialists before the punk movement which needed motifs with some capacity to shock, something harder to achieve than had once been the case.

Distressed top and bottom.  Gigi Hadid (b 1995) in distressed T-shirt and "boyfriend" jeans.

For poets and punks, improvising the look from the stocks of thrift shops, that was fine but for designer labels selling scruffy-looking jeans for four-figure sums, it was more of a challenge, especially as the social media generation had discovered that above all they liked authenticity and faux authenticity would not do, nobody wanting to look it to look they were trying too hard.  The might have seemed a problem, given the look was inherently fake but the aesthetic didn’t matter for its own sake, all that had to be denoted was “conspicuous consumption” (the excessive spending on wasteful goods as proof of wealth) and the juxtaposition of thousand dollar distressed jeans with the odd expensive accessory, achieved that and more, the discontinuities offering irony as a look.  The labels, the prominence of which remained a focus was enough for the message to work although one does wonder if any of the majors have been tempted to print a QR code on the back pocket, linked to the RRP because, what people are really trying to say is “My jeans cost US$1200”.

1962 AC Shelby American Cobra (CSX 2000), interior detail, 2016.

The value of selective scruffiness is well known in other fields.  When selling a car, usually a tatty interior greatly will depress the price (sometimes by more even than the cost of rectification).  However, if the tattiness is of some historic significance, it can add to car’s value, the best example being if the deterioration is part of a vehicle's provenance and proof of originality, a prized attribute to the segment of the collector market known as the “originally police”.  In 2016, what is recognized as the very first Shelby American AC Cobra (CSX 2000) sold for US$13.75 million, becoming the highest price realized at auction for what is classified as "American car".  Built in 1962, it was an AC Ace shipped to California without an engine (and apparently not AC's original "proof-of-concept" test bed which was fitted with one of the short-lived 221 cubic inch (3.6 litre) versions of Ford's new "thin-wall" Windsor V8) where the Shelby operation installed a 260 cubic inch (4.2 litre) Windsor and the rest is history.  The tatterdemalion state of the interior was advertised as one of the features of the car, confirming its status as “an untouched survivor”.  Among Cobra collectors, patina caused by Carroll Shelby's (1923–2012) butt is a most valuable tatterdemalion.

Patina plus and beyond buffing out: Juan Manuel Fangio, Mercedes-Benz W196R Stromlinienwagen (Streamliner), British Grand Prix, Silverstone, 17 July 1954.

Also recommended to be repaired before sale are dents, anything battered unlikely to attract a premium.  However, if a dent was put there by a Formula One (F1) world champion, it becomes a historic artefact.  In 1954, Mercedes-Benz astounded all when their new grand prix car (the W196R) appeared with all-enveloping bodywork, allowed because of a since closed loophole in the rule-book.  The sensuous shape made the rest of the field look antiquated although underneath it was a curious mix of old and new, the fuel-injection and desmodromic valve train representing cutting edge technology while the swing axles and drum brakes spoke to the past and present, the engineers’ beloved straight-eight configuration (its last appearance in F1) definitely the end of an era.  On fast tracks like Monza, the aerodynamic bodywork delivered great speed and stability but the limitations were exposed when the team ran the Stromlinienwagen at tighter circuits and in the 1954 British Grand Prix at Silverstone, Juan Manuel Fangio (1911–1995; winner of five F1 world-championship driver's titles) managed to clout a couple of oil-drums (those and bails of hay how track safety was then done) because it was so much harder to determine the extremities without being able to see the front wheels.  Quickly, the factory concocted a functional (though visually unremarkable) open-wheel version and the sleek original was thereafter used only on the circuits where the highest speeds were achieved.  In 1954, the factory was unconcerned with the historic potential of the dents and repaired the tatterdemalion W196R so an artefact of the era was lost.  That apart, as used cars the W196s have held their value well, an open-wheel version selling at auction in 2013 for US$29.7 million while in 2025 a Stromlinienwagen realized US$53.9 million.  

1966 Ferrari 330 GTC (1966-1968) restored by Bell Sport & Classic.  Many restored Ferraris of the pre-1973 era are finished to a much higher standard than when they left the showroom.  Despite this, genuine, original "survivors" (warts and all) are much-sought in some circles.

In the collector car industry, tatterdemalion definitely is a spectrum condition and for decades the matter of patina versus perfection has been debated.  There was once the idea that in Europe the preference was for a vehicle to appear naturally aged (well-maintained but showing the wear of decades of use) while the US market leaned towards cars restored to the point of being as good (or better) than they were on the showroom floor.  Social anthropologists might have some fun exploring that perception of difference and it was certainly never a universal rule but the debate continues, as does the argument about “improving” on the original.  Some of the most fancied machinery of the 1950s and 1960s (notably Jaguars, Ferraris and Maseratis) is now a staple of the restoration business but, although when new the machines looked gorgeous, it wasn’t necessary to dig too deep to find often shoddy standards of finish, the practice at the time something like sweeping the dirt “under the rug”.  When "restored", many of these cars are re-built to a higher standard, what was often left rough because it sat unseen somewhere now smoothed to perfection.  That’s what some customers want and the best restoration shops can do either though there are questions about whether what might be described as “fake patina” is quite the done thing.  Mechanics and engineers who were part of building Ferraris in the 1960s, upon looking at some immaculately “restored” cars have been known wryly to remark: that wasn't how we built them then.” 

Gucci offered Distressed Tights at US$190 (for a pair so quite good value).  Rapidly, they sold-out.

The fake patina business however goes back quite a way.  Among antique dealers, it’s now a definite niche but from the point at which the industrial revolution began to create a new moneyed class of mine and factory owners, there was a subset of the new money (and there are cynics who suggest it was mostly at the prodding of their wives) who wished to seem more like old money and a trend began to seek out “aged” furniture with which a man might deck out his (newly acquired) house to look as if things had been in the family for generations.  The notoriously snobbish (and amusing) diarist Alan Clark (1928–1999) once referred to someone as looking like “they had to buy their own chairs”, prompting one aristocrat to respond: “That’s a bit much from someone whose father (the art historian and life peer Kenneth Clark (1903–1983)) had to buy his own castle.  The old money were of course snooty about the such folk and David Lloyd George (1863–1945; UK prime-minister 1916-1922) would lament many of the “jumped-up grocers” in his Liberal Party were more troublesome and less sympathetic to the troubles of the downtrodden than the "backwoodsmen" gentry in their inherited country houses.